Function, which picks up from specified page all images that are larger than 50kt. The function returns arrayn, which contains the image URL and the image size in kilobytes. How do i start to do this? Do i use wget and exec? Is there easier way to do it. So first i need to download all images. Then analyze images and get url and size.
siina 0
Newbie Poster
Recommended Answers
Jump to PostThere is a nice function called file_get_contents which gets the contents of a url and stores it in a variable then you can place it into a file or process it into a mysql database etc. For example
<?php //first to specify the url $url='
Jump to PostYou can, also, query the remote server for an HEAD request and check if provides Content-Length, something like:
<?php $url = 'http://www.website.tld/image01.jpg'; $head = get_headers($url); $length = str_replace('Content-Length: ','',$head[6]); if($length < 50000) { echo 'too small: '; } else { echo 'ok: '; } …
All 7 Replies
cwarn23 387
Occupation: Genius Team Colleague Featured Poster
cereal 1,524
Nearly a Senior Poster Featured Poster
siina 0
Newbie Poster
cwarn23 387
Occupation: Genius Team Colleague Featured Poster
cereal 1,524
Nearly a Senior Poster Featured Poster
vibhaJ 126
Master Poster
cwarn23 387
Occupation: Genius Team Colleague Featured Poster
Be a part of the DaniWeb community
We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.