Im currently building a script to monitor / check on my sites and report back to me with their availability, HTTP Status Codes, and importantly here, the PAGE LOAD TIME.

Im currently doing this with curl requests, and this works for getting status's, and finding out many other things about the pages / sites. BUT it will only give me the time it took to grab the html on the page.

Is there a way to set it to download more sort of like a browser would. So it would record the total time in which it takes a user to download the html, css, JavaScript, images etc.

I have thought about just using the CURL script to download the html originally, and using the multiple settings (curl_multi_*) to download each of the components to make up the page, but this seems long winded.
I wondered if there was a better way than using curl, or maybe a pre-built set of classes ready for this but i have searched google and no such luck.

I know of some online tools that request pages sort of how i want to, e.g. http://tools.pingdom.com/, and http://loadimpact.com/pageanalyzer.php but i don't need all the info they generate, just the end result page load time that the user would experience.

Not sure if this is possible in PHP, may need to switch to python or another language to do this part of the script.
If anyone else has used/built a script capable of this please let me know!


Just been looking at using COM, but i would need to invest in using a windows server.
Found an article on using the COM class to screen grab sites so i assume you could use it aswell to time the load of pages.

$browser = new COM("Internet Explorer.Application");

Not sure if i can test this on my windows PC though, i have XAMPP, but when i run the script i get errors, i assume i need to install some things. Has anyone here used the COM class, give me any advise with it, maybe anyone using .NET have any ideas i could migrate to PHP, or use PHP to execute a .NET or even C program in order to get the result time?

Any Ideas?

PHP will only be able to calculate the time taken to generate the page before sending it, it would not be able to count in the time taken to download the content etc.

However, at the beginning of your script you could get the current date/time and output this in some JavaScript code as the start time, and then in a JavaScript function get the local system time when the page has finished loading, and work out the total time from there.
Of course, this will depend on the client having the correct time/date and being in the same timezone as the server. So a better method may be to have an Ajax function to get the current time from the server when the page has loaded and use this.

But how would i know that everything has finished loading with this?
I could if i put code into the pages this would work, most the sites i use have there own analytics setup eithere from Google analytics or other, but that's not at all what im trying to achieve, i want my script to be external, on a different domain which would go to another website, scrape it with something similar to curl, but record the time taken for the html result page, css, javascripts and images to load, rather than the time to simply download the html on the server which curl times would do.

Or wait, are you suggesting i grab pages via curl, and use the php dom class to insert javascript into the page which in turn would use Ajax to send the result to the database and then another script would then read this or attempt to on a regular basis.
Which would work if i could open the file to execute in the browser, but this script needs to be run via a cron or other such script therefore not in a browser and javascript becomes useless, unless you know of some way to execute it in php asif in a browser?? which in turn would probably give me the flexabiity to time a page anyway right? :)