Hi guys,

Recently moved into PHP for a personal application. Basically, it tries to connect some servers to see if they're online. It ultimately returns a string, which the caller echos. The main function is:

function IsServerOnline($IP, $checkforevent = TRUE, $ischannel = TRUE, $PORT = 11020)
 {
	global $UseGlobalOverride, $UseChannelOverride, $GlobalOverride, $ChannelOverride, $UseOfflineOverride, $OfflineOverride, $IsEvent;
	if ($UseGlobalOverride) return $GlobalOverride;
	if ($UseChannelOverride && $ischannel) return $ChannelOverride;
	
	$fp = fsockopen($IP, $PORT, &$errnum, &$errstr , 2);
	if (!$fp)
	{
		if ($UseOfflineOverride) return $OfflineOverride; else return "offline";
	} 
	else
	{
		fclose($fp);
		if ($checkforevent && $IsEvent) return "event"; else return "online";
	}
 }

and it is called by this:

|Server1=<?php echo(IsServerOnline("208.85.108.17", TRUE, TRUE, 80)); ?><br>
|Server2=<?php echo(IsServerOnline("63.251.217.212", FALSE, FALSE, 80)); ?><br>
|Server3=<?php echo(IsServerOnline("208.85.108.15", FALSE, FALSE, 11000)); ?><br>

It all works wonderfully, except when a server goes down. Then, the "timeouts" kick in, and the page gets slow to load. I have approximately 50 servers to test, so this webpage gets very slow at times. What would be my best bet for running the tests asynchronously and then printing the correct output in the correct place?

Recommended Answers

All 7 Replies

Yeah, not quite what I want. the detection code is fine, it just loads really slow when servers are offline.

Okay, some code I hacked together:

$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 20);
curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);

curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_NOBODY, true);

//curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
// uncomment that if you want to follow redirects

$content = curl_exec($ch);
curl_close($ch);

Take a look at $content to find out where the status header is, it should contain 200.

And (pardon my noobishness) that code is asyc? That is, it will immediatly move on to the next server?

No, but this should run fairly fast. It also helps if you only put the url setopt and the exec in the loop, and the rest before or after it.

You can use proc_open() to open multiple php processes. You can open a process to a PHP file that will run the server online test. You don't have to wait for proc_open() to return, so you can run it multiple times.

If you want to read the returns async, then use stream_select(). Otherwise you can just have the php scripts called by proc_open() write to persistent storage, like a db or file with the results.

You can also use exec() to execute processes async. eg:

exec('php /path/to/server_online_test.php http://example.com &');

That would call /path/to/server_online_test.php passing it "http://example.com" as an argument. You can retrieve the argument via $argv[1]

You can use proc_open() to open multiple php processes. You can open a process to a PHP file that will run the server online test. You don't have to wait for proc_open() to return, so you can run it multiple times.

If you want to read the returns async, then use stream_select(). Otherwise you can just have the php scripts called by proc_open() write to persistent storage, like a db or file with the results.

You can also use exec() to execute processes async. eg:

exec('php /path/to/server_online_test.php http://example.com &');

That would call /path/to/server_online_test.php passing it "http://example.com" as an argument. You can retrieve the argument via $argv[1]

Thanks, exactly what I was looking for!

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.