Hello.

I'm trying to get working some AJAX make statistics of links clicked. Using php and mysql as back-end. My choice is synchronous AJAX request to server using onbeforeunload event. I'm using synchronous request to be sure that browser sent request to the server before page is unloaded. With asynchronous request much requests do not get to the server especially with slow connections.

This approach works well so far, but as mysql database gets bigger, and php script more complicated, this will take more time to proceed request, which forces browser to wait a little before response is sent back to it.

I just thought is there any way to send response to browser, close the connection (this normally happens after exit or die() in php script) and only after that continue script execution. So we will be sure that request is sent to server, and browser will not wait so long until php executes all the back-end logic. I've tried

header("HTTP/1.0 200 OK");
header('Status: 200 OK');
header("connection: close");

but no success so far.

Any ideas how to do it?

Harutyun,

First thoughts are :

  1. Use the stasts pack(s) provided on your host server. These should allow you to see which of your pages have been visited (by day, month, year, browser variant, colour of your cat etc. etc), and where visitors exit to (ie pages on other domains hyperlinked from your pages).
  2. If you are only interested in logging pages on your own site, then include() code to undertake the logging at the top of every page.
  3. What you suggest (a "fire-and-forget worker request") must be possible. I wonder if a "multi-threaded php" approach might work?

Airshow

Hello. Thanks for reply.

I found user reply in PHP manual at http://am.php.net/features.connection-handling and it seems to solve this problem. Tested with Firefox so far and works well.

Here is code

<?php
  function redirect_and_continue($sURL)
  {
    header( "Location: ".$sURL ) ;
    ob_end_clean(); //arr1s code
    header("Connection: close");
    ignore_user_abort();
    ob_start();
    header("Content-Length: 0");
    ob_end_flush();
    flush(); // end arr1s code
    session_write_close(); // as pointed out by Anonymous
  }
?>

Thanks to all thinking of this problem.

Harutyun,

Well found and looks interesting but I am struggling to understand exactly how you are using this function.

Have you dropped the synchronous AJAX request in favour of a more conventional approach? And which url do you log, the one that is being exited or the new one that is about to load.

Airshow

Harutyun,

Well found and looks interesting but I am struggling to understand exactly how you are using this function.

Have you dropped the synchronous AJAX request in favour of a more conventional approach? And which url do you log, the one that is being exited or the new one that is about to load.

Airshow

Hello. I'm using two different approaches for following cases.

1. Regular URL clicked. In this case I'm just preventing event default behavior (go to the page), sending synchronous request to server with all information (current page, page url goes to, banner identifier in my database, banner location - top, left etc), and after getting response setting window.location to url address.

2. Google Ads clicked. In this case no one can prevent browser from going to url address as url is locatied in separate iframe different domain and events within iframe can not be captured.
In this case I'm just monitoring onbeforeunload event and deciding if ads clicked by mouse position. If mouse is within iframe I'm sending synchronous request to server with all available data (in this case url which is being loaded is not available). For time before response get browser hangs and waits (this is why I decided to use synchronous request), so we can be sure that request is already properly sent to the server.

In both cases returning request ASAP is very important. The back-end of this process goes to be more complicated and smart (more data is processed and inserted to the database). The above function is called in back-end file every time AJAX request received. This causes the browser to think that request succeed, and user notices almost no hang. After sending request to the browser the script continues executing all necessary code to proceed request data.

Hope this helped everyone interested in this topic.

Thanks.

Thanks Harutyun, I understand much better now.

My only concern is that if the user goes back/forward (using browser controls) then you may get duplicate log entries every time your page is re-unloaded. Suggest you experiment a bit.

If duplicate logging proves to be an issue, then maybe the simplest thing would be to introduce some "duplicate-reject-wthin_session" (or similar) logic if, of course, you don't have such logic already.

Best wishes

Airshow

Hello. If user goes back-forward using mouse (this is the case with most users) he is not be logged anyway (see my previous post, about mouse position and link click action). The only way to double log an action is changing page location while mouse is over google ads iframe, e.g use keyboard to activate address bar and type new address, reload using f5 etc. I don't think this must be an issue so I decided do no any additional check for current user. But may be I'll think of it later.

Thanks.

Harutyun,

That's cool. Good luck with it.

Airshow

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.