Hi everybody :)

I have a very stupid answer but I cannot solve it!

Connect to the mysql db, I get the records that are URLs, into the php loop I call an ajax function that reads the title tag address of the URL

ajax code

function getTitle(str1,str2)
if (window.XMLHttpRequest)
  {// code for IE7+, Firefox, Chrome, Opera, Safari
  xmlhttp=new XMLHttpRequest();
  {// code for IE6, IE5
  xmlhttp=new ActiveXObject("Microsoft.XMLHTTP");
  if (xmlhttp.readyState==4 && xmlhttp.status==200)

php code

while ($fields = @mysql_fetch_array($Query)) {
// call function that reads the title tag for each URL Listed
<script type="text/JavaScript">
new getTitle('<?php echo $fields['url']; ?>', '<?php echo $fields['idOfUrl']; ?>');
<div id="tagTitle_<?php echo $fields['idOfUrl']; ?>">HERE OUTPUT</div>

but this "dirty" method does not work 'cause I think the function does not have time to read!
if I change page and go back in browser window titles are shown correctly, then the function is written correctly

how can I solve it?

Recommended Answers

All 3 Replies

Member Avatar for diafol

This seems very wasteful with regard to server calls. Does this run on page load? If so, what's the point of using Ajax? Just get the info via php.

surely you're right
but reading a file (as can be an external address) requires time that I wanted to save by using ajax.. so I think this is a method, if you have other solution(s) please tell me..

Member Avatar for diafol

You can read as many files as you want in php - you don't need ajax to do it. I just don't understand what you're trying to do.

Getting ALL the info before the browser accepts it (server processing) is much better than making server calls -> client calls -> server calls - if that is indeed what you're doing. Sorry if I've got the wrong end of the stick.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.