Okay let's say I open the url, http://www.nothing.com/ using php's fopen function. then I do this,

$S = sss
$url = "http://www.nothing.com/" . $S . "&ql=1";
$open = fopen($url,"r");
while(! feof($open))
    echo fgets($open). "<br />";

That above returns all of the page I'm looking at. My goal is to have some sort of php function to sort of collect specific information from the page for example: I want to crawl the page and then find where it says name: next, I want to see what is next to name example: Name: Bob York Then I want to index Bob York into a mysql database? But, how can I do this?

6 Years
Discussion Span
Last Post by chrishea

Hey thanks for responding, but how could I use those php functions? Can you give me a quick code example?

Edited by Joe34: forgot something


If your need is pretty simple, then you may get what you want the way that you are doing it.

Even for simple needs but certainly when you need to start accessing multiple pages, navigating from one to the next and, in some cases, having to log in first, then there are better tools.

Curl can be used for some of this. I have used a class that helps to handle more complicated situations. Overall, I found that the most effective tool, especially for complicated situations is a Windows programming tool called Autoit. It has an Internet Explorer function that lets it automate navigation and extract fields. It is pretty easy to use and it can do just about anything that a human can do. It has the advantage over PHP tools that it works through the browser so when you are accessing secure (https) pages or pages created in ASP it has no problem. I could not get any of the PHP tools to process ASP pages.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.