Hi dani-webbers...

I am endevouring to build an application that drags in data from multiple sources, and compiles this into a single page in which users can then click through to the item on the desired site.

Most of the sites I want to include provide an API, but one crucial one doesn't. They have no problem with me getting the data, they just don't have a configured API. So, they are quite content for me to screen-scrape.

Using Firebug I have deciphered their HTML... now all I need to do is:

1) User opens MY homepage.
2) My homepage grabs the HTML from a specific URL
3) My script (PHP I expect) deciphers the HTML and pumps it out in my format.
4) Script combines all the feeds.
5) Script dumps HTML out to my Homepage.

What I have never done, is screen-scrape "on demand" / "dynamically" like this.......... can someone point me at the right tools for the job please.

Cheers,
Todd
Todd

Recommended Answers

All 2 Replies

Member Avatar for diafol

I use cURL for this - see the php manual for an example. Use buffering.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.