Hi There!

Our company became a reseller, and we didnt get the product info's in normal way (cd, email, xml, etc).
So we have to go to their website and download all of the product infos (name, description, price, pictures) one by one. This could be fun if you have only 10-50 products, but right now i am talking about 2000-10000 products.

So is there a way to solve this problem?


Thank You,

tmano

This is called screen grabbing, and there are lots of useful techniques for it.
Use wget to grab the whole site and a couple of grep and sed scripts or PHP with preg_match or other regular expressions to extract the relevant content to CSV. Then load the CSV tables into a all-encompassing mysql table and normalize it.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.