Hi!
Does anybody know of a script capable to read feeds and store them in static files? Also, if possible, to create an index of these files, too, so visitors on the website can find them easily by date or name, for example?

What I'm trying to do is to create some sort of mirror of my website's content, for slow browser, and also to redirect visitors here if something happens to my Apache server (which sometimes freezes). So, my idea is to install a php script that will read my site's feedburners feeds and store them in a folder; so, if some of the problems above occure, the visitors will be redirected to this folder where they can see, at least, the site's content extracted from my feeds. Can't think of anything better now, this sounds good for me.

I apreciate any recomandations!
Thank you!

Recommended Answers

All 3 Replies

Wget is not what I need. I want to store the feeds (mostly feedburners) in html format, as static files. The script should read the feed, create a html file of it using different names (dates are OK) and store it somewhere. Then, the visitor should, if Apache is down (this I can do) be taken to the page where he/she can brouse through these html pages. It would be like a basic version of the main website - better then nothing, anyway, if the Apache server is down.
Also, the script should run alone (a cronjob would be great) to retrieve the feeds daily or weekly and create those static files.
I've found a script called RSS extender on sourceforge that can read feeds and store them, but cannot create a separate file and keep it, unfortunatelly.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.