Hello all,

by using Python, i want to pickup the news titles on http://www.boston.com/yourtown/ and save them into a .txt file.

i have 2 questions:
1. how can i have the codes run every 2 hours?
2. how can i have the outputs write into a .txt file?

thanks.

from bs4 import BeautifulSoup
import re
import urllib2



url = "http://www.boston.com/yourtown/"
page = urllib2.urlopen(url)
soup = BeautifulSoup(page.read())

news_titles = soup.find_all('h4')

for nts in news_titles:
    for n in nts("a"):
        print n.renderContents()

Recommended Answers

Python has a scheduler module,wish is a option to OS scheduler like Cron or Windows task scheduler.
I have used it's some times and it work well sleeping in background and do work …

Jump to Post

All 3 Replies

please search for cron jobs in python.Cron jobs are task to be performed at specified time.That will solve your problem.Refer this

thanks for the reply

Python has a scheduler module,wish is a option to OS scheduler like Cron or Windows task scheduler.
I have used it's some times and it work well sleeping in background and do work only when it shall.
One positive aspect is that is crossplatform.

So this will run every 2 hours.

from bs4 import BeautifulSoup
import re
import urllib2
import sched, time

def do_event():
    url = "http://www.boston.com/yourtown/"
    page = urllib2.urlopen(url)
    soup = BeautifulSoup(page.read())
    news_titles = soup.find_all('h4')
    for nts in news_titles:
        for n in nts("a"):
            print n.renderContents()
    s.enter(7200, 1, do_event, ())

s = sched.scheduler(time.time, time.sleep)
s.enter(7200, 1, do_event, ())
s.run()
commented: good idea +14
Be a part of the DaniWeb community

We're a friendly, industry-focused community of 1.21 million developers, IT pros, digital marketers, and technology enthusiasts learning and sharing knowledge.