Member Avatar for sravan953

Hey guys,

I am making a program called 'Weather Watch' which basically gets weather updates for any city you type in.

For now, it only gets info for a particular city. I don't know how to search for the term entered in www.weather.com and then get the updates. The code so far:

import urllib2 as url
import os
import time

os.system('cls')

print("[Content provided by The Weather Channel]")
time.sleep(3)
os.system('cls')

print("Please wait...this may take a few seconds.")
time.sleep(3)
os.system('cls')

condition=True

try:
    url_open=url.urlopen("http://www.weather.com")
    lines=url_open.readlines()[1192:1199]
except:
    condition=False

if(condition):
    print("Weather forecast for Bangalore, INDIA\n")
    for x in lines:
        onsplit=x.split(">")

        tags=onsplit[2][:-5]

        if(tags=="Pressure:"):
            data=onsplit[4][:5]
        elif(tags=="Dew Point:"):
            data=onsplit[4][:2]
        else:
            data=onsplit[4][:-5]

        print(tags+" "+data)
    raw_input("\n<Any key to quit>")

This is what I've learnt on searching for a term in a site[Google, here]

import urllib2 as url2
import urllib as url

values={"search":"city"}

data=url.urlencode(values)

req=url2.Request("http://www.google.com",data)
res=url2.urlopen(req)

print(res.read())

Recommended Answers

All 5 Replies

Member Avatar for sravan953

foosion, I did not understand you fully. On what format is the URL formed? If I get the URL right for all cities, I can take care of 'getting' the data, by slicing and so on. So, how is the URL formed?

It depends on which service you use and what info you want.

Here's an example for google (and yahoo and noaa) weather. It was the first hit when I used google to search for "python weather google." http://code.google.com/p/python-weather-api/

Following the thinking of sneekula at:
http://www.daniweb.com/forums/post892908.html#post892908
You can do some detective work and pull weather data from the www.weather.com html code like my little example shows ...

# given the zip code, extract the weather conditions of the area
# tested with Python25

import urllib2
import time

def extract(text, sub1, sub2):
    """
    extract a substring from text between first
    occurances of substrings sub1 and sub2
    """
    return text.split(sub1, 1)[-1].split(sub2, 1)[0]


zipcode = '91201'
url_str = 'http://www.weather.com/weather/local/' + zipcode
try:
    fin = urllib2.urlopen(url_str)
    html = fin.readlines()
    fin.close()
except IOError:
    print( 'Cannot open URL %s for reading' % url_str )
    html = False
  
if html:
    for line in html:
        #print( line )  # test
        if line.startswith('OAS_spoof'):
            location = line
        if line.startswith('OAS_query'):
            weather = line

    #print( location )  # test
    #print( weather )  # test

    location_list = location.split('/')
    #print( location_list )  # test

    town = location_list[9].capitalize()
    state = location_list[7].capitalize()
    zip = location_list[10][:5]

    print( time.strftime("%A, %d%b%Y at %H:%M hours", time.localtime()) )
    print( "Lovely %s, %s  %s" % (town, state, zip) )

    temp_now = extract(weather, 'temp=', '&')
    cond_now = extract(weather, 'cond=', '&')
    temp_high = extract(weather, 'temph1=', '&')
    temp_low = extract(weather, 'templ1=', '&')

    sf = "is %s with %sF (low=%sF and high=%sF)"
    print( sf % (cond_now, temp_now, temp_low, temp_high) ) 

"""my result -->
Tuesday, 08Sep2009 at 13:15 hours
Lovely Glendale, Ca  91201
is clear_sunny with 81F (low=62F and high=84F)
"""

I did something very similar for my father who happens to be a weather nut. I went online and found the webpage for our zipcode and used urllib to get updates of the webpage every 15 minutes or so. Then I just parsed the html for common things like temperature, dew point, cloud coverage. It was pretty simple once you knew what to look for. My solution worked fine for me but if you plan on sharing this program with more than you or maybe a family member I would recomend using the APIs as they will probably be more reliable. If the weather channel were to change their webpage design my little program could be useless.

--EAnder

commented: thanks for the note +14
Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.