krazineurons 0 Newbie Poster

Hi friends, i have an idea of writing a subtitle crawler for the movies stored on my harddisk. however i run on gprs and net is very slow. so i had this idea, that i write a web application or a script and host it on a webserver on the internet. Keeping the idea in mind that a webserver is always connected to internet and runs 24x7 on high speed internet. I load my application with the files it has to crawl, and it does all the work on the server and then i just download what i require from it. I mean i can save a lot of bandwidth which goes in connecting, crawling and doing all other things. My question is, is this idea logically feasible?