How do I retrieve content of a website in perl?
That is given an url say google.com, I want a function get so that
$text = get($url);
$text should contain the content of google.com/index.htm

You could use the wget Unix utility in combination with Perl to do this. Even if you don't have a Windows port of it you could use this: http://pages.interlog.com/~tcharron/wgetwin.html. Just call it with system and it will put the content of a web page in a file on your machine. Then you can do what you like with it in Perl. I hope this helps.

Steven.

Just Use perl WWW::Mechanize module.

google for mwchanize perl.it will take you to cpan where u can download WWW::Mechanize module. Unzip it in cgi-bin and use it
like

$mech->get($url);

hope this helps.