Hi,

I have installed WWW-Mechanize-1.60 in Linux platform. Problem is that i am getting this error.

Error GETing http://www.example.com:Can't connect to eutils.ncbi.nlm.nih.gov:80 (connect: timeout)

Here is my code.

use WWW::Mechanize;
$url="http://www.example.com";
$mechanize = WWW::Mechanize->new(autocheck => 1);
$mechanize->get($url);
$page = $mechanize->content;

How can i solve this error?

I am getting this error if i am trying to fetch more data.

Is this method suitable to download large amount of data?

Any other methods to download large amount of data?

Regards
Vanditha

I ran your script and didn't get any error. I added some print statements to see the content and added another $mechanize->get to get a different page. They both downloaded in about a second and the output looks complete to me.

Is it working for you now? Perhaps the servers or the internet was slow when you tested it. Or maybe you have a firewall that is blocking your access to one of those sites?

I haven't any expertise using this module. Just saying your script works OK for me. (My platform is Win/DOS, but I don't see why it wouldn't work on Linux.)

#!/usr/bin/perl -w
use strict;
use warnings;
use WWW::Mechanize;
my $url="http://eutils.ncbi.nlm.nih.gov/";
my $mechanize = WWW::Mechanize->new(autocheck => 1);
$mechanize->get($url);
my $page = $mechanize->content;
print $page; #Output looked complete to me... no error
print '*' x 75, "\n"; #Printed a bunch of asterisks to separate from next page
$mechanize->get("http://www.example.com");
print $mechanize->content; #Output looked complete to me... no error
Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.