简体   繁体   English

WWW::Mechanize 超时 - 所有 url 超时

[英]WWW::Mechanize timeout - all urls timing out

I am having a problem using WWW::Mechanize.我在使用 WWW::Mechanize 时遇到问题。 It seems no matter what website I try accessing, my script just sits there in the command prompt until it times out.似乎无论我尝试访问哪个网站,我的脚本都只是在命令提示符下坐在那里,直到它超时。 The only things that come to mind that might be relevant are the following:唯一想到的可能相关的事情如下:

  • I have IE7, chrome, and FF installed.我安装了 IE7、chrome 和 FF。 FF was my default browser but I recently switched that to chrome. FF 是我的默认浏览器,但我最近将其切换到 chrome。
  • I seem to be able to access websites with port 8080 just fine.我似乎能够使用端口 8080 访问网站就好了。
  • I recently experimented with the cookie jar but stopped using it because, honestly, I'm not sure how it works.我最近尝试了 cookie jar 但停止使用它,因为老实说,我不确定它是如何工作的。 This may have instantiated a change.这可能已经实例化了一个变化。

Here is an example:这是一个例子:

#!/usr/bin/perl -w
use strict;
use WWW::Mechanize;

my $url = 'http://docstore.mik.ua/orelly/perl/learn/';

my $mech = WWW::Mechanize->new();

$mech->get( $url );

print $mech->content;

The code seems to work, so it must be a firewall/proxy issue.该代码似乎有效,因此它一定是防火墙/代理问题。 You can try setting a proxy:您可以尝试设置代理:

   $mech->proxy(['http', 'ftp'], 'http://your-proxy:8080/');

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM