简体   繁体   English

WWW :: Mechanize VS Curl

[英]WWW::Mechanize VS Curl

Background: I need to get updated data for all of my users. 背景:我需要获取所有用户的更新数据。 The data resides on a secure site so the script needs to login (using cookies) Traverses some inner URLs and then fetch the required data. 数据位于一个安全的站点上,因此脚本需要登录(使用cookie)遍历一些内部URL,然后获取所需的数据。

Tools: WWW::Mechanize or Curl 工具:WWW :: Mechanize或Curl

What is the best tool for my needs? 什么是满足我需求的最佳工具? Performance is a big issue I need to get the updated data as fast as possible due to the reason that I need to get updated data to lots of users. 性能是一个大问题,由于需要将更新的数据发送给许多用户,因此我需要尽快获取更新的数据。

Is it possible to fire up multiple requests using the WWW::Mechanize library? 是否可以使用WWW :: Mechanize库启动多个请求?

Update: 更新:

I got it running using Curl. 我让它使用Curl运行。 But I was thinking that I could speed it up using Mechanize. 但是我当时想我可以使用Mechanize加快速度。 Which library performs better regarding HTTP req? 哪个库在HTTP请求方面表现更好? Are there any statistics? 有统计资料吗? Right now i am using Curl with the multi interface. 现在,我在多接口上使用Curl。

WWW::Mechanize is a perl module. WWW :: Mechanize是一个perl模块。 Therefore you can use all power of the language with it, eg fork multiple processes. 因此,您可以使用语言的所有功能,例如派生多个进程。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM