简体   繁体   中英

Ways to speed op phpseclib sftp get to download remote file

So I use phpseclib which downloads a 50MB file over sftp in roughly 45 seconds. Which is fast compared to ssh2_scp_recv() which takes 90+ seconds, but slow compared to my sftp client (filezilla), which takes 10 seconds max.

My question is, what can I do to speed up file downloads through sftp, other than enabling the mcrypt, gmp and bcmath extensions which I've done already?

I'm running PHP 5.5 on Windows 7, and got the same results when using either cli or browser/apache, and using sftp->get to download a file as a whole, or download a file in chunks of various sizes.

Source:

set_include_path(get_include_path() . PATH_SEPARATOR . 'phpseclib');
require 'phpseclib/Net/SFTP.php';

$sftp = new Net_SFTP($host, $port, $timeout);
$sftp->login($user, $password);

$sftp->get($remoteFile, $localFile);

With an SFTP protocol, a client (client library) uses a "READ" request repeatedly to get chunks of file contents.

A dumb implementation, that phpseclib uses, sends one "READ" request (for up to 32 kB), waits for a "DATA" response, sends another "READ" request, waits, and so on, until it gets a whole file.

If a roundtrip to/from a server is long (big latency), the client (library) may be uselessly waiting most of the time.

Smart clients (libraries) overcome this by sending multiple "READ" requests, without waiting for the response, or by using a large "READ" request, or both.

FileZilla for instance, sends a sequence of 32 kB "READ" requests for up to total 1 MB worth of data.

The phpseclib does not support this optimization (note, that it does for uploads, though).

All you can do is to increase a size of the "READ" request, using Net_SFTP::max_sftp_packet .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM