简体   繁体   English

为什么下载的并发数量有限制?

[英]Why is there a limit in the concurrent number of downloads?

I am trying to make my own simple web crawler. 我正在尝试制作自己的简单网络爬虫。 I want to download files with specific extensions from a URL. 我想从URL下载具有特定扩展名的文件。 I have the following code written: 我写了以下代码:

    private void button1_Click(object sender, RoutedEventArgs e)
    {
        if (bw.IsBusy) return;
        bw.DoWork += new DoWorkEventHandler(bw_DoWork);
        bw.RunWorkerAsync(new string[] { URL.Text, SavePath.Text, Filter.Text });
    }
    //--------------------------------------------------------------------------------------------
    void bw_DoWork(object sender, DoWorkEventArgs e)
    {
        try
        {
            ThreadPool.SetMaxThreads(4, 4);
            string[] strs = e.Argument as string[];
            Regex reg = new Regex("<a(\\s*[^>]*?){0,1}\\s*href\\s*\\=\\s*\\\"([^>]*?)\\\"\\s*[^>]*>(.*?)</a>", RegexOptions.Compiled | RegexOptions.CultureInvariant | RegexOptions.IgnoreCase);
            int i = 0;
            string domainS = strs[0];
            string Extensions = strs[2];
            string OutDir = strs[1];
            var domain = new Uri(domainS);
            string[] Filters = Extensions.Split(new char[] { ';', ',', ' ' }, StringSplitOptions.RemoveEmptyEntries);
            string outPath = System.IO.Path.Combine(OutDir, string.Format("File_{0}.html", i));

            WebClient webClient = new WebClient();
            string str = webClient.DownloadString(domainS);
            str = str.Replace("\r\n", " ").Replace('\n', ' ');
            MatchCollection mc = reg.Matches(str);
            int NumOfThreads = mc.Count;

            Parallel.ForEach(mc.Cast<Match>(), new ParallelOptions { MaxDegreeOfParallelism = 2,  },
            mat =>
            {
                string val = mat.Groups[2].Value;
                var link = new Uri(domain, val);
                foreach (string ext in Filters)
                    if (val.EndsWith("." + ext))
                    {
                        Download((object)new object[] { OutDir, link });
                        break;
                    }
            });
            throw new Exception("Finished !");

        }
        catch (System.Exception ex)
        {
            ReportException(ex);
        }
        finally
        {

        }
    }
    //--------------------------------------------------------------------------------------------
    private static void Download(object o)
    {
        try
        {
            object[] objs = o as object[];
            Uri link = (Uri)objs[1];
            string outPath = System.IO.Path.Combine((string)objs[0], System.IO.Path.GetFileName(link.ToString()));
            if (!File.Exists(outPath))
            {
                //WebClient webClient = new WebClient();
                //webClient.DownloadFile(link, outPath);

                DownloadFile(link.ToString(), outPath);
            }
        }
        catch (System.Exception ex)
        {
            ReportException(ex);
        }
    }
    //--------------------------------------------------------------------------------------------
    private static bool DownloadFile(string url, string filePath)
    {
        try
        {
            HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url);
            request.UserAgent = "Web Crawler";
            request.Timeout = 40000;
            WebResponse response = request.GetResponse();
            Stream stream = response.GetResponseStream();
            using (FileStream fs = new FileStream(filePath, FileMode.CreateNew))
            {
                const int siz = 1000;
                byte[] bytes = new byte[siz];
                for (; ; )
                {
                    int count = stream.Read(bytes, 0, siz);
                    fs.Write(bytes, 0, count);
                    if (count == 0) break;
                }
                fs.Flush();
                fs.Close();
            }
        }
        catch (System.Exception ex)
        {
            ReportException(ex);
            return false;
        }
        finally
        {

        }
        return true;
    }

The problem is that while it works fine for 2 parallel downloads: 问题是虽然它适用于2个并行下载:

        new ParallelOptions { MaxDegreeOfParallelism = 2,  }

...it doesn't work for greater degrees of parallelism like: ...它不适用于更大程度的并行性,如:

        new ParallelOptions { MaxDegreeOfParallelism = 5,  }

...and I get connection timeout exceptions. ...我得到连接超时异常。

At first I thought it was because of WebClient : 起初我以为是因为WebClient

                //WebClient webClient = new WebClient();
                //webClient.DownloadFile(link, outPath);

...but when I replaced it with the function DownloadFile that used the HttpWebRequest I still got the error. ...但是当我用使用HttpWebRequest的函数DownloadFile替换它时,我仍然遇到错误。

I have tested it on many web pages and nothing changed. 我已在许多网页上测试过,没有任何改变。 I have also confirmed with chrome's extension, "Download Master", that these web servers allow multiple parallel downloads. 我还确认了chrome的扩展名“Download Master”,这些Web服务器允许多个并行下载。 Does anyone have any idea for why I get timeout Exceptions when trying to download many files in parallel? 有没有人知道为什么我会超时尝试并行下载多个文件?

You need to assign the ServicePointManager.DefaultConnectionLimit . 您需要分配ServicePointManager.DefaultConnectionLimit Default concurrent connections to the same host is 2. Also see related SO post on using web.config connectionManagement . 与同一主机的默认并发连接是2.另请参阅使用web.config connectionManagement 相关SO帖子

As far as I know IIS will limit the total number of connections in and out, however this number should be in the range of 10^3 not ~5. 据我所知,IIS将限制进出的连接总数,但是这个数字应该在10 ^ 3而不是5的范围内。

Is it possible you are testing off of the same url? 您是否可以测试相同的网址? I know a lot of web servers limit the number of simultaneous connections from clients. 我知道很多Web服务器限制了来自客户端的同时连接数。 Ex: Are you testing by trying to download 10 copies of http://www.google.com ? 例如:您是否尝试下载10份http://www.google.com进行测试

If so you might want to try testing with a list of different sites such as: 如果是这样,您可能想尝试使用不同网站的列表进行测试,例如:

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM