I have the following requirements:
There are array of urls, which should be downloaded.
How to do it on C# 5.0? I try to do following:
class Program
{
static Stopwatch sw = Stopwatch.StartNew();
static void Main(string[] args)
{
List<Task> tasks = new List<Task>();
string[] urls = new string[] { "http://site1.ru", "http://www.site2.com", "http://site3.com", "http://site4.ru" };
foreach (var url in urls)
{
var task = AsyncVersion(url);
tasks.Add(task);
}
Task.WaitAll(tasks.ToArray());
}
static async Task AsyncVersion(string url)
{
var webRequest = WebRequest.Create(url);
Console.WriteLine(
"Перед вызовом webRequest.GetResponseAsync(). Thread Id: {0}, Url : {1}",
Thread.CurrentThread.ManagedThreadId, url);
var webResponse = await webRequest.GetResponseAsync();
Console.WriteLine("{0} : {1}, elapsed {2}ms. Thread Id: {3}", url,
webResponse.ContentLength, sw.ElapsedMilliseconds,
Thread.CurrentThread.ManagedThreadId);
}
}
Which parts I don't understand:
This looks like an ideal job for Parallel.ForEach()
You can set the concurrency limit via a parameter, and then use the WebRequest.Timeout
property to bail after waiting too long for a response.
Something like this:
Parallel.ForEach(
urls,
new ParallelOptions { MaxDegreeOfParallelism = 3 },
url =>
{
try
{
var request = WebRequest.Create( url );
request.Timeout = 10000; // milliseconds
var response = request.GetResponse();
// handle response
}
catch ( WebException x )
{
// timeout or some other problem with the request
}
catch ( Exception x )
{
// make sure this Action doesn't ever let an exception
// escape as that would stop the whole ForEach loop
}
}
);
The call to Parallel.ForEach()
will block the calling thread until all the urls have been processed.
It will, however, use up to MaxDegreeOfParallelism
threads to run the work.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.