我编写了一个代码来检查URL,但是,ir的运行速度确实很慢。.我想尝试使其同时在几个URL上运行,例如10个URL,或者至少使其尽可能快。
我的代码:
Parallel.ForEach(urls, new ParallelOptions {
MaxDegreeOfParallelism = 10
}, s => {
try {
using(HttpRequest httpRequest = new HttpRequest()) {
httpRequest.UserAgent = "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:52.0) Gecko/20100101 Firefox/52.0";
httpRequest.Cookies = new CookieDictionary(false);
httpRequest.ConnectTimeout = 10000;
httpRequest.ReadWriteTimeout = 10000;
httpRequest.KeepAlive = true;
httpRequest.IgnoreProtocolErrors = true;
string check = httpRequest.Get(s + "'", null).ToString();
if (errors.Any(new Func < string, bool > (check.Contains))) {
Valid.Add(s);
Console.WriteLine(s);
File.WriteAllLines(Environment.CurrentDirectory + "/Good.txt", Valid);
}
}
} catch {
}
});
Run Code Online (Sandbox Code Playgroud)
您的服务调用不太可能受CPU限制。因此,分配更多的线程来处理负载可能不是最好的方法-如果使用async
,则会获得更好的吞吐量await
,如果可以,则使用更现代的HttpClient而不是HttpRequest或HttpWebRequest。
这是一个如何做的例子:
var client = new HttpClient();
//Start with a list of URLs
var urls = new string[]
{
"http://www.google.com",
"http://www.bing.com"
};
//Start requests for all of them
var requests = urls.Select
(
url => client.GetAsync(url)
);
//Wait for all the requests to finish
await Task.WhenAll(requests);
//Get the responses
var responses = requests.Select
(
task => task.Result
);
foreach (var r in responses)
{
var s = await r.Content.ReadAsStringAsync();
Console.WriteLine(r);
}
Run Code Online (Sandbox Code Playgroud)