System.Net.WebClient要求得到403禁止,但浏览器不与Apache服务器不与、浏览器、服务器、Net

2023-09-02 12:00:26 作者:少年多沧桑#

这是奇怪的,我想读的< HEAD>有很多不同的网站中的一个特定类型的服务器节在那里,而且,阿帕奇,有时给人的code 403禁止的。并非所有的Apache服务器做到这一点,所以它可能是一个配置设置或服务器的特定版本。

An odd one, I'm trying to read the <Head> section of a lot of different websites out there, and one particular type of server, Apache, sometimes gives the code 403 forbidden. Not all apache servers do this, so it may be a config setting or a particular version of the server.

当我再检查URL以网页浏览器(Firefox等)的页面加载罚款。在code八九不离十是这样的:

When I then check the url with a web browser (Firefox, for example) the page loads fine. The code sorta looks like this:

var client = new WebClient();
var stream = client.OpenRead(new Uri("http://en.wikipedia.org/wiki/Barack_Obama"));

通常情况下,403是一个访问权限故障之类的事情,但这些通常是不安全的网页。我在想,Apache是​​过滤的东西在请求头,因为我没有打扰到产生任何污染。

Normally, a 403 is a access permission failed sort of thing, but these are normally unsecure pages. I'm thinking that Apache is filtering on something in the request headers since I'm not bothering to create any.

也许有人谁知道更多关于Apache能不能给我一个什么样的报头中丢失了一些想法。我想,以保持头尽可能的小,以减少带宽。

Maybe someone who knows more about Apache can give me some ideas of what's missing in the headers. I'd like to keep the headers as small as possible to minimize bandwidth.

感谢

推荐答案

尝试设置的UserAgent头:

Try setting the UserAgent header:

string _UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)";
client.Headers.Add(HttpRequestHeader.UserAgent, _UserAgent);