问题描述
当将asp.net 5应用程序部署到Azure上的IIS 8时,如何使用Content-Type application/json启用响应的gzip压缩?通常,这可以使用web.config完成,但是现在已经消失了……新的方法是什么?
how would one enable gzip compression of responses with Content-Type application/json when the asp.net 5 app is deployed to IIS 8 on Azure? Typically this would've been done using web.config but that's gone now... what's the new approach?
推荐答案
您需要对kestrel应用程序进行反向代理,然后才能告诉反向代理进行压缩.
You need to reverse-proxy your kestrel application, then you can tell the reverse-proxy to compress.
在nginx中,操作如下:
In nginx, this goes as follows:
server {
listen 80 default_server;
listen [::]:80 default_server ipv6only=on;
server_name localhost;
gzip on;
gzip_min_length 1000;
#gzip_proxied expired no-cache no-store private auth;
gzip_proxied any;
gzip_comp_level 6;
gzip_buffers 16 8k;
gzip_http_version 1.1;
gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript;
gzip_vary on;
gzip_disable "MSIE [1-6]\.(?!.*SV1)";
location /
{
proxy_pass http://127.0.0.1:5004;
}
}
因此,此处nginx将在端口80上捕获传入的请求,然后将它们转发到同一台机器上但在端口5004上的kestrel.然后,Kestrel将响应发送回nginx.由于gzip处于打开状态,因此nginx将压缩响应并将其发送给用户.您需要确保的是,在输出例如文件时(例如,使用以前的Response.TransmitFile时),Kestrel上的应用程序不会返回HTTP标头,例如HTTP 1.1分块编码.
So here nginx will catch incoming requests on port 80, and then forward them to kestrel on the same machine, but on port 5004. Kestrel then sends the response back to nginx. Since gzip is on, nginx will compress the response, and send it to the user. All you need to ensure is that the application on Kestrel does not return HTTP headers, such as HTTP 1.1 chuncked-encoding when outputting for example a file (e.g. when using what used-to-be Response.TransmitFile).
IIS 7.5+支持反向代理.
有关详细信息,请参见此处:
https://serverfault.com/questions/47537/可以将其配置为转发到另一个Web服务器的请求
IIS 7.5+ supports reverse proxying.
See here for closer information:
https://serverfault.com/questions/47537/can-iis-be-configure-to-forward-request-to-another-web-server
这篇关于ASP.NET vNext,在Azure上启用对IIS 8的压缩?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!