Pre-compress static files in IIS 6 - iis-6

I am implementing Gzip compression for CSS and JS files on my site and just need to double check something.
Is the file compressed on every request? or is it collected and sent from the Temporary folder (if the file exists)? I just want to be sure that my files are not compressed on every request.
Also, is this a default behaviour or do I need some extra configurtion?
And last, do I need to worry or configure something when using hash tags in the path (to inform the browser that the file has changed) and static file compression? or it should work with no problem.
Edit: I am just using static compression
Many thanks

In order to get the most out of IIS compression you will need to add a few extra bits into the metabase file.
Backup your meta base file.
Enable live edit to the meta base file in IIS (or you need to restart IIS when your done.)
find the IIsCompressionScheme and make the following edits to the meta base file
<IIsCompressionScheme Location ="/LM/W3SVC/Filters/Compression/deflate"
HcCompressionDll="%windir%\system32\inetsrv\gzip.dll"
HcCreateFlags="0"
HcDoDynamicCompression="TRUE"
HcDoOnDemandCompression="TRUE"
HcDoStaticCompression="TRUE"
HcDynamicCompressionLevel="10"
HcFileExtensions="htm
html
css
js
txt
xml"
HcOnDemandCompLevel="10"
HcPriority="1"
HcScriptFileExtensions="asp
dll
aspx
axd
ashx
asbx
asmx
swf
asmx
exe"
>
</IIsCompressionScheme>
<IIsCompressionScheme Location ="/LM/W3SVC/Filters/Compression/gzip"
HcCompressionDll="%windir%\system32\inetsrv\gzip.dll"
HcCreateFlags="1"
HcDoDynamicCompression="TRUE"
HcDoOnDemandCompression="TRUE"
HcDoStaticCompression="TRUE"
HcDynamicCompressionLevel="10"
HcFileExtensions="htm
html
js
css
txt
xml"
HcOnDemandCompLevel="10"
HcPriority="1"
HcScriptFileExtensions="asp
dll
aspx
axd
ashx
asbx
asmx
swf
asmx
exe"
>
</IIsCompressionScheme>
Once done test a page from your site using a FF plug in like YSlow or Firebug, with Firebug you can inspect each element in the Net tab and check if the right compression is being applied to the right file types.
There is a great article with examples here http://www.codinghorror.com/blog/2004/08/http-compression-and-iis-6-0.html

IIS 6 supports both dynamic and static compression.
Have look at the relevant documentation and a decent blog entry on the subject.

"The newly compressed file is then stored in the compression directory, and subsequent requests for that file are serviced directly from the compression directory. In other words, an uncompressed version of the file is returned to the client unless a compressed version of the file already exists in the compression directory."*
Taken from this article.

Related

Browser cannot download .txt file if the project is .Net Core Restful API

I am trying to activate my SSL for my project but one of the the urls doesn't respond.
This is the web site and when you paste the url to the browser, it downloads the .txt file.
http://www.xxx.co.uk/.well-known/pki-validation/83CB00D29E282E1FFD6DFB220F030EF4.txt
This is the Restful .Net Core API domain and when you paste the url to the browser, it returns 404 error.
http://api.xxx.co.uk/.well-known/pki-validation/83CB00D29E282E1FFD6DFB220F030EF4.txt
Under the IIS I have compared the psychical paths, permissions and it seems everything is same. I believe something in Rest API blocks the .txt file download. Should I check the web.config or something I need to add to the C# source code? Should I update ConfigureServices(IServiceCollection services)? In IIS I have checked MIME Types and the .txt is already defined.
Any suggestions?
I believe my problem is pointing to this subject but I am not sure how to enable/define .txt files in startup.cs. Any code snippets?
How to Serve Static File
EDIT: Finally I found a solution. Now the browser can download the .txt files. What I did is, I created a virtual directory in IIS. Here are the steps:
Go to the C: drive
Create a new folder called well-known
Inside the .well-known folder, create another folder named pki-validation
so far, your folders should look like this: C:\.well-known\pki-validation
Upload the TXT file in the pki-validation folder
Open the IIS Manager on your server
Do a right click on your website and select Add Virtual Directory
In the Alias section write .well-known
In the Psychical Path area enter the path to the well known folder. For example: C:\well-known
Press OK to create this alias
The urls are now serving the .txt files. I hope these steps one day saves the other developers time.

Serve gzipped content via a Web Resource

I have a Dynamics 365 instance that makes heavy use of custom front-end interfaces using a modern Nodejs-based build pipeline involving the usual suspects such as webpack/babel/etc. I'm hosting these files as webresources in Dynamics (one html file and one bundle.js file per SPA).
As my team nears production, I'm trying to set up a nice production build for our front-end stuff to reduce load times. Unfortunately, I can't find a good way to serve our bundle.js files encoded as gzip because Dynamics does not return the Content-Encoded: gzip header when a request is made and therefore the browser doen't decompress the file and tries to read the compressed file as plain JavaScript.
Of course, we can serve the uncompressed file just fine but we would like to provide the smaller, faster loading file if possible as it's generally about 1/3 the size.
Does anyone have any brilliant ideas for how to override the default response headers coming back from dynamics when I request a web resource? Or any other clever solutions to this problem?
Thanks, and let me know if any clarification is needed.
I don't know of any way to serve gzipped content via a web resource.
If the download size is a huge concern perhaps encode the gzipped code to base64 and store it as a string variable in JS.
Then during execution you could decode, unzip, and eval() the code.
You could also store base64 gzipped code as a file attachment via an annotation record or within an XML web resource, though those options would require an additional API call to get the code, so a string variable may be your best bet.

Server side: detecting if a user is downloading (save as...) or visualizing a file in the browser

I'm writing an apache2 module
by default and when viewed in a web browser, the module would only print the first lines of a large file and convert them to HTML.
if the user choose to 'download as...', the whole raw file would be downloaded.
Is it possible to detect this choice on the server side ? (for example is there a specific http header set ?).
note: I would like to avoid any parameter in the GET url (e.g: "http://example.org/file?mode=raw" )
Pierre
added my own answer to close the question: as said #alexeyten there is no difference. I ended by a javascript code the alter the index.html file generated by apache.

Make Indexed File Downloadable In Apache Solr

I am trying to indexed pdf file to Solr which I have done successfully using the command
curl "http://localhost:8983/solr/update/extract?literal.id=id=true"-F myfile=#filename.pdf"
I am able to see the file contents and search, but when I try to click on file name it shows
HTTP ERROR 404
Problem accessing /solr/collection1/id. Reason:
not found
What I want is to have a link which allows downloading the file, I know Solr merely indexes the file and stores it. I was wondering if there is a way by which I can add attribute location like you have done and proceed from there, can you please share with me what you have done, if you want any more clarity regarding my problem do ask.
We have the actual files hosted through a separate web application to be download from with auditing and additional security.
you can always directly host these files through http server.
If you are having the file names with id, it is as easy as appending the id.extension to the fixed http hosted url.
Else index the path of the file with an additional parameter e.g. literal.url.
The url will the solr field which will now be available with the Solr response.

HTTP compression - How to send precompressed files that exist in a EAR file?

Is it possible to send pre-compressed files that are contained within an EARfile? More specifically, the jsp and js files within the WAR file. I am using Apache HTTP as the web server and although it is simple to turn on the deflate module and set it up to use a pre-compressed version of the files, I would like to apply this to files that are contained within an EAR file that is deployed to JBoss. The reason being that the content is quite static and compressing it on the fly each time is quite costly in terms of cpu time.
Quite frankly, I am not entirely familiar with how JBoss deploys these EAR files and 'serves' them. The gist of what I want to do is pre-compress the files contained inside the war so that when they are requested they are sent back to the client with gzip for Content-Encoding.
In theory, you could compress them before packging them in the EAR, and then serve them up with a custom controller which adds the http header to the response which tells the client they're compressed, but that seems like a lot of effort to go to.
When you say that on-the-fly compression is quite costly, have you actually measured it? Have you tried requesting a large number of uncompressed pages, measured the cpu usage, then tied it again with compressed pages? I think you may be over-estimating the impact. It uses quite low-intensity stream compression, designed to use little CPU resources.
You need to be very sure that you have a real performance problem before going to such lengths to mitigate it.
I don't frequent this site often and I seem to have left this thread hanging. Sorry about that. I did succeed in getting compression to my javascript and css files. What I did was I precompress them in the ant build process using the gzip. I then had to spoof the name to get rid of the gzip extension. So I had foo.js and compressed it into foo.js.gzip. I renamed this foo.js.gzip to foo.js and this is the file that gets packaged into the WAR file. So that handles the precompression part. To get this file served up properly, we just have to tell the browser that this file is compressed, via the content-encoding header of the http response. This was done via a output filter that is applied to files that matched the *.js extension (some Java/JBoss, WEB-INF/web.xml if it helps. I'm not too familiar with this so sorry guys).