uncompress gzipped assets with custom extension in browser - gzip

We want to serve gzipped files to the browser and it seems to be working with common file extensions such as css, txt, js etc.
However, when we change the extension to something else for e.g. filename.abc browser does not uncompress gzipped file even though Content-Encoding header is gzip. We have tried various combinations with Content-Type header.
How can we keep the custom extension of a file (say .abc) and still have browsers uncompress gzipped file automatically based on header info?

We were adding Content-Encoding, Content-Type headers via AWS cli s3 cp --metadata option. Setting it with --content-encoding and --content-type worked fine.

Related

How to serve a static html that has a php extension with S3?

I can't change the extension of the file, but I need to serve it as a html file. No php inside.
That will work fine. Set the Content-Type to text/html.
You can do this in the S3 console, described in How do I add metadata to an S3 object? or when you initially upload the file.

Is there a way to reupload a file to S3 without having to reset a custom MIME type?

I have two xml files that need (or really want) to be served with specific MIME types that S3 doesn't serve by default. The files are sitemap.xml and rss.xml served as application/xml and application/rss+xml respectively.
I am able to set the Content-Type header for these files no problem.
The problem is every time my site changes these files change. I should say that my site is completely static from a web server perspective. My site is updated by me building the files locally and uploading them to S3. When I upload my updated sitemap.xml and rss.xml files though, S3 nukes my custom Content-Type settings.
Is there a way to get it to associate these settings with the name of the file as opposed to the instance of the file?

s3_website setting MIME type for files without extension using

I'm trying to deploy a Jekyll site. Here's the flow:
Content is added and pushed to BitBucket
BitBucket Pipelines builds the site
Finds all HTML files in _site/ and removes their extension
Uses s3_website push (s3_website) to push contents to the designated S3 bucket
I'm removing the extension from the HTML files since I need clean URLs. Although, there's an additional step required to set the MIME type on these files to ensure S3 serves them correctly.
Somehow, the MIME type is being detected by itself as of now, and the site works, but I'm uncomfortable not having control over it. So, I tried to add the following to s3_website.yml to set the MIME type:
content_type:
"*": text/html
But that breaks the site.
How do I set s3_website to pick only those files that do not have an extension, and set the MIME type only to them?
The site would work without setting the MIME type, if Tika is able to correctly detect the content type by itself.
In case users need more control over this, the s3_website gem includes a YAML configuration that can handle this version 1.15.0.
Add this to s3_website.yml:
extensionless_mime_type: text/html
This sets the MIME type for all files that don't have an extension, to text/html.

Sails 0.10.5 compress middleware and serving gzipped assets

In sails 0.10.5 express compression is supposed to be in the middleware for production mode by default according to the issues on github, but none of the response headers have the appropriate Content-Encoding to suggest that they have been gzipped. Furthermore, the sizes of the assets all match the uncompressed assets.
After searching for any other issues related to this, I found this SO question which was theoretically the opposite of my problem: he had the gzipped files in place and needed the middleware and I have the middleware (supposedly by default) but no files. His problem was (apparently) solved by adding the middleware config, which was required for compression before 0.10.5. So, I npm installed grunt-contrib-compress and set up the config file. Now, I have the gzipped files being produced successfully, but they're not being served. I tried manually requesting the gzipped version of the asset by injecting it in sails-linker instead of the regular js, but the Content-Type on the response header was 'application/octect-stream'.
Has anyone successfully served gzipped static assets from a sails app? Am I doing anything obviously incorrectly? Even an outline of the general process would be appreciated.

Setting Content-Encoding Response header via WebLogic

I have a web application where part of its job is to serve out zipped binary files that are included within the war. The Web App is deployed to WebLogic.
The client application is Javascript based and is requesting the zipped files. Since, WebLogic is not currently setting the Content-Encoding: Zip in the Http Response header for the GET of the file, the web browser does not know it is a compressed file so it can't handle it correctly.
I need a way to set the Content-Encoding header to gzip within WebLogic when one of these files is requested. I have worked around it by hosting the files via Apache and adding an
AddEncoding x-gzip .
Any ideas on how to do the equivalent thing via WebLogic?