using ASP.NET MVC 2.0
I am making an amazon S3 downloader.
In download method I prepare a url something like http://s3.amazon.com/mysite.com/image.gif?awsKey=abcde
I redirect the user to that url (which opens that image.gif in the browser)
I see the image opened in the browser , but not as SAVE AS window to save at a location.
I have heard that I can add HEADERS in Response which can force SAVE AS dialog to save the file.
Any idea how those headers re added?
That would be the Content-Disposition header set to "attachment". However, the webservers serving s3.amazon.com would have to set this header, and changing that is outside of your control.
Related
I have a .jpg file stored in S3 and distributed using CloudFront. I can view the file when I download it, but I cannot view the file in Chrome or Safari. From what I can tell, I can't view the file in a browser because the Content-Type isn't getting sent despite the fact that I've set it in S3.
You can see what happens when you enter this signed CloudFront url into a browser. It should remain valid for roughly 24 hours after this post. https://media.development.doctheapp.com/claims/us-east-1:4877c3da-786a-4b3b-b1e0-c70bde0f9c4e741afc7c5a8304564963080a98e5675d09d1aca3d623911e34bd3b4eb0808579/300ae913-8f88-44b9-a4ab-e46d11133c76/receipt.jpg?Expires=1526533530&Signature=crnjhje1noP-7WfBMI6rMDPd-zdCAVKLaojFFNvxCZEdx0~EJHeqbL8oKwL64AULavekMHm~2r6vHto1d4IAt5eoLpbZR~q5PAfhSakte1iNNvuTxQ7q-mYOwoCemb5VD~bFXUBdrF1yiybaRHw-v6USbw53QZ2Qa4hfDkqgoEKwvEznBvR~sQnk5v-slX8~aJBhySS5XpkfdoE-yl8hh697xIyH~OliwrCg7h5iSkotwW9~EvTnLoVkXkuvru35eLhN4~gGMs3WDUAuucOl8JZdeg6CjAQQ~JWv6FJnb2wyvGrGJzOf70~8s08~qSqiCroyZfqUiZmw20eCIWXp4A__&Key-Pair-Id=APKAJVSE2BEPIQCCPH6Q
It seems like the original image is a TIFF file, not a JPEG. That's why you cannot see it in a browser but can open it upon download. The content type header is being sent correctly by Cloudfront if you look at the response headers.
It seems unclear form the docs, but here is my goal:
Create a presigned url to upload a file to s3
Pass that url to the browser
The browser uploads a file selected by the user to s3 using the pre-signed url
When the object on s3 is requested, the "Cache-Control: max-age=604800" header is on the response.
I would LIKE to not have to rely on the client to do anything special to make this happen. Meaning, some signal to S3 that it should set the cache control header to that value would have to be present in the pre-signed url, but I can't tell from the docs or 50 google searches how that is accomplished.
Any illumination into how this can be accomplished would be great!
We are developing an application for Web. Inside that application, to download a file, I have created a WCF Rest service that will download the files based on this link Download using WCF Rest. The purpose is to check for user authentication before downloading. I used streaming concept to download the file. It is now that I have found out few things
When the user downloads the file, he is not able to determine what are the file size and the time remaining. I analyzed and found out that the reason is because, it’s using the “Transfer Encoding: chunked” in the header so that the file will be downloaded in chunks. One of the advantages is that the memory consumption is less in the server even when there are many users downloading a file. So I thought of adding “Content-Length” header, but I found out that you can use only either one of the headers not both. So I was thinking how Hotmail and Gmail were downloading attachments. From my investigation, I found out that Hotmail uses chunking header whereas Gmail uses Content-length header. Also in the case of Gmail, it is also checking if the session is active or not then downloads the file accordingly. I want to achieve the following
a) Like Gmail, I want to check if the session is active or not and then downloads the files accordingly. What will be the method for me to implement it?
b) When downloading the file, I want to use Content-Length header instead of Chunked header. Also the memory consumption should be less. Can we achieve it in WCF Rest? If so how?
c) Is it possible for me to add a header in WCF that will display the file size in the browser Downloads window?
d) When downloading an inline images from WCF, I found out that the image after loading is not cached in local machine. I was thinking that once an image is shown in an HTML page, it will get automatically cached and the next time user visits the page, the image will load from cache instead from server. I want to cache the inline images to cache, what is the option that I can use for it? Are there any headers that I need to specify when downloading an inline image from server?
e) When I download a zip file using WCF in IPhone Chrome browser, it’s not downloading at all. But the same link works in Android Chrome browser. What could be the problem? Am I missing header in WCF?
Are there any methods that will achieve the above?
Regards,
Jollyguy
I am using PhantomJS 1.9.7 to scrape a web page. I need to send the returned page content to S3. I am currently using the filesystem module included with PhantomJS to save to the local file system and using a php script to scan the directory and ship the files off to S3. I would like to completely bypass the local filesystem and send the files directly from PhantomJS to S3. I could not find a direct way to do this within PhantomJS.
I toyed with the idea of using the child_process module and pass in the content as an argument, like so:
var execFile = require("child_process").execFile;
var page = require('webpage').create();
var content = page.content;
execFile('php', '[path/to/script.php, content]', null, function(err,stdout,stdin){
console.log("execFileSTDOUT:", JSON.stringify(stdout));
console.log("execFileSTDERR:", JSON.stringify(stderr));
});
which would call a php script directly to accomplish the upload. This will require using an additional process to call a CLI command. I am not comfortable with having another asynchronous process running. What I am looking for is a way to send the content directly to S3 from the PhantomJS script similar to what the filesystem module does with the local filesystem.
Any ideas as to how to accomplish this would be appreciated. Thanks!
You could just create and open another page and point it to your S3 service. Amazon S3 has a REST API and a SOAP API and REST seems easier.
For SOAP you will have to manually build the request. The only problem might be the wrong content-type. Though it looks as if it was implemented, but I cannot find a reference in the documentation.
You could also create a form in the page context and send the file that way.
I am developing a windows application in vb.net in which i have a url which first ask me to login on the website and then display a view pdf link. As i make it click it again redirect to another page where instead of asking for download pdf it opens it in my web browser control. Now i want to save that opened pdf on my specified path. I have googled a lot but didn't find any solution for the same. I even found some related posts but none of them have my answer. Here my pdf url doesnt contains any file name like '.pdf'. Url contains some token values. To open this url it requires login on the website. I am trying to download pdf file for many days. Please help me.
you have to push your file using http headers
Unique HTTP Headers Returned
because these headers are the only thing controlling how your browser handles the file.
Save As Mode (askapache_pdf=s)
Content-Disposition: attachment
Content-Type: application/pdf
for more info goto http://www.askapache.com/htaccess/pdf-cookies-headers-rewrites.html#Unique_HTTP_Headers_Returned
This does not have anything to do with server side scripting language its same that you have to add a response in your header of http request. But anyway in ASP you should try something like below
Response.AddHeader("content-disposition", "attachment;filename=somefile.ext")