Best way to preview private S3 documents - amazon-s3

Users can upload documents through the my site. It it stored in a private AWS S3 bucket. If they want to preview those uploaded files from my site, I generate a pre-signed URL and stick it in an <embed> tag. I was wondering if this is the best way to do this?
It currently works most of the time, however a few users don't get the preview, instead their browser wants to download the file. Mostly on Windows machines.
Any insight or help would be greatly appreciated!

Related

upload files to any user's google drive after they authenticate

I want to upload files from my server to my users's google drive storage, after they authenticate, the questions are:
Is it possible to do that?
Do I have to re-authenticate the user every time they like to upload a new file?
Could I embed the uploaded MP4 files in my website? (using the html video source as the file's download link of the user's google drive account)
do I need to re-authenticate the user if they just want to see the embedded videos some time later, in other words, is there anyway I could save some sort of a Token so they don't keep doing that.
Is it possible to do that?
Yes
Do I have to re-authenticate the user every time they like to upload a new file?
No, store a refresh token for the user and they wont need to authenticate each time.
Could I embed the uploaded MP4 files in my website? (using the html video source as the file's download link of the user's google drive account)
I wouldn't recommend it google drive isn't really designed for hosting of files in this manner that and people would need access to the file to download it anyway, its a big can of worms.
do I need to re-authenticate the user if they just want to see the embedded videos some time later, in other words, is there anyway I could save some sort of a Token so they don't keep doing that.
Yes as mentioned if the files are uploaded to your drive account you own them. You would need to share the files with anyone that you want to have access to them. You could set the files to public but thats not the best way to go about this.
Reference
Using OAuth 2.0 to Access Google APIs

Migrate videos from Vimeo to S3

I have a large quantity of videos on my Vimeo account that I would like to migrate to my AWS S3 account.
Rather than go through the time consuming process of downloading from Vimeo to my local machine then uploading from my local machine to S3, is there a way where I can do a direct transfer from Vimeo to S3?
If possible, I would want to create a script to iterate through each video via Vimeo API and set up the path to where it would go into S3 then initiate a direct transfer. Any ideas or suggestions would be much appreciated!
If you have a PRO account or higher, you can use the API to get download links for videos on your account, including download links for the original source file. Those download file links should be able to be used for importing into S3. Note that the links provided via the Vimeo API are expiring HTTP 302 redirects to the video file resource, so make sure you take note of the expiration time also provided in the response.
Download links are returned with the rest of a video's metadata, so I suggest using the fields parameter to only return the metadata needed.
http://developer.vimeo.com/api/common-formats#json-filter
https://developer.vimeo.com/api/reference/videos#GET/users/{user_id}/videos

Can I upload files from a custom website form to Dropbox?

I have clients uploading files directly to my Dropbox folder. I was curious if I can build a website form that uploads the file to my Dropbox folder and also saves the forms data to my server with a reference to the file that was uploaded to Dropbox? They are legal documents and I don't want to worry about security if Dropbox handles that.
It looks like it should definitely be possible.
You would have to use the dropbox api:
https://www.dropbox.com/developers
https://github.com/dropbox/dropbox-js
There are some javascript examples that would likely be what you would need to go directly to dropbox without going to your server first.

Edit S3 doc with google docs and store it back to S3

I am trying to write a program that would allow my users to edit their S3 docs with Google Docs and then the program would store it back to their S3 bucket.
Any ideas on how to start?
I know its possible to simple open a document with google docs by supplying a URL.
While Google Docs is able to open a document from a URL, it cannot "save" the document back to Amazon S3.
Your users would need to save the document within Google Docs (on Google Drive), then your program would need to retrieve that document and save it into Amazon S3.
The problem is... how do you trigger your program to perform the export?
As an alternative, you could synchronize between Google Drive and Amazon S3. See:
Zapier
CloudHQ
GoodSync
...and probably many more!

How to Give Access to non-public Amazon S3 bucket folders using Parse authenticated user

We are developing a mobile app using Parse as our BAAS solution but using Amazon S3 for storage of our media files. All of our users upload media files into their own individual folders inside of our app's bucket. As the user uploads media files we update their records in Parse so it knows where to download the files. That's the easy part.
I've spent quite a bit of time researching the different policies for S3 buckets and I am trying to get a grip on the proper way to ensure the security of the content uploaded. If you do all of your work with DynamoDB or SimpleDB then it's easy because you're essentially adjusting your ACLs with the IAM accounts and whatnot. If you use Amazon Cognito it's also easy because authentication happens through Google, Facebook or Amazon accounts. In my case I am using Parse to authenticate users which cannot speak to Amazon directly.
My goal is that only the currently logged in Parse user with ID #1234567 can access their own 1234567 folder and files (as well as any other user given permission by this person for collaboration). Here is a post similar to what I'm trying to accomplish: amazon S3 bucket policy - restricting access by referer BUT not restricting if urls are generated via query string authentication
...but how do I accomplish this with the current user's ID number?
Even better question is whether that post mentioned above is best practice or should I instead be looking at creating an EC2 server to handle access to these files? Should I be looking at CloudFront to serve private content? Or is there another method that works better for what I am trying to accomplish? I am going in circles and my head is spinning.
Thanks to whoever can help straighten me out.
Well since Parse is being shut down I am migrating to another service. This question is no longer relevant.