Hi Stackoverflow folks,
I have lots of queries regarding differences between Amplify Storage and S3 SDK usage for uploading files to AWS.
I already have added the discussion here for reference - https://github.com/aws-amplify/amplify-js/discussions/8973
I would be glad if you visit the link and understand my query and let me know your answers. Thanks in advance.
For your question given the examples on github:
S3 SDK wraps around AWS API for S3 and uploads based on IAM policy (and bucket ACL).
Amplify Storage uses cognito auth, and cognito as access to S3 and uses a service role to gain access to pass the file to S3.
Amplify Storage would be a tiny bit slower in this case, because of the intermediate auth, but mostly the same.
Related
I tried to use Google Cloud Speech-to-Text in my node.js project. It works fine with smaller files that I've on my disk but I wanted to get longer files that are stored in AWS S3. Is it possible or I need to use Google Cloud Storage?
You can use google cloud storage libraries in your node.js code to access AWS s30 storage:
"The Cloud Storage XML API is interoperable with some cloud storage tools and libraries that work with services such as Amazon Simple Storage Service (Amazon S3) and Eucalyptus Systems, Inc. To use these tools and libraries, change the request endpoint (URI) that the tool or library uses so it points to the Cloud Storage URI (https://storage.googleapis.com), and configure the tool or library to use your Cloud Storage HMAC keys." For more information please check Google documentation
For longer audio files, you can only use files in Google Cloud Storage. You can't use audio files stored in AWS S3. https://cloud.google.com/speech-to-text/docs/reference/rest/v1/RecognitionAudio
Today, I am using AWS S3 bucket and on top of it I am using AWS CloudFront.
I want to have also a Google Cloud storage with CloudFront, so I found the Storage where I can create bucket and put their my static files/images which is equivalent to the S3 bucket. But what about CloudFront? Where do I set CloudFront in Google Cloud?
Thanks in advance.
Google Cloud features built-in edge caching in its points of presence for services like Cloud Storage and App Engine, so in many cases you may not need a separate CDN product. I would suggest measuring your use case with and without a CDN from a few countries before adding in the extra expense. Keep in mind that objects need to be publicly readable with cache control settings that allow caching (which is the default for public objects) in order for Google's edge caches to cache them.
Google Cloud does have a CDN service, though, called Google Cloud CDN. It ties in with Cloud Load Balancing. It offers direct support for GCS buckets, although that's still in alpha. The upside is that serving GCS resources via Cloud CDN adds some nice perks, such as the ability to use custom domains with HTTPS or mapping GCS bucket names to differently-named domains.
In addition, if you're happy with CloudFront, I believe that you can use GCS (or pretty much anything else) as an origin server for it.
How should I control access to content on S3? For example, for a social media application. I host my photos and videos on S3. A user might upload content meant for a few friends only. How can I control access?
I know I could use IAM, but these are AWS users right? Not application users. I'd prefer not to create 1 policy for each user in my application.
You can look at AWS Post Presigned, this will help.
You can implement the same using minio client aka mc. Supports all S3 compatible solutions.
We also libraries available for Go, Java, .Net, Python, nodeJs languages.
PS: I am a contributor to minio project.
Hope it helps.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have a pretty tough problem here: I need to allow the users of my site to upload very large files to their accounts and I want to store these files on a AWS S3 filesystem. I can't just write a web service to receive these and save them in the S3 fs, because all kinds of things can go wrong during the upload and I need a sophisticated uploader client. The kind of client that Amazon provides to upload files into S3, but of course I can't give my users direct access to that. I'd seriously appreciate any ideas for this!
Thank you
Best practice would be to let your client application to upload directly to S3, not flowing through your own web infrastructure. This would leverage ether massive parallel nature of S3 and off-load your web infrastructure. Offloading your infrastructure will allow to use less instances or smaller instances to serve the same amount of traffic. Hence a lower infrastructure cost for you.
You would need to write an S3 IAM Policy that would limit access for each of your user to their own "directory" (aka key prefix) on S3
Have a look at this blog post : http://blogs.aws.amazon.com/security/post/Tx1P2T3LFXXCNB5/Writing-IAM-policies-Grant-access-to-user-specific-folders-in-an-Amazon-S3-bucke
If your application is a web app, you can even let your customers' browsers upload directly to S3. See how to implement this securely at http://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-UsingHTTPPOST.html
Just a last note about your question : S3 is not a file system, S3 is an object store. Read more about the differences between object storage and file system at http://www.infoworld.com/article/2614094/data-center/what-is-object-storage-.html
first you will need to create users on your AWS account, next you assign roles to these users, given them access to AWS services such as s3. Then you create Identity access for theses roles and relax yourself. AWS does the rest. you dont need to bother about cloudfront APIs. if you do what is needed as outlined here, problem would be solved.
I know this isn't a direct technical problem but this seems like an ideal place to ask since I know other developers have experience using this service. I was about to ask this on the Amazon AWS forums but realized you need to be a AWS account holder to do that. I don't want to signup with them before getting the following answered:
Is Amazon S3 a CDN? or is it just an online storage service meant for personal use? Even if it isn't a CDN are you at least allowed to serve website assets from it to a high traffic site?
I have an adult dating site I would like to store assets for in S3? Is this type of site allowed under their tos? What they had to say on the matter in their tos was way too broad. Basically this site has nude images of members but they are all of age and uploaded by the users themselves. The site is targeted only to U.S. users and is legal under U.S. laws.
Amazons S3 service can be used as a CDN if you want depending on the size of your site your might want to look at cloudfront which will allow you to have your content shared across multiple zones, for what your describing s3 will be fine for your needs but as for amazons rules with content im not to sure.
Amazon stands for storage services.
You can use S3 to store files for private or public use.
If you want to use CDN services, you have to use Cloud Front.
Cloud front accepts S3 as input data to spread it to CDN servers.
About the policies, Im uncertain, but you can use it for store any type of data as long you have its rights.