Storing a remote hosted image on S3 directly using java sdk - amazon-s3

I know I can download the image on server and then upload again to S3 or any other cloud hosting service, but is there any way to store the image asset directly on s3 by supplying URL of asset instead of a file, because I don't want to add unwanted download and upload on my server.
Note: I am assured that the URI will be 99.9% up and image file will also be there. And I am OK to use services other than S3 if they have such a feature.

No. There is no API call for Amazon S3 that will retrieve content from another location.
You must supply the content as part of the API call.

Related

How does Twitter do multipart file upload from the client app to its storage?

I am not speaking from the perspective of a user uploading a file to Twitter. Rather, I want to build a twitter-like app and upload a file from my client app -> to my server -> to my s3 storage.
My current approach is to break the file up into chunks, and then upload it to my web server. Then from my web server, I then upload it to s3 storage.
The part I am unsure about is how I should be storing it on the webserver before it goes to s3 storage. Currently, I save it in memory, and once complete I send it to s3. This consumes too much memory. The other approach is to store the file locally on the webserver, and then do the upload to s3. Is this how a high traffic site usually do it?
This is an architecture based question. --> How is this done for a high traffic website like Twitter?

Google cloud storage compatibility with aws s3 multipart upload

Okay, I have a working apps that use amazon s3 multipart, they use CreateMultipart, UploadPart and CompleteMultiPart.
Now we are migrating to google cloud storage and we have a problem with multipart. As far as I understood google doesn't support s3 multipart, got info from here Google Cloud Storage support of S3 multipart upload.
So I see that google has closest method Compose https://cloud.google.com/storage/docs/composite-objects, where I just upload different objects and then send request to combine them, or I can use uploadType=multipart https://cloud.google.com/storage/docs/json_api/v1/how-tos/upload#resumable, but this seems to be completely different from s3 multipart. And there is resumable upload https://cloud.google.com/storage/docs/resumable-uploads, that seems to allow upload files in chunks, but without complete multipart.
What is the best option to use? Some services already use CreateMultiPart, UploadPart, CompletePart and I need to write "adapter" for this services in order to make them compatible with google cloud storage.
Update: below answer is no longer correct. GCS does support multipart uploads: https://cloud.google.com/storage/docs/xml-api/post-object-multipart
You are correct. Google Cloud Storage does not currently support multipart upload.
The main benefits to multipart upload are allowing multiple streams to upload in parallel from one or more machines and allowing a partial upload failure not to ruin the whole upload. The best way to get those same benefits with GCS would be to upload the parts as separate objects and then using Compose to combine them into a final object. Indeed, this is exactly what the gsutil command-line utility does when uploading in parallel.
Resumable uploads are a great tool if you want to upload a single object in a single stream, in order, and you want the ability to resume if the connection is lost.
"uploadtype=multipart" uploads are a bit different. They are a way to specify an object's complete metadata and also its data in a single upload operation, using an HTTP multipart request.

Migrate videos from Vimeo to S3

I have a large quantity of videos on my Vimeo account that I would like to migrate to my AWS S3 account.
Rather than go through the time consuming process of downloading from Vimeo to my local machine then uploading from my local machine to S3, is there a way where I can do a direct transfer from Vimeo to S3?
If possible, I would want to create a script to iterate through each video via Vimeo API and set up the path to where it would go into S3 then initiate a direct transfer. Any ideas or suggestions would be much appreciated!
If you have a PRO account or higher, you can use the API to get download links for videos on your account, including download links for the original source file. Those download file links should be able to be used for importing into S3. Note that the links provided via the Vimeo API are expiring HTTP 302 redirects to the video file resource, so make sure you take note of the expiration time also provided in the response.
Download links are returned with the rest of a video's metadata, so I suggest using the fields parameter to only return the metadata needed.
http://developer.vimeo.com/api/common-formats#json-filter
https://developer.vimeo.com/api/reference/videos#GET/users/{user_id}/videos

How to use medias stored in AWS with Hybris

In Hybris project, I want to use medias which are stored in AWS S3. How can I add this media with their url's.
Normally, I can do it with AWS storage policy with uploading medias, but I don't want to upload images, I want to add media with link to AWS URL.
You can use OOTB Hybris amazoncloud extenstion.
This extenstion provides MediaURLStrategy and MediaStorageStrategy implementation for Amazon S3.
If you have access to Hybris Wiki you can get more details here

Amazon S3 Images

I'm attempting to build a Rails application that uses Amazons S3 service. I'm able to upload images to my bucket but when I try to display them in a browser window my image is downloaded to my computer instead of being shown in the browser which is what I want.
https://s3.amazonaws.com/skateparks/b_72e3d8d31fc4263f40b6.png
Set the Content-Type header when you upload the image to S3.