In Hybris project, I want to use medias which are stored in AWS S3. How can I add this media with their url's.
Normally, I can do it with AWS storage policy with uploading medias, but I don't want to upload images, I want to add media with link to AWS URL.
You can use OOTB Hybris amazoncloud extenstion.
This extenstion provides MediaURLStrategy and MediaStorageStrategy implementation for Amazon S3.
If you have access to Hybris Wiki you can get more details here
Related
I upload the files by referring https://www.youtube.com/watch?v=9x5LGaL2W7E.But I don't find any reference videos or links to view the files in the bucket with access key and secret key not with userID and Password. I am specially looking forward to develop this API in Vue.Js(VUE 2)
Navigate me.
You could somehow achieve that but the best solution is probably to use the AWS CLI and run something like s3 ls on your given bucket.
Here is the reference: https://docs.aws.amazon.com/cli/latest/reference/s3/
Okay, I have a working apps that use amazon s3 multipart, they use CreateMultipart, UploadPart and CompleteMultiPart.
Now we are migrating to google cloud storage and we have a problem with multipart. As far as I understood google doesn't support s3 multipart, got info from here Google Cloud Storage support of S3 multipart upload.
So I see that google has closest method Compose https://cloud.google.com/storage/docs/composite-objects, where I just upload different objects and then send request to combine them, or I can use uploadType=multipart https://cloud.google.com/storage/docs/json_api/v1/how-tos/upload#resumable, but this seems to be completely different from s3 multipart. And there is resumable upload https://cloud.google.com/storage/docs/resumable-uploads, that seems to allow upload files in chunks, but without complete multipart.
What is the best option to use? Some services already use CreateMultiPart, UploadPart, CompletePart and I need to write "adapter" for this services in order to make them compatible with google cloud storage.
Update: below answer is no longer correct. GCS does support multipart uploads: https://cloud.google.com/storage/docs/xml-api/post-object-multipart
You are correct. Google Cloud Storage does not currently support multipart upload.
The main benefits to multipart upload are allowing multiple streams to upload in parallel from one or more machines and allowing a partial upload failure not to ruin the whole upload. The best way to get those same benefits with GCS would be to upload the parts as separate objects and then using Compose to combine them into a final object. Indeed, this is exactly what the gsutil command-line utility does when uploading in parallel.
Resumable uploads are a great tool if you want to upload a single object in a single stream, in order, and you want the ability to resume if the connection is lost.
"uploadtype=multipart" uploads are a bit different. They are a way to specify an object's complete metadata and also its data in a single upload operation, using an HTTP multipart request.
I have a large quantity of videos on my Vimeo account that I would like to migrate to my AWS S3 account.
Rather than go through the time consuming process of downloading from Vimeo to my local machine then uploading from my local machine to S3, is there a way where I can do a direct transfer from Vimeo to S3?
If possible, I would want to create a script to iterate through each video via Vimeo API and set up the path to where it would go into S3 then initiate a direct transfer. Any ideas or suggestions would be much appreciated!
If you have a PRO account or higher, you can use the API to get download links for videos on your account, including download links for the original source file. Those download file links should be able to be used for importing into S3. Note that the links provided via the Vimeo API are expiring HTTP 302 redirects to the video file resource, so make sure you take note of the expiration time also provided in the response.
Download links are returned with the rest of a video's metadata, so I suggest using the fields parameter to only return the metadata needed.
http://developer.vimeo.com/api/common-formats#json-filter
https://developer.vimeo.com/api/reference/videos#GET/users/{user_id}/videos
I know I can download the image on server and then upload again to S3 or any other cloud hosting service, but is there any way to store the image asset directly on s3 by supplying URL of asset instead of a file, because I don't want to add unwanted download and upload on my server.
Note: I am assured that the URI will be 99.9% up and image file will also be there. And I am OK to use services other than S3 if they have such a feature.
No. There is no API call for Amazon S3 that will retrieve content from another location.
You must supply the content as part of the API call.
I'm evaluating potential content management systems I want to use for a project. Many of the users will need to upload static files and include links to the in their posts.
In the Admin UI I can only see the ability to upload an image in a post. Does anyone know if it is possible to upload files to Keystone through the Admin UI?
You could use their Amazon S3 storage adapter. Depending on which version of Keystone you're using (3 or 4), you'll have to do some different things. Either way, you need to make some credentials for Amazon S3's service and configure Keystone to work with it. From there, you can use Types.S3File to allow a certain part of your MongoDB model to be a reference to an S3 object. See this page for more info on the S3File type in Keystone.