upload files from sharepoint online to aws s3 bucket - amazon-s3

When any file or folder created in a document library in Sharepoint online, then need to upload that file or folder to AWS S3 bucket with the help of Power Automate.
Also, if any attachment upload in the Sharepoint online list, then also that attachment will upload on AWS S3 bucket.

I have created below when the file created or modified in the sharepoint with power automate/MS Flow

Related

How to create and interaction between google drive and aws s3?

I'm trying to set up a connection a Google Drive folder and S3 bucket, but I'm not sure where to start.
I've already created a sort of "Frankenstein process", but it's easy to use only by me and sharing it to my co-workers it's a pain.
I have a script that generates a plain text file and saves it into a drive folder. And to upload, I've installed Drive file stream to save it in my mac, then all I did was create a script using Python3, with the boto3 library, to upload the text file into different s3 buckets depending on the file name.
I was thinking that I can create a lambda to process the file into the s3 buckets but I cannot resolve how to create the connection between drive and s3. I would appreciate if someone could give me a piece of advise on how to start with this.
Thanks
if you just simply want to connect google drive and aws s3 there is one service name zapier which provide different type of integration without line of code
https://zapier.com/apps/amazon-s3/integrations/google-drive
For more details you can check this link out

How can I export all contents to my local drive from cloudinary?

Is there any method to download all contents in cloudinary as one zip file or download all contents using any plugin.
I have multiple folders and subfolders containing images in Cloudinary.
Bulk downloading images from your Cloudinary account can currently be done in the following ways:
Using the Admin API. Listing all resources and extracting their URLs for download.
Download as a zip file using the ZIP generation API
Backup to a private S3 bucket. Please see - http://support.cloudinary.com/hc/en-us/articles/203744391-How-do-I-grant-Cloudinary-with-the-permissions-to-backup-on-my-private-S3-bucket-

Can I upload files directly from S3 to LightSail without having to download them locally?

I'd like to write a script that can take a list of urls for some files on S3 and upload them directly to a LightSail instance. I know I can download the files from S3 and use sftp to upload to LightSail, but I'm hoping there is a way that I can trigger the file transfer directly from S3 to LightSail. Has anyone done this before?

AWS S3 auto save folder

Is there a way I can autosave autocad files or changes on the autocad files directly to S3 Bucket?, probably an API I can utilize for this workflow?
While I was not able to quickly find a plug in that does that for you, what you can do is one of the following:
Mount S3 bucket as a drive. You can read more at CloudBerry Drive - Mount S3 bucket as Windows drive
This might create some performance issues with AutoCad.
Sync saved files to S3
You can set a script to run every n minutes that automatically syncs your files to S3 using aws s3 sync. You can read more about AWS S3 Sync here. Your command might look something like
aws s3 sync /path/to/cad/files s3://bucket-with-cad/project/test

Unable to open file from winscp (ec2) which was mounted to AWS S3

I have mounted s3 bucket on my EC2 instance using the details mentioned here at https://winscp.net/eng/docs/guide_amazon_s3_sftp. Also, I updated the bucket policy.
From winscp, I can see the list of files, add new files, delete files uploaded via s3 web portal, and open files uploaded by me via winscp.
However, I cannot open the file from winscp which I uploaded from s3 web portal. I get the following error.
Permission denied to read s3 object