Amazon S3 server side bucket transfer - amazon-s3

I'm trying to transfer buckets between Amazon s3 accounts. I see there is s3cmd for unix and CloudBerry Explorer for Windows. I have tested both, but im not sure if the transfer (both accounts in same region) are server side or client side? Can this be done server side?
S3cmd
Cloudberry explorer
From this question: Best way to move files between S3 buckets?
I am also checking this: http://docs.aws.amazon.com/AmazonS3/latest/dev/CopyingObjectUsingRuby.html, i like ruby :)
Edit: Also in the CloudBerry Explorer, i have checked the server side transfer option, and i am using sync option, BUT i am still not sure if this is entirely client side.

Yes, if these tools are using the PUT - Copy method the file duplication is done entirely on the server-side.
You can read-up on the details in the API docs.

Related

Password protected storage server which can be accessed programmatically (like Amazon S3)

We want to store our files somewhere on a storage server, some of which need to be password protected. S3 is a very good option since
it can be password protected.
we can access it programmatically (say we can upload or download files from java)
Although the storage is cheap, download/upload price on S3 is not that cheap. So we are looking for alternatives. One option is to use our own servers. Is there any way to simulate a similar behavior with a personal server?
You can consider using this: https://www.minio.io/
It's an object storage server that is compatible with Amazon S3

Send files to S3 using Tumbleweed Secure File Transport

My org uses Tumbleweed Secure File Transport to transfer files to different locations.
I have a requirement to move files to S3 but am not sure whether this is possible using Tumbleweed.
The way my org currently does it is to sftp the files across to an EC2 instance which then transfers it to S3.
Does anyone know if Tumbleweed can send files directly to S3?
Thanks in advance.
S3 uploading is secure, and there are numerous tools that implement the API. The AWS CLI, SDKs, and 3rd party applications. Even if Tumbleweed doesn't support it, you can find a tool that does

Secure way to access Amazon S3

I have been using Transmit FTP program to access my Amazon S3 storage buckets. Just been reading on here that this isn't that secure.
I'm not a command line person as you can probably tell so what would be the best way for me to access my S3 storage on my Mac?
I'm using to store image files that I am making available for download on my website.
Thanks
FTP isn't secure, but it sounds like you are confusing this fact with the fact that you are using a multiprotocol client to access S3, and that client also happens to support FTP.
S3 cannot be directly accessed using the FTP protocol, so the client you are using can't actually be accessing S3 using FTP... hence, your security-related concern appears to be unfounded.

Is it possible to use Amazon S3 for folder in a .net site

Is it possible to use Amazon Simple Storage Service (S3) for folders & files on a .net site?
Background:
I have 200 websites sites and I would like to have a single common code base. Right now they are on a single dedicated server. I plan to move them to an EC2 server.
As you can see, some of the folders & files are not on S3 and some are.
Admin Panel - is a folder that requires authentication - is this an issue?
/Bin/ - contains DLL's - is this an issue?
EC2 is normal Windows Server like your current dedicated server. You remote desktop into it, install whatever you need, setup IIS etc.
S3 on the other hand is just a storage device. Think of it like a big NAS device. So you can use it to serve your static content (possible in conjunction with Cloudfront) but the actual website (Dlls, aspx pages etc) will have to be on EC2 in IIS.

404 redirect with cloud storage

I'm hoping to reach someone with some experience using a service like Amazon's S3 with this question. On my site we have a dedicated image server. And on this server, we have an automatic 404 redirect through Apapche so that, if a user tries to access an image that doesn't exist, they'll see a snazzy "Image Not Available" image.
We're looking to move the hosting of these images to a cloud storage solution (S3 or Rackspace's CloudFiles), and I'm wondering if anyone's had any success replicating this behavior on a cloud storage service and if so how they did it.
THe Amazon instances are just like normal hosted server instances once they are up and running so your Apache configuration could assumedly be identical to what you currently have.
Your only issue will be where to store the images. The new Amazon Elastic Block Store makes it easy to mount a drive based on S3 backed data. You could store all your images on such a volume and use it with your Apache instance.