Can't access personal s3 server with boto3 - amazon-s3

I have a private installation of a server which is fully s3-compatible. I have one bucket there and I can check it using s3 browser. I am trying to interact with the server using boto3 for python (using the same credentials that I use in s3 browser), however, for any request I get NoSuchBucket error. This is my code:
s3 = boto3.resource('s3',
endpoint_url=hostname,
use_ssl=False,
aws_access_key_id=access_key,
aws_secret_access_key=secret_key
)
for bucket in s3.buckets.all():
print(bucket.name)
Initially I thought there was an issue with credentials, but then I was able to interact with the server through s3 browser client.
So the problem is: I really don't understand the error code, since I am not querying any particular bucket. What could be the cause of the problem?

Problem solved! It was a DNS resolution issue.

Related

Copying folders from S3 to an Azure Storage Blob and receiving "cannot list objects, access is denied" error. Anyone else have this and resolve it?

I've confirmed that my S3 credentials are set correctly and even tried full permissions on the bucket and still receive the same message. On the Azure side, I've added Storage Blob Data Owner permissions to my account and can list files I manually upload through the portal with the credentials I used (signing in through AAD rather than using a token).
Any help is appreciated here.
As it turned out there were no issues with either the Azure or AWS configuration. Trying the same commands in a linux VM worked perfectly. If you're experiencing this error it may be an issue with the AWS CLI or some other local config.

s3 to ftp server without downloading to ec2

Is there is way (programmatically) to transfer file from s3 bucket to an external ftp server without downloading it to an ec2 instance ?
More details:
I have a Django server running on EC2 which serves an angular web app.
User uploads a file to S3 bucket using my web app and once the upload is complete the web app sends a POST request containing the file object s3 url.
The Django server upon receiving the POST request may need to copy the file (uploaded to s3) to an external ftp server. The target ftp server may be different depending upon the user who uploaded the file (each user group may have her own ftp server).
I understand that upon receiving POST request, Django server can download the file from s3 and then upload it to the appropriate target ftp server.
My question is: Can I reduce overhead on my EC2 instance in step 4 by somehow initiating a transfer from s3 to the target ftp server and get a callback/notification when that transfer completes (success or error).
Thanks.
You can create a lambda function to do the same.
A complete reference of the implementation is discussed here.
https://pythonvibes.wordpress.com/2016/12/09/ftp-and-sftp-through-lambda/
Hope it helps.

Amazon S3 suddenly stopped working with EC2 but working from localhost

Create folders and upload files to my S3 bucket stopped working.
The remote server returned an error: (403) Forbidden.
Everything seems to work previously as i did not change anything recently
After days of testing - i see that i am able to create folders in my bucket from localhost but same code doesnt work on the EC2 instance.
I must resolve the issue ASAP.
Thanks
diginotebooks
Does your EC2 instance have a role? If yes, what is this role? Is it possible that someone detached or modified a policy that was attached to it?
If your instance doesn't have a role, how do you upload files to S3? Using the AWS CLI tools? Same questions for the IAM profile used.
If you did not change anything - are you using the same IAM credentials from the server and localhost? May be related to this.
Just random thoughts...

How to upload files directly to Amazon S3 from a remote server?

Is it possible to upload a file to S3 from a remote server?
The remote server is basically a URL based file server. Example, using http://example.com/1.jpg, it serves the image. It doesn't do anything else and can't run code on this server.
It is possible to have another server telling S3 to upload a file from http://example.com/1.jpg
upload from http://example.com/1.jpg
server -------------------------------------------> S3 <-----> example.com
If you can't run code on the server or execute requests then, no, you can't do this. You will have to download the file to a server or computer that you own and upload from there.
You can see the operations you can perform on amazon S3 at http://docs.amazonwebservices.com/AmazonS3/latest/API/APIRest.html
Checking the operations for both the REST and SOAP APIs you'll see there's no way to give Amazon S3 a remote URL and have it grab the object for you. All of the PUT requests require the object's data to be provided as part of the request. Meaning the server or computer that is initiating the web request needs to have the data.
I have had a similar problem in the past where I wanted to download my users' Facebook Thumbnails and upload them to S3 for use on my site. The way I did it was to download the image from Facebook into Memory on my server, then upload to Amazon S3 - the full thing took under 2 seconds. After the upload to S3 was complete, write the bucket/key to a database.
Unfortunately there's no other way to do it.
I think the suggestion provided is quite good, you can SCP the file to S3 Bucket. Giving the pem file will be a password less authentication, via PHP file you can validate the extensions. PHP file can pass the file, as argument to SCP command.
The only problem with this solution is, you must have your instance in AWS. You can't use this solution if your website is hosted in other Hosting Providers and you are trying to upload files straight to S3 Bucket.
Technically it's possible, using AWS Signature Version 4, Assuming your remote server is the customer in the image below, you could prepare a form in the main server, and send the form fields to the remote server, for it to curl it. Detailed example here.
you can use scp command from Terminal.
1)using terminal, go to the place where there is that file you want to transfer to the server
2) type this:
scp -i yourAmazonKeypairPath.pem fileNameThatYouWantToTransfer.php ec2-user#ec2-00-000-000-15.us-west-2.compute.amazonaws.com:
N.B. Add "ec2-user#" before your ec2blablbla stuffs that you got from the Ec2 website!! This is such a picky error!
3) your file will be uploaded and the progress will be shown. When it is 100%, you are done!

Parallel Download from S3 to EC2

I was reading this blog entry about parallel upload into S3 using boto. Near the end it suggests a few tools for downloading using multiple connections (axel, aria2, and lftp). How can I go about using these with S3? I don't know how to pass the authentication keys to Amazon to access the file. I can, however, make the file public temporarily, but this solution is non-optimal.
Generate a signed url using the AWS API and use that for your downloads. Only someone with the signed url (which expires in the given timeout) can download the file.