Aws S3 bucket (AccessDenied) on LS through cli but not through CDN - amazon-s3

When attempting to execute ls through the aws cli I am getting the following error:
An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
However, through the browser (doing just a GET request):
<ListBucketResult>
<Name>static.example.com</Name>
<Prefix/>
<Marker/>
<MaxKeys>1000</MaxKeys>
<IsTruncated>true</IsTruncated>
<Contents>
I know that:
The static page is setup behind cloudflare
The static page is setup as a subdomain
With this in mind, is aws cli getting (AccessDenied) because the s3 bucket is behind CloudFlare? Should i be able to ls its contents if it's doable through a web browser? If so, is there any way to debug it to see the response that's being received by the server?
Additionally, I see that the response is truncated. Can I somehow get more than 1000 items accessing it via the subdomain? I have already tried adding ?marker=1000 to the endpoint, however it reflects the marker on the response, but lists the same files as if no marker parameter was provided.

Related

How to upload pdf to S3 bucket?

I recorded jmeter script with blazemeter to upload a pdf file to the s3 bucket.When I parameterize the required values and execute the script to upload the file I am getting the error response
Response code: 403
Response message: Forbidden
All the parameters have correct values passed.
You should also get an appropriate Error Code which provides way more information regarding what's wrong, i.e. Access Denied or All access to this Amazon S3 resource has been disabled., once you know the root cause you should be able to figure out what needs to be done in order to resolve the issue.
Also double check all the request parameters using Debug Sampler and View Results Tree listener, for example you cannot record and replay X-Amz-Signature, you need to generate a proper one for each and every request, see How to Handle Dynamic AWS SigV4 in JMeter for API Testing article for more details

S3 Access Denied with boto for private bucket as root user

I am trying to access a private S3 bucket that I've created in the console with boto3. However, when I try any action e.g. to list the bucket contents, I get
boto3.setup_default_session()
s3Client = boto3.client('s3')
blist = s3Client.list_objects(Bucket=f'{bucketName}')['Contents']
ClientError: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied
I am using my default profile (no need for IAM roles). The Access Control List on the browser states that the bucket owner has list/read/write permissions. The canonical id listed as the bucket owner is the same as the canonical id I get when I go to 'Your Security Credentials'.
In short, it feels like the account permissions are ok, but boto is not logging in with the right profile. In addition, running similar commands from the command line e.g.
aws s3api list-buckets
also gives Access Denied. I have no problem running these commands at work, where I have a work log-in and IAM roles. It's just running them on my personal 'default' profile.
Any suggestions?
It appears that your credentials have not been stored in a configuration file.
You can run this AWS CLI command:
aws configure
It will then prompt you for Access Key and Secret Key, then will store them in the ~.aws/credentials file. That file is automatically used by the AWS CLI and boto3.
It is a good idea to confirm that it works via the AWS CLI first, then you will know that it should work for boto3 also.
I would highly recommend that you create IAM credentials and use them instead of root credentials. It is quite dangerous if the root credentials are compromised. A good practice is to create an IAM User for specific applications, then limit the permissions granted to that application. This avoids situations where a programming error (or a security compromise) could lead to unwanted behaviour (eg resources being used or data being deleted).

how to find if a certain bucket/folder is configured for multi part file upload

I am trying to complete a multi-part upload to S3 but getting failed with 403 AccessDenied. How to find if a certain bucket/folder is configured for multi part file upload as part of the AWS policy.
{"code" : 500,
"message" : "Error initiating MultipartUploadResult: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: 3215EEC708AC0XXC; S3 Extended Request ID: eF5EBmDp6Pqribb+FGOd7rEBB42lPdVPdOxamp3nda7nsACI07VwQ7SOOowxXfSCV3eG332ahuY=)"}
You should be able to do a multipart upload if you have the write access to that specific bucket. Kindly check if the role you are using is having write permissions to S3. For testing, you can attach the AmazonS3FullAccess policy to your role, you should be able to do the upload now.
By the way, all you need to have PutObject permission to do a multipart upload. There is no specific configuration for enabling/disabling it. Here is a doc for further reference.
https://docs.aws.amazon.com/AmazonS3/latest/dev/mpuAndPermissions.html

S3 access denied when trying to run aws cli

using the AWS CLI I'm trying to run
aws cloudformation create-stack --stack-name FullstackLambda --template-url https://s3-us-west-2.amazonaws.com/awsappsync/resources/lambda/LambdaCFTemplate.yam --capabilities CAPABILITY_NAMED_IAM --region us-west-2
but I get the error
An error occurred (ValidationError) when calling the CreateStack operation: S3 error: Access Denied
I have already set my credential with
aws configure
PS I got the create-stack command from the AppSync docs (https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-lambda-resolvers.html)
Looks like you accidentally skipped l letter at the end of template file name:
LambdaCFTemplate.yam -> LambdaCFTemplate.yaml
First make sure the S3 URL is correct. But since this is a 403, I doubt it's the case.
Yours could result from a few different scenarios.
1.If both APIs and IAM user are MFA protected, you have to generate temporary credentials using aws sts get-session-token and use it
2.Use a role to provide cloudformation read access to the template object in S3. First create a IAM role with read access to S3. Then create a parameter like below and ref it in resource properties IamInstanceProfile block
"InstanceProfile":{
"Description":"Instance Profile Name",
"Type":"String",
"Default":"iam-test-role"
}

How to fix AWS S3 bucket mission "Sorry! You do not have permissions to view this bucket."

After messing around with S3 bucket permission, I can't access the s3 bucket from AWS console and CLI. Always getting this error from the console
Sorry! You do not have permissions to view this bucket.
Using the CLI on any s3api call, would get Access Denied.
A client error (AccessDenied) occurred when calling the GetBucketVersioning operation: Access Denied
A client error (AccessDenied) occurred when calling the PutObjectAcl operation: Access Denied
Anyone know how to fix this issue.
I solved the problem in the end it was the bucket policy where my IP was in the blocked instead of allow access. I used a different ip and able to update bucket policy.