Unable to Sync to S3 with s3cmd - amazon-s3

After setting up s3cmd and my S3 bucket, when I try this command
sudo s3cmd sync --recursive --preserve /srv s3://MyS3Bucket
I get this error:
ERROR: S3 error: 400 (InvalidRequest): The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.
My s3cmd version is 1.0.0 which is installed by default after following their "deb" installation guide for by Ubuntu 12.04

These days, it is recommended to use the AWS Command-Line Interface (CLI), which also provides a sync capability.

s3cmd version 1.5.2 is necessary for working with regions such as eu-central-1 (Frankfurt) or cn-north-1 (Bejing). debs for such are available in Debian experimental and unstable, and Ubuntu Wily universe. Or you can install from source from https://github.com/s3tools/s3cmd.

Related

Error while enabling server side encryption policy for aws s3 bucket through cli

aws s3api put-bucket-encryption --bucket my-buxket-en --server-s
ide-encryption-configuration '{"Rules": [{"ApplyServerSideEncryptionByDefault":
{"SSEAlgorithm": "AES256"}}]}'
I am getting below error
usage: aws [options] <command> <subcommand> [<subcommand> ...] [parameters]
To see help text, you can run:
aws help
aws <command> help
aws <command> <subcommand> help
Unknown options: {SSEAlgorithm:, AES256}}]}', [{ApplyServerSideEncryptionByDefau
lt:
Kindly help me to resolve the error
I'm going to add an answer that explains this for people using windows in case they find this and can't figure it out.
aws s3api put-bucket-encryption --bucket my-bucket-name --server-s
ide-encryption-configuration "{\"Rules\": [{\"ApplyServerSideEncryptionByDefault\":
{\"SSEAlgorithm\": \"AES256\"}}]"
Needs to become
aws s3api put-bucket-encryption --bucket my-bucket-name --server-s
ide-encryption-configuration "{\"Rules\": [{\"ApplyServerSideEncryptionByDefault\":
{\"SSEAlgorithm\": \"AES256\"}}]"
Once you handle the quotes in the json it works as expected.
I have examined your AWS CLI syntax and I can confirm to the best of my ability there's nothing wrong with your syntax.
From the error, the issue is more related to the AWS CLI version i.e you are most likely using an older version of AWS CLI hence the old version is not able to pick-up the required parameter server-side-encryption-configuration
Resolution Steps:
1. Check the current version of your AWS CLI :
aws --version
If the output is anything less than version (1.18.31) then proceed to upgrade your AWS CLI version as shown below.
2. Upgrade your AWS CLI using pip (or pip3):
To upgrade an existing AWS CLI installation, use the --upgrade option:
pip install --upgrade awscli
OR
pip3 install --upgrade awscli
3. Upgrade your AWS CLI using AWS Bundled Installer:
curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"
unzip awscli-bundle.zip
sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws
Note: You may need to log out for the changes to take effect
Hope this helps!

EMR spark-shell not picking up jars

I am using spark-shell and I am unable to pick up external jars. I run spark in EMR.
I run the following command:
spark-shell --jars s3://play/emr/release/1.0/code.jar
I get the following error:
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
Warning: Skip remote jar s3://play/emr/release/1.0/code.jar
Thanks in advance.
This is a limitation of Apache Spark itself, not specifically Spark on EMR. When running Spark in client deploy mode (all interactive shells like spark-shell or pyspark, or spark-submit without --deploy-mode cluster or --master yarn-cluster), only local jar paths are allowed.
The reason for this is that in order for Spark to download this remote jar, it must already be running Java code, at which point it is too late to add the jar to its own classpath.
The workaround is to download the jar locally (using the AWS S3 CLI) then specify the local path when running spark-shell or spark-submit.
You can do this with a spark-shell command line on the EMR box itself:
spark-submit --verbose --deploy-mode cluster --class com.your.package.and.Class s3://bucket/path/to/thejar.jar 10
You can also call this command using the AWS Java EMR Client Library or the AWS CLI. The key is to use: '--deploy-mode cluster'
Had same issue, you can add "--master yarn --deploy-mode cluster" args and it will allows you to execute s3 jars remotely

Error: AWS CLI SSH Certificate Verify Failed _ssl.c:581

I am trying to use the sync command from my file system to S3 on a Windows 2008 R2 server.
I have previously had no problem running this command on multiple local machines:
AWS S3 SYNC 'File system Name' S3://'S3 file directory name'
However when I try to run it from this box I get this error:
[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:581)
Every forum I see is using python scripts but I am just using the simple CLI commands.
Any idea why I am getting this error?
If you are running aws cli commands on Windows, above given commands i.e (sudo pip) will not work.
1) TO avoid "[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:581)" error
on cli you can use the format like :
AWS [aws-service-name] --no-verify-ssl [functions]
2) Then your cli command for S3 Sync becomes:
AWS S3 --no-verify-ssl SYNC 'File system Name' S3://'S3 file directory name'
This worked around the issue for me on ubuntu 14.04. I cannot confirm if it is an ideal/complete solution:
sudo pip uninstall certifi
sudo pip install certifi==2015.04.28
From here: https://github.com/aws/aws-cli/issues/1499

Resolving Chef Cookbook Dependencies

I'm following a basic chef tutorial outlined here, which walks you through creating an initial chef-repo with various cookbooks from the supermarket.
I'm at the point where I have a hosted chef account set up and I need to upload all my local cookbooks to my hosted chef server.
So I run this locally -
> knife cookbook upload --all
Uploading apache2 [3.0.1]
Uploading apt [2.7.0]
Uploading aws [2.7.0]
Uploading build-essential [2.1.2]
Uploading chef-sugar [3.1.0]
Uploading chef_handler [1.1.8]
Uploading database [4.0.6]
Uploading homebrew [1.12.0]
Uploading iis [4.1.1]
Uploading iptables [1.0.0]
Uploading logrotate [1.9.1]
Uploading mariadb [0.3.0]
Uploading mysql [4.1.2]
ERROR: Cookbook mysql depends on cookbooks which are not currently
ERROR: being uploaded and cannot be found on the server.
ERROR: The missing cookbook(s) are: 'build-essential' version '~> 1.4'
Ok, so mysql cookbook is complaining that it needs build-essential, ~> 1.4. No problem, let me just get that specific version...
> knife cookbook site download build-essential 1.4.4
Great, now I have the right build-essential version. Let's try it again..
> knife cookbook upload --all
Uploading apache2 [3.0.1]
Uploading apt [2.7.0]
Uploading aws [2.7.0]
Uploading build-essential [1.4.4]
Uploading chef-sugar [3.1.0]
Uploading chef_handler [1.1.8]
Uploading database [4.0.6]
Uploading homebrew [1.12.0]
ERROR: Cookbook homebrew depends on cookbooks which are not currently
ERROR: being uploaded and cannot be found on the server.
ERROR: The missing cookbook(s) are: 'build-essential' version '>= 2.1.2'
Well now it breaks homebrew, which complains it needs build-essenitial, >= 2.12.
How do I get out of this dependency cycle? I can't have two different versions of the same cookbook, can I? I downloaded this straight from the tutorial's site - am I stuck just trying to figure out the right version of all these things?
Thanks!
Your mysql cookbook version is old and therefore has old dependencies. Try to upgrade it to the latest release. And also use the new version of build-essenitial.
See https://supermarket.chef.io/cookbooks/mysql

Duplicity, amazon s3 backend exception

I'm trying to create an auto backup using duplicity into amazon s3 following this guide: easy server backups to amazon s3 with duplicity
However, at this command:
duplicity /var/www s3+http://com.mycorp.myhost.backup
I encounter the error:
BackendException: Could not initialize backend: No module named boot
Googling does not yield understandable results for me. FYI, the actual command I run is:
duplicity /Users/okyretina/Dropbox/archive/ s3+http://com.sinkdrive.okyretina.dropbox.archieve
I understand that this guide is for linux and I am using mac osx lion but I figured it should work as well. Any help is appreciated. Thanks.
If you are on debian/ubuntu you can install it via: sudo apt-get install python-boto
I was getting the same problem again when upgrading to Ubuntu 19.10 and installing python-boto was not helping. Turns out python3-boto is required in newer versions.
sudo apt install python3-boto
It turns out that I need to install boto first from here