running shell script in cloud SDK - gsutil

Getting below error when i tried to run shell script with gsutil in cloud SDK

Related

Install Zenko Cloud Server on ubuntu 20.04 for development purpose

For a couple of days, I have been trying to install the Zenko cloud server for development purposes on my ubuntu 20.04 machine. I am new to Docker and definitely not very much comfortable with Kubernetes and Helm. I am trying to follow this instruction. During the installation stage when I am trying to follow this link, I am getting this error
Error: validation failed: [unable to recognize "": no matches for kind "PodSecurityPolicy" in version "policy/v1beta1", unable to recognize "": no matches for kind "PodDisruptionBudget" in version "policy/v1beta1", unable to recognize "": no matches for kind "CronJob" in version "batch/v1beta1"]
while executing this command.
helm install
https://github.com/scality/Zenko/releases/download/1.2.5/zenko-1.2.5.tgz
I have also tried this link to install Zenko. I have successfully cloned Zenko from the git repository.
git clone https://github.com/scality/Zenko.git
But while executing cd ./zenko/charts, I am getting this error.
bash: cd: ./zenko/charts: No such file or directory
I have installed Minkube by following this link and also installed Helm2 by following this link. Also, I have tried to follow this Zenko documentation but did not quite understand it.
My current goal is to install the Zenko cloud server and upload files to Amazon S3 and also to my local directory where both can be managed via Zenko according to their documentation.
It will be very helpful if someone shows me some way to solve this issue. Thanks in advance.

Terraform: How to automate pulling and running docker images from Azure Container Registry

I want to automate the process of pulling docker images from Azure container Registry to the Azure VM. I have already done the following:
Created an Azure container Registry.
Setup username and password in the Azure Container Registry.
Pushed the image from my local machine to the Container Registry.
I have setup up terraform code to automate the build out of Azure VM. I also want to include the docker pull and docker run commands so that those tasks are automated. Below are the commands I would like to automate into terraform:
sudo docker login --username xxx --password xxx xxx.azurecr.io
sudo docker pull xxx.azurecr.io/xx/xxx
sudo docker run --network=host xxx.azurecr.io/xxx/xxx
Any help would be much appreciated. Thank you folks!
As I know, if you want to execute the Docker CLI command in the VM, you should install the Docker engine first.
In addition, if you want to run the Docker CLI commands in the VM automated after creating the VM through Terraform, You can use VM extension in Terraform. Write a shell script with the commands and then run it in the VM extension. Here is the example that Using Terraform with Azure VM Extensions.

EMR spark-shell not picking up jars

I am using spark-shell and I am unable to pick up external jars. I run spark in EMR.
I run the following command:
spark-shell --jars s3://play/emr/release/1.0/code.jar
I get the following error:
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
Warning: Skip remote jar s3://play/emr/release/1.0/code.jar
Thanks in advance.
This is a limitation of Apache Spark itself, not specifically Spark on EMR. When running Spark in client deploy mode (all interactive shells like spark-shell or pyspark, or spark-submit without --deploy-mode cluster or --master yarn-cluster), only local jar paths are allowed.
The reason for this is that in order for Spark to download this remote jar, it must already be running Java code, at which point it is too late to add the jar to its own classpath.
The workaround is to download the jar locally (using the AWS S3 CLI) then specify the local path when running spark-shell or spark-submit.
You can do this with a spark-shell command line on the EMR box itself:
spark-submit --verbose --deploy-mode cluster --class com.your.package.and.Class s3://bucket/path/to/thejar.jar 10
You can also call this command using the AWS Java EMR Client Library or the AWS CLI. The key is to use: '--deploy-mode cluster'
Had same issue, you can add "--master yarn --deploy-mode cluster" args and it will allows you to execute s3 jars remotely

Error: AWS CLI SSH Certificate Verify Failed _ssl.c:581

I am trying to use the sync command from my file system to S3 on a Windows 2008 R2 server.
I have previously had no problem running this command on multiple local machines:
AWS S3 SYNC 'File system Name' S3://'S3 file directory name'
However when I try to run it from this box I get this error:
[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:581)
Every forum I see is using python scripts but I am just using the simple CLI commands.
Any idea why I am getting this error?
If you are running aws cli commands on Windows, above given commands i.e (sudo pip) will not work.
1) TO avoid "[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:581)" error
on cli you can use the format like :
AWS [aws-service-name] --no-verify-ssl [functions]
2) Then your cli command for S3 Sync becomes:
AWS S3 --no-verify-ssl SYNC 'File system Name' S3://'S3 file directory name'
This worked around the issue for me on ubuntu 14.04. I cannot confirm if it is an ideal/complete solution:
sudo pip uninstall certifi
sudo pip install certifi==2015.04.28
From here: https://github.com/aws/aws-cli/issues/1499

Does Google's gsutil command line app work on 64 bit windows?

is it possible to run Google's Big Query command line tool: gsutil on Windows 7 64 bit?
I could not get this to work because of a dependent Python module called: pyOpenSSL-0.13, which I could not install w/o building it using Microsoft Express 2008. Just wondered if this was a known issue.
Thanks.
OK. I finally came up with a solution for those with a similar problem:
Install cygwin w/ dev tools (i.e. gcc compiler, make, automake, etc)
Install openssl-dev for cygwin.
Download pyOpenSSL-0.13 gzip file and decompress it into home folder. (Google for this)
run "python setup.py install" from inside a cygwin prompt.
Download gsutil source code from Google and decompress it into home folder.
run "python setup.py install" from inside a cygwin prompt.
cd to the gsutil directory
run ./gsutil
This solution worked for me on a 64 bit Windows 7 machine. It could be that I broke my installation and that others may not run into this problem. However, it does seem that OpenSSL support for gsutil on a Windows 64 bit machine running 64 bit python is non-existent.