gcloud compute ssh requires password even after using json key file for authentication - ssh

I am trying to authenticate gcloud using json key and even doing everything as per docs it requires for password when I run gcloud compute ssh root#production
Here is snapshot of steps I performed.
1. Authorizing access to Google Cloud Platform with a service account
tahir#NX00510:~/www/helloworld$ gcloud auth activate-service-account 1055703200677-compute#developer.gserviceaccount.com --key-file=gcloud_key.json
Activated service account credentials for: [1055703200677-compute#developer.gserviceaccount.com]
2. Initializing the gcloud
tahir#NX00510:~/www/helloworld$ gcloud init
Welcome! This command will take you through the configuration of gcloud.
Settings from your current configuration [default] are:
compute:
region: us-central1
zone: us-central1-b
core:
account: 1055703200677-compute#developer.gserviceaccount.com
disable_usage_reporting: 'True'
project: concise-hello-122320
Pick configuration to use:
[1] Re-initialize this configuration [default] with new settings
[2] Create a new configuration
Please enter your numeric choice: 1
Your current configuration has been set to: [default]
You can skip diagnostics next time by using the following flag:
gcloud init --skip-diagnostics
Network diagnostic detects and fixes local network connection issues.
Checking network connection...done.
Reachability Check passed.
Network diagnostic passed (1/1 checks passed).
Choose the account you would like to use to perform operations for
this configuration:
[1] 1055703200677-compute#developer.gserviceaccount.com
[2] Log in with a new account
Please enter your numeric choice: 1
You are logged in as: [1055703200677-compute#developer.gserviceaccount.com].
API [cloudresourcemanager.googleapis.com] not enabled on project
[1055703200677]. Would you like to enable and retry (this will take a
few minutes)? (y/N)? N
WARNING: Listing available projects failed: PERMISSION_DENIED: Cloud Resource Manager API has not been used in project 1055703200677 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/cloudresourcemanager.googleapis.com/overview?project=1055703200677 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.
- '#type': type.googleapis.com/google.rpc.Help
links:
- description: Google developers console API activation
url: https://console.developers.google.com/apis/api/cloudresourcemanager.googleapis.com/overview?project=1055703200677
Enter project id you would like to use: concise-hello-122320
Your current project has been set to: [concise-hello-122320].
Do you want to configure a default Compute Region and Zone? (Y/n)? n
Your Google Cloud SDK is configured and ready to use!
* Commands that require authentication will use 1055703200677-compute#developer.gserviceaccount.com by default
* Commands will reference project `concise-hello-122320` by default
Run `gcloud help config` to learn how to change individual settings
This gcloud configuration is called [default]. You can create additional configurations if you work with multiple accounts and/or projects.
Run `gcloud topic configurations` to learn more.
Some things to try next:
* Run `gcloud --help` to see the Cloud Platform services you can interact with. And run `gcloud help COMMAND` to get help on any gcloud command.
* Run `gcloud topic --help` to learn about advanced features of the SDK like arg files and output formatting
3. SSHing to gcloud
tahir#NX00510:~/www/helloworld$ gcloud compute ssh root#production
No zone specified. Using zone [us-central1-b] for instance: [production].
root#compute.1487950061407628967's password:
I don't know which password should I enter here, also I believe it should not ask for password in the first place because I have used json key file for authentication.
Could you guys please help me out to fix this.
Thanks !

Related

Pulumi automation backend

I am a newbie in pulumi. I am having an issue. When I do pulumi login in GCP backend It appears an error:
stderr: error: getting secrets manager: passphrase must be set with
PULUMI_CONFIG_PASSPHRASE or PULUMI_CONFIG_PASSPHRASE_FILE environment
variables
When I do pulumi logout the deployment works - pulumi api automation. Does anyone have an idea how to fix this?
Tried to set pulumi_config_passphrase.
When using the self-managed backends for Pulumi, you need to provide a pass phrase to encrypt secret values.
This can be done by setting a global environment variable which will depend on the operating system you're using. In Unix like environments (eg MacOs or Linux) you can do:
export PULUMI_CONFIG_PASSPHRASE=<a password you can remember>
In Windows on Powershell this can be done using:
$env:PULUMI_CONFIG_PASSPHRASE=<a password you can remember>
If you don't wish to use a passphrase, you can leverage the Pulumi service as your state store, or configure a cloud secrets provider.
This is done when initializing your stack, more information on that can be found here

ERROR: (gcloud.compute.ssh) Could not fetch resource: - Insufficient Permission

I am having trouble working through the Compute Engine Quickstart: Build a to-do app with a MongoDB tutorial. (edit: I am running the tutorial from within the compute engine console; i.e. https://console.cloud.google.com/compute/instances?project=&tutorial=compute_quickstart)
I SSH into the backend instance. I enter the "gcloud compute" command as copied from the tutorial. I am prompted to enter a passphrase. The following is returned:
WARNING: The public SSH key file for gcloud does not exist.
WARNING: The private SSH key file for gcloud does not exist.
WARNING: You do not have an SSH key for gcloud.
WARNING: SSH keygen will be executed to generate a key.
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in
...
<< Identifying detail ommitted >>
...
**ERROR: (gcloud.compute.ssh) Could not fetch resource:
- Insufficient Permission**
I had run through this stage of the tutorial on a previous occasion with no problems.
I am working from a Windows 10 PC with the google-cloud-sdk installed. I am using google chrome. I have tried in both regular and incognito modes.
Any help or advice greatfully received!
DaveDub
It looks like the attempt to SSH is recognising the instance in your project, but the user doesn't have the required permissions to access the machine.
Have you tried running:
gcloud auth login
and completing the web-based authorization to ensure you are attempting to access the machine as the correct (authenticated) user? This process ensures the Cloud SDK you are running inherits the permissions of the user specified in the web-based authorisation. See here for more information on this.
It's also worth adding the link to the tutorial you are following to your question.
Besides the accepted answer, be sure you are in the correct gcloud project
gcloud projects list
Then
gcloud config set project <your-project>
I just ran into this for yet another reason. Google has always had poor handling of multi-user auth conflicts with their business products. Whatever you sign into a clean chrome session with 'first' gets a 'special', invisible role. I've noticed with gsuite that I get 'forced' into that first user when I try to access the admin panel, and the only way to escape is to make sure that whatever google user I use for the gsuite admin is 'first', or open an incognito window. I've seen this bug for years, can't believe it still exists.
Anyways, I ran into a similar issue. Somehow I was the wrong google user, so the link I got when copy/pasting out of 'connect with gcloud command' was implying wrong google user. Only noticed later when I just gave up and used the terminal that I was not my normal user... So, might look into that.

Google cloud dataproc failing to create new cluster with initialization scripts

I am using the below command to create data proc cluster:
gcloud dataproc clusters create informetis-dev
--initialization-actions “gs://dataproc-initialization-actions/jupyter/jupyter.sh,gs://dataproc-initialization-actions/cloud-sql-proxy/cloud-sql-proxy.sh,gs://dataproc-initialization-actions/hue/hue.sh,gs://dataproc-initialization-actions/ipython-notebook/ipython.sh,gs://dataproc-initialization-actions/tez/tez.sh,gs://dataproc-initialization-actions/oozie/oozie.sh,gs://dataproc-initialization-actions/zeppelin/zeppelin.sh,gs://dataproc-initialization-actions/user-environment/user-environment.sh,gs://dataproc-initialization-actions/list-consistency-cache/shared-list-consistency-cache.sh,gs://dataproc-initialization-actions/kafka/kafka.sh,gs://dataproc-initialization-actions/ganglia/ganglia.sh,gs://dataproc-initialization-actions/flink/flink.sh”
--image-version 1.1 --master-boot-disk-size 100GB --master-machine-type n1-standard-1 --metadata "hive-metastore-instance=g-test-1022:asia-east1:db_instance”
--num-preemptible-workers 2 --num-workers 2 --preemptible-worker-boot-disk-size 1TB --properties hive:hive.metastore.warehouse.dir=gs://informetis-dev/hive-warehouse
--worker-machine-type n1-standard-2 --zone asia-east1-b --bucket info-dev
But Dataproc failed to create cluster with following errors in failure file:
cat
+ mysql -u hive -phive-password -e '' ERROR 2003 (HY000): Can't connect to MySQL server on 'localhost' (111)
+ mysql -e 'CREATE USER '\''hive'\'' IDENTIFIED BY '\''hive-password'\'';' ERROR 2003 (HY000): Can't connect to MySQL
server on 'localhost' (111)
Does anyone have any idea behind this failure ?
It looks like you're missing the --scopes sql-admin flag as described in the initialization action's documentation, which will prevent the CloudSQL proxy from being able to authorize its tunnel into your CloudSQL instance.
Additionally, aside from just the scopes, you need to make sure the default Compute Engine service account has the right project-level permissions in whichever project holds your CloudSQL instance. Normally the default service account is a project editor in the GCE project, so that should be sufficient when combined with the sql-admin scopes to access a CloudSQL instance in the same project, but if you're accessing a CloudSQL instance in a separate project, you'll also have to add that service account as a project editor in the project which owns the CloudSQL instance.
You can find the email address of your default compute service account under the IAM page for your project deploying Dataproc clusters, with the name "Compute Engine default service account"; it should look something like <number>#project.gserviceaccount.com`.
I am assuming that you already created the Cloud SQL instance with something like this, correct?
gcloud sql instances create g-test-1022 \
--tier db-n1-standard-1 \
--activation-policy=ALWAYS
If so, then it looks like the error is in how the argument for the metadata is formatted. You have this:
--metadata "hive-metastore-instance=g-test-1022:asia-east1:db_instance”
Unfortuinately, the zone looks to be incomplete (asia-east1 instead of asia-east1-b).
Additionally, with running that many initializayion actions, you'll want to provide a pretty generous initialization action timeout so the cluster does not assume something has failed while your actions take awhile to install. You can do that by specifying:
--initialization-action-timeout 30m
That will allow the cluster to give the initialization actions 30 minutes to bootstrap.
By the time you reported, it was detected an issue with cloud sql proxy initialization action. It is most probably that such issue affected you.
Nowadays, it shouldn't be an issue.

error opening a port in Google Compute Engine

I am trying to open a port via ssh in my VM instance in Google Compute engine but I keep getting error messages.
Here is my command:
myname#instance-2:~$ gcloud compute firewall-rules create baasbox-console-port --allow tcp:9000 --source-range
s=0.0.0.0/0
here is the error message:
NAME NETWORK SRC_RANGES RULES SRC_TAGS TARGET_TAGS
ERROR: (gcloud.compute.firewall-rules.create) Some requests did not succeed:
- Insufficient Permission
pls what am i doing wrong?
gcloud auth login
Go to the following link in your browser:
(Cut and past the link into your browser address bar)
For me (Ubuntu 14.04) this does not return a verification code on FireFox, use Chromium. You should get a long string of characters as a verification code. Cut and past this into the terminal. I would then see this:
ERROR: There was a problem with web authentication.ERROR: (gcloud.auth.login) invalid_grant
After several tries of generating the code and pasting it, I copied the code and the trailing colon(:) then it worked
You need to do either of the following:
run gcloud auth login in your instance, or
when you create your VM, you need to give it read-write access to Google Cloud Platform APIs by adding the compute-rw scope as follows:
gcloud compute instances create $VM --scopes compute-rw [...]
See the gcloud compute instances create docs for more info.

How do I configure Google BigQuery command line tool to use a Service Account?

I've created a service account using the Google API Console and wish to use this service account with the Google BigQuery CLI (bq) tool.
I've been using the command line tool to successfully access the BigQuery service using my valid OAuth2 credentials in ~/.bigquery.v2.token, however I can't seem to find any documentation on how to modify this file (or otherwise configure the tool) to use a service account instead.
Here is my current .bigquery.v2.token file
{
"_module": "oauth2client.client",
"_class": "OAuth2Credentials",
"access_token": "--my-access-token--",
"token_uri": "https://accounts.google.com/o/oauth2/token",
"invalid": false,
"client_id": "--my-client-id--.apps.googleusercontent.com",
"id_token": null,
"client_secret": "--my-client-secret--",
"token_expiry": "2012-11-06T15:57:12Z",
"refresh_token": "--my-refresh-token--",
"user_agent": "bq/2.0"
}
My other file: ~/.bigqueryrc generally looks like this:
project_id = --my-project-id--
credential_file = ~/.bigquery.v2.token
I've tried setting the credential_file paramater to the .p12 private key file for my service account but with no luck, it gives me back the following error
******************************************************************
** No OAuth2 credentials found, beginning authorization process **
******************************************************************
And asks me to go to a link in my browser to set up my OAuth2 credentials again.
The command line tools' initial configuration option "init":
bq help init
displays no helpful information about how to set up this tool to use a service account.
I ended up finding some documentation on how to set this up
$ bq --help
....
--service_account: Use this service account email address for authorization. For example, 1234567890#developer.gserviceaccount.com.
(default: '')
--service_account_credential_file: File to be used as a credential store for service accounts. Must be set if using a service account.
--service_account_private_key_file: Filename that contains the service account private key. Required if --service_account is specified.
(default: '')
--service_account_private_key_password: Password for private key. This password must match the password you set on the key when you created it in the Google APIs Console. Defaults to the default Google APIs Console private key password.
(default: 'notasecret')
....
You can either set these specifically on each bq (bigquery commandline client) request, ie:
$ bq --service_account --my-client-id--.apps.googleusercontent.com -- service_account_private_key_file ~/.bigquery.v2.p12 ... [command]
Or you can set up defaults in your ~/.bigqueryrc file like so
project_id = --my-project-id--
service_account = --my-client-id--#developer.gserviceaccount.com
service_account_credential_file = /home/james/.bigquery.v2.cred
service_account_private_key_file = /home/james/.bigquery.v2.p12
The service account can be found in the Google API Console, and you set up service_account_private_key_password when you created your service account (this defaults to "notasecret").
note: file paths in .bigqueryrc had to be the full path, I was unable to use ~/.bigquery...
Some additional dependencies were required, you will need to install openssl via yum/apt-get
--yum--
$ yum install openssl-devel libssl-devel
--or apt-get--
$ apt-get install libssl-dev
and pyopenssl via easy install/pip
--easy install--
$ easy_install pyopenssl
--or pip--
$ pip install pyopenssl
The bq authorization flags are now deprecated
bq documentation
1.) Tell gcloud to authenticate as your service account
gcloud auth activate-service-account \
test-service-account#google.com \
--key-file=/path/key.json \
--project=testproject
2.) Run a bq command as you would with your user account
# ex: bq query
bq query --use_legacy_sql=false 'SELECT CURRENT_DATE()'
3. optional) Revert gcloud authentication to your user account
gcloud config set account you#google.com
3a. optional) See who gcloud uses for authentication
gcloud auth list
The bq tool requires two configuration files, controlled by the --bigqueryrc and the --credential_file flag. If neither one is found, bq will attempt to automatically initialize during start up.
To avoid this for the --bigqueryrc file, you can place a ".bigqueryrc" file in the default location, or override it with --bigqueryrc to some writeable file path.
For anyone else who comes along struggling to use bq with a service account... I had a seriously hard time getting this to work inside of a CI/CD pipeline using the Google Cloud SDK docker images on gitlab-ci. Turns out the missing bit for me was making sure to set the default project. On my laptop gcloud was happy inferring the default project from the service account, but for some reason the version within the docker image was defaulting to a public free project.
- gcloud auth activate-service-account --key-file=${PATH_TO_SVC_ACCT_JSON};
- gcloud config set project ${GOOGLE_BIGQUERY_PROJECT}
after this I was able to use the bq utility as the service account. I imagine setting the default project in the .bigqueryrc file does the trick too, which is why the OP didn't run into this issue.