Detach Disk from GCE VM to mount and edit SSH - ssh

i am trying to detach disk on temp instance so i can mount and edit ssh_config but when i am using gcloud compute instances detach-disk INSTANCE --disk mydisk it shows
ERROR: (gcloud.compute.instances.detach-disk) There was a problem fetching the resource:
- Insufficient Permission
any suggestion? i'm new to google Cloud

You need to authorize your account before you can use your gcloud command. You can run the following command to authorize.
$ gcloud auth login

Related

Regularily loosing SSH connection rights on Hetzner Cloud

I have a strange issue regarding my setup login into my hetzner cloud via SSH.
initial situation
I have made a fresh SSH Key, added that to a fresh Hetzner Cloud solution and made the initial login into the cloud. I was able to access the cloud via terminal with the command ssh root#MY_IP
the issue
When I retry to access my server with ssh root#MY_IP a few days after I've made the setup, I get the following error message: root#MY_IP: Permission denied (publickey).
I haven't made any changes in the meantime, didn't to anything with the ssh connection, didn't created new ssh key, nothing. I don't understand why it just denies my connection try since it was working fine before.
Probably your ssh-agent was configured in a different shell?
Try listing your stored keys with
ssh-add -l
If you don't see the one you created for this specific machine/cluster, try adding it again with:
ssh-add <absolute_path_to_your_private_key>
If your agent is not even running, start it in the background with:
eval "$(ssh-agent -s)"

why are the folders i created via 2 different ssh sessions not visible to each other on the same cloud instance in GCP?

I'm logging into a Google Cloud instance, via:
SSH from my local machine
SSH from the cloud console
the local machine username is erjan222 and after SSH shows this:
erjan222#instance-1:~$
the gcloud username is erjcan and after SSH I see this:
erjcan#instance-1:~$
I mkdir in both sessions.
I created two folders, names are 'created_from_erjan222', 'created_from_erjcan'.
However when I do ls in either sessions, I should see both but I don't see them!
Each SSH session only can see the folder that I created it on it.

Google cloud platform, vm instance's ssh permission

In my Google Cloud Platform, vm instance, I accidentally changed the permission of /etc/ssh, and now I can't access it using ssh nor filezilla.
The log is as below:
###########################################################
# WARNING: UNPROTECTED PRIVATE KEY FILE! #
###########################################################
Permissions 0660 for '/etc/ssh/ssh_host_ed25519_key' are too open.
It is required that your private key files are NOT accessible by others.
This private key will be ignored.
key_load_private: bad permissions
The only thing I can access to is gcloud command or serial console.
I know I need to change the directory's permission back to 644 or 400, but I have no idea how as I can't access the ssh.
How do I change the permission without accessing ssh?
Any help would be much appreciated!
This problem can be solved by attaching the boot disk to another instance.
STEP 1:
Shutdown your instance with the SSH problem. Login into the Google Cloud Console. Go to Compute Engine -> VM instances. Click on your instance and make note of the "Boot disk" name. This will be the first disk under "Boot disk and local disks".
STEP 2:
Create a snapshot of the boot disk before doing anything further.
While still in Compute Engine -> Disk. Click on your boot disk. Click on "CREATE SNAPSHOT".
STEP 3:
Create a new instance in the same zone. A micro instance will work.
STEP 4:
Open a Cloud Shell prompt (this also works from your desktop if gcloud is setup). Execute this command. Replace NAME with your instance name (broken SSH system) and DISK with the boot disk name and ZONE with the zone that the system is in:
gcloud compute instance detach-disk NAME --disk=DISK --zone=ZONE
Make sure that the previous command did not report an error.
STEP 5:
Now we will attach this disk to the new instance that you created.
Make sure that the repair instance is running. Sometimes an instance can get confused on which disk to boot from if more than one disk is bootable.
Go to Compute Engine -> VM instances. Click on your instance. Click Edit. Under "Additional disks" click "Add item". For name enter/select the disk that you detached from your broken instance. Click Save.
STEP 6:
SSH into your new instance with both disks attached.
STEP 7:
Follow these steps carefully. We will mount the second disk to the root file system. Then change the permissions on the /mnt/repair/etc/ssh directory and contents.
Become superuser. Execute sudo -s
Execute df. Make sure that /dev/sdb1 is not mounted.
Create a directory for the mountpoint: mkdir /mnt/repair
Mount the second disk: mount /dev/sdb1 /mnt/repair
Change directories: cd /mnt/repair/etc
Set permissions for /etc/ssh (notice relative paths here): chmod 755 ssh
Change directories: cd ssh
Execute: chmod 644 *.pub
Execute: chmod 400 *key
ssh_config and sshd_config should still be 644. If not fix them too.
Shutdown the repair system: halt
STEP 8:
Now reverse the procedure and move the second disk back to your original instance and reattach. Start your instance and connect via SSH.
Note: To reattach the boot disk you have to use gcloud with the -boot option.
gcloud beta compute instances attach-disk NAME --disk=DISK --zone=ZONE --boot

Your credentials are invalid. Please run $ gcloud auth login

gsutil was working as a stand-alone on my system.
Then I installed SDK, including some authentication stuff.
Now gsutil says my credentials are invalid.
$ gcloud auth login wolfvolpi#gmail.com
WARNING: `gcloud auth login` no longer writes application default credentials.
If you need to use ADC, see:
gcloud auth application-default --help
You are now logged in as [redacted].
Your current project is [redacted]. You can change this setting by running:
$ gcloud config set project PROJECT_ID
$ gsutil ls
Your credentials are invalid. Please run
$ gcloud auth login
How to set my credentials so that gsutil runs again?
$ gcloud version
Google Cloud SDK 146.0.0
core 2017.02.28
core-nix 2017.02.28
gcloud
gcloud-deps 2017.02.28
gcloud-deps-linux-x86_64 2017.02.28
gsutil 4.22
gsutil-nix 4.22
If you previously had gsutil configured with credentials, it is possible that it is picking up those old credentials that will no longer work in the Cloud SDK mode. Take a look in your boto file (typically at ~/.boto) and remove any credentials found in there.
You can see the list of the valid fields in the boto file here: https://cloud.google.com/storage/docs/gsutil/commands/config#additional-configuration-controllable-features
The marked solution did not work sometimes. But You can create a new configure setting by using the
gcloud init
Welcome! This command will take you through the configuration of gcloud.
Settings from your current configuration [old config file's name] are:
core:
account: [account]
disable_usage_reporting: 'True'
project: [projectname]
Pick configuration to use:
[1] Re-initialize this configuration [old config file's name] with new settings
[2] Create a new configuration
[3] Switch to and re-initialize existing configuration: [default]
Please enter your numeric choice:
Then you can use option 2 to create new configuration. Try this will work these who not able to done using the above solution.
Also this will work when someone cannot use gsutil cors set [filename].json gs://[BucketName] command to set CORS.
Then try again.

ssh google compute engine instance from ubuntu

I am not able to access the google compute engine instance using ssh or gcutil ssh. I have tried adding my local machine keys into metadata and shh keys of the specific instance. How to achieve access using a shh-client?
this is the link to the guides i followed.
Google Compute Engine - troubleshooting SSH "Connection refused"
https://cloud.google.com/compute/docs/instances/connecting-to-instance#standardssh
the official way is google compute ssh [instance-name]
If this isn't what you did to get the error, let me know and I'll try.
Just to let you know Google Compute Engine recommends that all users transition to the gcloud compute tool from gcutil. gcloud compute is a unified command-line tool that features a number of improvements over gcutil link.
To connect to Google compute Engine you need tor run the below command in your cloud Shell:
gcloud compute ssh [INSTANCE_NAME]
Here a the public documentation, If you want to ssh from inside an instance just run the same command line above.
The trick here is to use the -C (comment) parameter to specify your GCE userid.
If the Google user who owns the GCE instance is myname#gmail.com (which you will use as your login userid), then generate the key pair with (for example)
ssh-keygen -b521 -t ecdsa -C myname -f mykeypair
When you paste mykeypair.pub into the instance's public key list, you should see "myname" appear as the userid of the key.
Setting this up will let you use ssh, scp, etc from your command line.