gsutil AccessDenied Exception - Login Required in both API Explorer and Service Account - authentication

I am using gsutil with a service account to download a file from a gs bucket. I ran it once an hour ago and it worked without a problem, and then I ran the same code again and I get an AccessDeniedException: 401 Login Required.
To get more information, I run the command with a -D flag, like:
gsutil -D cp gs://mybucket/file localpath
In the debug output, I look at:
Command being run: /path/to/gsutil
config_file_list: /path/to/boto/config
Case 1: Running gsutil under a service account or as another user
I configured a service account using the executable at /path/to/gsutil, not gcloud auth or any other gsutil executable on the machine, using:
/path/to/gsutil config -e
This created a .boto config file in my home directory, $HOME/.boto, which I then moved to another location to refer to by the service.
Since I'm running the service as another user, I need to reference the newly-created .boto file. I set the environment variable BOTO_CONFIG:
BOTO_CONFIG=/path/to/$HOME/.boto
export BOTO_CONFIG
I can confirm that I am referencing the correct config file by looking at the config_file_list variable in the gsutil -D command's output.
To set up the service account, I followed:
https://console.cloud.google.com/permissions/serviceaccounts
The key file from the service account set-up process was downloaded, and the path to it is included during the gsutil config -e step.
However, the response to running the gsutil command from the service account is AccessDeniedException: 401 Login Required.
Case 2: API Explorer
On further exploration, I used the web-based API explorer storage.objects.get at:
https://developers.google.com/apis-explorer/#p/storage/v1/storage.objects.get
to get the object's "mediaLink". When I click the mediaLink, I still receive a response that says "Login Required", despite being logged into my google account on the browser.
Why am I receiving the above "Login Required" responses in both cases, when I have set up the service account as instructed in Case 1, and am logged into the API explorer in Case 2?

I was able to solve this problem by looking at the read permissions on the .boto file. The file that was created by
gsutil config -e
only had read permissions set for the current user. Since it was being read by a service running as a different user, it wasn't able to read the file and yielding a 401 Login Required error. I fixed it by adding read permissions for the service's group.
In the least sophisticated case, you could fix it by giving any user read permission with
chmod a+r .boto

Related

How to create symbolic link for Google Cloud SDK gcloud directory on a NAS drive?

The problem: we have several servers that need to reference the same Google Cloud SDK credentials and we want to reference those credentials from a central location. What is the easiest way to share these credentials between several servers?
What we tried: we tried to create soft and hard symbolic links to the gcloud directory but we did not have success, in both cases we received the following error message:
C:\Users\Redacted\AppData\Local\Google\Cloud SDK> bq ls
WARNING: Could not open the configuration file: [C:\Users\Redacted\AppData\Roaming\gcloud\configurations\config_default].
ERROR: (bq) You do not currently have an active account selected.
Found the answer... use the below commands for the two files below to make a soft symbolic link to centralize credential files, not the gcloud directory itself:
mklink access_tokens.db
mklink credentials.db

Create an SSH key for other account on Google Cloud Platform

I have installed the Cloud SDK for Google Cloud. I've logged in using auth which redirected me to the gmail-login. Created the SSH key and even logged in by SFTP using Filezilla.
The problem is, when I log in using the gmail auth, SDK shell (or putty?) logs me into an account that is not admin. It has created another SSH user account (named 'Acer', after my pc) and logs me into it. Due to this, FTP starts at the /home/Acer folder. I want access to the /home/admin/web folder, but I don't have it now.
How can I create a SSH key for the admin account so that I can gain access to the folder mentioned above? Otherwise, is it possible to grant 'Acer' the permissions to access all the folders?
I have a few suggestions.
First a bit of background. If you run this command on your home workstation:
sudo find / -iname gcloud
You'll discover a gcloud configuration folder for each user on your home workstation. You'll probably see something like this:
/root/.config/gcloud
/home/Acer/.config/gcloud
If you change directory into /home/Acer/.config/gcloud/configurations you'll see a file named 'config_default'. This file will contain the default account to use for that user ('Acer').
Because you have performed gcloud auth login as that user, and during that process selected your gmail account, it will contain that gmail ID/account within the config file for that user. If you would like a user named 'admin' to log into your project, you could try adding a user named 'admin' to your home workstation, and then before attempting to use gcloud auth login, ensure you switch user on your home workstation to user 'admin'. This will generate a gcloud configuration on your home workstation for user admin, and propagate SSH keys etc.
If you want to create ssh keys manually there's some useful info here.
(For what it's worth, if you decide to use gcloud compute ssh to log into your instance home workstation, you can specify the user in the command you would like to log in as. For example gcloud compute ssh admin#INSTANCE_NAME).
I want access to the /home/admin/web folder, but I don't have it now.
Even if you are logged into the machine as a different user (in this case 'Acer'), the folder /home/admin/web should still exist on the instance if it existed previously. If you land in folder /home/Acer have you tried changing directory to the folder above and then listing the folders to see if /home/admin/ exists?
For example, from /home/Acer run:
$ cd ..
then
$ ls
You should be able to see /home/admin/.
Otherwise, is it possible to grant 'Acer' the permissions to access
all the folders?
Yes this is also possible. If you access the instance as the project owner (the easiest way would be to log into the Console as the owner of the project and use the SSH functionality in the console to access the instance). Now you can run this command:
$ sudo chown Acer.Acer -R /home/admin/web
This will make user 'Acer' owner of directory /home/admin/web and all files/directories below it (thanks to the -R switch).
Now when you next access the instance as user 'Acer' you'll be able to access /home/admin/web by running the following and you'll also have read/write capabilities:
$ cd /home/admin/web

Your credentials are invalid. Please run $ gcloud auth login

gsutil was working as a stand-alone on my system.
Then I installed SDK, including some authentication stuff.
Now gsutil says my credentials are invalid.
$ gcloud auth login wolfvolpi#gmail.com
WARNING: `gcloud auth login` no longer writes application default credentials.
If you need to use ADC, see:
gcloud auth application-default --help
You are now logged in as [redacted].
Your current project is [redacted]. You can change this setting by running:
$ gcloud config set project PROJECT_ID
$ gsutil ls
Your credentials are invalid. Please run
$ gcloud auth login
How to set my credentials so that gsutil runs again?
$ gcloud version
Google Cloud SDK 146.0.0
core 2017.02.28
core-nix 2017.02.28
gcloud
gcloud-deps 2017.02.28
gcloud-deps-linux-x86_64 2017.02.28
gsutil 4.22
gsutil-nix 4.22
If you previously had gsutil configured with credentials, it is possible that it is picking up those old credentials that will no longer work in the Cloud SDK mode. Take a look in your boto file (typically at ~/.boto) and remove any credentials found in there.
You can see the list of the valid fields in the boto file here: https://cloud.google.com/storage/docs/gsutil/commands/config#additional-configuration-controllable-features
The marked solution did not work sometimes. But You can create a new configure setting by using the
gcloud init
Welcome! This command will take you through the configuration of gcloud.
Settings from your current configuration [old config file's name] are:
core:
account: [account]
disable_usage_reporting: 'True'
project: [projectname]
Pick configuration to use:
[1] Re-initialize this configuration [old config file's name] with new settings
[2] Create a new configuration
[3] Switch to and re-initialize existing configuration: [default]
Please enter your numeric choice:
Then you can use option 2 to create new configuration. Try this will work these who not able to done using the above solution.
Also this will work when someone cannot use gsutil cors set [filename].json gs://[BucketName] command to set CORS.
Then try again.

Revoking access to gsutil OAuth Token

we had configured standalone gsutil on a remote server, however we do not have access to the server anymore. How do we revoke access provided to gsutil on that server. The .boto file will have the refresh Oauth2.0 token.
we do not have access to the server and so cannot remove .boto file.
The project configured is active in our console but we cannot see any specific access in permissions section.
A standalone gsutil script was installed (not gcloud).
Use gcloud auth revoke.
https://cloud.google.com/sdk/gcloud/reference/auth/revoke
The .gsutil directory just gets recreated within 10 seconds for me.
OK we can revoke access to gsutil from account permissions through this link:
https://security.google.com/settings/security/permissions
screenshot for Google Security Permissions Page
Just remove credstore files.
rm -rf ~/.gsutil/

Plesk Cron jobs and FTP - who is the owner for file access?

Trying to setup a Cron task that gets a file via FTP however seems to fail due to file permissions.
Code runs perfect in the browser, ie when apache is the owner, however fails when Cron runs the same page.
I'm assuming this is a directory/file permission error, if so who should I set the directory owner too for Cron jobs?
Most likely Dan's thought is going to be your problem. However if it works from a browser you can also call the page like this:
wget -q "http://www.domain.com/path/to/script/script.whatever" >/dev/null 2>&1
if you still get errors you can remove the >/dev/null 2>&1 part & [if your email address is in the domain administrator account correctly] output, including errors should get emailed to you.
As for the correct permissions, don't change the default plesk ones or you will get issues with normal ftp.
Defaults are:
everything under httpdocs = ftpuser.psacln
anything written by php/apache = apache.apache ~ unless you are running php as a cgi on that domain,, then they will belong to the ftp user as well.
-sean
cron jobs will run as the user that created them. More likely than a permissions error is a path error. If you're not specifying full absolute paths to the program/script to run, and to any files you reference, you'll likely have problems as cron won't have the same PATH in its environment as Apache does or you do at your shell prompt.