"server unavailable" error when trying to authenticate user on Google Colab - google-colaboratory

I'm getting a server unavailable error whenever I try to authenticate user on colab:
My code is simply
from google.colab import auth
auth.authenticate_user()
And I'm getting a
WARNING:google.auth.compute_engine._metadata:
Compute Engine Metadata server unavailable on attempt 1 of 3.
Reason: [Errno 115] Operation now in progress
I wonder if Google has changed anything

Related

DataFlow :missing required authentication credential

I am getting following error while running DataFlow pipeline
Error reporting inventory checksum: code: "Unauthenticated", message: "Request is missing required authentication credential.
Expected OAuth 2 access token, login cookie or other valid authentication credential.
We have created service account dataflow#12345678.iam.gserviceaccount.com with following roles
BigQuery Data Editor
Cloud KMS CryptoKey Decrypter
Dataflow Worker
Logs Writer
Monitoring Metric Writer
Pub/Sub Subscriber
Pub/Sub Viewer
Storage Object Creator
And in our python code we are using import google.auth
Any idea what am I missing here ?
I do not believe I need to create key for SA , however I am not sure if "OAuth 2 access token" for SA need to be created ? If yes how ?
This was the issue in my case https://cloud.google.com/dataflow/docs/guides/common-errors#lookup-policies
If you are trying to access a service through HTTP, with a custom request (not using a client library), you can obtain a OAuth2 token for that service account using the metadata server of the worker VM. See this example for Cloud Run, you can use the same code snippet in Dataflow to get a token and use it with your custom HTTP request:
https://cloud.google.com/run/docs/authenticating/service-to-service#acquire-token

Bigquery.client() thorws Oauth connection time out error

I am trying to connect bigquery via Google. I have service account for my project and while trying to access query I am getting time out error.
Error : connection to oauth2.googleapis.com time out
Code snippet:
from google.cloud import Bigquery
client =bigquery.Client()
client.query("")

Jupyter Notebook cannot access Big Query

I've created my first Jupyter Notebook in Google cloud. I've followed the instructions here
https://cloud.google.com/ai-platform/notebooks/docs/use-r-bigquery
to use R with BQ. However, when I try to run the code I keep getting
"Error: Access Denied: Project ABC: User does not have bigquery.jobs.create permission in project ABC. [accessDenied] " where ABC is my project ID.
I've added BigQuery User & Admin permissions, logout, and logged back in, but keep getting the same message.
I recommend you to verify the Manage access using IAM roles documentation where is mentioned how to add the required permissions to access to the dataset in BigQuery.
Due to the error message:
Error: Access Denied: Project ABC: User does not have bigquery.jobs.create permission in project ABC. [accessDenied] " you need to add the permission bigquery.jobs.create to the user or service account used.
In the Compute Engine menu you can check the service account used in the Notebook, confirm that that service account has all the permissions mentioned above or the BigQuery Admin role mentioned in your question.

How can I access to https://storage.googleapis.com/download.tensorflow.org/models/inceptions5h.zip

As far as I know,$ wget https://storage.googleapis.com/download.tensorflow.org/models/inceptions5h.zip should allow me to do so, but I get the following error:
Resolving storage.googleapis.com (storage.googleapis.com)... 216.58.220.208, 2404:6800:4005:80d::2010
Connecting to storage.googleapis.com (storage.googleapis.com)|216.58.220.208|:443... connected.
HTTP request sent, awaiting response... 403 Forbidden
2018-05-02 16:05:23 ERROR 403: Forbidden.
Your request doesn't appear to be authenticated.
cloudstorage.ForbiddenError
This error (403) indicates that the user was not authorized by Google Cloud Storage to make the request.
The various possible causes for this error are listed in the Google Cloud Storage error documentation for 403-Forbidden.
A common source of this error is that the bucket permissions (bucket ACL) are not set properly to allow your app access. See Google Cloud Storage Authentication for information on setting up access.

Magento1 REST API Access Denied

I am getting permission denied issue in 1.9 with REST API even though added all required Roles and Permissions.
Its working for guest users and getting the JSON result.
I can take customers via url api/rest/products?limit=1 without any authentication if i enabled Guest permission.
Same time its working with oAuth for a valid admin used.
But if i disabled guest permissions its not working for a valid admin user, showing the permission denied message.
When i check the access log, i can see like below
exception 'Mage_Api2_Exception' with message 'Access denied' in /var/www/html/app/code/core/Mage/Api2/Model/Server.php:217
Stack trace: #0 var/www/html/app/code/core/Mage/Api2/Model/Server.php(106): Mage_Api2_Model_Server->_allow(Object(Mage_Api2_Model_Request), Object(Mage_Api2_Model_Auth_User_Guest))
#1 /var/www/html/api.php(73): Mage_Api2_Model_Server->run()
is it because each API request via oAuth treats in Guest mode ?
How are you testing? Are you absolutely sure that you are indeed running an authorised request when disabling guest permission?
Maybe you can do some step-by-step debug in the class method _allow of Mage_Api2_Model_Server.