How do I solve this access error from hal config provider appengine? - spinnaker

I get get an error when run the command blow. Which resources is that I am denied access to? How to I get access to these resources?
Steins-MacBook-Pro-2:Spinnaker stein$ hal config provider appengine
account add my-appengine-account --project $GCP-PROJECT-ID
--write-permissions=[]
Get current deployment Success WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by
com.fasterxml.jackson.databind.util.ClassUtil
(file:/opt/halyard/lib/jackson-databind-2.9.7.jar) to constructor
java.lang.Void() WARNING: Please consider reporting this to the
maintainers of com.fasterxml.jackson.databind.util.ClassUtil WARNING:
Use --illegal-access=warn to enable warnings of further illegal
reflective access operations WARNING: All illegal access operations
will be denied in a future release
Add the my-appengine-account account Failure Problems in default.provider.appengine.my-appengine-account: ! ERROR Failed to
connect to appengine Admin API: 403 Forbidden { "code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "The caller does not have permission",
"reason" : "forbidden" } ], "message" : "The caller does not have permission", "status" : "PERMISSION_DENIED" }.
Failed to add account my-appengine-account for provider appengine. Steins-MacBook-Pro-2:Spinnaker stein$ hal config provider appengine account list
Get current deployment Success
Get the appengine provider Success Problems in default.provider.appengine:
WARNING Provider appengine is enabled, but no accounts have been configured.
No configured accounts for appengine.

WHat was missing was the right data in $GCP_PROJECT-ID
I change the $GCP_PROJECT-ID to project-235906 and then it is working

Related

Radixx Api Integration

I have been given with the credentials and endpoint from Radixx but when I used to use the endpoint to get security token . I am getting this error :
Additional Source Information: ERR9999: Unexpected error in GetSecurityGUID.Error Generation Source Location: DSAIXUAT1APP01 Session ID: -1 Error Details: Your user ID and Password combination did not validate.

Moving data from Google Big Query to Azure Data Lake Store using Azure Data Factory

I have a scenario where I need to connect the Azure Data Factory (v2) in Google Big Query for to move data to my Azure Data Lake but I don't work.
When I create a Linked Service, I choose Big Query as Source, I put all informations about BigQuery, as project-name, type of connection, etc but when I click in Validade button a message is show for me: (for exemple)...
UserError: ERROR [28000] [Microsoft][BigQuery] (80) Authentication
failed: invalid_grant ERROR [28000] [Microsoft][BigQuery] (80)
Authentication failed: invalid_grant'Type=,Message=ERROR [28000]
[Microsoft][BigQuery] (80) Authentication failed: invalid_grant ERROR
[28000] [Microsoft][BigQuery] (80) Authentication failed:
invalid_grant,Source=,', activityId:
05ff5ce1-5262-4794-853c-c9e39b7f4b71
Any idea? Someone already tested this connector?
Tks.
Peeter Bonomo
The documentation for the ADF connector to BigQuery explains what the parameters mean, but it doesn't give any guidance on how to obtain the values for those parameters.
I spent a couple of days on this and finally got it to work. I used "User Authentication" because I wanted to use a cloud-based IR. The "Client Id" and "Client Secret" can be obtained by creating new credentials in the GCP Console. But to get the "Refresh Token", you have to do the OAuth2 dance with Google and intercept the token.
I put together a PowerShell script to do this for you and wrote up a post on GitHub that walks you through the process of authenticating ADF v2 to Google BigQuery:
https://github.com/AnalyticJeremy/ADF_BigQuery
This is the error you get for any access issues unfortunately. It's also the error you get when your refresh token has expired which it always does after 60 minutes. Which is incredibly curious....this like so many sets of instructions on OAuth2.0 authentication for ADF never mention all this work is to get a code that expires in 60 minutes. Without some method of refreshing it everytime you connect this is worthless. At least the following link mentions this error and you get it because the token has expired...its the only blog post (or Microsoft documentation) that bothers to mention this super big issue. https://medium.com/#varunck/data-transfer-from-google-bigquery-to-azure-blob-storage-using-azure-data-factory-adf-80659c078f83
Here's a different method which is what I will be attempting...it uses service account and IR https://www.youtube.com/watch?v=oRqRt7ya_DM
According to https://learn.microsoft.com/en-us/azure/data-factory/connector-google-bigquery, to connect to Google BigQuery via cloud IR, you need to provide the below information:
{
"name": "GoogleBigQueryLinkedService",
"properties": {
"type": "GoogleBigQuery",
"typeProperties": {
"project" : "<project ID>",
"additionalProjects" : "<additional project IDs>",
"requestGoogleDriveScope" : true,
"authenticationType" : "UserAuthentication",
"clientId": "<id of the application used to generate the refresh token>",
"clientSecret": {
"type": "SecureString",
"value":"<secret of the application used to generate the refresh token>"
},
"refreshToken": {
"type": "SecureString",
"value": "<refresh token>"
}
}
}
}
and that the user you use to grant permission to the application should have access to the project you want to query.
Thanks,
Eva

Google OAuth1 migration to OAuth2

Please don't confuse this question with this one: Migration from OAuth1 3L to OAuth2:
I have been migrating my system users from google OAuth1 to OAuth2 as specified here.
The usual problem with this migrations is the building of the base_string which was a problem for me before asking this question. After fixing the base_string build up I did migrate 95% of my users but a small amount of users kept returning a 400 error:
{
"error": "invalid_request",
"error_description": "Invalid authorization header."
}
Here are the most important fields to check on:
base_string:
POST&https%3A%2F%2Fwww.googleapis.com%2Foauth2%2Fv3%2Ftoken&client_id%3DXXX%26client_secret%3DXXX%26grant_type%3Durn%253Aietf%253Aparams%253Aoauth%253Agrant-type%253Amigration%253Aoauth1%26oauth_consumer_key%3DXXX%26oauth_nonce%3D178143337915967474871427127026%26oauth_signature_method%3DHMAC-SHA1%26oauth_timestamp%3D1427127026%26oauth_token%3D1%XXX
URL: https://www.googleapis.com/oauth2/v3/token
I have to say, this users do work with OAuth1 at the moment so they are fully valid connections. An invalid/expired connection returns me this 500 error:
{
"error": "invalid_token",
"error_description": "Either the token is invalid or we could not decode it."
}

Google Big-query api 403-Forbidden Exception

I am getting the following JSON exception while I am executing the query using the java application for Big-Query API. Following is the exception I am getting:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "accessDenied",
"message": "Access Denied: Job eco-span-847:job_LyHmZIvlY1_0J8JQ3pxThEBf19I: The user does not have permission to run a query in project eco-span-847"
}
],
"code": 403,
"message": "Access Denied: Job eco-span-847:job_LyHmZIvlY1_0J8JQ3pxThEBf19I: The user does not have permission to run a query in project eco-span-847"
}
}
Why is this occurring and how could I resolve this?
To insert a query job in a project the calling user must be at least a Reader on the project.
When using the BigQuery API, you need to construct the client with OAuth credentials for some user. For programmatic access, this is often a Service Account identity. When you create a new Service Account, that account is not automatically added a membership role to your project.
To update the users and service accounts that are members of your project, go to https://console.developers.google.com/ for your project, select "Permissions" in the navigation panel, and make sure the user or service account identity you are calling with is a "Reader" on the project.
Check email/service account is it added in google cloud's IAM&Admin.
Verify the role/permission to use certain api. access control
If above are not met, communicate with the Owner/Admin of the project to add them in.

Server security configuration for WLClientLogReceiver

I have a Worklight app and I am trying to set up an adapter to capture client-side logs from it. (I have first tried to use Analytics, but it keeps crashing with PermGen out of memory error, perhaps I will have to look at it if this fails.) I followed the steps described in http://www-01.ibm.com/support/knowledgecenter/#!/SSZH4A_6.2.0/com.ibm.worklight.dev.doc/devref/c_uploaded_client_log_data.html up to "Server security". I have no idea how to actually configure the server realm/security check/etc. for the log uploader servlet. Currently it returns this error (both on development and production server):
[ERROR ] FWLSE0059E: Login into realm 'WLRemoteDisableNullLoginModule' failed. Cannot find application 'null'. [project Project]
Cannot find application 'null'
[ERROR ] FWLSE0117E: Error code: 4, error description: AUTHENTICATION_ERROR, error message: An error occurred while performing authentication using loginModule WLRemoteDisableNullLoginModule, User Identity {wl_directUpdateRealm=null, wl_authenticityRealm=null, Project=(name:2, loginModule:ProjectLoginModule), wl_remoteDisableRealm=null, SampleAppRealm=null, wl_antiXSRFRealm=null, wl_deviceAutoProvisioningRealm=null, WorklightConsole=null, wl_deviceNoProvisioningRealm=null, myserver=(name:2, loginModule:ProjectLoginModule), wl_anonymousUserRealm=null}. [project Project] [project Project]
[ERROR ] FWLSE0059E: Login into realm 'WLRemoteDisableNullLoginModule' failed. Cannot find application 'null'. [project Project]
Cannot find application 'null'
[ERROR ] FWLSE0117E: Error code: 4, error description: AUTHENTICATION_ERROR, error message: An error occurred while performing authentication using loginModule WLRemoteDisableNullLoginModule, User Identity {wl_directUpdateRealm=null, wl_authenticityRealm=null, Project=(name:2, loginModule:ProjectLoginModule), wl_remoteDisableRealm=null, SampleAppRealm=null, wl_antiXSRFRealm=null, wl_deviceAutoProvisioningRealm=null, WorklightConsole=null, wl_deviceNoProvisioningRealm=null, myserver=(name:2, loginModule:ProjectLoginModule), wl_anonymousUserRealm=null}. [project Project] [project Project]
[ERROR ] com.worklight.core.messages:Invoke procedure failed due to: null
[ERROR ] com.worklight.core.messages:Invoke procedure failed due to: null
I tried uncommenting the customTests section in authenticationConfig.xml containing the wl_remoteDisableRealm, but to no avail.
How should this be configured?
I see from your comment you got it working. We did not want to duplicate documentation for authenticationConfig.xml and risk it getting out of sync on the "Server preparation for uploaded log data" KnowledgeCenter topic page in the "Server security" section. That said, we should have provided a link to the Worklight Security Framework topic page.
There is nothing special or unique about the configuration for log receiver servlet in the context of security. The point being made in that section is that if you configure authenticationConfig.xml so that security issues challenges to the app that requires user interaction, you should
send logs only when you are sure your are already authenticated, or
change security constraints such that the log upload servlet URL authentication does not require user interaction
If you leave these in place, the risk is that the end-user will see a random prompt for credentials when they do not expect it.
The reason the "Server preparation for uploaded log data"