Unable to create a connection (JWT/OAuth) with Google Bigquery connector - google-bigquery

I am facing connection issues with Google Bigquery connector version 1.0.0 newly launched by MuleSoft in Mar 2022.
I've created a service account as well as an OAuth web application in Google Cloud Platform(GCP) and used values from the JSON file generated by GCP.
Test Connection is getting failed but the application deployed successfully and when the flow reaches any BigQuery connector operation, error raised! (Please see attached images)
I failed to connect using "JWT Connection" as well as "Oauth2 Connection".
Can someone guide how to connect to the Google Bigquery connector?
JWT Connection Image
OAuth Connection Image

Related

Unable to connect azure log analytics from Grafana

Grafana application hosted on linux server, when trying to establish a connection from Grafana web application page to log analytics getting below error
successfully queried the Azure Monitor Service. 2.Azure log analytics: Bad Gateway: Cannot connect to Azure Log Analytics REST API

Use Google Storage Transfer API to transfer data from external GCS into my GCS

I am working on a web application which comprises of ReactJs frontend and Java SpringBoot backend. This application would require users to upload data from their own Google Cloud storage into my Google Cloud Storage.
The application flow will be as follows -
The frontend requests the user for read access on their storage. For this I have used oauth 2.0 access tokens as described here
The generated Oauth token will be passed to the backend.
The backend will also have credentials for my service account to allow it to access my Google Cloud APIs. I have created the service account with required permissions and generated the key using the instructions from here
The backend will use the generated access token and my service account credentials to transfer the data.
In the final step, I want to create a transfer job using the google Storage-Transfer API. I am using the Java API client provided here for this.
I am having difficulty providing the authentication credentials to the transfer api.
In my understanding, there are two different authentications required - one for reading the user's bucket and another for starting the transfer job and writing the data in my cloud storage. I haven't found any relevant documentation or working examples for my use-case. In all the given samples, it is always assumed that the same service account credentials will have access to both the source and sink buckets.
tl;dr
Does the Google Storage Transfer API allow setting different source and target credentials for GCS to GCS transfers? If yes, how does one provide these credentials to the transfer job specification.
Any help is appreciated. Thanks!
This is not allowed for the the GCS Transfer API unfortunately, for this to work it would be required that the Service Account have access to both the source and the sink buckets, as you mentioned.
You can try opening a feature request in Google's Issue Tracker if you'd like so that Google's Product Team can consider such a functionality for newer versions of the API, also you could mention that this is subject is not touched in the documentation, so it can be improved.

API Connect on IBM Cloud: error when trying to expose an local API through API connect on cloud

I'm not able to expose a local rest API through API connect on cloud.
API Connect on Cloud : error
I created a Rest API in my laptop using IIS, and I want to expose it through API Connect on IBM Cloud. Since the "Push Rest API" option in IIB Web Admin is not working, I used the swagger.json file to get the API details manually to APIC on Cloud. I followed the following steps:
In IBM Cloud created resources for API Connect and Secure Gateway cloud foundry services
Created a Secure Gateway Destination and have the SG client running in my laptop
Created a simple Rest API using IIB V10 and deployed it to my local Integration Node.
Tried to push the Rest API using the IIB Web admin by giving host as api.us-south.apiconnect.appdomain.cloud and my IBM Cloud account username/pwd, but it failed saying unable to connect:
Unable to connect to IBM API Connect at host 'api.us-south.apiconnect.appdomain.cloud' port '443'
Then I tried to create an API manually using the swagger.json file available in the IIB RestAPI project. I used the option to create new API using "from file or URL" option in the APIC on IBM Cloud.
I gave my laptop IP as the "Host" value in APIC designer
In the "Assembly", I included a "Proxy" policy and updated its Target URL to cap-sg-prd-2.securegateway.appdomain.cloud:17041
When I try to test the above, I get the following error:
<httpMessage>Internal Server Error</httpMessage>
<moreInformation>Backside URL invalid</moreInformation>
Can you please help to resolve it?
You're missing one or both of the following:
1) The "Target URL" must be a valid URL. Looks like you just entered a hostname, so likely you need https://cap-sg-prd-2.securegateway.appdomain.cloud:17041 Doing that and republishing the API should resolve the "Backside URL invalid" error.
Once you do that, you may find that you still can't reach the backend due to either a timeout or connection refused error.
If so:
2) Did you allow access to the secure gateway destination via the client on your local machine? You have to intentionally set an ACL on the client to allow traffic to the host/port on your network.

Unable to create plateform for gcm on AWS

I have created plateform for ios app and it is created fine on aws server but when I have tried to create gcm plateform on sns section it is giving error. I have attached the image.
I got exactly same issue as yours. It seems google is migrating Firebase Cloud Messaging (FCM) to Google Cloud Messaging, and the API Key created via Credentials in API Manager of Google Cloud Platform is not working.
And here is how I get it to work.
Go to Firebase Console and import Google Cloud Project.
Go to Project settings on Firebase Console and you should see the Web API
Key of your project.
Go back to your Google Cloud Platform, and go to Credentials of API Manager, you should see there are two API keys have been generated. Browser key (auto created by Google Service) and Server key (auto created by Google Service)
The Server key (auto created by Google Service) is what you need to
use on the Amazon SNS.
Hope it can resolve your problem, and hope it is only a temporary solution that after Google done the migration, we can directly use the API key created in API Manager.
You should check if you are using the correct server key. It should be something like AIzaSyZ-1u...0GBYzPu7Udno5aA.

Azure HDInsight authentication failure when connecting to HDInsight web portal

After creating a new HDInsight Cluster, I am receiving the following error when I provide the authentication information and try to access the HDInsight web portal at https://{Cluster Name}.azurehdinsight.net/
403 - Forbidden: Access is denied.
Have tried recreating clusters, different browsers, clearing browser cache but have not been able to login. Pls suggest.
The interactive console has been discontinued and the team now recommends the usage of Windows Azure PowerShell
http://www.windowsazure.com/en-us/documentation/articles/hdinsight-submit-hadoop-jobs-programmatically/
You mention clearing browser cache, have you also tried clearing cookies as well? I have had this problem after a failed login to the HDInsight portal and needed to clear cookies and cache.
Also if you have an active directory integration and are trying to login with a domain account, try logging in with the cluster user that you created at deployment instead.
Use windows azure management portal instead, https://manage.windowsazure.com.