Connecting Power BI to BigQuery issue - SSL verification failed - ssl

I am trying to connect Power BI to BigQuery, but am getting the error below preventing this. I had previously been able to connect without issue, but in the last month have been unable to due to this error.
I have updated Power BI to the latest version, tried the instructions mentioned in the solution here (https://community.powerbi.com/t5/Desktop/Bigquery-connection-error/td-p/1602196), but nothing has worked.
How can I get around this SSL verification issue?
Error Message:
Details: "ODBC: ERROR [HY000] [Microsoft][DriverSupport] (1120) SSL verification failed because the server host name specified for the connection does not match the "CN" entry in the "Subject" field or any of the "DNS Name" entries of the "Subject Alternative Name" field in the server certificate.
ERROR [HY000] [Microsoft][DriverSupport] (1120) SSL verification failed because the server host name specified for the connection does not match the "CN" entry in the "Subject" field or any of the "DNS Name" entries of the "Subject Alternative Name" field in the server certificate."

Related

Configuring ODBC 32bit on Windows11 is failing

Trying to configure the ODBC driver on Windows is returning the following message
[Simba][DriverSupport] (1120) SSL verification failed because the server host name specified for the connection does not match the’CN’ entry in the ‘Subject’ field or any of the ‘DNS Name’ entries of the ‘Subject Alternavice Name’ field in the server certificate.
Using the default cacerts.pem file
What step am I missing?

Identity Server 4. Getting error when using self-signed certificate

I have an Identity Server, a Web API and a frontend app. Usually they're running on localhost and it works fine. Now I need to run the app on my local IP address. I changed all the settings from localhost:port to <my_ip>:port and I'm getting the following error in the Web API:
System.InvalidOperationException: IDX20803: Unable to obtain configuration from: '<my_ip>:5003/.well-known/openid-configuration'.
---> System.IO.IOException: IDX20804: Unable to retrieve document from: '<my_ip>:5003/.well-known/openid-configuration'.
---> System.Net.Http.HttpRequestException: The SSL connection could not be established, see inner exception.
---> System.Security.Authentication.AuthenticationException: The remote certificate is invalid according to the validation procedure.
I tried to generate a self signed certificate and add to the certificate store, I tried many solutions on SO i.e. this one. All I achieved is switching from this error and Keyset does not exist
This is how I try to load the certificate:
var key = Configuration["certBase64"]; // exported from store as x509 base64 encoded
var pfxBytes = Convert.FromBase64String(key);
var cert = new X509Certificate2(pfxBytes, "<certificate-pwd>", X509KeyStorageFlags.MachineKeySet);
builder.AddSigningCredential(cert);
What am I missing?
UPDATE
I also tried to use a local DNS name for my IP. On my router I set up forwarding from myapp.local to <my_ip>. Then I created a self signed certificate with Subject and DNS name = myapp.local, but it didn't help.
The current error:
System.InvalidOperationException: IDX20803: Unable to obtain configuration from: 'myapp.local:5003/.well-known/openid-configuration'.
---> System.IO.IOException: IDX20804: Unable to retrieve document from: 'myapp.local:5003/.well-known/openid-configuration'.
---> System.Net.Http.HttpRequestException: The SSL connection could not be established, see inner exception.
---> System.Security.Authentication.AuthenticationException: The remote certificate is invalid according to the validation procedure.
You can not use HTTPS and certificates when you try co connect to a service using an IP-address. You must always have a domain name and a valid certificate that your local machine trust.

Snowflake SSL peer certificate or SSH remote key was not OK

I am doing data load via ODBC, by extracting data from on prem database into files and then importing those into Snowflake target table.
During that process, I am getting the error mentioned below. What could be the possibility of getting this error?
Exception calling "Open" with "0" argument(s): "ERROR [HY000] [Snowflake][Snowflake] (4)
REST request for URL <hiding Snowflake URL> failed: CURLerror (curl_easy_perform() failed) - code=60 msg='SSL peer certificate or SSH remote key was not OK' osCode=2 osMsg='No such file or directory'.
ERROR [HY000] [Snowflake][Snowflake] (4)
REST request for URL <hiding Snowflake URL> failed: CURLerror (curl_easy_perform() failed) - code=60 msg='SSL peer certificate or SSH remote key was not OK' osCode=2 osMsg='No such file or directory'.
"
At C:\temp\wsla3205x.ps1:254 char:5
+ $sfOdbc.Open()
Is there any more information required on this from my side?

rsyslogd-2291: imrelp: could not activate relp listner

I'm trying to configure rsyslog tls with relp but keep getting errors.
I'm using RHEL 7.2 with rsyslog 8.15.
I do manage to send messages using relp + tls but without using the certificates. When I'm adding the certificates I'm getting the following error:
Jan 20 11:00:17 ip-10-0-0-114 rsyslogd-2353: imrelp[514]: error 'Failed to set certificate trust files [gnutls error -64: Error while reading file.]', object 'lstn 514' - input may not work as intended [v8.15.0 try http://www.rsyslog.com/e/2353 ]
Jan 20 11:00:17 ip-10-0-0-114 rsyslogd-2291: imrelp: could not activate relp listner, code 10031 [v8.15.0 try http://www.rsyslog.com/e/2291 ]
Server conf:
module(load="imrelp" ruleset="relp")
input(type="imrelp" port="514" tls="on"
tls.caCert="/home/ec2-user/rsyslog/ca.pem"
tls.myCert="/home/ec2-user/rsyslog/server-cert.pem"
tls.myPrivKey="/home/ec2-user/rsyslog/server-key.pem"
tls.authmode="name"
tls.permittedpeer=["client.example.co"]
)
ruleset(name="relp") {
action(type="omfile" file="/var/log/relptls2")
}
The following is the client configuration:
module(load="omrelp")
action(type="omrelp" target="10.0.0.114" port="514" tls="on"
tls.caCert="/home/ec2-user/rsyslog/ca.pem"
tls.myCert="/home/ec2-user/rsyslog/client-cert.pem"
tls.myPrivKey="/home/ec2-user/rsyslog/client-key.pem"
tls.authmode="name"
tls.permittedpeer=["server.example.co"]
)
When I remove the tls cert fields from the server configration I get client error:
Jan 20 10:35:29 ip-10-0-0-206 rsyslogd-2353: omrelp[10.0.0.114:514]:
error 'Failed to set certificate trust file [gnutls error -64: Error
while reading file.]', object 'conn to srvr 10.0.0.114:514' - action
may not work as intended [v8.15.0 try http://www.rsyslog.com/e/2353 ]
Help would be really really appreciated as I'm stack with this for long time.
Thanks!!!!
The gnutls error -64: Error while reading file error message means either:
The certificates actual path is different from what is in the
configuration file
Rsyslog service cannot read the certificates
because of permission problem
In case of permission issue you may move the certificates under /etc/rsyslog.d
In case of path issue, just fix the path :)

Certificate sent by the other side could not be validated - Oracle Wallet

I have written following code in PL/SQL for calling 3rd party APIs from Oracle 11g.
Begin
-- preparing Request...
l_http_request := UTL_HTTP.begin_request ('https://www..........'
, 'GET'
, 'HTTP/1.1');
-- set header's attributes...
UTL_HTTP.set_header(l_http_request, 'Content-Type', 'application/json');
UTL_HTTP.set_header(l_http_request, 'Content-Length', LENGTH(t_request_body));
UTL_HTTP.set_header(l_http_request, 'Api-Key','..............');
-- get Response and obtain received value
l_http_response := UTL_HTTP.get_response(l_http_request);
UTL_HTTP.read_text(l_http_response, l_response_text);
end;
When I run this code I'm getting following error
Error report:
ORA-29273: HTTP request failed
ORA-06512: at "SYS.UTL_HTTP", line 1130
ORA-29024: Certificate validation failure
ORA-06512: at line 13
29273. 00000 - "HTTP request failed"
*Cause: The UTL_HTTP package failed to execute the HTTP request.
*Action: Use get_detailed_sqlerrm to check the detailed error message.
Fix the error and retry the HTTP request.
I figured out that this is caused by 'https' protocole. So I downloaded all relevant certificates and then handed over to our DB team. Though they have configured Oracle wallet with these certificates, still we are getting the same error report.
Any thoughts?
UPDATE:
I've added following code as the very first lines in begin block...
UTL_HTTP.SET_DETAILED_EXCP_SUPPORT(TRUE);
UTL_HTTP.SET_WALLET('file:/../wallet','pwd.....' );
But now it gives following exception "Certificate is invalid" though the certificate sender confirms its validity. Also the validity could be confirmed by looking at this external ssl checker too: https://www.sslshopper.com.
Error report:
ORA-29024: Certificate validation failure
ORA-06512: at "SYS.UTL_HTTP", line 1128
ORA-06512: at line 16
29024. 00000 - "Certificate validation failure"
*Cause: The certificate sent by the other side could not be validated. This may occur if
the certificate has expired, has been revoked, or is invalid for another reason.
*Action: Check the certificate to determine whether it is valid. Obtain a new certificate,
alert the sender that there certificate has failed, or resend.
Please note that I've tired all formats of certificate files (Base-64 encoded / PKCS#7 etc.) as explained in http://oracle-base.com/articles/misc/utl_http-and-ssl.php
Any thoughts?
Personally, I find it a pain to load the certificates of each and every website you want to access in an Oracle Wallet (which is probably why you're getting the error--you need to install the certificates and chains of the website you're trying to access into the Wallet).
The easiest thing to do is install stunnel https://www.stunnel.org/index.html
Configure stunnel to listen for incoming connections on a local port such as 8800 and then make an outbound connection to somesite.com:443.
Something like this:
1. oracle issues a get as: http://localhost:8080/index.html
2. stunnel intercepts the request and gets https://somesite.com/index.html
3. stunnel gives results to oracle
This allows Oracle to communicate via http to stunnel, then stunnel communicates to https://somesite.com and delivers the data back to oracle on port 80.
This completely bypasses the Oracle Wallet.
As this is not a direct answer to your question, it surely solves the many, many issues with Oracle Wallet and in my opinion is the best solution.