I'm trying to connect BigQuery to a BI Tool (Cognos Analytics) through a JDBC driver.
It all works OK, until I try to read a gSheet generated table. Then I get the following error:
Data source adapter error: java.sql.SQLException: [Simba][BigQueryJDBCDriver](100033) Error getting job status.
[Simba][BigQueryJDBCDriver](100033) Error getting job status.
[Simba][BigQueryJDBCDriver](100033) Error getting job status.
400 Bad Request
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "Error while reading table: <GSHEET-TABLE>, error message: Failed to read the spreadsheet. Error code: PERMISSION_DENIED",
"reason" : "invalid"
} ],
"message" : "Error while reading table: <GSHEET-TABLE>, error message: Failed to read the spreadsheet. Error code: PERMISSION_DENIED",
"status" : "INVALID_ARGUMENT"
}
My connection string is the following:
jdbc:bigquery://https://www.googleapis.com/bigquery/v2:443;ProjectId=<MY-PROJECT-ID>;OAuthType=0;OAuthServiceAcctEmail=<CLIENT-EMAIL-FROM-JSON>;OAuthPvtKeyPath=<PATH-TO-JSON>;Timeout=60; RequestGoogleDriveScope=1;
The Service Account has full Admin ownership of the entire project.
Does somebody know how to access gSheets generated tables?
I have created DELPHI database application which will use DBX TSQLConnection component to connect to Oracle database (19c version).
I'm getting access violation error , When i call oracle listagg function (SQLQuery1.open --in this line).
When i debugged , i got the error in below object file,
FileName : Data.DBXDynalink.pas
Function : function TDBXDynalinkReader.DerivedNext: Boolean;
Error Line : DBXResult := FMethodTable.FDBXReader_Next(FReaderHandle);
Actual Error : Access violation at address 04FD6CC2 in module 'dbxora.dll'. Read of address 00000004
Below is my code ,
...SQLQuery1 initialization....
SQLQuery1.CommandText := Trim(memoSQLText.Lines.Text); // Assigning query
SQLQuery1.Open; // Exactly on this line i'm getting error
if SQLQuery1.RecordCount > 0 then
....Do something here....
Note : Same query is executing in other versions which are all below to Oracle version 19C (19.3)
IDE version used for application development : DELPHI XE3 (i have checked with DELPHI 10.1 Berlin also)
DB version : Oracle 19C (19.3)
Steps to reproduce :
// 1.Execute below queries in order to create testing data ,
create table myuserlist(myuser varchar2(10));
Insert into myuserlist(myuser) values('karthik');
Insert into myuserlist(myuser) values('aarush');
Insert into myuserlist(myuser) values('yuvan');
// 2.Try to open the below mentioned query using TSQLConnection and TSQLQuery
select listagg(a.myuser, ', ') within group (order by a.myuser) as myusernames from myuserlist a
Sample project is available in the GitHub,
https://github.com/yuvankarthik/DELPHI-DemoOracleConnect.git
Help me to resolve this issue.
I am trying to execute a stored procedure on-premise which inserts 1000 records at a time from a Logic App. I have tested the stored procedure separately and it takes less than a second to insert the data and the I have the on-premise gateway connection tested as well from the Logic Apps with another procedure. When I run the Logic Apps which gets the data from another web service in a JSON format and passed to the on-premise SQL Server stored procedure, I am getting a bad gateway (502) error.
I do see a similar question, but it's not answered on Stack Overflow. As I don't see much helpful information from the error, can anyone advise how to troubleshoot this issue? All I see is the command is not supported. I appreciate the help!
{
"error": {
"code": 502,
"source": "logic-apis-centralus.azure-apim.net",
"clientRequestId": "xxxxxxx",
"message": "BadGateway",
"innerError": {
"status": 502,
"message": **"The command '**[ AzureConnection = [gateway=\"true\",server=\"xxx.com\",database=\"xxx\"],\r\n request = [Connection = AzureConnection],\r\n dataSet = \"default\",\r\n procedure = \"[dbo].[UpsertEventData]\",\r\n parameters = [jsonObject=[\r\n {\r\n \"eventId\": \"a835795db0f7a13ab03be8e41bde7f56bc9b772905d82e7c111252d998a002be\",\r\n \"startTime\": \"2018-09-26T00:23:06+00:00\",\r\n \"endTime\": \"2018-09-26T00:24:50+00:00\",\r\n \"userId\": \"6aa080743b27d1a9a67afd2683ede38a\",\r\n \"channelId\": \"987fb87a4f7b4d867c1f391d38ec98b2\",\r\n \"shareId\": null,\r\n \"deviceId\": \"bdf8ada28095551a705737900d7c5bf3\",\r\n \"divisionId\": null,\r\n,\r\n \"contactId\": null,\r\n \"type\": \"asset-in-app-viewed\",\r\n \"page\": 1\r\n }, **isn't supported.**\r\n **inner exception:** The command '[ AzureConnection = [gateway=\"true\",server=\"xxx.com\",database=\"xxx\"],\r\n request = [Connection = AzureConnection],\r\n dataSet = \"default\",\r\n procedure = \"[dbo].[UpsertEventData]\",\r\n parameters = [jsonObject=[\r\n {\r\n \"eventId\": \"a835795db0f7a13ab03be8e41bde7f56bc9b772905d82e7c111252d998a002be\",\r\n \"startTime\": \"2018-09-26T00:23:06+00:00\",\r\n \"endTime\": \"2018-09-26T00:24:50+00:00\",\r\n \"userId\": \"6aa080743b27d1a9a67afd2683ede38a\",\r\n \"channelId\": \"987fb87a4f7b4d867c1f391d38ec98b2\",\r\n \"shareId\": null,\r\n \"deviceId\": \"bdf8ada28095551a705737900d7c5bf3\",\r\n \"divisionId\": null,\r\n \"assetId\": \"44ecbff1f7427e1fdc2bc3e1ad47f827\",\r\n \"contactId\": null,\r\n \"type\": \"asset-in-app-viewed\",\r\n \"page\": 1\r\n },\r\n {\r\n \"eventId\": \"a835795db0f7a13ab03be8e41bde7f5666f3195ab9eaf651a159d6ce22b04cc7\",\r\n \"startTime\": \"2018-09-26T00:25:45+00:00\",\r\n \"endTime\": \"2018-09-26T00:25:49+00:00\",\r\n \"userId\": \"6aa080743b27d1a9a67afd2683ede38a\",\r\n \"channelId\": \"7ae82...**' isn't supported.**\r\nclientRequestId: 7d9cbf4d-b50f-4905-8825-554f7e48d54c",
"source": "sql-logic-cp-centralus.logic-ase-centralus.p.azurewebsites.net"
}
}
}
Finally I am able to fix the issue with some debugging on the Logic Apps and Stored Proc.
Fix: The issue is with passing the 'Parse JSON' output from Logic Apps as an input to the Stored Procedure. I used the string conversion function to fix the issue.
I wish Logic Apps future release would give more specific exception details instead of a generic bad gateway error.
I am trying to run a job that runs successfully from within Visual Studio. I'd like to run this in my ADF pipeline but the job fails with a syntax error.
ERRORID: E_CSC_USER_SYNTAXERROR
SEVERITY: Error
COMPONENT: CSC
SOURCE: USER
MESSAGE:
syntax error. Expected one of: '[' end-of-file ALTER COMBINE CREATE DEPLOY DROP EXTRACT IF INSERT OUTPUT PROCESS REDUCE REFERENCE RESOURCE SELECT TABLE TRUNCATE UPDATE USE USING VIEW identifier quoted-identifier variable ';' '('
DETAILS:
at token [], line 2
near the ###:
**************
DECLARE #outSlice string = "somepath.csv";
### USE DATABASE myDB;
//LOCAL
//DECLARE #root string = #"some local path";
//CLOUD
//DECLARE #root string = "adl://storeuri";
DECLARE #root string = "wasb://container#account/";
//RUN MODE 0
//DECLARE #var2 int = 1;
//DECLARE #var1 int = 1;
DECLARE #path1 string = #root + #"path1/xyz.csv";
EDIT:I tried it both with USE DATABASE statement and I commented it out like shown above;### appears in the exact same location in either cases.
EDIT2: Added consecutive lines of code per request from #michael-rys
Later in the script the parameter #outSlice is used in an output statement like
OUTPUT #dataset
TO #outSlice
USING Outputters.Csv();
The parameter is determined within a pipeline activity. Snippet below:
"type": "DataLakeAnalyticsU-SQL",
"typeProperties": {
"scriptPath": "script.usql",
"scriptLinkedService": "storageXYX",
"degreeOfParallelism": 2,
"priority": 0,
"parameters": {
"outSlice": "$$Text.Format('/Output/{0:yyyy}/{0:MM}/{0:dd}/{0:HH}/somefile.csv',SliceStart)"
}
Per off-line conversation with Michael, I removed the BOM from the script file and the ADF job ran successfully. If you are using Visual Studio, go to File->Advanced Save Options. In my case, I had to choose UTF-8 without the signature to remove BOM. Thanks again #michael-rys!
I'm new with SQL stored procedures. All procedures are encrypted so I cannot see the code itself, but I have manual which contains information about parameters.
This stored procedure is made for inserting data to database. It contains lots of parameters e.g. #orderid and #customerid. While I don't need to use all the parameters, I tried to use following code to insert data with procedure.
$sql = $pdowin->query("EXEC salesorderimport #orderid=:orderid, #customerid=:customerid");
$sql->bindValue(':orderid', 000050);
$sql->bindValue(':customerid', 801040);
$sql->execute();
It gives me following error.
Fatal error: Uncaught exception 'PDOException' with message 'SQLSTATE[07002]: [Microsoft][ODBC Driver 11 for SQL Server]COUNT field incorrect or syntax error' in ...\test.php:14 Stack trace: #0 ...\test.php(14): PDO->query('EXEC salesorder...') #1 {main} thrown in ...\test.php on line 14
Thanks in advance.