I'm trying to use the BigQuery REST API to execute some queries, but, for some reason, I can't use SQL functions.
The endpoint I've been using is the following:
POST https://bigquery.googleapis.com/bigquery/v2/projects/{project-id}/queries
This works for regular queries (with no functions), but if I try to use the EXTRACT or FORMAT_DATE functions I always get a 400 Bad Request.
Examples
Payload:
{
"query": "SELECT user_id, timestamp, (EXTRACT(ISOWEEK FROM timestamp)) as week FROM table_name WHERE DATE(_PARTITIONTIME) >= '2022-01-01' ORDER BY week DESC"
}
Response:
{
"error": {
"code": 400,
"message": "Encountered \" \"FROM\" \"FROM \"\" at line 1, column 45.\nWas expecting:\n \")\" ...\n[Try using standard SQL (https://cloud.google.com/bigquery/docs/reference/standard-sql/enabling-standard-sql)]",
"errors": [
{
"message": "Encountered \" \"FROM\" \"FROM \"\" at line 1, column 45.\nWas expecting:\n \")\" ...\n[Try using standard SQL (https://cloud.google.com/bigquery/docs/reference/standard-sql/enabling-standard-sql)]",
"domain": "global",
"reason": "invalidQuery",
"location": "q",
"locationType": "parameter"
}
],
"status": "INVALID_ARGUMENT"
}
}
Second Payload:
{
"query": "SELECT user_id, timestamp, FORMAT_DATE('%Y%W',timestamp) as week FROM table_name WHERE DATE(_PARTITIONTIME) >= '2022-01-01' ORDER BY week DESC"
}
Response:
{
"error": {
"code": 400,
"message": "1.39 - 1.56: Unrecognized function format_date\n[Try using standard SQL (https://cloud.google.com/bigquery/docs/reference/standard-sql/enabling-standard-sql)]",
"errors": [
{
"message": "1.39 - 1.56: Unrecognized function format_date\n[Try using standard SQL (https://cloud.google.com/bigquery/docs/reference/standard-sql/enabling-standard-sql)]",
"domain": "global",
"reason": "invalidQuery",
"location": "q",
"locationType": "parameter"
}
],
"status": "INVALID_ARGUMENT"
}
}
Is there any particular way to escape BigQuery functions in the REST API?
Thank you,
I suspect (give that you mention the REST endpoint directly) you're constructing requests without the use of a client library.
Try setting the "useLegacySQL" field to false as part of the request:
https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query#QueryRequest
Due to historical precedent and to avoid breaking users during the evolution of the standard SQL dialect, the default value of this is field is true. The various BigQuery client libraries tend to handle this automatically for you.
Related
I use Jhipster with react as frontend with loopback as server side, I should show custom error ( ex. tax code already present in the archive).
this is format error
{
"error": {
"statusCode": 422,
"name": "UnprocessableEntityError",
"message": "The request body is invalid. See error object `details` property for more info.",
"code": "VALIDATION_FAILED",
"details": [
{
"path": "partitaIva",
"message": "Partita Iva giĆ presente",
"code": "CUSTOM_ERROR",
"info": {}
}
]
}
}
There could be more errors too, like for a form.
I want to know how to display the error returned by server.
Running a POST with the following:
https://www.googleapis.com/admin/directory/v1/users/{{userid}}/aliases/
With the following as the body in JSON
{
"alias":"person#gsuite.company.com"
}
Im just getting the following error
{
"error": {
"code": 400,
"message": "Invalid Input: alias_email",
"errors": [
{
"message": "Invalid Input: alias_email",
"domain": "global",
"reason": "invalid"
}
]
}
}
I would love some help on this because its getting super frustratinu
Google Support Page
https://developers.google.com/admin-sdk/directory/v1/guides/manage-user-aliases
See comments above, but always give your secondary domains in Google Workplace an alias to the main domain and the domain alais will appear
I am trying to add a UDF (I've tried both options of inline vs on cloud storage) and always get the same messsage:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "invalidQuery",
"message": "Unknown TVF: funcName",
"locationType": "other",
"location": "query"
}
],
"code": 400,
"message": "Unknown TVF: funcName"
}
}
I set the resource via.
$udf_resource = new Google_Service_Bigquery_UserDefinedFunctionResource();
$udf_resource->setResourceUri('gs://path/to/bucket/funcName.js');
or
$udf_resource = new Google_Service_Bigquery_UserDefinedFunctionResource();
$udf_resource->setInlineCode("FUNC_NAME_CODE");
both are being inserted into a job query config via.
$query_config->setUserDefinedFunctionResources($udf_resource);
The udf runs fine via. the Web UI.
Is there something I am missing?
passing array to setUserDefinedFunctionResources(), e.g.
$query_config->setUserDefinedFunctionResources([$udf_resource]);
I have been trying to create a job to load a compressed json file from Google Cloud Storage to a Google BigQuery table. I have read/write access in both Google Cloud Storage and Google BigQuery. Also, the uploaded file belongs in the same project as the BigQuery one.
The problem happens when I access to the resource behind this url https://www.googleapis.com/upload/bigquery/v2/projects/NUMERIC_ID/jobs by means of a POST request. The content of the request to the abovementioned resource can be found as follows:
{
"kind" : "bigquery#job",
"projectId" : NUMERIC_ID,
"configuration": {
"load": {
"sourceUris": ["gs://bucket_name/document.json.gz"],
"schema": {
"fields": [
{
"name": "id",
"type": "INTEGER"
},
{
"name": "date",
"type": "TIMESTAMP"
},
{
"name": "user_agent",
"type": "STRING"
},
{
"name": "queried_key",
"type": "STRING"
},
{
"name": "user_country",
"type": "STRING"
},
{
"name": "duration",
"type": "INTEGER"
},
{
"name": "target",
"type": "STRING"
}
]
},
"destinationTable": {
"datasetId": "DATASET_NAME",
"projectId": NUMERIC_ID,
"tableId": "TABLE_ID"
}
}
}
}
However, the error doesn't make any sense and can also be found below:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "invalid",
"message": "Job configuration must contain exactly one job-specific configuration object (e.g., query, load, extract, spreadsheetExtract), but there were 0: "
}
],
"code": 400,
"message": "Job configuration must contain exactly one job-specific configuration object (e.g., query, load, extract, spreadsheetExtract), but there were 0: "
}
}
I know the problem doesn't lie either in the project id or in the access token placed in the authentication header, because I have successfully created an empty table before. Also I specify the content-type header to be application/json which I don't think is the issue here, because the body content should be json encoded.
Thanks in advance
Your HTTP request is malformed -- BigQuery doesn't recognize this as a load job at all.
You need to look into the POST request, and check the body you send.
You need to ensure that all the above (which seams correct) is the body of the POST call. The above Json should be on a single line, and if you manually creating the multipart message, make sure there is an extra newline between the headers and body of each MIME type.
If you are using some sort of library, make sure the body is not expected in some other form, like resource, content, or body. I've seen libraries that use these differently.
Try out the BigQuery API explorer: https://developers.google.com/bigquery/docs/reference/v2/jobs/insert and ensure your request body matches the one made by the API.
I am trying to send segment doc manually using the CLI with example on this page: https://docs.aws.amazon.com/xray/latest/devguide/xray-api-sendingdata.html#xray-api-segments
I created my own Trace ID and also start and end time.
The command i used are:
> DOC='{"trace_id": "'$TRACE_ID'", "id": "6226467e3f841234", "start_time": 1581596193, "end_time": 1581596198, "name": "test.com"}'
>echo $DOC
{"trace_id": "1-5e453c54-3dc3e03a3c86f97231d06c88", "id": "6226467e3f845502", "start_time": 1581596193, "end_time": 1581596198, "name": "test.com"}
> aws xray put-trace-segments --trace-segment-documents $DOC
{
"UnprocessedTraceSegments": [
{
"ErrorCode": "ParseError",
"Message": "Invalid segment. ErrorCode: ParseError"
},
{
"ErrorCode": "MissingId",
"Message": "Invalid segment. ErrorCode: MissingId"
},
{
"ErrorCode": "MissingId",
"Message": "Invalid segment. ErrorCode: MissingId"
},
.................
The put-trace-segment keep giving me error. The segment doc comply with the JSON schema too. Am i missing something else?
Thanks.
I need to enclose the JSON with "..". The command that works for me was: aws xray put-trace-segments --trace-segment-documents "$DOC"
This is probably due an error in the documentation or that the xray team was using another kind of shell.