Invalid Path error while inserting job from google cloud storage to google bigquery - google-bigquery

I am trying to insert a job through HTTP Post request, but i am getting Invalid path error.
My request body is as follows:
{
"configuration": {
"load": {
"sourceUris": [
"gs://onianalytics/PersData.csv"
],
"schema": {
"fields": [
{
"name": "Name",
"type": "STRING"
},
{
"name": "Age",
"type": "INTEGER"
}
]
},
"destinationTable": {
"datasetId": "Test_Dataset",
"projectId": "lithe-anvil-404",
"tableId": "tb_test_Pers"
}
}
},
"jobReference": {
"jobId": "10",
"projectId": "lithe-anvil-404"
}
}
For the sourceuri parameter, I am passing "gs://onianalytics/PersData.csv", where onianalytics is my bucket name and PersData.csv is my csv file (from which I want to upload data into google bigquery).
I am getting below response:
"status": {
"state": "DONE",
"errorResult": {
"reason": "invalid",
"message": "Invalid path: gs://onianalytics/PersData.csv"
},
"errors": [
{
"reason": "invalid",
"message": "Invalid path: gs://onianalytics/PersData.csv"
}
]
},
"statistics": {
"creationTime": "1387276603674",
"startTime": "1387276603751",
"endTime": "1387276603751"
}
}
My bucket is under the same projectid which has the BigQuery service activated. Also, I have Google Cloud Storage enabled under APIs and Auth. Following scopes are added while authenticating:
googleapis.com/auth/bigquery, googleapis.com/auth/cloud-platform, googleapis.com/auth/devstorage.full_control,googleapis.com/auth/devstorage.read_only,googleapis.com/auth/devstorage.read_write
I am inserting this job by "Try it!" link which is available on developers.google.com/bigquery/docs/reference/v2/jobs/insert.
In fact I am able to create buckets and objects in goggle cloud storage through APIs. But when i try to insert job from the uploaded object (which is a csv file), i got "Invalid Path" error. Can anyone please help me to identify why this error is occurring?

The error I get when trying the code above is "Not found: URI gs://onianalytics/PersData.csv".
I'm wondering if instead of /onianalytics/ you had a different path with invalid characters?

Related

Facebook ads custom audience Data is missing or does not match schema error

i was building a integration with the facebook ads audience API, and according the documentation the request must be created like this:
POST - https://graph.facebook.com/v15.0/<MY_CUSTOM_AUDIENCE_ID>/users?access_token=<MY_ACCESS_TOKEN>
{
"session":{
"session_id":1,
"batch_seq":1,
"last_batch_flag":true,
"estimated_num_total":1
},
"payload":{
"schema":[
"FN"
],
"data":
[
"8b1ebea129cee0d2ca86be6706cd2dfcf79aaaea259fd0c311bdbf2a192be148"
]
}
}
Using the previus example a received a error 400:
{
"error": {
"message": "(#100) Data is missing or does not match schema",
"type": "OAuthException",
"code": 100,
"fbtrace_id": "AqrLd9uIw0D4BBFtHF33bdU"
}
}
For do this i used this documentation https://developers.facebook.com/docs/marketing-api/audiences/guides/custom-audiences#hash
Anyone has use this before?
Your schema field type is array but array use form multi-key qualification.
Change it to string: schema: 'FN'
In docs you can see all formats.
This payload with multi keys work for me:
{
"session": {
"session_id": 123,
"batch_seq": 1,
"last_batch_flag": true
},
"payload": {
"schema": [
"EMAIL",
"PHONE",
"FN"
],
"data": [
["EMAIL_HASH", "PHONE_HASH", "FN_HASH"]
]
}
}

whatsapp api error during sending an created template

{ "messaging_product": "whatsapp",
"to": "91********5",
"type": "template",
"template": {
"name": "demo_template",
"language": {
"code": "en_US"
}
}
}
This is passed as a post method.
demo_template is an existing template created by me.
Getting error like this:
{
"error": {
"message": "(#132012) Parameter format does not match format in the created template",
"type": "OAuthException",
"code": 132012,
"error_data": {
"messaging_product": "whatsapp",
"details": "header: Format mismatch, expected IMAGE, received UNKNOWN"
},
"error_subcode": 2494073,
"fbtrace_id": "ARtWScjGa0rADjfHvbOH4bS"
}
}
For image you have to first upload the image in media library then you can get id for the media you have uploded then you can attach that id with the messaging template you send out in headers in your case it is image.check this link here
or you can directly use url link to attach media.{ "type": "header", "parameters": [ { "type": "image", "image": { "link": "https://URL" } } ] },
The parameter values included in the request are not using the format specified in the template. See the Message Template Guidelines.

BigQuery: Routine deployment failing with error "Unknown option: description"

We use terraform to deploy BigQuery objects (datasets, tables, routines etc..) to region europe-west2 in GCP. We do this many many times a day and all of a sudden at "2021-08-18T21:15:44.033910202Z" our deployments starting failing when attempting to deploy BigQuery routines. They are all failing with errors of the form:
status: {
code: 3
message: "Unknown option: description"
}
Here is the first log message I can find pertaining to this error (I have redacted project names):
"protoPayload": {
"#type": "type.googleapis.com/google.cloud.audit.AuditLog",
"status": {
"code": 3,
"message": "Unknown option: description"
},
"authenticationInfo": {
"principalEmail": "deployer-dev#myadminproject.iam.gserviceaccount.com",
"serviceAccountDelegationInfo": [
{
"firstPartyPrincipal": {
"principalEmail": "deployer-dev#myadminproject.iam.gserviceaccount.com"
}
}
]
},
"requestMetadata": {
"callerIp": "10.51.0.116",
"callerSuppliedUserAgent": "Terraform/0.14.7 (+https://www.terraform.io) Terraform-Plugin-SDK/2.5.0 terraform-provider-google/3.69.0,gzip(gfe)",
"callerNetwork": "//compute.googleapis.com/projects/myadminproject/global/networks/__unknown__",
"requestAttributes": {},
"destinationAttributes": {}
},
"serviceName": "bigquery.googleapis.com",
"methodName": "google.cloud.bigquery.v2.RoutineService.InsertRoutine",
"authorizationInfo": [
{
"resource": "projects/myproject/datasets/p00003818_dp_model",
"permission": "bigquery.routines.create",
"granted": true,
"resourceAttributes": {}
}
],
"resourceName": "projects/myproject/datasets/p00003818_dp_model/routines/UserProfile_Events_AllCarData_Deployment",
"metadata": {
"routineCreation": {
"routine": {
"routineName": "projects/myproject/datasets/p00003818_dp_model/routines/UserProfile_Events_AllCarData_Deployment"
},
"reason": "ROUTINE_INSERT_REQUEST"
},
"#type": "type.googleapis.com/google.cloud.audit.BigQueryAuditMetadata"
}
},
"insertId": "ak27xdbke",
"resource": {
"type": "bigquery_dataset",
"labels": {
"dataset_id": "p00003818_dp_model",
"project_id": "myproject"
}
},
"timestamp": "2021-08-18T21:15:43.109609Z",
"severity": "ERROR",
"logName": "projects/myproject/logs/cloudaudit.googleapis.com%2Factivity",
"receiveTimestamp": "2021-08-18T21:15:44.033910202Z"
}
The fact that this occurred without any changes by ourselves indicates that this is a problem at the Google end. I also observe that whilst we witnessed this in a few projects it occurred first in one project and then a few minutes later in another - that may or may not be helpful information.
Posting here in case anyone else hits this problem and also hoping it might catch the attention of a googler.
UPDATE! I have reproduced the pproblem using the REST API https://cloud.google.com/bigquery/docs/reference/rest/v2/routines/insert
I have entered a payload that does not include a description and that successfully creates a routine:
However, if I include a description which, as this screenshot indicates, is a valid parameter:
then the request fails:

BigQuery: Load Data into EU Dataset from GCS

In the past I have successfully loaded data into US-hosted BigQuery datasets from CSV data in US-hosted GCS buckets. We since decided to move our BigQuery data to the EU and I created a new dataset with this region selected on it. I have successfully populated those of our tables small enough to be uploaded from my machine at home. But two tables are far too large for this so I would like to load them from files in GCS. I have tried doing this from both a US-hosted GCS bucket and an EU-hosted GCS bucket (thinking that bq load might not like to cross regions) but the load fails every time. Below is the error detail I'm getting from the bq command line (500, Internal Error). Does anyone know a reason why this might be happening? Is loading data into EU-hosted BigQuery datasets from GCS something that is known to work for others?
{
"configuration": {
"load": {
"destinationTable": {
"datasetId": "######",
"projectId": "######",
"tableId": "test"
},
"schema": {
"fields": [
{
"name": "test_col",
"type": "INTEGER"
}
]
},
"sourceFormat": "CSV",
"sourceUris": [
"gs://######/test.csv"
]
}
},
"etag": "######",
"id": "######",
"jobReference": {
"jobId": "job_Y4U58uTyxitsvbgljFi2x534N7M",
"projectId": "######"
},
"kind": "bigquery#job",
"selfLink": "https://www.googleapis.com/bigquery/v2/projects/######",
"statistics": {
"creationTime": "1445336673213",
"endTime": "1445336674738",
"startTime": "1445336674738"
},
"status": {
"errorResult": {
"message": "An internal error occurred and the request could not be completed.",
"reason": "internalError"
},
"errors": [
{
"message": "An internal error occurred and the request could not be completed.",
"reason": "internalError"
}
],
"state": "DONE"
},
"user_email": "######"
}
After searching through other related questions on StackOverflow I eventually realised that I had set my GCS bucket region to EUROPE-WEST-1 and not the multi-region EU location. Things are now working as expected.

Invalid Path while inserting job from google cloud storage to google bigquery

I am trying to insert a job through HTTP Post request, but i am getting Invalid path error.
My request body is as follows:
{
"configuration": {
"load": {
"sourceUris": [
"gs://onianalytics/PersData.csv"
],
"schema": {
"fields": [
{
"name": "Name",
"type": "STRING"
},
{
"name": "Age",
"type": "INTEGER"
}
]
},
"destinationTable": {
"datasetId": "Test_Dataset",
"projectId": "lithe-anvil-404",
"tableId": "tb_test_Pers"
}
}
},
"jobReference": {
"jobId": "10",
"projectId": "lithe-anvil-404"
}
}
For the sourceuri parameter, I am passing "gs://onianalytics/PersData.csv", where onianalytics is my bucket name and PersData.csv is my csv file (from which I want to upload data into google bigquery).
I am getting below response:
"status": {
"state": "DONE",
"errorResult": {
"reason": "invalid",
"message": "Invalid path: gs://onianalytics/PersData.csv"
},
"errors": [
{
"reason": "invalid",
"message": "Invalid path: gs://onianalytics/PersData.csv"
}
]
},
"statistics": {
"creationTime": "1387276603674",
"startTime": "1387276603751",
"endTime": "1387276603751"
}
}
Please explain why this error is occurring?
Is your bucket under the same projectId that has the BigQuery service activated and you requested tokens with? If not, have you tried enabling read/write access for that project?