Importing data to BQ error - google-bigquery

I updated our BigQuery client to new client Google API and suddenly I started seeing this error when uploading via JSON:
"errors": [
{
"reason": "invalid",
"location": "Offset:0 / Line:1 / Column:159 / Field:q1",
"message": "Could not convert value to string"
},
Job reference:
"jobReference": {
"projectId": "dot-metrics",
"jobId": "job_8e0511a40c1845cca5717daf78b605f7"
},
This worked before we updated our client, afterwards it just stopped working so it must be some change inside BigQuery. Any help is appreciated!

This looks like a regression in a recent release that broke importing null values in json. A fix should be forthcoming.
Note if you drop the null fields (i.e. instead of "field: null" you just don't include "field" at all) it should continue to work.

Related

How do I create a bigquery pubsub direct in gcp? I get an error failed to create

I am trying to publish data directly from pubsub to bigquery.
I have created a topic with a schema.
I have created a table.
But when I create the subscription, I get an error request contains an invalid argument
gcloud pubsub subscriptions create check-me.httpobs --topic=check-me.httpobs --bigquery-table=agilicus:checkme.httpobs --write-metadata --use-topic-schema
ERROR: Failed to create subscription [projects/agilicus/subscriptions/check-me.httpobs]: Request contains an invalid argument.
ERROR: (gcloud.pubsub.subscriptions.create) Failed to create the following: [check-me.httpobs].
there's not really a lot of diagnostics i can do here.
Is there any worked out example that shows? What am i doing wrong for this error?
Side note: its really a pain to have to create the BQ schema w/ its native json format, and then create the message schema in avro format. Similar, but different, and no conversion tools that I can find.
If i run with --log-http, it doesn't really enlighten:
{
"error": {
"code": 400,
"message": "Request contains an invalid argument.",
"status": "INVALID_ARGUMENT"
}
}
-- update:
switched to protobuf, same problem.
https://gist.github.com/donbowman/5ea8f8d8017493cbfa3a9e4f6e736bcc has the details.
gcloud version
Google Cloud SDK 404.0.0
alpha 2022.09.23
beta 2022.09.23
bq 2.0.78
bundled-python3-unix 3.9.12
core 2022.09.23
gsutil 5.14
I have confirmed all the fields are present and correct format, as per https://github.com/googleapis/googleapis/blob/master/google/pubsub/v1/pubsub.proto#L639
specifically:
{"ackDeadlineSeconds": 900, "bigqueryConfig": {"dropUnknownFields": true, "table": "agilicus:checkme.httpobs", "useTopicSchema": true, "writeMetadata": true}, "name": "projects/agilicus/subscriptions/check-me.httpobs", "topic": "projects/agilicus/topics/check-me.httpobs"}
I have also tried using the API Explorer to post this, same effect.
I have also tried using the python example:
https://cloud.google.com/pubsub/docs/samples/pubsub-create-bigquery-subscription#pubsub_create_bigquery_subscription-python
to create. Same error w/ a slight bit more info (ip, grpc_status 3)
debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B2607:f8b0:400b:807::200a%5D:443 {created_time:"2022-10-04T20:54:44.600831924-04:00", grpc_status:3, grpc_message:"Request contains an invalid argument."}"

This location is unknown

Today I started getting This location is unknown errors on almost every Flight Offers Search call to the test API. I'm using this data https://github.com/amadeus4dev/data-collection/blob/master/data/flightsearch.md to make requests.
The kind of error that I get is this:
{
"code": 4926,
"title": "INVALID DATA RECEIVED",
"detail": "This location code is unknown",
"status": 400
}
Is there something going on today with test API? On friday everything worked.

How to index couchdb in hyperledger fabric

I'm trying to get the result sorted by the posting date for which i have defined an index
{
"index": {
"fields": [{
"PostingDate": "DESC"
}]
},
"ddoc": "indexPostingDate",
"name": "indexPostingDate",
"type": "json"
}
Acc to the hyperledger fabric documentation i have to place this in META-INF/statedb/couchdb/indexes
With Sap hyperledger fabric platform i tried placing this folder both in vendor folder as well as directly under src folder.But none worked for me.
This is my code for querying result
queryString := fmt.Sprintf("{\"selector\":{\"id\":\"%s\"},\"sort\": [{\"PostingDate\": \"desc\"}],\"use_index\": \"indexPostingDate\"}",
id)
logger.Infof("Getting data for %s", id)
oResultsIterator, responseMetaData, oFetchErr := stub.GetQueryResultWithPagination(queryString,
pageSize, bookmark)
On invoking this function, i get this following error
{
"error": {
"message": "GET_QUERY_RESULT failed: transaction ID: 9b7c42c09069855758ccdfbc30f5d75d38b51569a78e101584051e0fa142ebc3: error handling CouchDB request. Error:no_usable_index, Status Code:400, Reason:No index exists for this sort, try indexing by the sort fields.",
"code": "CustomError",
"status": 500
}
}
How do i resolve this?
It's a known limitation that you can't specify the sort direction in the Query with Fabric. If you need an ordered search, your index creation should specify the direction, so you can simply exclude it in your query.
Now that we support and recommend you use the official CouchDB:3.1, this may work, but I would need to test it.
You can simplify your query by the way by using backticks so you don't have to escape your quotes for ease of reading and writing: fmt.Sprintf(`{"selector":{"id":"%s"},"sort": [{"PostingDate": "desc"}],"use_index": "indexPostingDate"}`, id)

Error on Paysafe Card Payment API?

This may sound stupid but I keep getting this error for every request that I make for the Card Payments API.
{
"error": {
"code": "5270",
"message": "The credentials provided with the request do not have permission to access the data requested.",
"links": [
{
"rel": "errorinfo",
"href": "https://developer.optimalpayments.com/en/documentation/cardpayments/error-5270"
}
]
}
}
The error seems to indicate that I do not have permission to do something. Is this an access issue as I think I am using the proper key? Has anyone ever seen this error with Paysafe?
#Crazyshezy is correct. Write to Paysafe with the key that you are using and they will be able to isolate what permissions that you have or may not have with your API key.
i was getting same error. It was due to card expiry year. It should be 4 digit year(e.g 2017) but i was sending 2 digit year(17).

BigQuery Command Line Tool: get error details

one of my jobs keeps failing and when I looked into why (by requesting job details) I get the following output:
status": {
"errorResult": {
"location": "gs://sf_auto/Datastore Mapper modules.models.userData/15716706166748C8426AD/output-46",
"message": "JSON table encountered too many errors, giving up. Rows: 1; errors: 1.",
"reason": "invalid"
},
"errors": [
{
"location": "gs://sf_auto/Datastore Mapper modules.models.userData/15716706166748C8426AD/output-46",
"message": "JSON table encountered too many errors, giving up. Rows: 1; errors: 1.",
"reason": "invalid"
}
],
"state": "DONE"
Problem is, it doesn't help at all, and I need more details. Is there anyway to understand which column or attribute caused the failings? Is there any way to get more information?
Edit Additional Details
We're running a map reduce job on appengine to transfer our datastore from appengine to BigQuery
The files are stored on Google Cloud Store
It's creating a brand new table instead of adding to an existing one
Update #2
I played around with the query trying lots of things as well as adjusting the scheme and i've narrowed down the problem to the uuid. For some reason this type of data messes everything up:
"uuid": "XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX"
The schema defines it as a String
OK, after loads of debugging I found the error... in the JSON Newline file we had two attributes that were similar:
uuid: "XXX..."
uuId: "XXX..."
This has been there for a while so I think some change within bigquery started to require that keys be unique regardless of capitalization. Will test so more and confirm!
A recent change made loads of JSON data case insensitive in field names similar to be consistent with how SQL queries treat field names. I have opened a work item to track the improvement of error message for this case.