Error parse JSON missing comma pos <number> - sql

I have a column with a varchar and want to convert it to a JSON by parse_Json.
({u'meta': {u'removedAt': None, u'validation': {u'createdTime': 157....)
When I use :
select get_path(PARSE_JSON(OFFER), 'field') from
this error occours: SQL-Fehler [100069] [22P02]: Error parsing JSON: missing colon, pos 3.
So I try to add a Colon at position 3
select get_path(PARSE_JSON(REPLACE (offer,'u','u:')), 'field') from
So this error occurred SQL-Fehler [100069] [22P02]: Error parsing JSON: misplaced colon, pos 10
By now I don't know how do handle this and the information by snowflake doesnt really help.
https://support.snowflake.net/s/article/error-error-parsing-json-missing-comma-pos-number
Thanks for your help

Your 'JSON input' is actually a Python representation string of its dictionary data structure, and is not a valid JSON format. While dictionaries in Python may appear similar to JSON when printed in an interactive shell, they are not the same.
To produce valid JSON from your Python objects, use the json module's dump or dumps functions, and then use the proper string serialized JSON form in your parse_json function.

Related

Flatbuffers converting json to binary - unexpected force_align value

I convert a binary file to json with the following command with flatbuffers.
flatc --json schema.fbs -- model.blob
When I try to immediately convert the json back to a binary with this command
flatc -b schema.fbs model.json
It throws an error
error: unexpected force_align value '64', alignment must be a power of two integer ranging from the type's natural alignment 1 to 16
It points to the very last line of the json file as the problem. Does anybody know the problem? Could it be escape sequences?
Is there a force_align: 64 somewhere in schema.fbs ? That would be the real source of the problem. It ignores this attribute when generating the JSON, but i

Parsing JSON files from a column with invalid token in BigQuery

This is the JSON File:
{"success":false,"error":{"type":"ValidationError","message":{"Period":{"maxValue:$1":"Value"}}}}
I am trying to parse the "Value" from the file
JSON_EXTRACT_SCALAR(response,"$.error.message.loanPeriod.maxValue:$1']")
The tricky part is because of the "$" or the ":" from "maxValue:$1"
Please note that "response" is the column
Below is for BigQuery Standard SQL
In cases where a JSON key uses invalid JSONPath characters, you can escape those characters using single quotes and brackets as in example below
JSON_EXTRACT_SCALAR(response,"$.error.message.Period['maxValue:$1']")
See more in documentation - JSON Functions in Standard SQL

How to extract a file having varbinary column in u-sql script using default Extractor?

I have to Extract the varbinary column in a file. When I tried to extract it with byte[]. It show the error "Conversion Error. Column having invalid characters".
U-SQL Script:
EXTRACT Id int?,createddate DateTime?,Photo byte[]
FROM #input
USING Extractors.Csv(quoting: true,nullEscape:"\N");
The built-in Csv/Tsv/Text extractors assume that they operate on textual data, where the binary content is hex-encoded. This is necessary, since the binary otherwise could contain any of the delineation characters. See https://msdn.microsoft.com/en-us/library/azure/mt621366.aspx under byte[].
So if your photo is not hex-encoded, you would have to write your own custom extractor.

Unexpected token \n in JSON when parsing with Elm.Json

Actually I'm working with Elm but I have few issues with the json parsing in this language, the error that give me the compiler is:
Err "Given an invalid JSON: Unexpected token \n in JSON at position 388"
What I need to do is this:
example
At the char_meta I want its something like this:
[("Biographical Information", [("Japanese Name", "緑谷出久"), ...]), ...]
Here the code:
Ellie link
PD: The only constant keys are character_name, lang, summary and char_meta, they keys inside of char_meta are dynamic (thats why I use keyvaluepair) and the length its always different of this array (sometimes its empty)
Thanks, hope can help me.
EDIT:
The Ellie link now redirect to the fixed code
The issue is that elm (or JS once transcoded) interprets the \n and \" sequences when parsing the string literal, and they are replaced with an actual new line and double quotes respectively, which results in invalid JSON.
If you want to have the JSON inline in the code, you need to escape the 5 \s by doubling them (\\n and \\").
This only applies for literals, you won't have the issue if you load JSON from the network for instance.

Inserting into Postgres Multidimensional Text Array from NodeJS Knex

I am attempting to insert a single row into a Postgres table from a NodeJS application using the Knex-Seed-File module for Knex.
Upon each attempt, I receive an error for only one column/field which is a multidimensional text array: photo_urls text[][] NULL,. The error states there is a malformed array literal.
Having gone through the official Postgres documentation, I've tried using double quotes:
(8.14.2. Array Value Input)
"To write an array value as a literal constant, enclose the element values within curly braces and separate them by commas...You can put double quotes around any element value, and must do so if it contains commas or curly braces."
I've also tried using ARRAY constructor syntax.
Here are various attempts Ive had at constructing the input as well as what was returned as being the actual SQL generated and the returned error:
Attempt 1:
array[['ext_profile:','ext_random:','int_random:'],['https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile','https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2','https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3']]
Result 1:
'array[[\'ext_profile:\',\'ext_random:\',\'int_random:\'],[\'https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile\',\'https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2\',\'https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3\']]'
Error 1:
- malformed array literal: "array[['ext_profile:','ext_random:','int_random:'],['https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile','https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2','https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3']]
Attempt 2:
$${"ext_profile:", "ext_random:", "int_random:"},{"https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile", "https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2", "https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3"}$$
Result 2:
'"$${""ext_profile:"", ""ext_random:"", ""int_random:""},{""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile"", ""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2"", ""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3""}$$"'
Error 2:
- malformed array literal: ""$${""ext_profile:"", ""ext_random:"", ""int_random:""},{""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile"", ""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2"", ""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3""}$$"
Attempt 3:
($${"ext_profile:", "ext_random:", "int_random:"},{"https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile", "https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2", "https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3"}$$)
Result 3:
'"($${""ext_profile:"", ""ext_random:"", ""int_random:""},{""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile"", ""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2"", ""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3""}$$)"'
Error 3:
- malformed array literal: ""($${""ext_profile:"", ""ext_random:"", ""int_random:""},{""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile"", ""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2"", ""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2, https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3""}$$)"
Attempt 4:
array[['ext_profile:','ext_random:','int_random:'],["https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile","https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2","https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3"]]
Result 4:
'"array[[\'ext_profile:\',\'ext_random:\',\'int_random:\'],[""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile"",""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2"",""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3""]]"'
Error 4:
- malformed array literal: ""array[['ext_profile:','ext_random:','int_random:'],[""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile"",""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2"",""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3""]]"
Attempt 5 (Post knex-seed-file upgrade):
[["ext_profile:","ext_random:","int_random:"],["https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile","https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2","https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3"]]
Result 5:
'"[[""ext_profile:"",""ext_random:"",""int_random:""],[""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile"",""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2"",""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3""]]"'
Error 5:
- malformed array literal: ""[[""ext_profile:"",""ext_random:"",""int_random:""],[""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile"",""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2"",""https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3""]]"
There appear to be many bugs/issues reported as related to knex postgres integration:
#658, #828, #869, #1602,... which seem to have been closed and/or merged into #1661.
From what I can tell, it appears the issue was closed as resolved.
Can anyone help identify what I'm doing wrong or what I can do to resolve the issue?
The module is now upgraded (0.3.1) and should now handle arrays properly. To enter array value after updating the package, you should use following pattern:
[["ext_profile:","ext_random:","int_random:"],["https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Profile","https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Ext+Random+2","https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+1,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=In+Random+2,https://dummyimage.com/300x250/0cb3f5/fcfcfc.png&text=Int+Random+3"]]
Please open an issue at https://github.com/tohalla/knex-seed-file, upon encountering more problems.
#touko correctly identified that the issue is a result of the default behavior for csv files.
When saving a csv file, double quotes are added to anything with embedded commas or double-quoted characters.
It is explained in these posts and articles:
superuser post 1
superuser post 2
csvreader.com
https://www.rfc-editor.org/rfc/rfc4180
In regards to the knex-seed-file module there is an issue opened in Github. Currently, I'm using the workaround of opening the csv file in a text editor and manually removing the undesired double quotes.
Example (note: I am using a pipe-delimited csv file):
find "[[ and replace with [[
find ]]" and replace with ]]
find "" and replace with "