Using reserved word field name in DocumentDB - sql

I inherited a database loaded into DocumentDB, where field name happens to be "Value".
Example of my structure is:
{
...
"Alternates": [
"Type": "ID",
"Value" : "NOCALL"
]
}
when I query (using documentDB's SQL), trying to get back all documents where Alternates.Value = "NOCALL", I get syntax error near
"Value" error
. If I query for Type = "ID", it is all fine.
Seems that the word Value, having a special meaning on DocumentDB is causing an issue.
Putting punctuation (e.g. quotes/double quotes) around "Value" does not seem to help.
Any suggestion on how to resolve this will be much appreciated!
Thank you in advance!

You are correct. Value is a reserved keyword.
To escape this use [""] syntax.
So in your case of
"Alternates": [
"Type": "ID",
"Value" : "NOCALL"
]
SELECT c
FROM c
JOIN alt IN c.Alternates
WHERE alt["Value"] = 'NOCALL'

In my case, the structure looks something like this - { "name": "ABC", "Value": 123 }.
I could escape the reserved keyword using [""] (as answered by others) along with <source_name> i.e.
SELECT c["Value"] FROM c -- 123
Ref.: Querying in Azure Cosmos DB

Related

How to get length of array in SPL2 splunk query

My splunk data looks like this
{
"name": "john",
"foo": []
}
sometimes foo is empty, and sometimes it has data in it. I want to query for all the EMPTY using SPL2.
I tried foo=[] and I tried foo="[]" but neither works.
You can try the following syntax :
<your_search>
| where isnull('foo{}')

Snowflake Searching string in semi structured data

I have a table. There are many columns and rows. One column that I am trying to query in Snowflake has semi structured data. For example, when I query
select response
from table
limit 5
This is what is returned
[body={\n "id": "xxxxx",\n "object": "charge",\n "amount": 500,\n "amount_refunded": 0,\n "application": null,\n "application_fee": null,\n "application_fee_amount": null,\n "balance_transaction": null,\n "billing_details": {\n "address": {\n "city": null,\n "zip": "xxxxx",]
I want to select only the zip in this data. When I run code:
select response:zip
from table
limit 5
I get an error.
SQL compilation error: error line 1 at position 21 Invalid argument types for function 'GET': (VARCHAR(16777216), VARCHAR(11))
Is there a reason why this is happening? I am new to snowflake so trying to parse out this data but stuck. Thanks!
Snowflake has very good documentation on the subject
For your specific case, have you attempted to use dot notation? It's the appropiate method for accessing JSON. So
Select result:body.zip
from table
Remember that you have your 'body' element. You need to access that one first with semicolon because it's a level 1 element. Zip is located within body so it's a level 2. Level 1 elements are accessed with semicolon, level 2 elements are accessed with dot notation.
I think you have multiple issues with this.
First I think your response column is not a variant column. Please run the below query and confirm
SHOW COLUMNS ON table;
Even if the column is variant, the way the data is stored is not in a valid JSON format. You will need to strip the JSON part and then store that in the variant column.
Please do the first part and share the information, I will then suggest next steps. I wanted to put that in the comment but comment does not allow to write so many sentences.

Is it possible to prevent ORDS from escaping my GeoJSON?

I have a problem with Oracle ORDS escaping my GeoJSON with "
{
"id": 1,
"city": "New York",
"state_abrv": "NY",
"location": "{\"type\":\"Point\",\"coordinates\":[-73.943849, 40.6698]}"
}
In Oracle DB it is stated correctly:
{"type":"Point","coordinates":[-73.943849, 40.6698]}
Need help to figure out why the " are added and how to prevent this from happening
add this column alias to your restful service handler query for the JSON column
SELECT id,
jsons "{}jsons" --this one
FROM table_with_json
Then when ords sees the data for the column, it won't format it as JSON because it already IS json
You can use whatever you want, in your case it should probably be
"{}location"

How to extract this json into a table?

I've a sql column filled with json document, one for row:
[{
"ID":"TOT",
"type":"ABS",
"value":"32.0"
},
{
"ID":"T1",
"type":"ABS",
"value":"9.0"
},
{
"ID":"T2",
"type":"ABS",
"value":"8.0"
},
{
"ID":"T3",
"type":"ABS",
"value":"15.0"
}]
How is it possible to trasform it into tabular form? I tried with redshift json_extract_path_text and JSON_EXTRACT_ARRAY_ELEMENT_TEXT function, also I tried with json_each and json_each_text (on postgres) but didn't get what expected... any suggestions?
desired results should appear like this:
T1 T2 T3 TOT
9.0 8.0 15.0 32.0
I assume you printed 4 rows. In postgresql
SELECT this_column->'ID'
FROM that_table;
will return column with JSON strings. Use ->> if you want text column. More info here: https://www.postgresql.org/docs/current/static/functions-json.html
In case you were using some old Postgresql (before 9.3), this gets harder : )
Your best option is to use COPY from JSON Format. This will load the JSON directly into a normal table format. You then query it as normal data.
However, I suspect that you will need to slightly modify the format of the file by removing the outer [...] square brackets and also the commas between records, eg:
{
"ID": "TOT",
"type": "ABS",
"value": "32.0"
}
{
"ID": "T1",
"type": "ABS",
"value": "9.0"
}
If, however, your data is already loaded and you cannot re-load the data, you could either extract the data into a new table, or add additional columns to the existing table and use an UPDATE command to extract each field into a new column.
Or, very worst case, you can use one of the JSON Functions to access the information in a JSON field, but this is very inefficient for large requests (eg in a WHERE clause).

MongoDB query returns 3 dots instead of answer

I am following an online course on a website and when I am trying to submit a query on my local MongoDB, it returns ... instead of the answer.
The query I submit is
db.scores.find( { "type" : "essay", "score" : 50 }, { student : true, _id : false ).pretty()
The "..." that I get as an "answer" from the local MongoDB server indicates that the server is expecting from me to provide it with more input.
I clearly have a syntax error on my query, I forgot to close a curly bracket.
The correct query db.scores.find( { "type" : "essay", "score" : 50 }, { student : true, _id : false } ).pretty() does not return "..."
HINT: In case the forgotten input is not in the end of the query, but somewhere in the middle (as happened in this query) you can escape the "..." mode by hitting the "enter" two times and then try to type in the new query again.
When I had this same error, its was the result of a string value being terminated prematurely due to a ' or " in the string. Look for any extraneous quotation marks or apostrophes in the values you're adding which may interfere with the declaration.
Just as addition: watch out the password string. Be aware it not to contain quotation mark which interfers with quotation mark of declaration of it, like in my case. I got ... too.