Objective-C: Add JSON response to SQLite DB - objective-c

I am getting a JSON response and would like to add it to an SQLite db. The results come back as an array and in each array there will be about 30 keys with values. What would be the most efficient approach to adding all those to my table? The keys coming from the JSON would be the columns in the sqlite db.
Would it be best to do a for loop on each array item then another for loop in side to get the values and add it to a string then add them to the database that way? Or is there a better approach using FMDB to add a JSON response directly to the database if the json keys match the database table columns?

If you believe your JSON response will not change and your data model will not change (or rarely change), then I'd just loop through the arrays and write the slightly long...
[db executeUpdate:#"INSERT INTO response (key1,key2,..key30) VALUES (?,?,?...etc);", json_reponse.value1,json_response.value2,...,json_response.value30, nil];
However, if this model would change, be extended, etc... then I'd probably just use Core Data.
The biggest factor though is what are you doing with the data after it is stored? Creating objects, display a report, converting the objects back to JSON (then just store the raw JSON in a text field)?

I ended up doing a loop and getting the json keys and using those as the columns.

Related

Add Array to field instead of String

I have the following problem:
I want to add my field value the value of value= [0,16,33,50,67,84,101,118,135,152,169,186,203,220,237,254,271,288,305,322,338,355,372,389,406,423,440,457,474,491,508,525,542,559,576,593,610,627,644,661,677,694,711,728,745,762,779,796,813,830,847,864,881,898,915,932,949,966,983,1000,1016,1033,1050,1067,1084,1101,1118,1135,1152,1169,1186,1203,1220,1237,1254,1271,1288,1305,1322,1338,1355,1372,1389,1406,1423,1440,1457,1474,1491,1508,1525,1542,1559,1576,1593,1610,1627,1644,1661,1677]
I tried to use JSON or any other field type it return me the value as a string (with "") and as I am doing stuff, it would not work. How to work around this?
I'm not entirely sure if this answers your question, but Directus 6 saves data in only the MySQL 5 datatypes. Therefore, CSV / JSON values are saved as strings (often in the TEXT datatype). If you want to use this data in your application as an array / JSON, you will have to convert it yourself.
The Directus team is working to support more (custom) datatypes in future versions, so the API can respond with nested arrays/objects in JSON.

What happens if I send integers to a BigQuery field "string"?

One of the columns I send (in my code) to BigQuery is integers. I added the columns to BigQuery and I was too fast and added them as type string.
Will they be automatically converted? Or will the data be totally corrupted (= I cannot trust at all the resulting string)?
Data shouldn't be automatically converted as this would destroy the purpose of having a table schema.
What I've seen people doing is saving a whole json line as string and then processing this string inside of BigQuery. Other than that, if you try to save values not correspondent to the field schema definition, you should see an error being thrown, like so:
If you need to change a table schema's definition, you can check this tutorial on updating a table schema.
Actually BigQuery converted automatically the integers that I have sent it to string, so my table populates ok

Use Postgres to parse stringified JSON object

I've been using Postgres to store JSON objects as strings, and now I want to utilize PG's built-in json and jsonb types to store the objects more efficiently.
Basically, I want to parse the stringified JSON and put it in a json column from within PG, without having to resort to reading all the values into Python and parsing them there.
Ideally, my migration should look like this:
UPDATE table_name SET json_column=parse_json(string_column);
I looked at Postgres's JSON functions, and there doesn't seem to be a method of doing this, even though it seems pretty trivial. For the record, my JSON objects are just one-dimensional arrays of strings.
Is there any way to do this?
There is no need for a parse_json column, just change the type of the column:
ALTER TABLE table_name
ALTER COLUMN json_column TYPE json USING json_column::json;
Note that if you plan on doing a lot of JSON operations on these values (i.e. extracting elements from objects, modifying objects etc) it's better to use jsonb. json should only be used for storing JSON data. Also, as Laurenz Albe points out, if you don't need to do any JSON operations on these values and you are not interested in the validation that postgresql can do on them (e.g. because you trust that the source always provides valid JSON), then using text is a perfectly valid option (or bytea).

Dynamic type cast in select query

I have totally rewritten my question because of inaccurate description of the problem!
We have to store a lot of different informations about a specific region. For this we need a flexible data structure which does not limit the possibilities for the user.
So we've create a key-value table for this additional data which is described through a meta table which contains the datatype of the value.
We already use this information for queries over our rest api. We then automatically wrap the requested field with into a cast.
SQL Fiddle
We return this data together with information form other tables as a JSON object. We convert the corresponding rows from the data-table with array_agg and json_object into a JSON object:
...
CASE
WHEN count(prop.name) = 0 THEN '{}'::json
ELSE json_object(array_agg(prop.name), array_agg(prop.value))
END AS data
...
This works very well. Now the problem we have is if we store data like a floating point number into this field, we then get returned a string representation of this number:
e.g. 5.231 returns as "5.231"
Now we would like to CAST this number during our select statement into the right data-format so the JSON result would be correctly formatted. We have all the information we need so we tried following:
SELECT
json_object(array_agg(data.name),
-- here I cast the value into the right datatype!
-- results in an error
array_agg(CAST(value AS datatype))) AS data
FROM data
JOIN (
SELECT name, datatype
FROM meta)
AS info
ON info.name = data.name
The error message is following:
ERROR: type "datatype" does not exist
LINE 3: array_agg(CAST(value AS datatype))) AS data
^
Query failed
PostgreSQL said: type "datatype" does not exist
So is it possible to dynamically cast the text of the data_type column to a postgresql type to return a well-formatted JSON object?
First, that's a terrible abuse of SQL, and ought to be avoided in practically all scenarios. If you have a scenario where this is legitimate, you probably already know your RDBMS so intimately, that you're writing custom indexing plugins, and wouldn't even think of asking this question...
If you tell us what you're actually trying to do, there's about a 99.9% chance we can tell you a better way to do it.
Now with that disclaimer aside:
This is not possible, without using dynamic SQL. With a sufficiently recent version of PostgreSQL, you can accomplish this with the use of 'EXECUTE IMMEDIATE', which you can read about in the manual. It basically boils down to using EXEC.
Note, however, that even using this method, the result for every row fetched in the same query must have the same data type. In other words, you can't expect that row 1 will have a data type of VARCHAR, and row 2 will have INT. That is completely impossible.
The problem you have is, that json_object does create an object out of a string array for the keys and another string array for the values. So if you feed your JSON objects into this method, it will always return an error.
So the first problem is, that you have to use a JSON or JSONB column for the values. Or you can convert the values from string to json with to_json().
Now the second problem is that you need to use another method to create your json object because you want to feed it with a string array for the keys and a json-object array for the values. For this there is a method called json_object_agg.
Then your output should be like the one you expected! Here the full query:
SELECT
json_object_agg(data.name, to_json(data.value)) AS data
FROM data

Dynamic DB storage based on unknown result from source

I have a source of data that I get from a webService. I can never know when it'll change and I need to store it in a DB as soon as I get it. What is the best way to make the storage solution adapt to what I put there. I am using mySQL. Would serialization be the key?
I would store the context in a column using the TEXT data type, and consider MEDIUMTEXT or LONGTEXT if the content is over 4000 characters. MySQL 5.1 has XML functionality to get values out of the XML payload...
Ideally, I'd consume the webservice and populate tables appropriately.