It seems that the only way to handle JSON in Elm is to decode each JSON scheme manually by Json.Decode.
Is there a nice alternative like F#'s Type Provider or something?
F# Data: JSON Type Provider
There is no official package doing this but there are community alternatives like this one: https://github.com/eeue56/json-to-elm
Create Elm type aliases and decoders based on JSON input
This project allows you to automate the creation of:
type aliases from JSON data
decoders from type aliases and some union types
encoders from type aliases and some union types
Related
I want to validate a JSON Schema to see if the JSON Schema is valid. So I don't want to validate a standard JSON document against a schema.
I need a meta schema that is a bit more strict than the current meta schema.
I would like to have a schema that
only allows appropriate properties on a type
e.g. not maxLength on a int type
adds custom required fields on a type.
In this question some insights are given by #JasonDesrosiers.
I was wondering if there is any update on the improvements and an example of how to extend the meta schema.
We want to project a new database schema for our Society's application.
The program is developed in c# nancy serverside and react-redux-graphql on clientside.
Our Society often must implement repentine changing for treat new business data. So we want to realise a solid core for the fundamental and no subject to decadence data eg: Article (Code, description, Qty, Value, Price, categoryId).
But often we need to add particular category to an article, or special implementation only for a limited period of time. We are thinking to implement a TOXI like solution for treat those situations.
But in TOXI pattern implementation we wan to add a third table for define each tag data type and definition.
Here is a simple explanatory image:
In the Metadata we have two columns with JSON data: DataType and DefinedValue
DataType define How the program (eventually a func in db) must cast the varchar data in articoli_meta.value
DefinedValue is not null define if the type must have a series of predefined value eg: High, Medium, Low etc...
Those two column are varchar and contain JSON with a predefined standard, a defined standard from our programming team (ev. an sql func for validate those two columns)
I Understand that this kind of approach is not a 'pure' relational approach but we must consider that we often pass data to the client in json format so the DefinedValue column can easily queried as string and passed to interface as data for a dropdown list.
Any ideas, experience or design tips are appreciated
I'm setting up a pipeline in NiFi where I get JSON records which I then use to make a request to an API. The response I get would have both numeric and textual data. I then have to write this data to Hive. I use InferAvroSchema to infer the schema. Some numeric values are signed values like -2.46,-0.1 While inferring the type, the processor considers them as string instead of double or float or decimal type.
I know we can hard code our AVRO schema in the processors but I thought making it more dynamic by utilizing the InferAvroSchema would be even better. Is there any other way we can overcome/resolve this?
InferAvroSchema is good for guessing an initial schema, but once you need something more specific it is better to remove InferAvroSchema and provide the exact schema you need.
I've been using Postgres to store JSON objects as strings, and now I want to utilize PG's built-in json and jsonb types to store the objects more efficiently.
Basically, I want to parse the stringified JSON and put it in a json column from within PG, without having to resort to reading all the values into Python and parsing them there.
Ideally, my migration should look like this:
UPDATE table_name SET json_column=parse_json(string_column);
I looked at Postgres's JSON functions, and there doesn't seem to be a method of doing this, even though it seems pretty trivial. For the record, my JSON objects are just one-dimensional arrays of strings.
Is there any way to do this?
There is no need for a parse_json column, just change the type of the column:
ALTER TABLE table_name
ALTER COLUMN json_column TYPE json USING json_column::json;
Note that if you plan on doing a lot of JSON operations on these values (i.e. extracting elements from objects, modifying objects etc) it's better to use jsonb. json should only be used for storing JSON data. Also, as Laurenz Albe points out, if you don't need to do any JSON operations on these values and you are not interested in the validation that postgresql can do on them (e.g. because you trust that the source always provides valid JSON), then using text is a perfectly valid option (or bytea).
I wanna have a table parameter in a RFC function module of type CGPL_TEXT1, which uses the domain type TEXT40, which is a char 40.
I tried to create it:
IT_PARTS_DESCRIPTION LIKE CGPL_TEXT1
But I keep getting this error
tables using like may only reference flat structures
I am also not able to use TYPE. If I do so, I get this error:
Flat types may only be referenced using LIKE for table parameters
Don't go there. For RFC-enabled function modules, always use a structure as a line type for your table. The RFC protocol itself also supports unstructured tables, but many adapters don't. So you should
declare a data dictionary structure Z_MY_PARTS_DATA with a single field DESCRIPTION TYPE CGPL_TEXT2
declare a data dictionary table type Z_MY_PARTS_TABLE using this structure
use that table type in your function module.
Look inside the dictionary for a table type which has only one column representing Your Text.
If You cannot find it, just go the proper way and define a z structure and a z tabletype based on that structure. This is the proper way and I also prefer to use this ( even sometimes, when I would not really need it, i do this ).... because the structures sand table types can be documented.