Modify data using AJV for Json Schema - jsonschema

I'm using AJV (Another Json schema validator) on NodeJs.
I've the following schema
var schema = {
"$id": "testSchema.json",
"type": "object",
"$schema": "http://json-schema.org/draft-06/schema#",
"additionalProperties": false,
"properties": {
"userId": {
"type": "integer"
},
"userName": {
"type": "string"
},
"uniqueID": {
"type": "integer"
}
}
}
I need to overwrite unqiueID property by a value that I could somehow pass to Json schema or AJV.
I think the above can be done using AJV addKeyword method, tried using it but failed because I don't know how to manipulate (and return) data value from AJV custom keywords.
Is possible to modify data with AJV ? or are there any other possible ways to do it??
Thank you!

You can create a custom keyword with function that will do whatever your want to data.
var Ajv = require('ajv');
var ajv = new Ajv({allErrors: true});
ajv.addKeyword('my_id_rewrite', {
type: 'object',
compile: function (sch, parentSchema) {
return function (data) {
console.log(data)
data['my_id']=parentSchema.my_id_rewrite;
return true;
}
}
});
var schema = { "my_id_rewrite": 2 };
var validate = ajv.compile(schema);
o = {"my_id":1}
console.log(validate(o)); // true
console.log(o); // Object {my_id: 2}
https://runkit.com/embed/cxg0vwqazre3

Related

SQL Server stored procedure in .NET Core 6 Web API to produce JSON data used in Angular app

I have a SQL Server stored procedure that has an ID parameter and returns a string in JSON format that is needed in the Angular app.
Here is a sample of the JSON needed:
[
{
"type": "date",
"name": "asofdate",
"ui":
{
"label": "As Of Date",
"placeholder": "Enter a date"
},
"validators": { "required": "true" }
},
{
"type": "select",
"name": "scope",
"ui": { "label": "Scope", "placeholder": "Select a scope" },
"validators": { "required": "true" },
"source": [
{ "value": 1, "text": "ABC" },
{ "value": 2, "text": "CDE" },
{ "value": 3, "text": "FGI" }
]
}
]
Here is a what the result of running the stored procedure looks like:
When I run the Web API passing the ID parameter to the stored procedure, I would like to capture the response as a JSON object to be used in the Angular app.
But the Web API is returning this:
[
{
"jsonResponse": "[
{
\"type\":\"date\",
\"name\":\"asofdate\",
\"ui\":{\"label\":\"As Of Date\",\"placeholder\":\"Enter a date\"},
\"validators\":{\"required\":\"true\"}
}
,
{
\"type\":\"select\",
\"name\":\"scope\",
\"ui\":{\"label\":\"Scope\",\"placeholder\":\"Select a scope\"},
\"validators\":{\"required\":\"true\"},
\"source\":[{\"value\":1,\"text\":\"ABC\"},{\"value\":2,\"text\":\"DEF\"},{\"value\":3,\"text\":\"GHI\"}]}
}
]
Is there a way to get the JSON response from the Web API without all the "\" and without:
{
"jsonResponse": "
so that it matches the sample above?
Here is the code from the Web API:
[HttpGet("{ReportID}")]
public async Task<ActionResult<IEnumerable<usp_ReportParameterResult>>> GetReportParameters(int ReportID)
{
if (_context.usp_ReportParameterAsync == null)
{
return NotFound();
}
var op = new OutputParameter<int>();
var JSONresponse = await _context.usp_ReportParameterAsync(ReportID, op);
if (JSONresponse == null)
{
return NotFound();
}
return JSONresponse;
}
The stored procedure uses JSON_QUERY and JSON PATH to create the needed nested arrays.
So, in the angular code I have the following hard-coded:
TESTDATA:any[] = [
{
type:'text',
name:'firstName',
validators:{
required:true
},
ui:{label:'First Name',placeholder:'Enter Your First Name'}
}
,
{
"type":"date",
"name":"asofdate",
"ui":{"label":"****As Of Date","placeholder":"Enter a date","class":["date-picker-wrapper"]},
"validators":{"required":"true"}
}
]
What I need is instead of this data being hrad-coded it is being dynamically generated from a Web API.
The hard-coded data looks like the following from browser debug:
[![enter image description here][2]][2]
From the web api data looks like the following:
It is not an array like the TESTDATA. Is the a way to get response from web api into an array format as required?
Actually, easiest solution was to remove the backlashes in the Angular app by simply doing the following:
for (let item of this.formattedJSON) {
item.ui = JSON.parse(item.ui);
item.validators = JSON.parse(item.validators);
}

JSON Schema v7: formatMinimum & formatMaximum validate everything

I am using ajv json schema library (v7) and trying to validate a date based on some value. It looks pretty straightforward with using formatMinimum/formatMaximum but it seems that every date is validated when using these keywords
Here's my schema
"some-date": {
"type": "object",
"properties": {
"data": {
"type": "object",
"properties": {
"value": {
"type": "string",
"format": "date-time",
"formatMinimum": "2021-03-10T14:25:00.000Z"
}
}
}
}
}
Here's the json:
{
"some-date": {
"data": {
"value": "2011-03-10T14:25:00.000Z"
}
}
}
Here's how I am validating:
const ajv = new Ajv({allErrors: true})
require('ajv-formats')(ajv)
require('ajv-errors')(ajv)
require('ajv-keywords')(ajv)
const validate = ajv.validate(mySchema)
const isValid = validate(myJSON)
I've tried it on JSONSchemalint and it validates the above json with the given schema. Also, I have tried with several dates and it validates everything.
Please let me know if I am missing something.
Thanks
I'm not sure where you're getting formatMinimum and formatMaximum from, but they are not standard keywords in the JSON Schema specification, under any version. Are they documented as supported keywords in the implementation that you are using?

Can Strongloop generate string uids?

I want to stop using autogenerated number ids for my models on Strongloop. Can Strongloop generate string uids like e.g. 067e6162-3b6f-4ae2-a171-2470b63dff00?
Yes, strong-loop will generate uuid with uuid function call in model definition. you can use something like below in your model properties.
"id": {
"type": "string",
"defaultFn": "uuid"
}
You can check below url to get more info.
https://loopback.io/doc/en/lb3/Model-definition-JSON-file.html and https://github.com/strongloop/loopback/issues/292.
You need to modify the .js file along with the .json file.
Based on the logic, you can also add a remote method and generate the uuid from node-uuid module.
I'm assuming a User model here and properties of id, name, age and creating an entry into User model.
User.json
{
"name": "User",
"properties": {
"id": {
"type": "string",
"id": true,
"defaultFn": "uuid",
"required": true
},
"name": {
"type": "string",
"required": true
},
"age": {
"type": "string",
"required": true
}
}
User.js
var uuid = require('node-uuid');
module.exports = function(User) {
var userObj = {};
userObj.id = uuid();
userObj.name = 'John';
userObj.age = 22;
User.create(userObj, function(err, userInstance){
if (err) {
console.log(err);
} else if (userInstance) {
console.log(userInstance);
}
});
}
This will work.

JSON Schema - require all properties

The required field in JSON Schema
JSON Schema features the properties, required and additionalProperties fields. For example,
{
"type": "object",
"properties": {
"elephant": {"type": "string"},
"giraffe": {"type": "string"},
"polarBear": {"type": "string"}
},
"required": [
"elephant",
"giraffe",
"polarBear"
],
"additionalProperties": false
}
Will validate JSON objects like:
{
"elephant": "Johnny",
"giraffe": "Jimmy",
"polarBear": "George"
}
But will fail if the list of properties is not exactly elephant, giraffe, polarBear.
The problem
I often copy-paste the list of properties to the list of required, and suffer from annoying bugs when the lists don't match due to typos and other silly errors.
Is there a shorter way to denote that all properties are required, without explicitly naming them?
You can just use the "minProperties" property instead of explicity naming all the fields.
{
"type": "object",
"properties": {
"elephant": {"type": "string"},
"giraffe": {"type": "string"},
"polarBear": {"type": "string"}
},
"additionalProperties": false,
"minProperties": 3
}
I doubt there exists a way to specify required properties other than explicitly name them in required array.
But if you encounter this issue very often I would suggest you to write a small script that post-process your json-schema and add automatically the required array for all defined objects.
The script just need to traverse the json-schema tree, and at each level, if a "properties" keyword is found, add a "required" keyword with all defined keys contained in properties at the same level.
Let the machines do the bore stuff.
I do this in code with a one-liner, for instance, if I want to use required for insert in a DB, but only want to validate against the schema when performing an update.
prepareSchema(action) {
const actionSchema = R.clone(schema)
switch (action) {
case 'insert':
actionSchema.$id = `/${schema.$id}-Insert`
actionSchema.required = Object.keys(schema.properties)
return actionSchema
default:
return schema
}
}
if you using the library jsonschema in python use custom validators:
first create custom validator:
# Custom validator for requiring all properties listed in the instance to be in the 'required' list of the instance
def allRequired(validator, allRequired, instance, schema):
if not validator.is_type(instance, "object"):
return
if allRequired and "required" in instance:
# requiring all properties to 'required'
instanceRequired = instance["required"]
instanceProperties = list(instance["properties"].keys())
for property in instanceProperties:
if property not in instanceRequired:
yield ValidationError("%r should be required but only the following are required: %r" % (property, instanceRequired))
for property in instanceRequired:
if property not in instanceProperties:
yield ValidationError("%r should be in properties but only the following are properties: %r" % (property, instanceProperties))
then extend an exsitsing validator:
all_validators = dict(Draft4Validator.VALIDATORS)
all_validators['allRequired'] = allRequired
customValidator = jsonschema.validators.extend(
validator=Draft4Validator,
validators=all_validators
)
now test:
schema = {"allRequired": True}
instance = {"properties": {"name": {"type": "string"}}, "required": []}
v = customValidator(schema)
errors = validateInstance(v, instance)
you will get the error:
'name' should be required but only the following are required: []
As suggested by others, here's such post-processing python code:
def schema_to_strict(schema):
if schema['type'] not in ['object', 'array']:
return schema
if schema['type'] == 'array':
schema['items'] = schema_to_strict(schema['items'])
return schema
for k, v in schema['properties'].items():
schema['properties'][k] = schema_to_strict(v)
schema['required'] = list(schema['properties'].keys())
schema['additionalProperties'] = False
return schema
You can use the function below:
export function addRequiredAttributeRecursive(schema) {
if (schema.type === 'object') {
schema.required = [];
Object.keys(schema.properties).forEach((key) => {
schema.required.push(key);
if (schema.properties[key].type === 'object') {
schema.properties[key] = addRequiredAttributeRecursive(
schema.properties[key],
);
} else if (schema.properties[key].type === 'array') {
schema.properties[key].items = addRequiredAttributeRecursive(
schema.properties[key].items,
);
}
});
} else if (schema.type === 'array') {
if (schema.items.type === 'object') {
schema.items = addRequiredAttributeRecursive(schema.items);
}
}
return schema;
}
It recursively write the required attribute for every property on all objects from the schema you have.
If you are using Javascript, you can use property getter.
{
"type": "object",
"properties": {
"elephant": {"type": "string"},
"giraffe": {"type": "string"},
"polarBear": {"type": "string"}
},
get required() { return Object.keys(this.properties) },
"additionalProperties": false
}

How to make ember work with Django REST gis

I am currently trying to setup ember to interact with Django's REST Framework using the ember-django-adapter.
This works flawless. But since I started using djangorestframework-gis, ember is not able to process the responses anymore.
I have not found anyone building geoJSON with ember except for: https://gist.github.com/cspanring/5114078 But that does not seem to be the right approach because I do not want to change the data model?
This is the api-response:
{
"type": "FeatureCollection",
"features": [
{
"id": 1,
"type": "Feature",
"geometry": {
"coordinates": [
9.84375,
53.665466308594
],
"type": "Point"
},
"properties": {
"date_created": "2014-10-05T20:08:43.565Z",
"body": "Hi",
"author": 1,
"expired": false,
"anonymous": false,
"input_device": 1,
"image": "",
"lat": 0.0,
"lng": 0.0
}
}
]
}
While ember expects something like:
[{"id":1,
"date_created":"2014-10-05T20:08:43.565Z",
"body":"Hi",
"author":1,
"expired":false,
"anonymous":false,
"input_device":1,
"image":"",
"lat":0,
"lng":0
}
]
My take on this was to write my own Serializer:
import Ember from "ember";
import DS from "ember-data";
export default DS.DjangoRESTSerializer.extend({
extractArray: function(store, type, payload) {
console.log(payload);
//console.log(JSON.stringify(payload));
var features = payload["features"];
var nPayload = [];
for (var i = features.length - 1; i >= 0; i--) {
var message = features[i];
var nmessage = {"id": message.id};
for(var entry in message.properties){
var props = message.properties;
if (message.properties.hasOwnProperty(entry)) {
var obj = {}
nmessage[entry]=props[entry];
}
}
nPayload.push(nmessage);
};
console.log(nPayload); //prints in the format above
this._super(store, type, nPayload);
},
})
But I receive the following error:
The response from a findAll must be an Array, not undefined
What am I missing here? Or is this the wrong approach? Has anyone ever tried to get this to work?
An alternative would be to handle this on the serverside and simply output a regular restframework response and set lat and long in the backend.
This is not a valid answer for the question above. I wanted to share my solution anyways,
just in case anyone ever gets into the same situation:
I now do not return a valid geoJSON, but custom lat, lng values. The following is backend code for django-rest-framework:
Model:
#models/message.py
class Message(models.Model):
def lat(self):
return self.location.coords[1]
def lng(self):
return self.location.coords[0]
And in the serializer:
#message/serializer.py
class MessageSerializer(serializers.ModelSerializer):
lat = serializers.Field(source="lat")
lng = serializers.Field(source="lng")
Ember can easily handle the format.