I'm trying to bump an exact query first using Specification and Pageable from spring data.
When making a request to this endpoint http://localhost:8080/configs?search=PATH
This is the response now:
{
"content": [
{
"groupId": "PATHOD",
"type": "company",
},
{
"groupId": "PATH",
"type": "company",
},
{
"groupId": "APATH",
"type": "company",
}
],
"pageable": {
...
},
"sort": {
"sorted": true,
"unsorted": false,
"empty": false
},
"size": 10,
"empty": false
}
I want to be like, with the exact match bumped to first place:
{
"content": [
{
"groupId": "APATH",
"type": "company",
},
{
"groupId": "PATH",
"type": "company",
},
{
"groupId": "PATHOD",
"type": "company",
}
],
"pageable": {
....
},
"sort": {
"sorted": true,
"unsorted": false,
"empty": false
},
"size": 10,
"empty": false
}
This is my code right now, but I don't know how to write the specification
fun findAllWithFilters(
exactMatchFirst: Boolean = false,
pageable: Pageable
): Page<OrderConfig> {
val spec = and(
withExactMatchFirst(exactMatchFirst)
)
return findAll(spec, pageable)
}
The PATH groupId should be the first one to appear in the results
Related
I need to set datatype for Additional Column with Dynamic Content in Sink in ADF
By default its taking nvarchar(max) from Json obj but I need bigInt
Below is a Json Obj which create table with Additional column
{
"source": {
"type": "SqlServerSource",
"additionalColumns": [
{
"name": "ApplicationId",
"value": 3604509277250831000
}
],
"sqlReaderQuery": "SELECT * from Table A",
"queryTimeout": "02:00:00",
"isolationLevel": "ReadUncommitted",
"partitionOption": "None"
},
"sink": {
"type": "AzureSqlSink",
"writeBehavior": "insert",
"sqlWriterUseTableLock": false,
"tableOption": "autoCreate",
"disableMetricsCollection": false
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"typeConversion": true,
"typeConversionSettings": {
"allowDataTruncation": true,
"treatBooleanAsNumber": false
}
}
}
ADF Configuration
After create table Database - column with datatype
If I convert Dynamic content into Int
#int(pipeline().parameters.application.applicationId)
Then getting below warning
Please let me know how can I set Datatype in ADF
I also tried the same and getting same result.
By default its taking nvarchar(max) from Json obj but I need bigInt
To resolve this when you add additional column in your source data set and in Mapping click onimport schema it will import the schema of the source and also give you additional column in schema you have to change the type of the column as Int64 as shown in below image. in below image you can see after name there is additional means it is an additional column.
After this run your pipeline, It will create additional column with data type bigint .
{
"name": "pipeline2",
"properties": {
"activities": [
{
"name": "Copy data1",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "JsonSource",
"additionalColumns": [
{
"name": "name",
"value": {
"value": "#pipeline().parameters.demo.age",
"type": "Expression"
}
}
],
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"recursive": true,
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "JsonReadSettings"
}
},
"sink": {
"type": "AzureSqlSink",
"writeBehavior": "insert",
"sqlWriterUseTableLock": false,
"tableOption": "autoCreate",
"disableMetricsCollection": false
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"path": "$['taskId']"
},
"sink": {
"name": "taskId",
"type": "String"
}
},
{
"source": {
"path": "$['taskObtainedScore']"
},
"sink": {
"name": "taskObtainedScore",
"type": "String"
}
},
{
"source": {
"path": "$['multiInstance']"
},
"sink": {
"name": "multiInstance",
"type": "String"
}
},
{
"source": {
"path": "$['name']"
},
"sink": {
"name": "name",
"type": "Int64"
}
}
],
"collectionReference": ""
}
},
"inputs": [
{
"referenceName": "Json1",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "AzureSqlTable1",
"type": "DatasetReference"
}
]
}
],
"parameters": {
"demo": {
"type": "object",
"defaultValue": {
"name": "John",
"age": 30,
"isStudent": true
}
}
},
"annotations": []
}
}
OUTPUT:
I am trying to create Delivery plan in the Azure Devops project using Azure Devops REST Apis. I have used following method to create the same.
https://learn.microsoft.com/en-us/rest/api/azure/devops/work/plans/create?view=azure-devops-rest-6.0
POST https://dev.azure.com/{organization}/{project}/_apis/work/plans?api-version=6.0
and I am sending following data in the request body properties
{
"properties": {
"teamBacklogMappings": [
{
"teamId": "09d57738-697f-4433-abdd-b80a2bc6337b",
"categoryReferenceName": "Microsoft.RequirementCategory"
},
{
"teamId": "5df45eec-4108-474a-8d93-bc09c0b9037e",
"categoryReferenceName": "Microsoft.RequirementCategory"
},
{
"teamId": "e8ed402b-68e7-4140-96f2-07790a08788b",
"categoryReferenceName": "Microsoft.RequirementCategory"
},
{
"teamId": "14425694-efa5-454e-811c-e9e03d79198f",
"categoryReferenceName": "Microsoft.RequirementCategory"
},
{
"teamId": "b02690e9-1f48-421a-a918-3b23cc9a1b73",
"categoryReferenceName": "Microsoft.RequirementCategory"
}
],
"cardSettings": {
"fields": {
"showId": true,
"showAssignedTo": true,
"assignedToDisplayFormat": "avatarOnly",
"showState": true,
"showTags": true,
"showParent": false,
"showEmptyFields": true,
"showChildRollup": true,
"additionalFields": null,
"coreFields": [
{
"referenceName": "System.AssignedTo",
"displayName": "Assigned To",
"fieldType": "string",
"isIdentity": true
},
{
"referenceName": "System.Id",
"displayName": "ID",
"fieldType": "integer",
"isIdentity": false
},
{
"referenceName": "System.State",
"displayName": "State",
"fieldType": "string",
"isIdentity": false
},
{
"referenceName": "System.Tags",
"displayName": "Tags",
"fieldType": "plainText",
"isIdentity": false
}
]
}
},
"markers": [],
"styleSettings": [
{
"name": "BLOCKER",
"isEnabled": "True",
"filter": "[System.Tags] CONTAINS 'BLOCKER'",
"clauses": [
{
"fieldName": "System.Tags",
"logicalOperator": "AND",
"operator": "CONTAINS",
"value": "BLOCKER"
}
],
"settings": {
"background-color": "#E60017",
"title-color": "#000000"
}
},
{
"name": "New",
"isEnabled": "True",
"filter": "[System.State] = 'New'",
"clauses": [
{
"fieldName": "System.State",
"logicalOperator": "AND",
"operator": "=",
"value": "New"
}
],
"settings": {
"background-color": "#AAAAAA",
"title-color": "#000000"
}
},
{
"name": "Dev Completed",
"isEnabled": "True",
"filter": "[System.State] = 'Development Completed'",
"clauses": [
{
"fieldName": "System.State",
"logicalOperator": "AND",
"operator": "=",
"value": "Development Completed"
}
],
"settings": {
"background-color": "#D7E587",
"title-color": "#000000"
}
},
{
"name": "Deployed to QA",
"isEnabled": "True",
"filter": "[System.State] = 'Deployed to QA (SIT)'",
"clauses": [
{
"fieldName": "System.State",
"logicalOperator": "AND",
"operator": "=",
"value": "Deployed to QA (SIT)"
}
],
"settings": {
"background-color": "#C3D84C",
"title-color": "#000000"
}
},
{
"name": "Deployed to UAT",
"isEnabled": "True",
"filter": "[System.State] = 'Deployed to UAT'",
"clauses": [
{
"fieldName": "System.State",
"logicalOperator": "AND",
"operator": "=",
"value": "Deployed to UAT"
}
],
"settings": {
"background-color": "#60AF49",
"title-color": "#000000"
}
},
{
"name": "Deployed to PROD",
"isEnabled": "True",
"filter": "[System.State] = 'Completed'",
"clauses": [
{
"fieldName": "System.State",
"logicalOperator": "AND",
"operator": "=",
"value": "Completed"
}
],
"settings": {
"background-color": "#00643A",
"title-color": "#000000"
}
}
],
"tagStyleSettings": []
}
}
However Styling rules are not getting created in the project.
Got this done by using Update for the Delivery plan immediately after creating the same. Update it with same properties though you will have to add revision property to the request body.
https://learn.microsoft.com/en-us/rest/api/azure/devops/work/plans/update?view=azure-devops-rest-6.0
https://dev.azure.com/{organization}/{project}/_apis/work/plans/{id}?api-version=6.0
Create Delivery Plan doesn't accept style settings, from the UI you can notice this:
Just get the plan id and the response from the Create API and then use them in the Update API, this is the only way.
My login has different payloads one is:
{
"username": "",
"pass": ""
}
And one of the other is:
{
"username": "",
"pass": "",
"facebook": true
}
And the last:
{
"username": "",
"pass": "",
"google": true
}
My schema is as follow:
login_schema = {
"title": "UserLogin",
"description": "User login with facebook, google or regular login.",
"type": "object",
"properties": {
"username": {
"type": "string"
},
"pass": {
"type": "string"
},
"facebook": {
"type": "string"
},
"google": {
"type": "string"
}
},
"oneOf": [
{
"required": [
"username",
"pass"
],
"additionalProperties": False,
},
{
"required": [
"username",
"pass"
"google"
]
},
{
"required": [
"username",
"pass",
"facebook"
]
}
],
"minProperties": 2,
"additionalProperties": False,
}
It should give an error for the below sample:
{
"username": "",
"pass": "",
"google": "",
"facebook": ""
}
But it validates the schema successfully! What I have done wrong in the above schema?
EDIT-1:
pip3 show jsonschema
Name: jsonschema
Version: 3.0.2
Summary: An implementation of JSON Schema validation for Python
Home-page: https://github.com/Julian/jsonschema
Author: Julian Berman
Author-email: Julian#GrayVines.com
License: UNKNOWN
Location: /usr/local/lib/python3.7/site-packages
Requires: setuptools, six, attrs, pyrsistent
EDIT-2:
What I get as an error is:
jsonschema.exceptions.ValidationError: {'username': '', 'pass': '', 'google': '12'} is valid under each of {'required': ['username', 'pass', 'google']}, {'required': ['username', 'pass']}
A live demo of the error: https://jsonschema.dev/s/mXg5X
Your solution is really close. You just need to change /oneOf/0 to
{
"properties": {
"username": true,
"pass": true
},
"required": ["username", "pass"],
"additionalProperties": false
}
The problem is that additionalProperties doesn't consider the required keyword when determining what properties are considered "additional". It considers only properties and patternProperties. When just using required, additionalProperties considers all properties to be "additional" and the only valid value is {}.
However, I suggest a different approach. The dependencies keyword is useful in these situations.
{
"type": "object",
"properties": {
"username": { "type": "string" },
"pass": { "type": "string" },
"facebook": { "type": "boolean" },
"google": { "type": "boolean" }
},
"required": ["username", "pass"],
"dependencies": {
"facebook": { "not": { "required": ["google"] } },
"google": { "not": { "required": ["facebook"] } }
},
"additionalProperties": false
}
There is Project model
{
"name": "Project",
"plural": "Projects",
"base": "PersistedModel",
"idInjection": true,
"options": {
"validateUpsert": true
},
"properties": {
"title": {
"type": "string",
"required": true
},
"description": {
"type": "string"
},
"code": {
"type": "string"
},
"startDate": {
"type": "date",
"required": true
},
"endDate": {
"type": "date"
},
"value": {
"type": "number"
},
"infoEN": {
"type": "string"
},
"infoRU": {
"type": "string"
},
"infoAM": {
"type": "string"
},
"externalLinks": {
"type": [
"string"
]
}
},
"validations": [],
"relations": {
"industry": {
"type": "belongsTo",
"model": "Industry",
"foreignKey": "",
"options": {
"nestRemoting": true
}
},
"service": {
"type": "belongsTo",
"model": "Service",
"foreignKey": "",
"options": {
"nestRemoting": true
}
},
"tags": {
"type": "hasAndBelongsToMany",
"model": "Tag",
"foreignKey": "",
"options": {
"nestRemoting": true
}
}
},
"acls": [],
"methods": {}
}
And it hasAndBelongsToMany tags
here is Tag model
{
"name": "Tag",
"plural": "Tags",
"base": "PersistedModel",
"idInjection": true,
"options": {
"validateUpsert": true
},
"properties": {
"name": {
"type": "string",
"required": true
}
},
"validations": [],
"relations": {},
"acls": [],
"methods": {}
}
Now when the relation is created loopback api gives this api endpoint.
POST /Projects/{id}/tags
This creates a new tag into the tags collection and adds it to the project.
But what about adding an already existing tag to the project?
So I figured maybe I add before save hook to the Tag
Here I'll check if the tag exists and then pass the existing one for the relation.
Something like this.
tag.js
'use strict';
module.exports = function(Tag) {
Tag.observe('before save', function(ctx, next) {
console.log(ctx.instance);
Tag.find({name: ctx.instance.name})
next();
});
// Tag.validatesUniquenessOf('name', {message: 'name is not unique'});
};
#HaykSafaryan it just demo to show you how to use tag inside project
var app = require('../../server/server');
module.exports = function(project) {
var tag=app.models.tags
//afterremote it just demo. you can use any method
project.afterRemote('create', function(ctx, next) {
tag.find({name: ctx.instance.name},function(err,result)){
if(err) throw err;
next()
}
});
};
this is just example code to show you how to use update,create ,find ,upsertwithwhere etc. tag for validation you have to setup condition over here it will not take validation which you defined in tags models
Now suppose I have a json data formation like this following:
{
"ServiceName": "cacheWebApi",
"Description": "This is a CacheWebApiService",
"IsActive": true,
"Urls": [{ "ServiceAddress": "http://192.168.111.210:8200", "Weight": 5, "IsAvailable": true },
{ "ServiceAddress": ",http://192.168.111.210:8200", "Weight": 3, "IsAvailable": true }]
}
Now what worries me is that the "Urls" is another nested json formation. So how to bind this value to the datatables? And have you got any good ideas (e.g:something like I only wanna show all the ServiceAddress)...
This should do what you need:
var data = [{
"ServiceName": "cacheWebApi",
"Description": "This is a CacheWebApiService",
"IsActive": true,
"Urls": [
{
"ServiceAddress": "http://192.168.111.210:8200",
"Weight": 5,
"IsAvailable": true
},
{
"ServiceAddress": ",http://192.168.111.210:8200",
"Weight": 3,
"IsAvailable": true
}
]
}];
$(function() {
var table = $('#example').dataTable({
"data": data,
"columns": [
{
"data": "ServiceName"
}, {
"data": "Description"
}, {
"data": "IsActive"
}, {
"data": "Urls[0].ServiceAddress"
}, {
"data": "Urls[0].Weight"
}, {
"data": "Urls[0].IsAvailable"
}, {
"data": "Urls[1].ServiceAddress"
}, {
"data": "Urls[1].Weight"
}, {
"data": "Urls[1].IsAvailable"
}
],
});
});
You should put your data in an array though. Working JSFiddle
EDIT
IF the number of Urls isn't defined then you could do something like this:
var table = $('#example').dataTable({
"data": data,
"columns": [
{
"data": "ServiceName"
}, {
"data": "Description"
}, {
"data": "IsActive"
}, {
"data": "Urls",
"render": function(d){
return JSON.stringify(d);
}
}
],
});
I guess that that isn't brilliant but you could do almost anything to that function, for instance:
var table = $('#example').dataTable({
"data": data,
"columns": [
{
"data": "ServiceName"
}, {
"data": "Description"
}, {
"data": "IsActive"
}, {
"data": "Urls",
"render": function(d){
return d.map(function(c){
return c.ServiceAddress
}).join(", ");
}
}
],
});