JPQL filter string with number value throws IllegalArgumentException in portal-app REST API V1 - cuba-platform

I want to get a customer by zip(String) by using this body:
{
"entity": "demo$Customer",
"query": "select c from demo$Customer c where c.zip = :zip",
"params": [
{
"name": "zip",
"value": "12345"
}
]
}
I get this error:
java.lang.IllegalArgumentException: You have attempted to set a value
of type class java.math.BigDecimal for parameter zip with expected
type of class java.lang.String from query string select c from
demo$Customer c where c.zip = :zip.
If I change the value to C12345 I get the data.
Is my params wrong or is that a bug while the value is a BigDecimal and the domain property is a String? How can I mark the vale as String explicitly?
Thanks for answers.

You have to specify the parameter type explicitly. You request will look like this:
{
"entity": "demo$Customer",
"query": "select c from demo$Customer c where c.zip = :zip",
"params": [
{
"name": "zip",
"value": "123",
"type": "string"
}
]
}
Arguments with implicit types are handled successfully if you have a date or number argument with a specific format. When you have a string argument that looks like a date or number then the explicit type is required.

Related

A json Schema with an array of a $ref or an enum

I would like to have a Json Schema that would enforce an array of $ref and an enum of null. I have accidentally defined a tuple - not what I want. Here is my current schema (note I must use draft-04):
{
"$schema": "http://json-schema.org/draft-04/schema#",
"version": "4.4.0",
"title": "myCollection",
"description": "Resume/CV",
"type": "object",
"properties": {
"EmploymentHistories": {
"type": "array",
"items": {
"oneOf": [
{
"$ref": "../../../Common/json/base/TextType.json#"
},
{
"enum": [
null
]
}
]
},
"additionalProperties": false
}
}
}
And here is an instance I would like:
{
"EmploymentHistories": [
{
"value": "String",
"languageCode": "aa"
},
{
"value": "String",
"languageCode": "aa"
},
null,
null
]
}
But I am getting an error on validation like:
File D:\Dev\Proj\Recruiting\json\resumecv\samples\Untitled5.json is not valid.
A value of type 'null' is not permitted here.
Reason: it must be of one of the following types (see below)
'string'
'object'
Hint: Either 'type' is present and doesn't contain 'null' or 'enum' is present and doesn't contain a value of type 'null'.
Error location: EmploymentHistories / 3
Details
Array item '2' is not valid.
Property 'EmploymentHistories' is not valid.
A value of type 'null' is not permitted here.
Reason: it must be of one of the following types (see below)
'string'
'object'
Hint: Either 'type' is present and doesn't contain 'null' or 'enum' is present and doesn't contain a value of type 'null'.
Error location: EmploymentHistories / 4
Details
Array item '3' is not valid.
Property 'EmploymentHistories' is not valid.
Any help is appreciated.n
This looks like a bug in the validator implementation you are using. It seems to be saying that "enum": [null] is not allowed in a schema. The error is incorrect. This should be perfectly fine. However, you can probably work around this bug by changing it to "type": "null", which should have the same effect.

How to use try function with map in dataweave 2.0

Hi everybody i hope you are well, i have a doubt how can i do to use the try inside map in dataweave, i explain about my issue, i receive a csv file with multiple rows, that rows are group by order first and the second, i use map to transform the data in json ( to join all the rows with the same order) format using a couple colums that colums coming from the csv file, if any colum has empty or null the map fail and broke all the content file, how can i use the try function in dataweave if any group of orders fail only get the order and put in another part of the json and follow with the next order without broke the loop.
Part of the CSV File - Demo:
number,date,upc,quantity,price
1234556,2022-08-04,4015,1,
1234556,2022-08-04,4019,1,2.00
1234556,2022-08-04,4016,1,3.00
1234557,2022-08-04,4015,1,3.00
Dataweave:
%dw 2.0
output application/json
---
payload groupBy ($.number) pluck $ map ( () -> {
"number": $[0].number,
"date": $[0].date,
"items": $ map {
"upc": $.upc,
"price": $.price as Number {format: "##,###.##"} as String {format: "##,###.00"},
"quantity": $.quantity
}
})
Error Message:
Unable to coerce `` as Number using `##,###.##` as format.
NOTE: if put the data in the position "price "= something the the issue are solve in the first row, but i need use the function try or what do you recomend, i cant validate all the elements in csv because this is a demo the complete file has a many columns... if you would has another coment to better my code i'd apreciated.
Expected Result: (I don't know if this is possible)
[
{
"data": [
{
"number":"1234557",
"date":"2022-08-04",
"items":[
{
"upc":"4015",
"price":"3.00",
"quantity":"1"
}
]
}
]
},
{
"Error":[
{
"number":"1234556",
"message":"Unable to coerce `` as Number using `##,###.##` as format."
}
]
}
]
best regards!!
Hi The closer I got from what you asked for was
%dw 2.0
output application/json
import * from dw::Runtime
fun safeMap<T, R>(items: Array<T>, callback: (item:T) -> R ): Array<R | {message: String}> =
items map ((item) -> try(() -> callback(item)) match {
case is {success: false} -> {message: $.error.message as String}
case is {success: true, result: R} -> $.result
})
---
payload
groupBy ($.number)
pluck $
safeMap ((item) -> {
"number": item[0].number,
"date": item[0].date,
"items": item safeMap {
"upc": $.upc,
"price": $.price as Number {format: "##,###.##"} as String {format: "##,###.00"},
"quantity": $.quantity
}
})
This uses a combination of map and try function.
And it outputs
[
{
"number": "1234556",
"date": "2022-08-04",
"items": [
{
"message": "Unable to coerce `` as Number using `##,###.##` as format."
},
{
"upc": "4019",
"price": "2.00",
"quantity": "1"
},
{
"upc": "4016",
"price": "3.00",
"quantity": "1"
}
]
},
{
"number": "1234557",
"date": "2022-08-04",
"items": [
{
"upc": "4015",
"price": "3.00",
"quantity": "1"
}
]
}
]
If you are looking to resolve the value of price if price is null/empty value in the input csv and get rid of the error(which is happening because it cannot format Null values to Number) , try adding default in case of empty/null values and formatting to String only when value exists, like below:
%dw 2.0
output application/json
---
payload groupBy ($.number) pluck $ map ( () -> {
"number": $[0].number,
"date": $[0].date,
"items": $ map {
"upc": $.upc,
"price": ($.price as String {format: "##,###.00"}) default $.price,
"quantity": $.quantity
}
})
Note:
For price, you don't need to convert to Number at all if you want your output as formatted string ultimately.

Django Rest Framework- How to send data as strings

I want to send all the data in a response as string, like in database id is stored as integers but I want to send it as string in response.
eg: I have the response as
{
"categories": [
{
"id": 1,
"category": "xya",
"quantity": 25
}
]
}
I want it to be as:
{
"categories": [
{
"id": "1",
"category": "xya",
"quantity": "25"
}
]
}
I am using ModelSerializer to send all the fields.
Another option is to convert int to str using the to_representation method of your model serializer.
class YourSerializer(serializers.ModelSerializer):
# other fields
def to_representation(self, instance):
""" Override `to_representation` method """
repr = super().to_representation(instance)
repr['id'] = str(repr['id'])
repr['quantity'] = str(repr['quantity'])
return repr
You can explicitly define id field in your serializer to be CharField()
Like this
class YourSerializer(serializers.ModelSerializer):
# other fields
id = serializers.CharField()
class Meta:
model = SomeModel
fields = ('id', ..... other fields)

Trying to construct PostgreSQL Query to extract from JSON a text value in an object, in an array, in an object, in an array, in an object

I am constructing an interface between a PostgreSQL system and a SQL Server system and am attempting to "flatten" the structure of the JSON data to facilitate this. I'm very experienced in SQL Server but I'm new to both PostgreSQL and JSON.
The JSON contains essentially two types of structure: those of type "text" or "textarea" where the value I want is in an object named value (the first two cases below) and those of type "select" where the value object points to an id object in a lower-level options array (the third case below).
{
"baseGroupId": {
"fields": [
{
"id": "1f53",
"name": "Location",
"type": "text",
"options": [],
"value": "Over the rainbow"
},
{
"id": "b547",
"name": "Description",
"type": "textarea",
"options": [],
"value": "A place of wonderful discovery"
},
{
"id": "c12f",
"name": "Assessment",
"type": "select",
"options": [
{
"id": "e5fd",
"name": "0"
},
{
"id": "e970",
"name": "1"
},
{
"id": "0ff4",
"name": "2"
},
{
"id": "2db3",
"name": "3"
},
{
"id": "241f",
"name": "4"
},
{
"id": "3f52",
"name": "5"
}
],
"value": "241f"
}
]
}
}
Those with a sharp eye will see that the value of the last value object "241f" can also be seen within the options array against one of the id objects. When nested like this I need to extract the value of the corresponding name, in this case "4".
The JSON-formatted information is in table customfield field textvalue. It's datatype is text but I'm coercing it to json. I was originally getting array set errors when trying to apply the criteria in a WHERE clause and then I read about using a LATERAL subquery instead. It now runs but returns all the options, not just the one matching the value.
I'm afraid I couldn't get an SQL Fiddle working to reproduce my results, but I would really appreciate an examination of my query to see if the problem can be spotted.
with cte_custombundledfields as
(
select
textvalue
, cfname
, json_array_elements(textvalue::json -> 'baseGroupId'->'fields') ->> 'name' as name
, json_array_elements(textvalue::json -> 'baseGroupId'->'fields') ->> 'value' as value
, json_array_elements(textvalue::json -> 'baseGroupId'->'fields') ->> 'type' as type
from
customfield
)
, cte_custombundledfieldsoptions as
(
select *
, json_array_elements(json_array_elements(textvalue::json -> 'baseGroupId'->'fields') -> 'options') ->> 'name' as value2
from
cte_custombundledfields x
, LATERAL json_array_elements(x.textvalue::json -> 'baseGroupId'->'fields') y
, LATERAL json_array_elements(y -> 'options') z
where
type = 'select'
and z ->> 'id' = x.value
)
select *
from
cte_custombundledfieldsoptions
I posted a much-simplified rewrite of this question which was answered by Bergi.
How do I query a string from JSON based on another string within the JSON in PostgreSQL?

BigQuery-: Unable to set values in "in unnest(#)" clause

I'm trying to use "in unnest()" clause using the BigQuery client through a java application where I've set the parameter type of a named parameter to array type because a list of values is to be sent in this clause. But I get error response:
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "Invalid query parameter type",
"reason" : "invalid"
} ],
"message" : "Invalid query parameter type"
}
I got the same error when I tried using REST API where parameter type was set as
"parameterType": {
"arrayType": {
"type": "STRING"
}
How to set multiple values in "in unnest(#myparam)" clause ?
From the documentation it should be:
"parameterType": {
"type": "ARRAY",
"arrayType": {
"type": "STRING"
}
}
Specifically, you also need to provide the ARRAY type.