Moving data from SQL Server to Cosmos in Copy Activity of Data Factory v2. One of the column in SQL server has JSON object (Although dataType is (varchar(MAX)) and I have mapped it to one column in Cosmos collection. The issue is it adds it as String NOT json object. How can we setup it up in Copy Activity so that data for that one particular column gets added as Json Object not string
It gets added as follows:
MyObject:"{SomeField: "Value" }"
However I want this to be:
MyObject:{SomeField: "Value" } // Without quotes so that it appears as json object rather than string
Use JSON conversion function available in Data Factory.
https://learn.microsoft.com/en-us/azure/data-factory/control-flow-expression-language-functions#json
MyObject:json("{SomeField: "Value" }")
It will result as
MyObject:{SomeField: "Value" }
Related
I am new in javers so my question is maybe silly. I have a java spring application and I have pulled javers version 6.8.2 in. I can save my entities properly and at the same time javers saves the state of my entity in it's table structure. But whenever I try to query historical information I fail. I want to get all the historical data related to a few fields' values. Let me give you an example:
I have the following java entity (I avoid to add jpa annotations here)
public class DummyClass {
String field_1;
Integer field_2;
OffsetDateTime field_3;
List<Integer >field_4;
}
#Repository
#JaversSpringDataAuditable
public interface DummyClassRepository extends JpaRepository<DummyClass, String> {}
I can persist my entity in Javers repository into jv_snapshot table. My entity's structure saved under "state" column in jv_snapshot table in json format. Here is my jv_snapshot table column list:
snapshpt_pk;
type;
state;
changed_properties;
managed_type;global_id_fk;
commit_fk;
And here is the way how my data was persisted:
column state:
{
"field_1": "XXXXX",
"field_2": "wwww",
"field_3": "2022-12-20T02:46:00.540",
"field_4": "zzz"
}
Now I want to get back all of my hystorical records, especially I want to get back all of the domain objects (using the original java class) where a certain domain object's fields carrying a certain value.
So for instance I want to see all of my entities where field_1=XXXXX and field_3>2021-12-20T02:46:00.540.
Is there any way I can achive this in javers?
I am unable to load empty string values for vertex and edge properties, as far as I know I am following Gremlin Load Format and providing the right request JSON to Neptune loader endpoint.
This is how the vertex csv file(generated using pandas) on s3 looks
The request payload sent to loader endpoint(http://database:8182/loader)
{
"source": "s3://bucket/dir/or/object/containing/csvfile/",
"format" : "csv",
"iamRoleArn" : "arn:sample",
"region" : "us-east-1",
"failOnError" : "FALSE",
"parserConfiguration" : {
"allowEmptyStrings": true
}
}
The data is loaded successfully.
Vertex is created.
'label' and 'id' fields are assigned with values mentioned in the csv.
'key2' property shows its value as 'value'(mentioned in csv).
'key1' property is not found/loaded in database in spite of providing allowEmptyStrings as true in
request payload.
If you want an empty string, you need to encapsulate it in quotes within the CSV:
~id,~label,key1:String,key2:String
testid1,testlabel,"",value2
testid2,testlabel,value1,""
If the field is not encapsulated in quotes, it is treated as null (which is different than an empty string). Null values are implicit within TinkerPop as it presently stands.
The data source in PowerApps gallery was a SQL View.
Search('[dbo].[vwCandidate]', textSearchCandidate.Text, "NameLast", "NameFirst", "MiscellaneousTags", "EmailAddress", "PhoneNumber")
The selected record populated a global variable for the form item.
Set(varCandidate, gallerySearchResults.Selected)
Everything works as expected. Then, I changed the data source to use a stored procedure to move the search from PowerApps to SQL server. After doing so I received the error message
"Incompatible Type. We can't evaluate your formula because the context
variable types are incompatible with the types of the values in other
places in your app"
I cannot revert back to the view that was working without getting the same error. I'm hoping my only option is NOT to use a new variable and change every occurrence in the form/App. I'd like to avoid this if possible.
I cannot view the form so I'm not sure how to debug properly. My hunch is the date fields being returned via Flow are causing the problem. They are 'smalldatetime' types and the Flow is returning a string 'yyyy-mm-ddThh:mm:ss.000' even though 'date' is being requested.
"PhoneNumber": {
"type": "string"
},
"CandidateStatus": {
"type": "string"
},
"DateApplied": {
"type": "string",
"format": "date"
},
Flow JSON here does not seem to like any of the other 'date' format types.
Are there any workarounds from Flow? Should I reformat the date values when I am setting the global variable? Advice?
Turns out, I was on the right track thinking that the DATE data type coming from Flow as a string. Here's why:
A new record was created using a Patch function while setting the global variable:
Set(varCandidate, Patch('[dbo].[candidate]', Defaults('[dbo].[candidate]'), {DateApplied: DateTimeValue(Text(Now())), CreatedDate:DateTimeValue(Text(Now())), CreatedBy:varUser.Email}))
The "DateApplied" field was a "DATE" type in the SQL table and it was coming from Flow as a string "2019-03-13T17:40:52.000". The recordset from Flow was being set to the same global variable when I wanted to edit the record
Set(varCandidate, gallerySearchResults.Selected)
The error "Incompatible Type" (see question for full error message) was due to this field being a "Date Value" in a new record and a "string" in an edit record.
My fix is to remove this "Date" type fields from the patch and modify the Flow to retrieve the newly created record record by ID.
Reset everything back, including the data source, then save and close the app completely, re-test.
Remove any flow connections then save and close the app completely , re-test, then re-add the flow connections.
I don't why but PowerApps some times persist data connection errors until you have close the app down.
And just to confirm PowerApps doesn't support stored procedures as data sources, only as write using the patch function etc.
Ill try to tell the long story short:
MySQL has a spatial Point data type for some time. To insert data we need to use expression of INSERT (...) VALUES (POINT(lon lat) ...). To do that in CakeORM we need to have our given property as new QueryExpression("POINT($lon,$lat)"); and the CakePHP will handle data binding upon saving entity.
Now what I want to do is to Have entity with property (field) of type GeoJSON Point that corresponds to Mysql column of type Point as well and the ability to store and fetch entities to and from the db
I have used beforeSave call back for that in such manner:
public function beforeSave(Event $event, \Cake\Datasource\EntityInterface $entity, ArrayObject $options) {
$coords = $entity->position->getCoordinates();
$entity->position = new \Cake\Database\Expression\QueryExpression('POINT(' . $coords[0] . ',' . $coords[1] . ')');
}
where $entity->position is of Type Point(additional lib, but this could be anything). This works in general, but modifies actuall entity so if i do
$entity->position=new Point(5,10);
$table->save($position);
$entity->position // <- this will be Query expression insteed of Point(5,10);
And in the third step I want to still have my Point object, and not modified value -QueryExpression.
I know that CakePHP supposed to have support for custom database datatypes but it is useless due to... well it just does not work as I would expect it to work. Here is a GitHub issue describing why remomended (by the docs) solution does not work.
https://github.com/cakephp/cakephp/issues/8041
I am using indexer to sync data from my SQL Database to Azure Search Service. I have a field in my SQL View, which contains XML data. The Column contains a list of string. The corresponding field in my Azure Search Service Index in a Collection(Edm.String).
On checking some documentations, I found that Indexer does not change Xml(SQL) to Collection(Azure Search).
Is there any workaround as to how I can get create the Collection from the Xml data?
p.s I am extracting the data from a View, so I can change the Xml to JSON if needed.
UPDATE on October 17, 2016: Azure Search now automatically converts a string coming from a database to an Collection(Edm.String) field if the data represents a JSON string array: for example, ["blue", "white", "red"]
Old response: great timing, we just added a new "field mappings" feature that allows you to do this. This feature will be deployed sometime early next week. I will post a comment on this thread when this is rolled out in all datacenters.
To use it, you indeed need to use JSON. Make sure your source column contains a JSON array, for example ["hello" "world"]. Then, update your indexer definition to contain the new fieldMappings property:
"fieldMappings" : [ { "sourceFieldName" : "YOUR_SOURCE_FIELD", "targetFieldName" : "YOUR_TARGET_FIELD", "mappingFunction" : { "name" : "jsonArrayToStringCollection" } } ]
NOTE: You'll need to use API version 2015-02-28-Preview to add fieldMappings.
HTH,
Eugene