i'm a kind of newbie using Apache Solr, and i'm indexing a document witch has list of complex objectslike this:
{
PropA: AnyValue,
PropB: [{p1:'v1', p2:'v2'}, {p1:'v3', p2:'v4'}],
PropC: [{p1:'v1', p2:'v2'}, {p1:'v3', p2:'v4'}]
}
When i send it to solr it will get tha same data but on a different format:
{
PropA: AnyValue,
PropB.p1: ['v1','v3']
PropB.p2: ['v2','v4']
PropC.p1: ['v1','v3']
PropC.p2: ['v2','v4']
}
This format is causing me problems on deserializing, is it possible or what can i do to to get Solr to return the object on the original format?
Do i have to specify something special at the schema level to support Subdocuments? i'm kinda lost on this.
Any ideas?
To get well formatted document from SOLR you can use carrot framework.
It easy to implement and you can generate XML or JSON format as par your custom requirement using XSLT.
Related
I have a custom schema, stored in a json file. I want to parse that schema. Unfortunately, that schema is an enterprise product, so I cannot find a way to parse it. Any approach how to parse a custom schema, I am quite stuck at this point.
One of the example of such schema is following:
{
typeid:org.name:prop1.0.0,
properties : {
typevalue:Float64,
length:4,
typeid:element,
description:"List to store elements"
}
}
One thing I was able to figure out that the typeid is analogous to $id in json-schema. So I think I can parse that. But I am unsure about the others.
Trying to generate JSON schema (http://jsonschema.net) from the syncthing (https://docs.syncthing.net/rest/system-connections-get.html) JSON below.
The problem is that the connection objects start with their ID (e.g.
YZJBJFX-RDB...) which is interpreted as a type.
Is it the JSON from synching that isn't standard or is it the issue with the schema generator?
Do you have any suggestions how to get around this if schema generation is a requirement (I.e. no typing schemas manually).
{
"total":{
"paused":false,
"clientVersion":"",
"at":"2015-11-07T17:29:47.691637262+01:00",
"connected":false,
"inBytesTotal":1479,
"type":"",
"outBytesTotal":1318,
"address":""
},
"connections":{
"YZJBJFX-RDBL7WY-6ZGKJ2D-4MJB4E7-ZATSDUY-LD6Y3L3-MLFUYWE-AEMXJAC":{
"connected":true,
"inBytesTotal":556,
"paused":false,
"at":"2015-11-07T17:29:47.691548971+01:00",
"clientVersion":"v0.12.1",
"address":"127.0.0.1:22002",
"type":"TCP (Client)",
"outBytesTotal":550
},
"DOVII4U-SQEEESM-VZ2CVTC-CJM4YN5-QNV7DCU-5U3ASRL-YVFG6TH-W5DV5AA":{
"outBytesTotal":0,
"type":"",
"address":"",
"at":"0001-01-01T00:00:00Z",
"clientVersion":"",
"paused":false,
"inBytesTotal":0,
"connected":false
},
"UYGDMA4-TPHOFO5-2VQYDCC-7CWX7XW-INZINQT-LE4B42N-4JUZTSM-IWCSXA4":{
"address":"",
"type":"",
"outBytesTotal":0,
"connected":false,
"inBytesTotal":0,
"paused":false,
"at":"0001-01-01T00:00:00Z",
"clientVersion":""
}
}
}
Any input is appreciated.
Is it the JSON from synching that isn't standard or is it the issue
with the schema generator?
There is nothing non-standard about this JSON. Neither is there any issue with the schema generation.
Unfortunately, defining a schema for what is effectively dynamic content is difficult. This will always be the case because the job of schemas is to describe static data structures.
That said, it may be possible to do this using the patternProperties field in JSON schema. This post is effectively asking the same question as yours.
I am using indexer to sync data from my SQL Database to Azure Search Service. I have a field in my SQL View, which contains XML data. The Column contains a list of string. The corresponding field in my Azure Search Service Index in a Collection(Edm.String).
On checking some documentations, I found that Indexer does not change Xml(SQL) to Collection(Azure Search).
Is there any workaround as to how I can get create the Collection from the Xml data?
p.s I am extracting the data from a View, so I can change the Xml to JSON if needed.
UPDATE on October 17, 2016: Azure Search now automatically converts a string coming from a database to an Collection(Edm.String) field if the data represents a JSON string array: for example, ["blue", "white", "red"]
Old response: great timing, we just added a new "field mappings" feature that allows you to do this. This feature will be deployed sometime early next week. I will post a comment on this thread when this is rolled out in all datacenters.
To use it, you indeed need to use JSON. Make sure your source column contains a JSON array, for example ["hello" "world"]. Then, update your indexer definition to contain the new fieldMappings property:
"fieldMappings" : [ { "sourceFieldName" : "YOUR_SOURCE_FIELD", "targetFieldName" : "YOUR_TARGET_FIELD", "mappingFunction" : { "name" : "jsonArrayToStringCollection" } } ]
NOTE: You'll need to use API version 2015-02-28-Preview to add fieldMappings.
HTH,
Eugene
I'm trying to query RavenDB using the HTTP client for all documents by type.
I would like a collection of the documents with a given type.
I understand that there might be limitations only the first 1024 documents will be returned.
I am well under that number and besides it's for a proof of concept.
I am able to obtain all the documents using the following syntax:
http://localhost:8080/databases/{database name}/docs/
I see that I could use the #metadata field to get the documents of the type I want but I don't know the syntax.
Since the HTTP api allows you to query indexes, I attempted to write a static index.
When I wrote the index from Raven Studio, the index was not returning the documents of the type I wanted. It was giving zero results.
from doc in docs.MyType
select new { doc};
I also tried this:
from doc in docs
let Tag = doc["#metadata"]["Raven-Entity-Name"]
where Tag == "MyType"
select new { doc};
You can do it using:
http://localhost:8080/databases/{database name}/indexes/dynamic/CollectionName
I am a new to using Mongo DB and exploring the frameworks around for migrating from mysql to mongodb. So far from my findings I have been able to figure out SpringMongo as the best solution to my requirements.
The only problem is that instead of using a DSL based or abstract querying mechanism, I wished the framework allowed me to pass plain json string as arguments to the different methods exposed by the API(find, findOne) so that the query parameters can be written out to an external file (using a key to refer) and passed to the methods by reading and parsing at run time. But the framework should be capable of mapping the results to the domain objects.
Is there a way in spring-mongo to achieve this? Or is there any other frameworks on the same lines
You could use Spring Data to do that, just use the BasicQuery class instead of Query class. Your code will look like the following:
/* Any arbitrary string that could to parsed to DBObject */
Query q = new BasicQuery("{ filter : true }");
List<Entity> entities = this.template.find(q, Entity.class);
If you want more details:
http://static.springsource.org/spring-data/data-mongo/docs/current/reference/html/#mongo.query
http://static.springsource.org/spring-data/data-mongodb/docs/current/api/org/springframework/data/mongodb/core/query/BasicQuery.html
Well I got to find this one in the Spring data MongoOperations...
String jsonCommand = "{username: 'mickey'}";
MongoOperations mongoOps = //get mongooperations implemantation
mongoOps.executeCommand(jsonCommand)
It returns an instance of CommandResult that encapsulates the result.