Querying for all fields in a document - ravendb

I want use gather all data within a database in RavenDB. Similar to SELECT * in SQL.
from #all_docs
select {
Name: Value
}
Instead of typing in all possible names and their corresponding values, I am wondering if there is an "all" character.

from #all_docs
should return all the documents

Related

Abstract view of how distinct queries are implemented in NoSQL

I am developing a system using Google Data-Store, where there's a Kind - Posts and which has 2 properties
1. message (string)
2. hashtags (list)
I wanted to query the distinct hashtags with the number. For example
say The posts are
{
{
"message":"msg1",
"tags":["abc","cde","efr"]
},
{
"message":"msg2",
"tags":["abc,"efgh","efk"]
},
{
"message":"msg3",
"tags":["abc,"efgh","efr"]
}
}
The output should be
{
"abc":3
"cde":1
"efk":1
"efgh":2
"efr":2
}
But in NoSQL implementation Data-store I can't directly query this. In order to query I have to load all the messages and find distinct queries. It will be a time-consuming event.
But I have seen a distinct function db.collection.distinct() which I think might have optimize this problem. If It has to be done on any NoSQL what may be the optimum solution for this?
Unfortunately, projection queries with 'distinct on' will only return a single result per distinct value (https://cloud.google.com/datastore/docs/concepts/queries#projection_queries). It will not provide a count of each distinct value. You'll need to do the count yourself, but you can use a projection query to save cost by only returning the tag values instead of the full entities.

How do you index an array inside a JSON with an Oracle 12c query?

I have a table "move" with one column "move_doc" which is a CLOB. The json stored inside has the structure:
{
moveid : "123",
movedate : "xyz",
submoves: [
{
submoveid: "1",
...
},
{
submoveid : "2",
...
}
]
}
I know I can run an Oracle 12c query to access the submoves list with:
select move.move_doc.submoves from move move
How do I access particular submoves of the array? And the attributes inside a particular submove?
You have to use Oracle functions json_query and/or json_value like this:
SELECT json_value(move_doc, '$.submoves[0].submoveid' RETURNING NUMBER) FROM move;
returns 1.
SELECT json_query(move_doc, '$.submoves[1]') FROM move;
would return the second JSON element, i.e. something like
{
submoveid : "2",
...
}
json_value is used to retrieve a scalar value, json_query is used to retrieve JSON values. You might also want to have a look at json_table which returns an SQL result table and thus can be used in Joins.
See this Oracle Doc for more examples
Beda here from the Oracle JSON team.
We have added a new multi-value index in release 21c allowing you to index values from a JSON array. Obviously, 21c is brand new and you want to know how to do this in older releases: Functional indexes (using JSON_Value function) are limited to a single value per JSON document and therefore are not capable to index array values. But: there is a 'JSON search index' which indexes your entire JSON document and therefore also values in the array. Another solution is to use a materialized view usign JSON_Table. This will expand the array values into separate rows.Then you can add a regular B-Tree index on that column.
Sample code here:
JSON indexing with functional indexes and JSON search index
https://livesql.oracle.com/apex/livesql/file/content_HN507PELCEEJGVNW4Q61L34DS.html
JSON and materialized views
https://livesql.oracle.com/apex/livesql/file/content_HYMB1YBP4CPMG6T6MXY5G9X5L.html
From what I've looked, In Oracle, you can index the "whole array" as a single index entry, but not individual elements of an array.
NoSQL databases like MongoDB, Couchbase, Cassandra have "array/collection" indexes which can index individual elements or fields of objects within an array and query them.

Solr query with dynamic field

I have documents with the following link.* dynamic fields:
"docs": [{
"id":"id1"
"link.1.text":"mytext"
"link.1.nImg":1
"link.2.text":"mytext"
"link.2.nImg":2
}, {
"id":"id2"
"link.1.text":"mytext"
"link.1.nImg":1
"link.2.text":"mytext"
"link.2.nImg":1
}]
How can I get a query like : link.*.text:"mytext" or link.*.nImg:2 ?
You couldn't do that in Solr.
Dynamic fields allow Solr to index fields that you did not explicitly
define in your schema. This is useful if you discover you have
forgotten to define one or more fields. Dynamic fields can make your
application less brittle by providing some flexibility in the
documents you can add to Solr.
In query you need to list exact name of a field name, so dynamic fields give you an index time flexibility
Some more info - https://cwiki.apache.org/confluence/display/solr/Dynamic+Fields

How to query a specific collection from a RavenDB?

In one database, I am storing two separate documents - CumulativeSprintData and Features. I'm trying to query from javascript. Right now I'm just using the default:
http://servername:8080/databases/sprintprogress/indexes/dynamic?
The problem is that this default query pulls in documents of both types. How do I specify which document type I want to pull down?
Thanks!
You can use:
http://servername:8080/databases/sprintprogress/indexes/dynamic/Features
http://servername:8080/databases/sprintprogress/indexes/dynamic/CumulativeSprintDatas

How to get elasticsearch to perform similar to SQL 'LIKE'

If using a SQL 'Like' statement to query data it will return data even if its only partially matched. For instance, if I'm searching for food and there's an item in my db called "raisins" when using SQL the query would return "raisins" even if my search only contained "rai". In elasticsearch, the query won't return a record unless the entire name (in this case "raisins") is specified. How can I get elasticsearch to behave similar to the SQL statement. I'm using Rails 3.1.1 and PostgreSQL. Thanks!
While creating index of model for elasticsearch use tokenizer on index which will fulfil your requirement. For. e.g.
tokenizer: {
:name_tokenizer => {type: "edgeNGram", max_gram: 100, min_gram: 3, side: "front"}
}
this will create tokens of size from 3 to 100 of your fields and as side is given as front it will check from the starting. You can get more details from here http://www.slideshare.net/clintongormley/terms-of-endearment-the-elasticsearch-query-dsl-explained