Ambigious mongoDB query, matches everything - mongodb-query

Why does this term match everything:
{result: $and[{$exists:true}, {$ne: 0}]}
{result:{$exists:true}, result:{$ne:0}} (this too as suggested)
The idea was to match fields, which have a key "result" and are where result is not equal zero. Why this does match a document, which only has a oid?
edit:
What works as expected is the following:
{ $and: [ { result:{$exists:true}}, {result:{$ne: 0}}]}
The question is still the same, why do those queries behaive like this?

try:
{result:{$exists:true}, result:{$ne:0}}

Related

Azure Logic Apps: Set condition to False when SQL query returns no rows of data

How can i conditionally test the output from an Execute SQL Query to make sure it returns some rows of data.
In my example below if the query returns no rows I don't want it to send an email, I want to do something else. What is the test?
Thanks for your time
I test, if it queries result is no rows, the query body will be like this:
{
"OutputParameters": {},
"ResultSets": {}
}
So you could add a Condition with #{body('Execute_a_SQL_query')['OutputParameters']} is equal to {}. If true, do the things you want. Yo could set this in the Code view mode.
The below is the test result, hope this is what you want.
This will work in Query SQL V2.
What is does is takes the ResultSet and converts to string. This prevent s a null error on the length function. As an empty result set is {}, the length is 2. So if the length is 2 then the the result is empty.
"expression": {
"and": [
{
"equals": [
"#length(string(body('Execute_a_SQL_query_(V2)')?['ResultSets']))",
2
]
}
]
}
I am using similar to this in an until condition which runs until the length is zero. I guess you could do the same?
#equals(length(body('Execute_a_SQL_query')?['value']), 0)

Having problems with ordering by numeric value

I have JSON data in my collection similar to following example. There is a icCount property with numeric value. Now when I issue a query with order specified by icCount, its sorted as text and not numeric value (see screenshot below). Index is automatic here. Any idea what is wrong here? (running RavenDB 4.1.1)
{
"enabled": true,
"description": "",
"icCount": 3865,
"companyname": "ABC Data"
}
Ok, so I just found it myself. Help here https://ravendb.net/docs/article-page/4.1/csharp/indexes/querying/sorting states that I should specify ordering mode(type). For my case I can simply rewrite it to: order by icCount as long desc ... see the long in clause. This way my data list is ordered correctly.

Boosting individual elasticsearch indices to have preference in results

I am trying to boost certain indices in my elastic search query. Right now, my query is looking like this.
var query = {
"query": {
"query_string": {
"fields": ["FirstName", "LastName"],
"query": "Hank Hill",
"default_operator": "AND"
}
}
};
var boosted_indices = {
"index_A" : 1.0,
"index_B" : 1.0,
"index_C" : 10.0
};
if (boosted_indices) {
query["indices_boost"] = boosted_indices;
}
// stringify and send query in an http.get request
I know that my query without boosting any indices works as I expect. However, I am still getting a lot of results from "index_A" in my query results, rather than the heavily boosted index_C. I know that there should be a similar number of matching results in A and C, so the issue must be that I am not boosting the query correctly.
Did I set up my query JSON incorrectly? On the tutorial I linked, it did not give much context.
One other thing I noticed.. the "_score" field for the returned documents... all of them are set to null. Might this have something to do with my documents not being boosted according to the index they came from?
I hope you are not using the sort parameter in query. This could be the reason that _score is null and you are not getting expected results.
Does this help?

Neo4j 2.0.1 Cypher performance difference between using start and match with a predicate

Started using Cypher about a week ago (really like it). In the 'browser' interface I'm running two queries:
1) start n=node:Node(name="foo") match (n)-[r*..4]-(m) return n,m
2) match (n{name:"foo"})-[r*..4]-(m) return n,m
The first query returns almost immediately, the second query more than an hour and counting. Naively I would think these would be equivalent, clearly they are not. I ran a 'smaller' (path just up to 1) version of both in the neo-shell so I could profile them.
profile start n=node:Node(name="foo") match (n)-[r*..1]-(m) return n,m;
ColumnFilter(symKeys=["m", "n", " UNNAMED51", "r"], returnItemNames=["n", "m"], _rows=4, _db_hits=0)
TraversalMatcher(start={"expr": "Literal(foo)", "identifiers": ["n"], "key": "Literal(name)",
"idxName": "Node", "producer": "NodeByIndex"}, trail="(n)-[*1..1]-(m)", _rows=4, _db_hits=5)
.
profile match (n{name:"foo"})-[r*..1]-(m) return n,m
ColumnFilter(symKeys=["n", "m", " UNNAMED33", "r"], returnItemNames=["n", "m"], _rows=4, _db_hits=0)
Filter(pred="Property(n,name(0)) == Literal(foo)", _rows=4, _db_hits=196870)
TraversalMatcher(start={"producer": "AllNodes", "identifiers": ["m"]},
trail="(m)-[*1..1]-(n)", _rows=196870, _db_hits=396980)
From other stackoverflow questions I understand db_hits is good to look at, so it looks like the second query has basically done a linear scan (my db is almost 400k nodes). This seems to be confirmed by the "producer" value of "AllNodes" instead of "NodeByIndex".
Obviously I need to specify the match (predicate) differently so that it hits the index. The index is called 'Node' on parameter 'name'. My googling, stacko search is failing me.. how do I specify the conditional in the match so that it hits the index?
Update:
After some poking around it appears I'm using a 'legacy' index? and then trying to hit that with the 'new style (don't use start) query... (kinda extrapolating here). So I can do the following:
create index ON :label(name)
and that would provide an index for a particular label on the name property, but I really want an index (I guess non-legacy index) on ALL the node names. I have use cases where that's important (user may not know the label but does know the name).
Any suggestions or guidance is much appreciated.
Right now there is no global schema index, so you would probably want to create an index on a generic label like Entity or Node and create an index like this:
create index on :Entity(name);
And add that Entity label to all your nodes.
match (n) set n:Entity;

ActiveRecord search model using LIKE only returning exact matches

In my rails app I am trying to search the Users model based on certain conditions.
In particular, I have a location field which is a string and I want to search this field based on whether it contains the search string. For example, if I search for users with location 'oxford' I want it to also return users with a variation on that, like 'oxford, england'.
Having searched the web for the answer to this it seems that I should be using the LIKE keyword in the activerecord search, but for me this is only returning exact matches.
Here is a snippet of my code from the search method
conditions_array = []
conditions_array << [ 'lower(location) LIKE ?', options[:location].downcase ] if !options[:location].empty?
conditions = build_search_conditions(conditions_array)
results = User.where(conditions)
Am I doing something wrong? Or is using LIKE not the right approach to achieving my objective?
You need to do like '%oxford%'
% Matches any number of characters, even zero characters
conditions_array << [ 'lower(location) LIKE ?', "%#{options[:location].downcase}%" ] if !options[:location].empty?