Mongoengine How to Filter with ISODate - mongodb-query

I'm trying to convert a query string into Mongoengine. The filter is looking at an array to limit to a lifecycle match and enable_date is null or enable_date is less than or equal to an ISODate
filter = { "integration" : { "$elemMatch" : { "lifecycle" : lifecycle, "$or" : [ { "enable_date" : None}, {"enable_date" : { "$lte" : ISODate(f"{as_of_date}")} } ] } }}
subs = Collection.objects(__raw__=filter)

Use queryset visitor from mongoengine
from datetime import datetime
from mongoengine.queryset.visitor import Q
subs = Collection.objects(Q(lifecycle=lifecycle) & (Q(enable_date=None) | Q(enable_date__lte=datetime.now())))

Related

Due to limitations of the com.mongodb.BasicDocument, you can't add a second ''?

I have a requirement when PhoneNumber and emailID are passed as search parameters together and I have to query mongoDB for condition like shown below
Where (PhoneNumber = "primaryPhoneNumber" or "SecondaryPhoneNumber") and (emailId= "primaryEmailid" or "secondaryEmailID")
SO when I am trying to frame my Query somewhat like below I am getting the exception-
**
Due to limitations of the com.mongodb.BasicDocument, you can't add a second '' criteria. Query already contains '{ "$or" : [{ "createdByEmailId" : "brokermanager_lion#perfios.com"}, { "clientRM" : { "$in" : ["brokermanager_lion#perfios.com"]}}, { "productRM" : { "$in" : ["brokermanager_lion#perfios.com"]}}]}'**
`
if (StringUtils.isNotBlank(leadsFilters.getPhoneNumber())) {
Criteria andCriteria = new Criteria();
query.addCriteria(andCriteria.andOperator(Criteria.where("")
.orOperator(Criteria.where("userInfo.primaryMobileNumber").is(leadsFilters.getPhoneNumber()),
Criteria.where("userInfo.secondaryMobileNumber").is(leadsFilters.getPhoneNumber()))));
}
query.addCriteria(Criteria.where("lspCode").is(lspCode));
if (StringUtils.isNotBlank(leadsFilters.getEmailId())) {
Criteria andEmailCriteria = new Criteria();
query.addCriteria(Criteria.where("").
orOperator(Criteria.where("userInfo.primaryEmailId").is(leadsFilters.getEmailId()),
Criteria.where("userInfo.secondaryEmailId").is(leadsFilters.getEmailId())));
}
`

Nested Conditional Properties for Local file references not working in python-jsonschema

My RefResolver and Validator script is
from jsonschema import RefResolver
from jsonschema.validators import validator_for
from tests.automation.io.helpers.loader import load_schema
base = load_schema('base.schema.json') # {"$schema": "http://json-schema.org/draft-07/schema#" }
definitions = load_schema('defination.schema.json') # https://jsonschema.dev/s/FZDbO
schema = load_schema('update.schema.json') # https://jsonschema.dev/s/lvLFa
schema_store = {
base.get('$id','base.schema.json') : base,
definitions.get('$id','defination.schema.json') : definitions,
schema.get('$id','update.schema.json') : schema,
}
resolver = RefResolver.from_schema(base, store=schema_store)
Validator = validator_for(base)
validator = Validator(schema, resolver=resolver)
data = {
"common_data": {
"os_ip": "127.0.0.1",
"os_user": "root",
"os_pwd": "hello",
"remote_os": "Windows"
},
"dup_file": "ad7a.exe"
}
validator.validate(data)
My JSON Schema looks like
base.schema.json => {"$schema": "http://json-schema.org/draft-07/schema#" }
defination.schema.json => https://jsonschema.dev/s/FZDbO
update.schema.json => https://jsonschema.dev/s/lvLFa
Getting error : jsonschema.exceptions.ValidationError: 'ad7a.exe' does not match '^(.*.)(bin)$'
Same thing I have tested https://json-schema.hyperjump.io/ it working perfectly fine , suspecting some issue with python-jsonschema only.

React Native - Using promises when fetching and returning them to components

i want to display related data on react native, i have 2 response api
response 1
{
"total": 2,
"data" : [
{
"date" : "2020-12-01",
"time" : "08:00:00"
},
{
"date" : "2020-12-02",
"time" : "09:00:00"
}
]
}
response 2, date is parameter
date : 2020-12-01
{
"total": 2,
"data" : [
{
"date" : "2020-12-01",
"description" : "bla bla bla"
},
{
"date" : "2020-12-01",
"description" : "vla vla vla"
}
]
}
date : 2020-12-02
{
"total": 1,
"data" : [
{
"date" : "2020-12-02",
"description" : "cla cla cla"
}
]
}
how to use promises on fetch and how components return them,
so it can display content descriptions like this
Are you asking how to use promises with Fetch? Or how to take the results of those promises and use them to create a visual component?
It doesn't look like you've put any work into asking this question and it's quite wide ranging, from setting up components, creating fetch requests to an API and formatting them into visual components.
I'd consider starting here for your fetch request:
https://reactnative.dev/docs/network#making-requests
Then look at how they take the text and place it on a screen here:
https://reactnative.dev/docs/network#making-requests
EDIT:
Per the comments below.
You would most likely want to store the results of your request in some form of local state, and pass that state to your FlatList or similar, for instance:
const PromiseExample = () => {
const [listItems, setListItems] = useState([])
useEffect(() => {
async function fetchData() {
const data = await fetch('localhost:3000/data').json()
const otherData = await fetch('localhost:3000/otherData').json()
let joinedData = []
// at this point you would join your two data structures in whatever way you need to and push the joined data to the var.
setListItems(joinedData)
}
}, [])
return (
<FlatList data={listItems} />
)
}

How to convert `json query` to `sql query`?

I have an AngularJS App designed to build queries in JSON format, those queries are built with many tables, fields, and operators like "join", "inner", "where","and","or","like", etc.
AngularJS App is sending this JSON queries to my Django-Restframework backend, so I need to translate that JSON query into SQL query, to be able to run Raw SQL with previous validations of what tables/models are allowed for select.
I don't need a full JSON query to SQL query translation, I just want to translate selects with support for clauses like "where","and", "or", "group_by".
For better understanding of my question I put the following snippets:
{
"selectedFields": {
"orders": {
"id": true,
"orderdate": true},
"customers": {
"customername": true,
"customerlastname": true}
},
"from": ["orders"],
"inner_join":
{
"customers": {
"on_eq": [
{
"orders": {
"customderID": true
},
},
{
"customers": {
"customerID": ture
}
}
]
}
}
}
SELECT
Orders.OrderID,
Customers.CustomerName,
Customers.CustomerLastName,
Orders.OrderDate
FROM Orders
INNER JOIN Customers
ON Orders.CustomerID=Customers.CustomerID;
I took the example from: http://www.w3schools.com/sql/sql_join.asp
Please note that I am not trying to serialize any SQL query output to JSON.
I found a NodeJS package (https://www.npmjs.com/package/json-sql) who converts JSON queries to SQL queries, so I made a NodeJS script and then I create a class in Python to call NodeJS script.
With this approach I just need to send all AngularJS queries following this syntax (https://github.com/2do2go/json-sql/tree/master/docs#join)
NodeJS script.
// Use:
//$ nodejs reporter/services.js '{"type":"select","fields":["a","b"],"table":"table"}'
var jsonSql = require('json-sql')();
var json_query = JSON.parse(process.argv[2]);
var sql = jsonSql.build(json_query);
console.log(sql.query);
DRF class:
from unipath import Path
import json
from django.conf import settings
from Naked.toolshed.shell import muterun_js
full_path = str(Path(settings.BASE_DIR, "reporter/services.js"))
class JSONToSQL:
def __init__(self, json_):
self.json = json_
self.sql = None
self.dic = json.loads(json_)
self.to_sql()
def to_sql(self):
response = muterun_js('%s \'%s\'' % (full_path, self.json))
if response.exitcode == 0:
self.sql = str(response.stdout).replace("\n","")
You could write some custom JS to parse the object like this:
var selectedfields ='';
var fields = Object.keys(obj.selectedFields);
for (i=0;i<fields.length; i++) {
var subfields = Object.keys(obj.selectedFields[fields[i]]);
for (j=0;j<subfields.length; j++) {
selectedfields = selectedfields + fields[i]+'.'+subfields[j]+' ';
}
}
var from="";
for (i=0;i<obj.from.length; i++) {
if (from=="") {
from = obj.from[i]
} else {
from = from + ',' +obj.from[i]
}
}
var output = 'SELECT '+selectedfields+ ' FROM '+from;
document.getElementById('output').innerHTML=output;
or in angular you would use $scope.output = ... from within a controller perhaps.
jsfiddle here: https://jsfiddle.net/jsheridan390/fpbp6cz0/1/

Translate SQL query to MongoDB mapreduce

I have the following collection in mongo:
{
"_id" : ObjectId("506217890b50f300d020d237"),
"o_orderkey" : NumberLong(1),
"o_orderstatus" : "O",
"o_totalprice" : 173665.47,
"o_orderdate" : ISODate("1996-01-02T02:00:00Z"),
"o_orderpriority" : "5-LOW",
"o_clerk" : "Clerk#000000951",
"o_shippriority" : 0,
"o_comment" : "blithely final dolphins solve-- blithely blithe packages nag blith",
"customer" : {
"c_custkey" : NumberLong(36901),
"c_name" : "Customer#000036901",
"c_address" : "TBb1yDZcf 8Zepk7apFJ",
"c_phone" : "23-644-998-4944",
"c_acctbal" : 4809.84,
"c_mktsegment" : "AUTOMOBILE",
"c_comment" : "regular accounts after the blithely pending dependencies play blith",
"c_nationkey" : {
"n_nationkey" : NumberLong(13),
"n_name" : "JORDAN",
"n_comment" : "blithe, express deposits boost carefully busy accounts. furiously pending depos",
"n_regioin" : {
"r_regionkey" : NumberLong(4),
"r_name" : "MIDDLE EAST",
"r_comment" : "furiously unusual packages use carefully above the unusual, exp"
}
}
},
"o_lineitem" : [
{
"l_linenumber" : 1,
"l_quantity" : 17,
"l_extendedprice" : 21168.23,
"l_discount" : 0.04,
"l_tax" : 0.02,
"l_returnflag" : "N",
"l_linestatus" : "O",
"l_shipdate" : ISODate("1996-03-13T03:00:00Z"),
"l_commitdate" : ISODate("1996-02-12T03:00:00Z"),
"l_receiptdate" : ISODate("1996-03-22T03:00:00Z"),
"l_shipinstruct" : "DELIVER IN PERSON",
"l_shipmode" : "TRUCK",
"l_comment" : "blithely regular ideas caj",
"partsupp" : {
"ps_availqty" : 6157,
"ps_supplycost" : 719.17,
"ps_comment" : "blithely ironic packages haggle quickly silent platelets. silent packages must have to nod. slyly special theodolites along the blithely ironic packages nag above the furiously pending acc",
"ps_partkey" : {
"p_partkey" : NumberLong(155190),
"p_name" : "slate lavender tan lime lawn",
"p_mfgr" : "Manufacturer#4",
"p_brand" : "Brand#44",
"p_type" : "PROMO BRUSHED NICKEL",
"p_size" : 9,
"p_container" : "JUMBO JAR",
"p_retailprice" : 1245.19,
"p_comment" : "regular, final dol"
},
"ps_suppkey" : {
"s_suppkey" : NumberLong(7706),
"s_name" : "Supplier#000007706",
"s_address" : "BlHq75VoMNCoU380SGiS9fTWbGpeI",
"s_phone" : "33-481-218-6643",
"s_acctbal" : -379.71,
"s_comment" : "carefully pending ideas after the instructions are alongside of the dolphins. slyly pe",
"s_nationkey" : {
"n_nationkey" : NumberLong(23),
"n_name" : "UNITED KINGDOM",
"n_comment" : "fluffily regular pinto beans breach according to the ironic dolph",
"n_regioin" : {
"r_regionkey" : NumberLong(3),
"r_name" : "EUROPE",
"r_comment" : "special, bold deposits haggle foxes. platelet"
}
}
}
}
},
.
.
.
]
}
And im trying to translate the following sql query:
select
s_acctbal,
s_name,
n_name,
p_partkey,
p_mfgr,
s_address,
s_phone,
s_comment
from
part,
supplier,
partsupp,
nation,
region
where
p_partkey = ps_partkey
and s_suppkey = ps_suppkey
and p_size = 15
and p_type like '%BRASS'
and s_nationkey = n_nationkey
and n_regionkey = r_regionkey
and r_name = 'EUROPE'
and ps_supplycost = (
select
min(ps_supplycost)
from
partsupp, supplier,
nation, region
where
p_partkey = ps_partkey
and s_suppkey = ps_suppkey
and s_nationkey = n_nationkey
and n_regionkey = r_regionkey
and r_name = 'EUROPE'
)
order by
s_acctbal desc,
n_name,
s_name,
p_partkey;
My function that i was trying:
db.runCommand({
mapreduce: "ordersfull",
query: {
},
map: function Map() {
var pattern = /BRASS$/g;
for(var i in this.o_lineitem){
var p_size = this.o_lineitem[i].partsupp.ps_partkey.p_size;
var p_type = this.o_lineitem[i].partsupp.ps_partkey.p_type;
var region = this.o_lineitem[i].partsupp.ps_suppkey.s_nationkey.n_regioin.r_name;
if(p_size==15 && p_type.match(pattern)!=null && region == "EUROPE"){
emit("",{
s_acctbal: this.o_lineitem[i].partsupp.ps_suppkey.s_acctbal,
s_name: this.o_lineitem[i].partsupp.ps_suppkey.s_name,
n_name: this.o_lineitem[i].partsupp.ps_suppkey.s_nationkey.n_name,
p_partkey: this.o_lineitem[i].partsupp.ps_partkey.p_partkey,
p_mfgr: this.o_lineitem[i].partsupp.ps_partkey.p_mfgr,
s_address: this.o_lineitem[i].partsupp.ps_suppkey.s_address,
s_phone: this.o_lineitem[i].partsupp.ps_suppkey.s_phone,
s_comment: this.o_lineitem[i].partsupp.ps_suppkey.s_comment
} );
}
}
},
reduce: function(key, values) {
},
out: 'query002'
});
In my result i got null value for all entries, what happen?
You can debug your MapReduce output by including print() or printjson() statements in the JavaScript functions. The resulting print output will be saved in your MongoDB log.
There are several issues with your MapReduce:
the for .. in loop will not work as you expect .. you should instead use array.forEach(..)
if you are iterating an array, you will already have a reference to the array item and should not use array[index]
you should emit() with a unique key name if you don't want unexpected grouping
your reduce() should return a value matching the structure of the emitted data
you should ideally use a query parameter to limit the documents that need to be inspected
Given you appear to just be iterating documents without doing any grouping or reduce(), you may find it easier to fetch the documents and perform the same matching in your application code.
In any case, the map() function should actually look more like:
var map = function () {
var pattern = /BRASS$/;
this.o_lineitem.forEach(function(item) {
var partKey = item.partsupp.ps_partkey;
var suppKey = item.partsupp.ps_suppkey;
var region = suppKey.s_nationkey.n_regioin.r_name;
if (partKey.p_size==15 && partKey.p_type.match(pattern) !=null && region == "EUROPE") {
emit(suppKey.s_name,
{
s_acctbal: suppKey.s_acctbal,
s_name: suppKey.s_name,
n_name: suppKey.s_nationkey.n_name,
p_partkey: partKey.p_partkey,
p_mfgr: partKey.p_mfgr,
s_address: suppKey.s_address,
s_phone: suppKey.s_phone,
s_comment: suppKey.s_comment
}
);
}
})
}
It would be easier to translate this query into the new Aggregation Framework in MongoDB 2.2, given your data structure and the desired multiple matching and sorting.
There are some current limitations to be aware of (such as the present 16MB maximum on output from the Aggregation pipeline), but you will likely find the queries easier to create and debug.
Here is a commented example using the Aggregation Framework, including initial match criteria for order status, date, and part/supplier items of interest:
db.ordersfull.aggregate(
// Find matching documents first (can take advantage of index)
{ $match: {
o_orderstatus: 'O',
o_orderdate: { $gte: new ISODate('2012-10-01') },
$and: [
{ o_lineitem: { $elemMatch: { 'partsupp.ps_partkey.p_size': 15 }} },
{ o_lineitem: { $elemMatch: { 'partsupp.ps_partkey.p_type': { $exists : true } }} },
{ o_lineitem: { $elemMatch: { 'partsupp.ps_suppkey.s_nationkey.n_regioin.r_name': 'EUROPE'}} }
]
}},
// Filter to fields of interest
{ $project: {
_id: 0,
o_lineitem: 1
}},
// Convert line item arrays into document stream
{ $unwind: '$o_lineitem' },
// Match desired line items
{ $match: {
'o_lineitem.partsupp.ps_partkey.p_size': 15,
'o_lineitem.partsupp.ps_partkey.p_type': /BRASS$/,
'o_lineitem.partsupp.ps_suppkey.s_nationkey.n_regioin.r_name': 'EUROPE'
}},
// Final field selection
{ $project: {
s_acctbal: '$o_lineitem.partsupp.ps_suppkey.s_acctbal',
s_name: '$o_lineitem.partsupp.ps_suppkey.s_name',
n_name: '$o_lineitem.partsupp.ps_suppkey.s_nationkey.n_name',
p_partkey: '$o_lineitem.partsupp.ps_partkey.p_partkey',
p_mfgr: '$o_lineitem.partsupp.ps_partkey.p_mfgr',
s_address: '$o_lineitem.partsupp.ps_suppkey.s_address',
s_phone: '$o_lineitem.partsupp.ps_suppkey.s_phone',
s_comment: '$o_lineitem.partsupp.ps_suppkey.s_comment'
}},
// Sort the output
{ $sort: {
s_acctbal: -1,
n_name: 1,
s_name: 1,
p_partkey: 1
}}
)