Ngb Datepicker Select Multiple Day - datetimepicker

[
{
"year": 2022,
"month": 11,
"day": 23
},
{
"year": 2022,
"month": 11,
"day": 26
},
{
"year": 2022,
"month": 11,
"day": 6
}
]
I want to fetch dates coming as json selected in DateTimePicker. Unlike the classical interval work, it is sufficient to select only certain days.
There is an example like this in the link, but I want the dates to be marked according to json while the form is being loaded.
Example

Related

Issues creating Json object on snowflake using Sql

I am new to Snowflake and I am trying to create a new table from an existing table and converting some rows into json format.
what I have on snowflake (table name: lex)
area
source
type
date
One
modak
good
2021
what I want to achieve on snowflake
area
sources
One
[{"source": "modak","period": {"type": "good","date": "2021"} }]
Any direction on how to go about it using SQl will be appreciated.
You can use object_construct to produce JSON objects:
select
area, array_construct( object_construct( 'source', source, 'period', object_construct( 'type',type, 'date',date )) ) sources
from
values ('One','modak','good','2021') tmp (area, source,type,date);
+------+-------------------------------------------------------------------------+
| AREA | SOURCES |
+------+-------------------------------------------------------------------------+
| One | [ { "period": { "date": "2021", "type": "good" }, "source": "modak" } ] |
+------+-------------------------------------------------------------------------+
I also used array_construct to add the brackets ([])
OBJECT_CONSTRUCT:
https://docs.snowflake.com/en/sql-reference/functions/object_construct.html
ARRAY_CONSTRUCT:
https://docs.snowflake.com/en/sql-reference/functions/array_construct.html
PS: Order of the JSON elements are not important when accessing them:
select parse_json(j):source, parse_json(j):period.type
from
values
('{ "period": { "date": "2021", "type": "good" }, "source": "modak" }'),
('{"source": "modak","period": {"type": "good","date": "2021"} }') tmp(j);
+----------------------+---------------------------+
| PARSE_JSON(J):SOURCE | PARSE_JSON(J):PERIOD.TYPE |
+----------------------+---------------------------+
| "modak" | "good" |
| "modak" | "good" |
+----------------------+---------------------------+
So the complete answer is:
WITH lex AS (
SELECT * FROM VALUES
('One', 'modak', 'good', 2021)
v(area, source, type, date)
)
SELECT l.area,
ARRAY_AGG(OBJECT_CONSTRUCT('source', l.source, 'period', OBJECT_CONSTRUCT('type', l.type, 'date', l.date))) as sources
FROM lex AS l
GROUP BY 1
ORDER BY 1;
which gives:
AREA
SOURCES
One
[ { "period": { "date": 2021, "type": "good" }, "source": "modak" } ]
Gokhan's answer show how to build a static array with ARRAY_CONSTRUCT, but if you have a dynamic input like:
WITH lex AS (
SELECT * FROM VALUES
('One', 'modak', 'good', 2021),
('One', 'modak', 'bad', 2022),
('Two', 'modak', 'good', 2021)
v(area, source, type, date)
)
And you want, the below, you will need to use ARRAY_AGG
AREA
SOURCES
One
[ { "period": { "date": 2021, "type": "good" }, "source": "modak" }, { "period": { "date": 2022, "type": "bad" }, "source": "modak" } ]
Two
[ { "period": { "date": 2021, "type": "good" }, "source": "modak" } ]

how to perform Rank query in Django

I am stuck at this point and this query seems a bit tricky to me
so basically i am getting data from devices in periodic intervals 'lets say 3 hours'
But every device is sending data twice on every interval(3 hours), i am required to extract device data that is send last(in intervals)
"date": "2021-11-28",
"time": "12 : 35",
"meter": "70F8E7FFFE10C495",
"dr": 3,
"date_received": "2021-11-28T18:05:20.473430+05:30",
},
{
"id": 2,
"date": "2021-11-28",
"time": "12 : 37",
"meter": "70F8E7FFFE10C459",
"date_received": "2021-11-28T18:07:09.980403+05:30",
},
{
"id": 3,
"date": "2021-11-28",
"time": "12 : 37",
"meter": "70F8E7FFFE10C459", <---- this occurrence only
"dr": 3,
"date_received": "2021-11-28T18:07:11.533388+05:30",
},
{
"id": 4,
"date": "2021-11-28",
"time": "12 : 50",
"meter": "70F8E7FFFE10C463",
"date_received": "2021-11-28T18:20:44.197284+05:30",
},
{
"id": 5,
"date": "2021-11-28",
"time": "12 : 56",
"meter": "70F8E7FFFE10C47D",
"date_received": "2021-11-28T18:26:43.221316+05:30",
},
{
"id": 6,
"date": "2021-11-28",
"time": "12 : 56",
"meter": "70F8E7FFFE10C47D", <---- only want to get this occurrence in similar manner
"date_received": "2021-11-28T18:26:44.925292+05:30",
},
So if device : 70F8E7FFFE10C459 sends data twice i want to retrieve only last sent data , and this needs obj in queryset , i was told about RANK concept don't understand how to apply that here
In general i just want to know the query to apply, raw query will also work
class MeterData(models.Model):
meter = models.ForeignKey('dashboard.Meter',on_delete=models.SET_NULL,null=True)
customer = models.ForeignKey('Customer',on_delete=models.SET_NULL,null=True,blank=True)
battery = models.CharField(max_length=40,null=True,blank=True)
crc_status = models.CharField(max_length=43,null=True,blank=True)
Pulse_count = models.IntegerField(null=True,blank=False)
status_data = models.CharField(max_length=50,blank=True,null=True)
status_magnet_temper = models.BooleanField(default=False)
meter_reading = models.CharField(max_length=50,null=True,blank=True)
status_switch_temper = models.BooleanField(default=False)
status_voltage_level = models.CharField(max_length=10,null=True,blank=True)
uplink_type = models.CharField(max_length=50,null=True,blank=True)
confirmed_uplink = models.BooleanField(max_length=50,null=True,blank=True)
tx_info_frequency = models.CharField(max_length=50,null=True,blank=True)
dr = models.IntegerField(null=True,blank=True)
date_received = models.DateTimeField(auto_now_add=True,null=True)
It doesn't work on Sqlite
I think the following query is the correct answer.
MeterData.objects.filter(put_your_filters_here).order_by('-date_received').distinct('customer_id')
For SQLite DB use the below query:
from django.db.models import Subquery, OuterRef
distinct_subquery = Subquery(MeterData.objects.filter(put_your_filters_here, customer_id=OuterRef('customer_id')).order_by('-date_received').values('id')[:1])
MeterData.objects.filter(put_your_filters_here, id__in=distinct_subquery).order_by('-date_received')
if it doesn't work let me know.

SQL Query to Node Red Line Chart

I am attempting to query data out of SQL Server and display it as a line chart in Node Red.
My Data from SQL looks like
1556029184000 0.0675168918918922
1556029139000 0.0675515463917528
1556029079000 0.0679347826086958
1556029019000 0.0674082568807342
1556028959000 0.0674431818181822
1556028898000 0.0675537634408605
1556028838000 0.0673611111111115
1556028779000 0.0675917431192663
1556028719000 0.06744212962963
1556028659000 0.0673148148148151
Left column is a timestamp converted to Epoch and right column is the value to plot.
Node red debug shows this:
[{"x":"1556029788000","y":0.06772222222222232},
{"x":"1556029738000","y":0.06855053191489367},
{"x":"1556029678000","y":0.06858333333333343},
{"x":"1556029619000","y":0.06751146788990835},
{"x":"1556029559000","y":0.06805180180180205},
{"x":"1556029499000","y":2.714885321100926},
{"x":"1556029439000","y":11.43350290697674},
{"x":"1556029378000","y":6.6709253246753235},
{"x":"1556029319000","y":0.06748842592592619},
{"x":"1556029259000","y":0.06760714285714318}]
Nothing is displayed in the chart. All help is appreciated.
The node-red-dashboard sidebar help has a link to the details the format for passing data to the chart node.
What you currently have is a msg.payload that contains an array of objects with x & y values. These need to be moved to the msg.payload.data field as described:
[{
"series": ["A", "B", "C"],
"data": [
[{ "x": 1504029632890, "y": 5 },
{ "x": 1504029636001, "y": 4 },
{ "x": 1504029638656, "y": 2 }
],
[{ "x": 1504029633514, "y": 6 },
{ "x": 1504029636622, "y": 7 },
{ "x": 1504029639539, "y": 6 }
],
[{ "x": 1504029634400, "y": 7 },
{ "x": 1504029637959, "y": 7 },
{ "x": 1504029640317, "y": 7 }
]
],
"labels": [""]
}]

Rally Lookback API doesn't retrieve records newer than 1 week

I'm running some queries with Rally Lookback API and it seems that revisions newer than 1 week are not being retrieved:
λ date
Wed, Nov 28, 2018 2:26:45 PM
using the query below:
{
"ObjectID": 251038028040,
"__At": "current"
}
results:
{
"_rallyAPIMajor": "2",
"_rallyAPIMinor": "0",
"Errors": [],
"Warnings": [
"Max page size limited to 100 when fields=true"
],
"GeneratedQuery": {
"find": {
"ObjectID": 251038028040,
"$and": [
{
"_ValidFrom": {
"$lte": "2018-11-21T14:44:34.694Z"
},
"_ValidTo": {
"$gt": "2018-11-21T14:44:34.694Z"
}
}
],
"_ValidFrom": {
"$lte": "2018-11-21T14:44:34.694Z"
}
},
"limit": 10,
"skip": 0,
"fields": true
},
"TotalResultCount": 1,
"HasMore": false,
"StartIndex": 0,
"PageSize": 10,
"ETLDate": "2018-11-21T14:44:34.694Z",
"Results": [
{
"_id": "5bfe7e3c3f1f4460feaeaf11",
"_SnapshotNumber": 30,
"_ValidFrom": "2018-11-21T12:22:08.961Z",
"_ValidTo": "9999-01-01T00:00:00.000Z",
"ObjectID": 251038028040,
"_TypeHierarchy": [
-51001,
-51002,
-51003,
-51004,
-51005,
-51038,
46772408020
],
"_Revision": 268342830516,
"_RevisionDate": "2018-11-21T12:22:08.961Z",
"_RevisionNumber": 53,
}
],
"ThreadStats": {
"cpuTime": "15.463705",
"waitTime": "0",
"waitCount": "0",
"blockedTime": "0",
"blockedCount": "0"
},
"Timings": {
"preProcess": 0,
"findEtlDate": 88,
"allowedValuesDisambiguation": 1,
"mongoQuery": 1,
"authorization": 3,
"suppressNonRequested": 0,
"compressSnapshots": 0,
"allowedValuesHydration": 0,
"TOTAL": 93
}
}
Having in mind that this artifact have, as for now, 79 revisions with the latest revision pointing to 11/21/2018 02:41 PM CST as per revisions tab at Rally Central.
One other thing is that if I run the query a couple of minutes later the ETL date seems to be updating, as some sort of indexing being run:
{
"_rallyAPIMajor": "2",
"_rallyAPIMinor": "0",
"Errors": [],
"Warnings": [
"Max page size limited to 100 when fields=true"
],
"GeneratedQuery": {
"find": {
"ObjectID": 251038028040,
"$and": [
{
"_ValidFrom": {
"$lte": "2018-11-21T14:45:50.565Z"
},
"_ValidTo": {
"$gt": "2018-11-21T14:45:50.565Z"
}
}
],
"_ValidFrom": {
"$lte": "2018-11-21T14:45:50.565Z"
}
},
"limit": 10,
....... rest of the code ommited.
Is there any reason why Lookback API shouldn't processing current data instead of one week of difference between records?
It appears that your workspace's data is currently being "re-built". The _ETLDate is the date of the most-current revision in the LBAPI database and should eventually catch up to the current revision's date.

Select game scores with a player's context from Postgres

I am developing a web app with rails and postgres as a shooting range game backend. It has an endpoint to save players scores, which writes the scores to DB and returns the first three places and the players' ranks with some context around it: a place before and a place after. For example if a player has rank 23, the method should return 1st, 2nd, 3rd places, the player rank itself and also two records with ranks 22 and 24 besides it. I don't store the rank in the DB it is calculated dynamically using following sql query:
SELECT RANK() OVER(ORDER BY score DESC) AS place, name, score
FROM scores
WHERE game_id=? AND rules_version=? AND game_mode=?
LIMIT 10
POST request with following data:
{
"players": [
{ "name": "Jack", "score": 100, "local_id": 1, "stats": {}},
{ "name": "Anna", "score": 200, "local_id": 2, "stats": {}}
]
}
should return following result set:
{
"scores": [
{
"name": "Piter",
"place": 1,
"score": 800
},
{
"name": "Lisa",
"place": 2,
"score": 700
},
{
"name": "Philip",
"place": 3,
"score": 600
},
{
"name": "Max",
"place": 25,
"score": 204
},
{
"name": "Anna",
"place": 26,
"score": 200,
"local_id": 2
},
{
"name": "Ashley",
"place": 27,
"score": 193
}
{
"name": "Norman",
"place": 36,
"score": 103
},
{
"name": "Jack",
"place": 37,
"score": 100,
"local_id": 1
},
{
"name": "Chris",
"place": 38,
"score": 95
}
]
}
Every field except local_id and as I said place is stored in DB. I can't figure out the right sql query to do such select. Please help.