MPAndroidChart - Setting a fixed interval on X axis for a time value - mpandroidchart

I am trying to display a time on my X axis with a fixed interval. Currently the X axis is not behaving like I want it to. I'm trying to have an interval of 5 minutes. When I zoom in my points are at the right timestamp.

Maybe you can use setGranularity API:
xAxis.setGranularityEnabled(true);
xAxis.setGranularity(5 * 60 * 1000); // if time is in ms

Related

Subtraction of dates with hours and minutes (result in float)

I would like some help with an SSIS problem.
I have two columns, one with a date of when demand was open and another when the demand was responded to.
My date comes in this way:
DT_ANSWERED_DATE
DT_CREATED_DATE
2021-02-04 19:48:00.000
2021-02-04 19:44:00.000
I would like to subtract DT_ANSWERED_DATE MINUS DT_CREATED_DATE
but I would like the result would be a float number:
like in this case when a subtract in excel
I get the result:
DT_ANSWERED_DATE
DT_CREATED_DATE
DT_ANSWERED_DATE minus DT_CREATED_DATE
2021-02-04 19:48:00.000
2021-02-04 19:44:00.000
0,00277777777228039
I would like to do the same thing but in a derived column at SSIS (Microsoft Visual Studio)
Thanks for the response in advance
It looks like your granularity is in minutes. This should get you the decimal number you are looking for...
DATEDIFF("mi", DT_CREATED_DATE, DT_ANSWERED_DATE) / (60 * 24)
(60 min per hour * 24 hours in a day)
Microsoft documentation... https://learn.microsoft.com/en-us/sql/integration-services/expressions/datediff-ssis-expression?view=sql-server-ver16
In your example above this results in:
4 min / (60*24) = 0.00277777777
Note:
I highly recommend using decimal vs float. Unless you really, really have a reason. 1=1 is usually not true when using a float number. It will always be true with integers or decimals.

extracting HOUR from an interval in spark sql

I was wondering how to properly extract amount of hours between given 2 timestamps objects.
For instance, when the following SQL query gets executed:
select x, extract(HOUR FROM x) as result
from
(select (TIMESTAMP'2021-01-22T05:00:00' - TIMESTAMP'2021-01-01T09:00:00') as x)
The result value is 20, while I'd expect it to be 500.
It seems odd to me considering that x value indicates the expected return value.
Can anyone please explain to me what I'm doing wrong and perhaps suggest additional way of query so the desired result would return?
Thanks in advance!
I think you have to do the maths with this one as datediff in SparkSQL only supports days. This worked for me:
SELECT (unix_timestamp(to_timestamp('2021-01-22T05:00:00') ) - unix_timestamp(to_timestamp('2021-01-01T09:00:00'))) / 60 / 60 diffInHours
My results (in Synapse Notebook, not Databricks but I expect it to be the same):
The unix_timestamp function converts the timestamp to a Unix timestamp (in seconds) and then you can apply date math to it. Subtracting them gives the number of seconds between the two timestamps. Divide by 60 for the number minutes between the two dates and by 60 again for the number of hours between the two dates.

How to create an SQL time-in-location table from location/timestamp SQL data stream

I have a question that I'm struggling with in SQL.
I currently have a series of location and timestamp data. It consists of devices in locations at varying timestamps. The locations are repeated, so while they are lat/long coordinates there are several that repeat. The timestamp comes in irregular intervals (sometimes multiple times a second, sometimes nothing for 30 seconds). For example see the below representational data (I am sorting by device name in this example, but could order by anything if it would help):
Device Location Timestamp
X A 1
X A 1.7
X A 2
X A 3
X B 4
X B 5.2
X B 6
X A 7
X A 8
Y A 2
Y A 4
Y C 6
Y C 7
I wish to create a table based on the above data that would show entry/exit or first/last time in each location, with the total duration of that instance. i.e:
Device Location EntryTime ExitTime Duration
X A 1 3 2
X B 4 6 2
X A 7 8 1
Y A 2 4 2
Y C 6 7 1
From here I could process it further to work out a total time in location for a given day, for example.
This is something I could do in Python or some other language with something like a while loop, but I'm really not sure how to accomplish this in SQL.
It's probably worth noting that this is in Azure SQL and I'm creating this table via a Stream Analytics Query to an Event Hubs instance.
The reason I don't want to just simply total all in a location is because it is going to be streaming data and rolling through for a display for say, the last 24 hrs.
Any hints, tips or tricks on how I might accomplish this would be greatly appreciated. I've looked and haven't be able to quite find what I'm looking for - I can see things like datediff for calculating duration between two timestamps, or max and min for finding the first and last dates, but none quite seem to tick the box. The challenge I have here is that the devices move around and come back to the same locations many times within the period. Taking the first occurrence/timestamp of device X at location A and subtracting it from the last, for example, doesn't take into account the other locations it may have traveled to in between those timestamps. Complicating things further, the timestamps are irregular, so I can't simply count the number of occurrences for each location and add them up either.
Maybe I'm missing something simple or obvious, but this has got me stumped! Help would be greatly appreciated :)
I believe grouping would work
SELECT Device, Location, [EntryTime] = MIN(Timestamp), [ExitTime] = Max(Timestamp), [Duration] = MAX(Timestamp)- MIN(Timestamp)
FROM <table>
GROUP BY Device, Location
I was working on similar issue, to some extent in my dataset.
SELECT U.*, TO_DATE(U.WEND,'DD-MM-YY HH24:MI') - TO_DATE(U.WSTART,'DD-MM-YY HH24:MI') AS DURATION
FROM
(
SELECT EMPNAME,TLOC, TRUNC(TO_DATE(T.TDATETIME,'DD-MM-YY HH24:MI')) AS WDATE, MIN(T.TDATETIME) AS WSTART, MAX(T.TDATETIME) AS WEND FROM EMPTRCK_RSMSM T
GROUP BY EMPNAME,TLOC,TRUNC(TO_DATE(T.TDATETIME,'DD-MM-YY HH24:MI'))
) U

Grafana x-axis to show data with 10 seconds granularity instead of 1 sec

It's probably something easy but I'm new to grafana so bear with me.
I have data collected every 10 seconds I would like to display in Grafana.
select time, value from "metrics_value" where instance='Processor' and type='counter' and type_instance='message' and time> now() - 1m;
name: metrics_value
---------------
time value
2016-10-13T09:24:33Z 23583
2016-10-13T09:24:43Z 23583
2016-10-13T09:24:53Z 23583
2016-10-13T09:25:03Z 23583
2016-10-13T09:25:13Z 23583
But it's shown as :
So it fills in the intermediate points with some values.
How could I set the interval of x axis of grafana to show only points of 10 seconds?
I know the I could aggregate summarize function to sum up as described here: How to change the x axis in Graphite/Grafana (to graph by day)?
But I don't think I can use that.
Works properly with:
select sum("value") from "metrics_value" where instance='Processor' and type='counter' and type_instance='message' and time> now() - 1m GROUP BY time(10s) fill(null);
Edit: I also changed "sum" aggregation to mean so grafana calculates the mean of the values when zoomed out. (Otherwise it summed the values.)

Best way to store time above 24:00:00 in postgresql?

I'm storing GTFS feeds into a SQL database and some times are expected to be stored above the 24:00:00 cap on time values. For example, some trains run at 12:30AM, but are listed for the previous days service, and that running time is stored as 24:30 in the GTFS specifications.
What would be the best way of getting around this? Should I just store it as a string?
Suggest to use int for that... your value could be:
Sec + Min * 60 + Hour * 3600
For the 24:30:00, you will get 88200.
When loading your value from DB, you could reverse your value by simple math equation:
Hour = int(value / 3600)
Min = int(value % 3600 / 60)
Sec = value % 3600 % 1800
I'd store two fields:
departure_time timestamp with time zone,
service_date date
Departure time would be calculated like this:
=> select '2015-07-08'::timestamptz+'24:30'::interval;
2015-07-09 00:30:00+02
This way:
you have a normal moment of time field for sorting events;
you'd not loose service date information;
you'd be able to calculate back the original GTFS data if needed.