Access Queries Over Date Range - sql

I have an access query that will run to calculate the total activity in a location at a specified time. Is there a way to run this over a range of dates and return the results using SQL or should I write a form to run the query various with dates from code?
To make matters slightly more complicated, the query is also dealing with radioactive decay, correcting each item to its current activity at the time requested - although this can probably be ignored for testing.
Finally, on a slightly unrelated note, items and arrive and leave a location at any time, but it is unfeasible to have data points for each second of every day, so I am using three hour intervals as a compromise - as radioactive decay follows an equation, anyone have a better way of doing this (so that data is plotted as soon as an item arrives, otherwise the decay from the arrival time and plotted interval can effect my result a lot) or is this the best I am likely to manage?
Current SQL for single date:
SELECT FORMAT(SUM(Items.Activity*(Exp(-(0.693*(dDate-Items.MeasuredOn)/Nuclides.HalfLife)))), '#,##0.00') AS CurrentActivity
FROM Nuclides INNER JOIN ((Locations INNER JOIN (Items INNER JOIN ItemLocations ON Items.ItemID = ItemLocations.ItemID) ON Locations.LocationID = ItemLocations.LocationID) INNER JOIN LocationLimits ON Locations.LocationID = LocationLimits.LocationID) ON (Nuclides.Nuclide = Items.Nuclide) AND (Nuclides.Nuclide = LocationLimits.Nuclide)
WHERE (((ItemLocations.LocationID)=lLocationID) AND ((ItemLocations.RecieptDate)<dDate) AND ((ItemLocations.DisposalDate)>dDate Or (ItemLocations.DisposalDate) Is Null));

Could no figure out how to solve in the way I initially imagined, so solved as follows instead:
User inputs a date range for the time period they want to view data for.
One query to return single list of all dates when an event happened (date item added, date item removed)
Using set time intervals (now one hour) activity calculated at the time until an event time is reached. At event time activity is calculated for half a second before the event occurred and the time of the event itself. We then continue with our hour intervals until the next event.
After final event keep calculating at the interval until the end time is reached.
This works quickly enough to not be a problem for the user, and the graph is as smooth as it needs to be. FYI activity is also calculated half a second before events to give a vertical slope on the line, otherwise it will be slanted from the previous point (potentially an hour or more of time).

Related

Laravel where clause based on conditions from value in database

I am building an event reminder page where people can set a reminder for certain events. There is an option for the user to set the amount of time before they need to be notified. It is stored in notification_time and notification_unit. notification_time keeps track of the time before they want to be notified and notification_unit keeps track of the PHP date format in which they selected the time, eg. i for minutes, H for hours.
Eg. notification_time - 2 and notification_unit - H means they need to be notified 2 hours before.
I have Cron jobs running in the background for handling the notification. This function is being hit once every minute.
Reminder::where(function ($query) {
$query->where('event_time', '>=', now()->subMinutes(Carbon::createFromFormat('i', 60)->diffInMinutes() - 1)->format('H:i:s'));
$query->where('event_time', '<=', now()->subMinutes(Carbon::createFromFormat('i', 60)->diffInMinutes())->format('H:i:s'));
})
In this function, I am hard coding the 'i', 60 while it should be fetched from the database. event_time is also part of the same table
The table looks something like this -
id event_time ... notification_unit notification_time created_at updated_at
Is there any way to solve this issue? Is it possible to do the same logic with SQL instead?
A direct answer to this question is not possible. I found 2 ways to resolve my issue.
First solution
Mysql has DATEDIFF and DATE_SUB to get timestamp difference and subtract certain intervals from a timestamp. In my case, the function runs every minute. To use them, I have to refactor my database to store the time and unit in seconds in the database. Then do the calculation. I chose not to use this way because both operations are a bit heavy on the server-side since I am running the function every minute.
Second Solution
This is the solution that I personally did in my case. Here I did the calculations while storing it in the database. Meaning? Let me explain. I created a new table notification_settings which is linked to the reminder (one-one relation). The table looks like this
id, unit, time, notify_at, repeating, created_at, updated_at
The unit and time columns are only used while displaying the reminder. What I did is, I calculated when to be notified in the notify_at column. So in the event scheduler, I need to check for the reminders at present (since I am running it every minute). The repeating column is there to keep track of whether the reminder is repeating or not. If it is repeating I re-calculate the notify_at column at the time of scheduling. Once the user is notified notify_at is set to null.

Qlikview line chart with multiple expressions over time period dimension

I am new to Qlikview and after several failed attempts I have to ask for some guidance regarding charts in Qlikview. I want to create Line chart which will have:
One dimension – time period of one month broke down by days in it
One expression – Number of created tasks per day
Second expression – Number of closed tasks per day
Third expression – Number of open tasks per day
This is very basic example and I couldn’t find solution for this, and to be honest I think I don’t understand how I should setup my time period dimension and expression. Each time when I try to introduce more then one expression things go south. Maybe its because I have multiple dates or my dimension is wrong.
Here is my simple data:
http://pastebin.com/Lv0CFQPm
I have been reading about helper tables like Master Callendar or “Date Island” but I couldn’t grasp it. I have tried to follow guide from here: https://community.qlik.com/docs/DOC-8642 but that only worked for one date (for me at least).
How should I setup dimension and expression on my chart, so I can count the ID field if Created Date matches one from dimension and Status is appropriate?
I have personal edition so I am unable to open qwv files from other authors.
Thank you in advance, kind regards!
My solution to this would be to change from a single line per Call with associated dates to a concatenated list of Call Events with a single date each. i.e. each Call will have a creation event and a resolution event. This is how I achieve that. (I turned your data into a spreadsheet but the concept is the same for any data source.)
Calls:
LOAD Type,
Id,
Priority,
'New' as Status,
date(floor(Created)) as [Date],
time(Created) as [Time]
FROM
[Calls.xlsx]
(ooxml, embedded labels, table is Sheet1) where Created>0;
LOAD Type,
Id,
Priority,
Status,
date(floor(Resolved)) as [Date],
time(Resolved) as [Time]
FROM
[Calls.xlsx]
(ooxml, embedded labels, table is Sheet1) where Resolved>0;
Key concepts here are allowing QlikView's auto-conatenate to do it's job by making the field-names of both load statements exactly the same, including capitalisation. The second is splitting the timestamp into a Date and a time. This allows you to have a dimension of Date only and group the events for the day. (In big data sets the resource saving is also significant.) The third is creating the dummy 'New' status for each event on the day of it's creation date.
With just this data and these expressions
Created = count(if(Status='New',Id))
Resolved = count(if(Status='Resolved',Id))
and then
Created-Resolved
all with full accumulation ticked for Open (to give you a running total rather than a daily total which might go negative and look odd) you could draw this graph.
For extra completeness you could add this to the code section to fill up your dates and create the Master Calendar you spoke of. There are many other ways of achieving this
MINMAX:
load floor(num(min([Date]))) as MINTRANS,
floor(num(max([Date]))) as MAXTRANS
Resident Calls;
let zDateMin=FieldValue('MINTRANS',1);
let zDateMax=FieldValue('MAXTRANS',1);
//complete calendar
Dates:
LOAD
Date($(zDateMin) + IterNo() - 1, '$(DateFormat)') as [Date]
AUTOGENERATE 1
WHILE $(zDateMin)+IterNo()-1<= $(zDateMax);
Then you could draw this chart. Don't forget to turn Suppress Zero Values on the Presentation tab off.
But my suggestion would be to use a combo rather than line chart so that the calls per day are shown as discrete buckets (Bars) but the running total of Open calls is a line

SQL Adding unmatched Join results to output

Maybe I am approaching the entire problem wrong - or inefficiently.
Essentially, I am trying to combine two views of data, one of them a log table, based upon 2 criteria:
RoomName field match
Timestamp matches
vw_FusionRVDB_Schedule (RoomName, StartTime, EndTime, Subject, etc)
This contains the schedule of events for all indexed rooms - times in UTC.
vw_FusionRVDB_DisplayUsageHistory (RoomName, OnTime, OffTime, etc)
This is a log of activity that has been paired down to just show when the room display has been turned on and off - times in UTC.
I am wanting to match display on/off activities with the events scheduled in the room when the logged activities occurred.
The query is really long, and includes a lot of derived fields. Hopefully just focusing on the join section will make it more clear.
SELECT <foo>
FROM dbo.vw_FusionRVDB_Schedule
INNER JOIN dbo.vw_FusionRVDB_DisplayUsageHistory
ON dbo.vw_FusionRVDB_Schedule.RoomName =
dbo.vw_FusionRVDB_DisplayUsageHistory.RoomName
AND dbo.vw_FusionRVDB_Schedule.EndTime >=
dbo.vw_FusionRVDB_DisplayUsageHistory.OnTime
AND dbo.vw_FusionRVDB_Schedule.StartTime <=
dbo.vw_FusionRVDB_DisplayUsageHistory.OffTime
This query is working great. By design, some events are listed more than once. This happens when there are multiple on/off display cycles that occur within the window of the same event. Similarly, if a room display is turned on before or during one event and stays on through a following event, data from that single log entry is used on both the first and second event record. So this query is doing exactly what is needed in this aspect.
However, I also want to add back into the output, scheduled events (from the vw_FusionRVDB_Schedule view) that have no corresponding logged activities in the vw_FusionRVDB_DisplayUsageHistory.
I have tried various forms on UNION on another query of the vw_FusionRVDB_Schedule view with null values in the and the fields otherwise taken or derived from vw_FusionRVDB_DisplayUsageHistory view. But it adds all scheduled activities back in - not just the ones with no match from the initial join.
I can provide more details if needed. Thank you in advance.
HepC answered in the comments. I was letting the results confuse me. A left join did the trick.

Dynamically filtering large query result for presentation in SSRS

We have a system that records data to an SQL Server DB captured from field equipment every minute. This data is used for a number of purposes, one of which is for charting in reports via SSRS.
The issue is that with such a high volume of data, when a report is run for period of for example 3 months, the volume of data returned obviously causes excessive report rendering times.
I've been thinking of finding a way of dynamically reducing the amount of data returned, based on the start and end time periods chosen. Something along the lines of a sliding scale where from the duration between the start and end period, I can apply different levels of filtering so that where larger periods are chosen, more filtering occurs while for smaller periods less or no filtering occurs.
There is still a need to be able to produce higher resolution (as in more data points returned) reports for troubleshooting purposes.
For example:
Scenario 1:
User is executing a report for a period of 3 months. Result set returned by the query is reduced for performance reasons without adversely affecting what information the user wants to see (the chart is still representative of the changes over time).
Scenario 2:
User executes the report for a period of 1 hour, in order to look for potential indicator(s) of problems with field devices while troubleshooting the system. For this short time period, no filtering is applied.
My first thought was to use a modulo operation on the primary key of the data (which is an identity field), whereby the divisor is chosen depending on the difference between the start and end dates.
For example, something like if the difference between the start and end dates for the report execution period is 5 weeks, choose a divisor of 5 and apply a mod to the PK, selecting where the result is equal to zero.
I would love to get feedback as to whether this sounds like a valid approach or whether there is a better way to do this.
Thanks.

SQL query to calculate visit duration from log table

I have a MySQL table LOGIN_LOG with fields ID, PLAYER, TIMESTAMP and ACTION. ACTION can be either 'login' or 'logout'. Only around 20% of the logins have an accompanying logout row. For those that do, I want to calculate the average duration.
I'm thinking of something like
select avg(LL2.TIMESTAMP - LL1.TIMESTAMP)
from LOGIN_LOG LL1
inner join LOGIN_LOG LL2 on LL1.PLAYER = LL2.PLAYER and LL2.TIMESTAMP > LL1.TIMESTAMP
left join LOGIN_LOG LL3 on LL3.PLAYER = LL1.PLAYER
and LL3.TIMESTAMP between LL1.TIMESTAMP + 1 and LL2.TIMESTAMP - 1
and LL3.ACTION = 'login'
where LL1.ACTION = 'login' and LL2.ACTION = 'logout' and isnull(LL3.ID)
is this the best way to do it, or is there one more efficient?
Given the data you have, there probably isn't anything much faster you can do because you have to look at a LOGIN and a LOGOUT record, and ensure there is no other LOGIN (or LOGOUT?) record for the same user between the two.
Alternatively, find a way to ensure that a disconnect records a logout, so that the data is complete (instead of 20% complete). However, the query probably still has to ensure that the criteria are all met, so it won't help the query all that much.
If you can get the data into a format where the LOGIN and corresponding LOGOUT times are both in the same record, then you can simplify the query immensely. I'm not clear if the SessionManager does that for you.
Do you have a SessionManager type object that can timeout sessions? Because a timeout could be logged there, and you could get the last activity time from that and the timeout period.
Or you log all activity on the website/service, and thus you can query website/service visit duration directly, and see what activities they performed. For a website, Apache log analysers can probably generate the required stats.
I agree with JeeBee, but another advantage to a SessionManager type object is that you can handle the sessionEnd event and write a logout row with the active time in it. This way you would likely go from 20% accompanying logout rows to 100% accompanying logout rows. Querying for the activity time would then be trivial and consistent for all sessions.
If only 20% of your users actually log out, this search will not give you a very accurate time of each session. A better way to gauge how long an average user session is would be to take the average time between actions, or avg. time per page. This, then, can multiplied by the average number of pages/actions per visit to give a more accurate time.
Additionally, you can determine avg. time for each page, and then get your session end time = session time to that point + avg time spent on their last page. This will give you a much more fine-grained(and accurate) measure of time spent per session.
Regarding the given SQL, it seems to be more complicated than you really need. This sort of statistical operation can often be better handled/more maintainable in code external to the database where you can have the full power of whichever language you choose, and not just the rather convoluted abilities of SQL for statistical calculations