$queryResults = $bigQuery->runQuery("
SELECT
date,
(value/10)+273 as temperature
FROM
[bigquery-public-data:ghcn_d.ghcnd_$year]
WHERE
id = '{$station['id']}' AND
element LIKE '%$element%'
ORDER BY
date,
temperature"
);
I am calling this query and iterating over the years for each station. It gets through 1 or 2 stations and then I get a
killed
on my output and the process is halted...
Is it possible that the queries are not closing? I am looking for a close query, but it appears that it should close itself?
Any Ideas?
Related
I have a table of events, some date/times, and revision numbers for each event. The number of revisions vary. Most events open, then things happen to cause revisions, then they close. Sometimes events open, things happen, the event closes, then it re-opens using the same event number, things happen . . .
There is no available table of events that tells me "at this time the event opened", "at this time the event was closed", "at this time the event was re-opened", etc. That would be too easy.
The same event can open and close multiple times.
I want to create a report where a user can pick some date/time parameters and see what events are active (open) at that moment, or between two date/times. Here is an example of both an event that opens and closes three times, and an event that only opens and closes once, and what I desire for an output. The results below are stored into a table variable called #CEvents.
It looks to me like the English version of the question is something like "If all of an event's ClosedDateTime values are NULL:
The Created or reopened DT = the RevDateTime value of the MIN(rev_num);
The Closed DT = the RevDateTime value of the MAX(rev_num);
If an event has one or more non-NULL ClosedDateTime values:
A Created or reopened DT = the RevDateTime value of the MIN(rev_num);
A Closed DT = the RevDateTime value for the row previous to the row where the
ClosedDateTime is not NULL;
A Created or reopened DT = the RevDateTime value for the row right after the row where
the ClosedDateTime is not NULL;
Repeat for each instance where there is a ClosedDateTime value;
Finish up with a Closed DT = the RevDateTime value of the MAX(rev_num);
I cannot figure out how to write the SQL for this question. Or, perhaps you have a better idea? Thanks in advance.
I solved this to my satisfaction by building a temp table containing the cloumns I wanted and updating the table with a series of INSERT queries.
Each event's first rev_num was given an INT 'Status' value of 0, meaning open
Since there was a ClosedDT field a 'Status' value of -1 (closed) was added for each row that had one.
To get the "I'm back open again" values for those events with more than one open/close pair I used a query that joined the table upon itself by the rev_num to the copy of the table's rev_num + 1, thereby retrieving the timestamp value of the row after the CloseDT, which is the re-opened datetime:
INSERT INTO #CEvents2
SELECT a.ag_id, a.dgroup, a.num_1, a.RevDT TimeStamp, Status = 0, RowNum = NULL, a.eid
FROM #CEvents a
LEFT JOIN #CEvents b on a.num_1 = b.Num_1 and a.rev_num = b.rev_num + 1
WHERE b.ClosedDT is NOT NULL
To get a final closed date/time I used a summary table that contains the
earliest and latest date/time for the event, choosing that one field that contained the latest date/time to insert as the timestamp.
I added one more field, a RowNum partitioning by the event number, ordering by the timestamp which resulted in a table that shows the chronology of each event and for each row a 0 (I'm open) and -1 (I'm closed).
It just remains for the report user to pick a set of dates to see the chronology, or to pick a time and see which events are in a 0 status at that very moment.
I opted to forgo putting the results into a side-by-side table showing a column for Open and one for Close. Instead each row has the open/close INT values of 0 or -1.
I want to perform stream calculations on a super table with 100,000 sub-tables in TDengine, time window of 30 minutes / slide of 5 minutes. However, the system get stuck after a time.
My stream creation statement is:
create stream calctemp_stream into
calctemp_stream_output_stb as
select
_wstart as start,
_wend as wend,
last(temperature) -first (temperature) as difftemperature,
last(temperature) as lasttemperature,
last (ts) as lastts ,
first (temperature) as firsttemperature ,
last(alarm_type) as alarm_type,
xl,atype,xh
from calctemp partition by xl,atype,xh interval (30m) sliding(5m) ;
xl,atype,xh are the tags of supertable.
I don't know how to debug it.
I have this code:
#Name("Creating_hourly_measurement_Position_Stopper for line 2")
insert into CreateMeasurement
select
m.measurement.source as source,
current_timestamp().toDate() as time,
"Line2_Count_Position_Stopper_Measurement" as type,
{
"Line2_DoughDeposit2.Hourly_Count_Position_Stopper.value",
count(cast(getNumber(m, "Status.Sidestopper_positioning.value"), double)),
"Line2_DoughDeposit2.Hourly_Count_Position_Stopper.unit",
getString(m, "Status.Sidestopper_positioning.unit")
} as fragments
from MeasurementCreated.win:time(1 hours) m
where getNumber(m, "Status.Sidestopper_positioning.value") is not null
and cast(getNumber(m, "Status.Sidestopper_positioning.value"), int) = 1
and m.measurement.source.value = "903791"
output last every 1 hours;
but it seems to loop. I believe it's because new measurement will modify this group, meaning it is constantly extending. This mean that recalculation will be performed each time when new data will be available.
Is there a way to count the measurement or get the total of the measurements per hour or per day?
The stream it consumes is "MeasurementCreated" (see from) and that isn't produced by any EPL so one can safely say that this EPL by itself cannot possibly loop.
If you want to improve the EPL there is some information at this link: http://esper.espertech.com/release-8.2.0/reference-esper/html_single/index.html#processingmodel_basicfilter
By moving the where-clause text into a filter you can discard events early.
Doesn't the insert into CreateMeasurement then cause an event in MeasurementCreated?
I have an Access 2003 database which records fault call help requests in a medium size organisation of around 200 users. Calls are logged (and appended into the database) via a Classic ASP page, and a team of systems administrators use a seperate classic ASP web page to view calls, provide a response, etc.
All calls are recorded in one table called tblFaultCall, it's structure is below
tblFault call
ID : Autonumber
strName
strPhone
dtmDateOpen : Date/Time (date call logged)
dtmDateClosed : Date/Time (date call closed)
dtmTime : Date/Time (time call logged)
strStatus (always 'Open', 'Pending' or 'Closed')
strCategory (always one of 10 categories, held as as list in tblCatgory, and used in lookup lists in the ASP web page)
strFaultDesc
strResolution
strCallOwner
dtmDatePending : Date/Time (date call set to pending, if it ever was)
For management, I need a way of easily creating a quarterly report which shows as below
Call recieved between dd/mm/yyyy and dd/mm/yyyy
----
Category Calls recieved Of which 'Closed' closed within 5 days Closed within 14 days Open Pending
Cateogry x 1052 950 700 200 50 50
Cateogry Y 65 60 50 5 0 5
I need an easy way to do this. I need the manager to be able to insert the dates he wants, and then click a button and it all comes up. I cannot work out how to create one query which gives all of this. It's easy to give just the categories and number of Open calls, but then can't work out how to add a further column to show number of Closed calls, or the number closed within x days, etc. I can create individual queries for the harder columns, but not get it all together.
So, options are
Classic ASP - I think would involve a lot of individual SQLs for the calculated fields
Access Report ?
Some kind of export to Excel?
VBA in Excel to link back to prepared queries in Access?
Any advise would be appreciated.
You should be able to get that data in one query. Try this one:
SELECT AllCalls.strCategory, CallsReceived, CallsClosed, ClosedWithin5Days, ClosedWithin14days, CallsOpen, CallsPending
FROM
((
SELECT strCategory,
Count(ID) AS CallsReceived,
Sum(IIF(strStatus='Closed',1,0)) AS CallsClosed,
Sum(IIF(strStatus='Open',1,0)) AS CallsOpen,
Sum(IIF(strStatus='Pending',1,0)) AS CallsPending
FROM tblFaultCall
WHERE dtmDateOpen BETWEEN #6/1/2014# and #6/30/2014#
GROUP BY strCategory
) AS AllCalls
LEFT JOIN
(
SELECT strCategory,
Count(ID) AS ClosedWithin5Days
FROM tblFaultCall
WHERE DateDiff("d", dtmDateOpen, dtmDateClosed) <=5
AND dtmDateOpen BETWEEN #6/1/2014# and #6/30/2014#
GROUP BY strCategory
) AS FiveDay ON AllCalls.strCategory=FiveDay.strCategory)
LEFT JOIN
(
SELECT strCategory,
Count(ID) AS ClosedWithin14Days
FROM tblFaultCall
WHERE DateDiff("d", dtmDateOpen, dtmDateClosed) between 5 and 14
AND dtmDateOpen BETWEEN #6/1/2014# and #6/30/2014#
GROUP BY strCategory
) AS FourteenDay ON AllCalls.strCategory=FourteenDay.strCategory
The classic ASP part should be very similar to your other pages: query the database, loop through the resulting data, output it to the screen. You would use the same approach if you were generating a spreadsheet too.
Each column can be calculated, mostly with iif statements:
Total calls = count(calls)
Closed calls = sum(iif(<call is closed>,1,0) (however you define <call is closed>)
Closed in 5 days = sum(iif(<call is closed in 5 days>,1,0))
and so on
Hi I am new to programming and rails, and I am trying to create an admin interface in my app that shows stats. I have a Job model that has many Responses, and I need to collect the average response time for each day. In order to collect the response time for the first job I would do the following
job = Job.first
response = job.responses.first
response_time = response.created_at - job.created_at
This is very simple, but I am having trouble trying to collect this information for all the jobs of that day. Im trying to come up with a solution that will give me an array of data pairs. For example {[June 17, 51s], [June 18, 60s], [June 19, 38s], ... etc}.
I cant seem to figure out the correct rails active record call that will give me what I need
Don't think you are going to find an active record solution, but you have what you need, just need to add a little ruby.
Probably a 100 ways to do it, here is one way that creates a hash with the number of whole days from the job creation date as the key and the count as the value
job = Job.first
start_date = job.created_at
response_dates = job.responses.pluck(:created_at) #creates an array of created_at datetimes
day_stats = response_dates.each_with_object(Hash.new(0)) { |dt, h| h[((dt - start_date)/1.day).round(0)] += 1 }
This basically iterates through the datetime array, subtracts the response date from the job date, divides it by 1 day and rounds it to a whole day.
Output would be something like:
=> {0=>1, 5=>2, 6=>1, 7=>2, 9=>1, 31=>1, 37=>6, 40=>1, 42=>3, 44=>1, 59=>32, 60=>59, 61=>2, 64=>1, 65=>2, 78=>168, 97=>39, 93=>2, 110=>1, 214=>1}
If you want the date, you could add the key*1.day to the start_date