Find frequency of data in sql server 2014 by date and time - sql

so here is a question.
I have a table FacebookInfo with a column name.
Another table is FacebookPost with a column created_time and foriegn key as facebookinfoid mapped to FacebookInfo Id column
So basically FacebookInfo has a record of Facebook Pages and FacebookPost is the posts of those facebook pages.
What I want to find out is how frequently the pages are posting on Facebook, so I want to find out the average posts per day, the difference in hours between those posts, average time on first post of a day and average time on last post of the day.
Thanks for help.
Here is some sample data
FacebookInfo
id name
3 Qatar Airways
4 KLM Royal Dutch Airlines
5 LATAM Airlines
6 Southwest Airlines
FacebookPost
id facebookinfoid created_time
777 3 2016-12-06 12:54:31.000
778 3 2016-12-05 09:54:09.000
779 3 2016-12-02 12:40:46.000
780 3 2016-12-01 13:00:00.000
781 3 2016-11-30 11:29:53.000
782 3 2016-11-30 09:00:00.000
783 3 2016-11-29 10:09:45.000
784 3 2016-11-28 14:00:00.000
785 3 2016-11-27 11:21:11.000
786 3 2016-11-26 12:00:01.000
787 3 2016-11-25 11:58:55.000
788 3 2016-11-24 10:28:19.000
789 3 2016-11-23 16:20:29.000
790 3 2016-11-23 11:19:42.000
791 3 2016-11-21 12:03:07.000
792 3 2016-11-18 13:36:41.000
793 3 2016-11-17 11:08:41.000
794 3 2016-11-16 12:01:00.000
795 3 2016-11-15 13:39:06.000
796 3 2016-11-11 15:11:56.000
1454 4 2016-12-06 15:00:22.000
1455 4 2016-12-05 14:59:04.000
1456 4 2016-12-05 09:00:07.000
1457 4 2016-12-04 15:00:07.000
1458 4 2016-12-03 10:00:08.000
1459 4 2016-12-02 15:00:15.000
1460 4 2016-12-01 14:00:00.000
1461 4 2016-11-30 13:30:24.000
1462 4 2016-11-29 15:00:07.000
1463 4 2016-11-28 15:00:19.000
1464 4 2016-11-28 09:00:09.000
1465 4 2016-11-26 10:00:06.000
1466 4 2016-11-24 15:00:04.000
1467 4 2016-11-23 09:00:09.000
1468 4 2016-11-22 15:01:04.000
1469 4 2016-11-21 15:00:07.000
1470 4 2016-11-21 05:00:10.000
1471 4 2016-11-19 10:00:07.000
1472 4 2016-11-18 09:00:10.000
1473 4 2016-11-17 15:00:01.000
2454 5 2016-12-05 16:00:01.000
2455 5 2016-12-02 16:02:37.000
2456 5 2016-12-01 16:00:09.000
2457 5 2016-11-30 16:00:48.000
2458 5 2016-11-29 16:01:34.000
2459 5 2016-11-28 16:00:00.000
2460 5 2016-11-25 16:00:01.000
2461 5 2016-11-23 16:00:00.000
2462 5 2016-11-22 16:00:00.000
2463 5 2016-11-21 16:00:00.000
2464 5 2016-11-19 16:00:03.000
2465 5 2016-11-18 16:00:00.000
2466 5 2016-11-17 16:00:01.000
2467 5 2016-11-16 16:00:03.000
2468 5 2016-11-15 16:00:01.000
2469 5 2016-11-12 16:00:00.000
2470 5 2016-11-11 16:00:00.000
2471 5 2016-11-10 16:00:01.000
2472 5 2016-11-09 16:00:00.000
2473 5 2016-11-08 16:00:02.000
3059 6 2016-12-06 15:14:30.000
3060 6 2016-12-04 21:38:33.000
3061 6 2016-12-03 22:27:40.000
3062 6 2016-12-02 21:29:42.000
3063 6 2016-12-01 23:00:04.000
3064 6 2016-11-30 22:00:02.000
3065 6 2016-11-30 20:28:17.000
3066 6 2016-11-29 17:57:02.000
3067 6 2016-11-28 20:49:59.000
3068 6 2016-11-26 17:10:55.000
3069 6 2016-11-26 12:50:45.000
3070 6 2016-11-25 21:16:31.000
3071 6 2016-11-25 01:27:09.000
3072 6 2016-11-24 15:50:16.000
3073 6 2016-11-23 22:00:01.000
3074 6 2016-11-23 15:10:32.000
3075 6 2016-11-22 21:42:42.000
3076 6 2016-11-22 16:29:28.000
3077 6 2016-11-22 03:03:21.000
3078 6 2016-11-22 01:45:41.000

Related

add Rank Column to pandas Dataframe based on column condition

What I have is below.
DOG
Date
Steps
Tiger
2021-11-01
164
Oakley
2021-11-01
76
Piper
2021-11-01
65
Millie
2021-11-01
188
Oscar
2021-11-02
152
Foster
2021-11-02
191
Zeus
2021-11-02
101
Benji
2021-11-02
94
Lucy
2021-11-02
186
Rufus
2021-11-02
65
Hank
2021-11-03
98
Olive
2021-11-03
122
Ellie
2021-11-03
153
Thor
2021-11-03
152
Nala
2021-11-03
181
Mia
2021-11-03
48
Bella
2021-11-03
23
Izzy
2021-11-03
135
Pepper
2021-11-03
22
Diesel
2021-11-04
111
Dixie
2021-11-04
34
Emma
2021-11-04
56
Abbie
2021-11-04
32
Guinness
2021-11-04
166
Kobe
2021-11-04
71
What I want is below. Rank by value of ['Steps'] column for each Date
DOG
Date
Steps
Rank
Tiger
2021-11-01
164
2
Oakley
2021-11-01
76
3
Piper
2021-11-01
65
4
Millie
2021-11-01
188
1
Oscar
2021-11-02
152
3
Foster
2021-11-02
191
1
Zeus
2021-11-02
101
4
Benji
2021-11-02
94
5
Lucy
2021-11-02
186
2
Rufus
2021-11-02
65
6
Hank
2021-11-03
98
6
Olive
2021-11-03
122
5
Ellie
2021-11-03
153
2
Thor
2021-11-03
152
3
Nala
2021-11-03
181
1
Mia
2021-11-03
48
7
Bella
2021-11-03
23
8
Izzy
2021-11-03
135
4
Pepper
2021-11-03
22
9
Diesel
2021-11-04
111
2
Dixie
2021-11-04
34
5
Emma
2021-11-04
56
4
Abbie
2021-11-04
32
6
Guinness
2021-11-04
166
1
Kobe
2021-11-04
71
3
I tried below, but it failed.
df['Rank'] = df.groupby('Date')['Steps'].rank(ascending=False)
First your solution for me working.
Maybe need method='dense' and casting to integers.
df['Rank'] = df.groupby('Date')['Steps'].rank(ascending=False, method='dense').astype(int)

Update fields in table based on previous field records

I have a table of records for pay periods with fields WeekStart and WeekEnd populated for ever fiscal year. In my application a user should be able to update the first WeekEnd date for a given fiscal year and that should in turn update subsequent records WeekStart date by adding 1 day to the Previous WeekEnd date and for the same records's WeekEnd date, add 13 days to the new WeekStart date.
This is part of a Stored Procedure written in SQL Server 2016.
UPDATE [dbo].[staffing_BiweeklyPPCopy]
SET
[WeekStart] = DATEADD(DD, 1, LAG([WeekEnd], 1) OVER (ORDER BY [ID])),
[WeekEnd] = DATEADD(DD, 14, LAG([WeekEnd], 1) OVER (ORDER BY [ID]))
WHERE
[FiscalYear] = #fiscalyear
Original Table contents shown below...
ID WeekStart WeekEnd
163 2018-10-01 2018-10-13
164 2018-10-14 2018-10-27
165 2018-10-28 2018-11-10
166 2018-11-11 2018-11-24
167 2018-11-25 2018-12-08
168 2018-12-09 2018-12-22
169 2018-12-23 2019-01-05
170 2019-01-06 2019-01-19
171 2019-01-20 2019-02-02
172 2019-02-03 2019-02-16
173 2019-02-17 2019-03-02
174 2019-03-03 2019-03-16
175 2019-03-17 2019-03-30
176 2019-03-31 2019-04-13
177 2019-04-14 2019-04-27
178 2019-04-28 2019-05-11
179 2019-05-12 2019-05-25
180 2019-05-26 2019-06-08
181 2019-06-09 2019-06-22
182 2019-06-23 2019-07-06
183 2019-07-07 2019-07-20
184 2019-07-21 2019-08-03
185 2019-08-04 2019-08-17
186 2019-08-18 2019-08-31
187 2019-09-01 2019-09-14
188 2019-09-15 2019-09-28
189 2019-09-29 2019-09-30
For example if a user updates the weekend date for record ID 163 to '2018-10-14', the table will update as follows..
ID WeekStart WeekEnd
163 2018-10-01 2018-10-14
164 2018-10-15 2018-10-28
165 2018-10-29 2018-11-11
166 2018-11-12 2018-11-25
167 2018-11-26 2018-12-09
.
.
.
189 2019-09-30 2019-09-30
Thank you in advance.

MSSQL MAX returns all results?

I have tried the following query to return the highest P.Maxvalue for each ME.Name from the last day between 06:00 and 18:00:
SELECT MAX(P.MaxValue) AS Value,P.DateTime,ME.Name AS ID
FROM vManagedEntity AS ME INNER JOIN
Perf.vPerfHourly AS P ON ME.ManagedEntityRowId = P.ManagedEntityRowId INNER JOIN
vPerformanceRuleInstance AS PRI ON P.PerformanceRuleInstanceRowId = PRI.PerformanceRuleInstanceRowId INNER JOIN
vPerformanceRule AS PR ON PRI.RuleRowId = PR.RuleRowId
WHERE (ME.ManagedEntityTypeRowId = 2546) AND (pr.ObjectName = 'VMGuest-cpu') AND (pr.CounterName LIKE 'cpuUsageMHz') AND (CAST(p.DateTime as time) >= '06:00:00' AND CAST(p.DateTime as time) <='18:00:00') AND (p.DateTime > DATEADD(day, - 1, getutcdate()))
group by ME.Name,P.DateTime
ORDER by id
but it seems to return each MaxValue for each ID instead of the highest?
like:
Value DateTime ID
55 2018-02-19 12:00:00.000 bob:vm-100736
51 2018-02-19 13:00:00.000 bob:vm-100736
53 2018-02-19 14:00:00.000 bob:vm-100736
52 2018-02-19 15:00:00.000 bob:vm-100736
52 2018-02-19 16:00:00.000 bob:vm-100736
51 2018-02-19 17:00:00.000 bob:vm-100736
54 2018-02-19 18:00:00.000 bob:vm-100736
51 2018-02-20 06:00:00.000 bob:vm-100736
51 2018-02-20 07:00:00.000 bob:vm-100736
53 2018-02-20 08:00:00.000 bob:vm-100736
52 2018-02-20 09:00:00.000 bob:vm-100736
78 2018-02-19 12:00:00.000 bob:vm-101
82 2018-02-19 13:00:00.000 bob:vm-101
79 2018-02-19 14:00:00.000 bob:vm-101
78 2018-02-19 15:00:00.000 bob:vm-101
79 2018-02-19 16:00:00.000 bob:vm-101
77 2018-02-19 17:00:00.000 bob:vm-101
82 2018-02-19 18:00:00.000 bob:vm-101
82 2018-02-20 06:00:00.000 bob:vm-101
79 2018-02-20 07:00:00.000 bob:vm-101
81 2018-02-20 08:00:00.000 bob:vm-101
82 2018-02-20 09:00:00.000 bob:vm-101
155 2018-02-19 12:00:00.000 bob:vm-104432
there is one value per hour for each id hence twelve results for each id
does MAX not work in this way i want ?
Thanks
expected view like this :
Value DateTime ID
55 2018-02-19 12:00:00.000 bob:vm-100736
82 2018-02-19 13:00:00.000 bob:vm-101
etc
If you're using group by on datetime and id, you'll get all datetimes and all ids, it's that simple.
If you don't need exact time, you can group by date only:
SELECT MAX(P.MaxValue) AS Value, cast(P.DateTime as date) as dat, ME.Name AS ID
...
group by ME.Name, cast(P.DateTime as date)
Or if you do, you may use not exists clause instead of group by.

SQL Aggregate Woes

I have this code:
SELECT q.HospitalNumber, q.Patient_Name,
q.[Date/Time]) as DATE_OF_GCS, --convert(varchar(5),q.[Date/Time],108) as TIME_OF_GCS,
max(q.[GCS_COUNT]) as BEST_GCS
From
(
Select pat.HospitalNumber, pat.FirstName + ' ' + pat.LastName as Patient_Name, ts.Time as [Date/Time], sum(pt.value) as [GCS_COUNT]
from ParametersText pt INNER JOIN TextSignals ts ON ts.TextID = pt.TextID AND ts.ParameterID = pt.ParameterID
INNER JOIN Patients pat ON pat.patientID = ts.PatientID
WHERE ts.ParameterID = 21654 or ts.ParameterID = 21655 or ts.ParameterID = 21656
GROUP BY pat.HospitalNumber, pat.FirstName, pat.LastName, ts.PatientID, ts.Time
) q
GROUP BY q.HospitalNumber, q.Patient_Name, q.[Date/Time]
--,convert(varchar(5),q.[Date/Time],108)
But all I'm after is the highest Best_GCS per day per patient, but I need the time as well. The recording of the GCS can occur many times of the day and can be the same score several times. Any help would be most appreciated...
Thanks very much in advance...
This is on SQL Server (t-sql)
Here is a snippet of the data this query throws out:
patientID Patient_Name DATE_OF_GCS GCS
442 patient name 2014-02-13 16:00:00.000 15
442 patient name 2014-02-13 18:00:00.000 15
442 patient name 2014-02-13 20:00:00.000 15
442 patient name 2014-02-14 00:00:00.000 15
442 patient name 2014-02-14 04:00:00.000 15
442 patient name 2014-02-14 05:00:00.000 15
442 patient name 2014-02-14 06:00:00.000 15
442 patient name 2014-02-14 08:00:00.000 15
442 patient name 2014-02-14 12:00:00.000 15
442 patient name 2014-02-14 16:00:00.000 15
442 patient name 2014-02-14 17:00:00.000 15
442 patient name 2014-02-14 20:00:00.000 15
442 patient name 2014-02-14 23:00:00.000 15
442 patient name 2014-02-15 00:00:00.000 15
442 patient name 2014-02-15 02:00:00.000 15
442 patient name 2014-02-15 05:00:00.000 15
442 patient name 2014-02-15 08:00:00.000 15
442 patient name 2014-02-15 12:00:00.000 15
442 patient name 2014-02-15 15:00:00.000 11
442 patient name 2014-02-15 16:00:00.000 15
442 patient name 2014-02-15 17:00:00.000 15
442 patient name 2014-02-15 18:00:00.000 15
442 patient name 2014-02-15 20:00:00.000 15
442 patient name 2014-02-16 00:00:00.000 15
442 patient name 2014-02-16 02:00:00.000 15
442 patient name 2014-02-16 05:00:00.000 15
442 patient name 2014-02-16 08:00:00.000 15
442 patient name 2014-02-16 20:00:00.000 11
442 patient name 2014-02-16 20:51:00.000 4
442 patient name 2014-02-17 01:00:00.000 15
442 patient name 2014-02-17 02:00:00.000 15
442 patient name 2014-02-17 04:00:00.000 15
442 patient name 2014-02-17 06:00:00.000 15
442 patient name 2014-02-17 08:00:00.000 15
442 patient name 2014-02-17 10:00:00.000 15
442 patient name 2014-02-17 15:00:00.000 15
442 patient name 2014-02-17 18:00:00.000 15
442 patient name 2014-02-17 20:00:00.000 15
442 patient name 2014-02-18 00:00:00.000 15
442 patient name 2014-02-18 04:00:00.000 15
442 patient name 2014-02-18 08:00:00.000 15
442 patient name 2014-02-18 12:00:00.000 15
442 patient name 2014-02-18 14:00:00.000 15
442 patient name 2014-02-18 15:00:00.000 15
442 patient name 2014-02-18 17:00:00.000 15
442 patient name 2014-02-18 19:00:00.000 15
442 patient name 2014-02-18 20:00:00.000 11
442 patient name 2014-02-19 02:00:00.000 15
442 patient name 2014-02-19 06:00:00.000 15
442 patient name 2014-02-19 09:00:00.000 15
471 patient name 2014-02-13 09:00:00.000 7
471 patient name 2014-02-13 11:00:00.000 7
471 patient name 2014-02-13 13:00:00.000 7
471 patient name 2014-02-13 15:00:00.000 8
471 patient name 2014-02-13 17:00:00.000 8
471 patient name 2014-02-13 19:00:00.000 7
471 patient name 2014-02-13 21:00:00.000 5
471 patient name 2014-02-13 22:00:00.000 5
471 patient name 2014-02-14 00:00:00.000 5
471 patient name 2014-02-14 02:00:00.000 5
471 patient name 2014-02-14 04:00:00.000 5
471 patient name 2014-02-14 06:00:00.000 5
471 patient name 2014-02-14 08:00:00.000 9
471 patient name 2014-02-14 10:00:00.000 6
471 patient name 2014-02-14 12:00:00.000 6
471 patient name 2014-02-14 14:00:00.000 8
471 patient name 2014-02-14 16:00:00.000 6
471 patient name 2014-02-14 18:00:00.000 6
471 patient name 2014-02-14 20:00:00.000 5
471 patient name 2014-02-14 22:00:00.000 6
471 patient name 2014-02-15 00:00:00.000 6
471 patient name 2014-02-15 02:00:00.000 6
471 patient name 2014-02-15 04:00:00.000 6
471 patient name 2014-02-15 06:00:00.000 6
471 patient name 2014-02-15 08:00:00.000 8
471 patient name 2014-02-15 09:00:00.000 6
471 patient name 2014-02-15 10:00:00.000 6
471 patient name 2014-02-15 11:00:00.000 3
471 patient name 2014-02-15 13:00:00.000 5
471 patient name 2014-02-15 14:00:00.000 3
471 patient name 2014-02-15 16:00:00.000 3
471 patient name 2014-02-15 18:00:00.000 3
471 patient name 2014-02-15 19:00:00.000 3
471 patient name 2014-02-15 21:00:00.000 3
471 patient name 2014-02-15 22:00:00.000 3
471 patient name 2014-02-16 00:00:00.000 3
471 patient name 2014-02-16 02:00:00.000 3
471 patient name 2014-02-16 02:30:00.000 3
471 patient name 2014-02-16 03:00:00.000 3
471 patient name 2014-02-16 06:00:00.000 3
471 patient name 2014-02-16 08:00:00.000 3
471 patient name 2014-02-16 12:00:00.000 5
471 patient name 2014-02-16 14:00:00.000 3
471 patient name 2014-02-16 18:00:00.000 3
471 patient name 2014-02-16 19:00:00.000 3
471 patient name 2014-02-16 21:00:00.000 3
472 patient name 2014-02-13 08:00:00.000 15
472 patient name 2014-02-13 12:00:00.000 15
472 patient name 2014-02-13 15:00:00.000 15
472 patient name 2014-02-13 19:00:00.000 15
472 patient name 2014-02-13 22:00:00.000 15
472 patient name 2014-02-14 03:00:00.000 15
472 patient name 2014-02-14 08:00:00.000 15
472 patient name 2014-02-14 14:00:00.000 15
472 patient name 2014-02-14 17:00:00.000 15
472 patient name 2014-02-14 19:00:00.000 15
472 patient name 2014-02-14 21:00:00.000 15
472 patient name 2014-02-14 23:00:00.000 15
472 patient name 2014-02-15 01:00:00.000 15
472 patient name 2014-02-15 05:00:00.000 14
472 patient name 2014-02-15 07:00:00.000 15
472 patient name 2014-02-15 08:00:00.000 15
472 patient name 2014-02-15 20:00:00.000 15
472 patient name 2014-02-15 22:00:00.000 15
472 patient name 2014-02-16 00:00:00.000 15
472 patient name 2014-02-16 03:00:00.000 15
472 patient name 2014-02-16 05:00:00.000 15
472 patient name 2014-02-16 07:00:00.000 15
472 patient name 2014-02-16 09:00:00.000 15
472 patient name 2014-02-16 12:00:00.000 15
472 patient name 2014-02-16 15:00:00.000 15
472 patient name 2014-02-16 18:00:00.000 15
472 patient name 2014-02-16 20:00:00.000 15
472 patient name 2014-02-16 22:00:00.000 15
472 patient name 2014-02-17 00:00:00.000 15
472 patient name 2014-02-17 02:00:00.000 15
472 patient name 2014-02-17 04:00:00.000 15
I figured it out. Thanks Anyway..
Here's the code. If there's any way it could be better, I'd love to hear it:
With t as
(
SELECT q.HospitalNumber, q.Patient_Name,
q.[Date/Time] as DATE_OF_GCS,
max(q.[GCS_COUNT]) as BEST_GCS
From
(
Select pat.HospitalNumber, pat.FirstName + ' ' + pat.LastName as Patient_Name, ts.Time as [Date/Time], sum(pt.value) as [GCS_COUNT]
from ParametersText pt INNER JOIN TextSignals ts ON ts.TextID = pt.TextID AND ts.ParameterID = pt.ParameterID
INNER JOIN Patients pat ON pat.patientID = ts.PatientID
WHERE ts.ParameterID = 21654 or ts.ParameterID = 21655 or ts.ParameterID = 21656
GROUP BY pat.HospitalNumber, pat.FirstName, pat.LastName, ts.PatientID, ts.Time
) q
GROUP BY q.HospitalNumber, q.Patient_Name, q.[Date/Time]
)
Select hospitalnumber, Date_Of_GCS, Best_GCS
FROM (
select t.hospitalnumber, t.Date_Of_GCS, t.Best_GCS, ROW_NUMBER() over (partition by t.hospitalnumber, cast(t.Date_Of_GCS as date) order by t.Best_GCS Desc ) as GCS
FROM t ) as p
where gcs = 1
Order By hospitalnumber, Date_Of_GCS

Calculate average values for rows with different ids in MS Excel

File contains information about products per day, and I need to calculate average values for month for each product.
Source data looks like this:
A B C D
id date rating price
1 1 2014/01/01 2 20
2 1 2014/01/02 2 20
3 1 2014/01/03 2 20
4 1 2014/01/04 1 20
5 1 2014/01/05 1 20
6 1 2014/01/06 1 20
7 1 2014/01/07 1 20
8 3 2014/01/01 5 99
9 3 2014/01/02 5 99
10 3 2014/01/03 5 99
11 3 2014/01/04 5 99
12 3 2014/01/05 5 120
13 3 2014/01/06 5 120
14 3 2014/01/07 5 120
Need to get:
A B C D
id date rating price
1 1 1.42 20
2 3 5 108
How to do that? Need some advanced formula or VB Script.
Update: I have data for long period - about 2 years. Need to calculate average values for each product for each week, and after for each month.
Source data example:
id date rating
4 2013-09-01 445
4 2013-09-02 446
4 2013-09-03 447
4 2013-09-04 448
4 2013-09-05 449
4 2013-09-06 450
4 2013-09-07 451
4 2013-09-08 452
4 2013-09-09 453
4 2013-09-10 454
4 2013-09-11 455
4 2013-09-12 456
4 2013-09-13 457
4 2013-09-14 458
4 2013-09-15 459
4 2013-09-16 460
4 2013-09-17 461
4 2013-09-18 462
4 2013-09-19 463
4 2013-09-20 464
4 2013-09-21 465
4 2013-09-22 466
4 2013-09-23 467
4 2013-09-24 468
4 2013-09-25 469
4 2013-09-26 470
4 2013-09-27 471
4 2013-09-28 472
4 2013-09-29 473
4 2013-09-30 474
4 2013-10-01 475
4 2013-10-02 476
4 2013-10-03 477
4 2013-10-04 478
4 2013-10-05 479
4 2013-10-06 480
4 2013-10-07 481
4 2013-10-08 482
4 2013-10-09 483
4 2013-10-10 484
4 2013-10-11 485
4 2013-10-12 486
4 2013-10-13 487
4 2013-10-14 488
4 2013-10-15 489
4 2013-10-16 490
4 2013-10-17 491
4 2013-10-18 492
4 2013-10-19 493
4 2013-10-20 494
4 2013-10-21 495
4 2013-10-22 496
4 2013-10-23 497
4 2013-10-24 498
4 2013-10-25 499
4 2013-10-26 500
4 2013-10-27 501
4 2013-10-28 502
4 2013-10-29 503
4 2013-10-30 504
4 2013-10-31 505
7 2013-09-01 1445
7 2013-09-02 1446
7 2013-09-03 1447
7 2013-09-04 1448
7 2013-09-05 1449
7 2013-09-06 1450
7 2013-09-07 1451
7 2013-09-08 1452
7 2013-09-09 1453
7 2013-09-10 1454
7 2013-09-11 1455
7 2013-09-12 1456
7 2013-09-13 1457
7 2013-09-14 1458
7 2013-09-15 1459
7 2013-09-16 1460
7 2013-09-17 1461
7 2013-09-18 1462
7 2013-09-19 1463
7 2013-09-20 1464
7 2013-09-21 1465
7 2013-09-22 1466
7 2013-09-23 1467
7 2013-09-24 1468
7 2013-09-25 1469
7 2013-09-26 1470
7 2013-09-27 1471
7 2013-09-28 1472
7 2013-09-29 1473
7 2013-09-30 1474
7 2013-10-01 1475
7 2013-10-02 1476
7 2013-10-03 1477
7 2013-10-04 1478
7 2013-10-05 1479
7 2013-10-06 1480
7 2013-10-07 1481
7 2013-10-08 1482
7 2013-10-09 1483
7 2013-10-10 1484
7 2013-10-11 1485
7 2013-10-12 1486
7 2013-10-13 1487
7 2013-10-14 1488
7 2013-10-15 1489
7 2013-10-16 1490
7 2013-10-17 1491
7 2013-10-18 1492
7 2013-10-19 1493
7 2013-10-20 1494
7 2013-10-21 1495
7 2013-10-22 1496
7 2013-10-23 1497
7 2013-10-24 1498
7 2013-10-25 1499
7 2013-10-26 1500
7 2013-10-27 1501
7 2013-10-28 1502
7 2013-10-29 1503
7 2013-10-30 1504
7 2013-10-31 1505
This is the job of a pivot table, and it takes about 30secs to do it
Update:
as per your update, put the date into the Report Filter and modify to suit