So... I am currently using Oracle 11.1g and I need to create a query that uses the ID and CusCODE from Table_with_value and checks Table_with_status using the ID to find active CO_status but on different CusCODE.
This is what I have so far - obviously does not work as it should unless CusCODE and ID are provided manually:
SELECT wm_concat(CoID) as active_CO_Status_for_same_ID_but_different_CusCODE
FROM Table_with_status
WHERE
CoID IN (SELECT CoID FROM Table_with_status WHERE ID = Table_with_value.ID AND CusCODE != Table_with_value.CusCODE)) AND Co_status = 'active';
Table_with_value:
|CoID | CusCODE | ID | Value |
|--------|---------|----------|----|
|354223 | 1.432 | 0784296L | 99 |
|321232 | 4.212321.22 | 0432296L | 32 |
|938421 | 3.213 | 0021321L | 93 |
Table_with_status:
|CoID | CusCODE | ID | Co_status|
|--------|--------------|----------|--------|
|354223 | 1.432 | 0784296L | active|
|354232 | 1.432 | 0784296L | inactive |
|666698 | 1.47621 | 0784296L | active |
|666700 | 1.5217 | 0784296L | active |
|938421 | 3.213 | 0021321L | active |
|938422 | 3.213 | 0021321L | active |
|938423 | 3.213 | 0021321L | active |
|321232 | 4.212321.22 | 0432296L | active |
|321232 | 4.212321.22 | 0432296L | active |
|321232 | 1.689 | 0432296L | inactive |
Expected output:
|CoID | active_CO_Status_for_same_ID_but_different_CusCODE | ID | Value |
|--------|---------|----------|----|
|354223 | 666698,666700 | 1.432 | 0784296L | 99 |
|321232 | N/A | 4.212321.22 | 0432296L | 32 |
|938421 | N/A | 3.213 | 0021321L | 93 |
Any idea on how this can be implemented ideally without any PL/SQL for loops, but it should be fine as well since the output dataset is expected < 300 IDs.
I apologize in advance for the cryptic nature in which I structured the question :) Let me know if something is not clear.
From your description and expected output, it looks like you need a left outer join, something like:
SELECT v.CoID,
wm_concat(s.CoID) as other_active_CusCODE -- active_CO_Status_for_same_ID_but_different_CusCODE
v.CusCODE,
v.ID,
v.value
FROM Table_with_value v
LEFT JOIN Table_with_status s
ON s.ID = v.ID
AND s.CusCODE != v.CusCODE
AND s.Co_status = 'active'
GROUP BY v.CoID, v.CusCODE, v.ID, v.value;
SQL Fiddle using listagg() instead of the never-supported and now-removed wm_concat(); with a couple of different approaches if the logic isn't quite what I interpreted. With your sample data they all get:
COID OTHER_ACTIVE_CUSCODE CUSCODE ID VALUE
------ -------------------- ----------- -------- -----
321232 (null) 4.212321.22 0432296L 32
354223 666698,666700 1.432 0784296L 99
938421 (null) 3.213 0021321L 93
Your code looks like it should work, assuming you are referring to the correct tables:
SELECT wm_concat(s.CoID) as active_CO_Status_for_same_ID_but_different_CusCODE
FROM Table_with_status s
WHERE s.CoID IN (SELECT v.CoID
FROM Table_with_value v
WHERE v.ID = s.ID AND
v.CusCODE <> s.CusCODE
) AND
s.Co_status = 'active';
I'm hoping that im over thinking this. but i need to sum a column where i have no unique link to join on and when i do it double ups columns.
This is my current SQL that works until i add the join on vwBatchInData then it doubles up every record, what is the best way to achieve this?
select b.fldBatchID as 'ID',SUM(bIn.fldBatchDetailsWeight) as 'Batch In', sum(t.fldTransactionNetWeight) as 'Batch Out' , format((sum(t.fldTransactionNetWeight) / sum(bIn.fldBatchDetailsWeight)),'P2' ) as 'Yield'
from [TRANSACTION] t
right join vwBatchInData bIn on bIn.fldBatchID = t.fldBatchID
inner join Batch b on b.fldBatchID = t.fldBatchID
where CAST(b.fldBatchDate as date) = '2020-03-04'
group by b.fldBatchID**
vwBatchInData Table
+------------+---------------+-----------------------+
| fldBatchID | fldKillNumber | fldBatchDetailsWeight |
+------------+---------------+-----------------------+
| 2862 | 601598 | 164.40 |
| 2862 | 601599 | 190.80 |
| 2862 | 601596 | 195.00 |
| 2862 | 601597 | 200.20 |
| 2862 | 601594 | 176.60 |
+------------+---------------+-----------------------+
Transaction Table
+------------+------------------+-------------------------+
| fldBatchID | fldTransactionID | fldTransactionNetWeight |
+------------+------------------+-------------------------+
| 2862 | 10242352 | 16.26 |
| 2862 | 10242353 | 22.82 |
| 2862 | 10242362 | 18.52 |
| 2862 | 10242363 | 21.44 |
| 2862 | 10242364 | 20.32 |
+------------+------------------+-------------------------+
Batch Table
+------------+-------------------------+
| fldBatchID | fldBatchDate |
+------------+-------------------------+
| 2862 | 2020-03-04 00:00:00.000 |
+------------+-------------------------+
Desired output with the above snipets
+------+----------+-----------+---------+
| ID | Batch In | Batch Out | Yield |
+------+----------+-----------+---------+
| 2862 | 927.00 | 90.36 | 10.76 % |
+------+----------+-----------+---------+
I think you just want to aggregate before joining:
select b.fldBatchID as ID,
(bIn.fldBatchDetailsWeight) as batch_in,
(t.fldTransactionNetWeight) as batch_out,
format(t.fldTransactionNetWeight / bIn.fldBatchDetailsWeight, 'P2' ) as Yield
from batch b left join
(select bin.fldBatchID, sum(fldBatchDetailsWeight) as fldBatchDetailsWeight
from vwBatchInData bin
group by bin.fldBatchID
) bin
on bIn.fldBatchID = b.fldBatchID left join
(select t.fldBatchID, sum(fldTransactionNetWeight) as fldTransactionNetWeight
from transactions t
group by t.fldBatchID
) bin
on t.fldBatchID = b.fldBatchID
where CAST(b.fldBatchDate as date) = '2020-03-04';
I have a table with following stock data where we have couple of columns like date, ticker, open and close(stock prices).
To query this data, I want to know which stock has given the highest margin on particular date. So if I have 516 different stocks, my query should return 516 rows of ticker, date, open, close and a new column Margin(which will be max(close-open)).
| deep_stocks.date_ | deep_stocks.ticker | deep_stocks.open | deep_stocks.close |
+--------------------+---------------------+-------------------+--------------------+--+
| 20100721 | A | 27.68 | 27.58 |
| 20100722 | A | 27.95 | 28.72 |
| 20100723 | A | 28.56 | 29.3 |
| 20100726 | A | 29.22 | 29.64 |
| 20100727 | A | 29.73 | 28.87 |
| 20100728 | A | 28.79 | 28.78 |
| 20100729 | A | 28.97 | 28.15 |
| 20100730 | A | 27.78 | 27.93 |
| 20100802 | A | 28.35 | 28.82 |
| 20100803 | A | 28.7 | 27.84 |
I have written a query where my approach was:
Step 1 - Get the difference between Close and Open prices (Inner/Sub query)
Step 2 - Get the maximum of margin for every stock (used group by with max function)
Step 3 - Join the results with Main Table and get the data.
I'll put my query in solution or comments can someone please correct it as it is taking more time. Also I would like to know can we have any other alternative approach.
As already told about my approach please find below query:
SELECT ds.ticker, ds.date_, ds.close, ds.open, ds.Margin FROM
(SELECT ticker, date_, close, open, case(close-open)>0 when true then round(close-open,2) else 0 end as Margin FROM DataStocks) ds
JOIN
(SELECT dsIn.ticker, max(dsIn.Margin) mxMargin FROM
(select ticker, case(close-open)>0 when true then round(close-open,2) else 0 end as Margin FROM DataStocks ) dsIn group by dsIn.ticker) dsEx
ON ds.ticker=dsEx.ticker AND ds.Margin=dsEx.mxMargin ORDER BY ds.Margin;
Do we have any other alternatives for this query or can it be possible to optimize it.
I am looking to add 'if else' condition within a where clause, but not sure how would I do it. Looking at the below table set, I am trying to filter the results where the query would extract all the Product line subtypes and add a condition only when the [Product Line Subtype] = 'Marine'. When it is Marine, then it should consider only two combinations of Section and Profit Code while omitting other combinations.
Combination 1 When Prod line Subtype = Marine then Section = Inland and Profit Code = Builders
Combination 2 When Prod line Subtype = Marine then Section = Ocean and Profit Code = Stocks
My actual table has larger sets of distinct combinations than showed in the below table when Prod line Subtype = Marine, but I just want to filter only the above two combinations to my results set. Any help would be much appreciated!
Main table
+--+------------------+-------------+-----------------+
| |Prod line Subtype | Section | Profit Code |
+--+------------------+-------------+-----------------+
| | Marine | Inland | Builders |
| | Marine | Ocean | Stock |
| | Property | General | Transport |
| | Energy | Source | Others |
| | Property | General | Transport |
| | Energy | Source | Transport |
| | Marine | Inland | Transport |
| | Marine | Floaters | Transport |
| | Marine | Cargo | Others |
+--+------------------+-------------+---------------- +
Expected Results
+--+------------------+-------------+-----------------+
| |Prod line Subtype | Section | Profit Code |
+--+------------------+-------------+-----------------+
| | Marine | Inland | Builders |
| | Marine | Ocean | Stock |
| | Property | General | Transport |
| | Energy | Source | Others |
| | Property | General | Transport |
| | Energy | Source | Transport |
+--+------------------+-------------+---------------- +
My query attempt:
select *
from #Step1
where c1.row_ord = 1
and c1.[Prod Line Subtype] = 'Marine' AND (
(c1.[Section] = 'Inland' AND c1.[Profit Code] = 'Builder')
OR (c1.[Section] = 'Ocean' AND c1.[Profit Code] = 'Stock')
)
What about:
Select * from [Your table]
Where ([Prod line Subtype]<>'Marine' Or
(Section='Inland' And [Profit Code]='Builders') Or
(Section='Ocean' And [Profit Code]='Stocks')
)
Can omit the [Prod line Subtype]='Marine' from or conditions
I have an MS Access database for rainfall data of several climate stations.
For each day of each station, I want to calculate the rainfall in the previous day (if recorded), and the sum of the rainfall at the previous 3 and 7 days.
Due to the huge amount of data and the limitations of Access, I made a query that takes station by station; Then I applied an auxillary query to find dates first, For each station, The following SQL statement is applied (and named RainFallStudy query):
SELECT
[173].ID, [173].AirportCode, [173].RFmm,
DateSerial([rYear], [rMonth], [rDay]) AS DateSer,
[DateSer]-1 AS DM1,
[DateSer]-2 AS DM2,
[DateSer]-3 AS DM3,
[DateSer]-4 AS DM4,
[DateSer]-5 AS DM5,
[DateSer]-6 AS DM6,
[DateSer]-7 AS DM7
FROM
[173]
WHERE
((([173].AirportCode) = 786660));
I used DM1, DM2, etc as the date serial of the day-1, day-2, etc.
Then I used another query that uses RainFallStudy query with left joints as shown in the figure:
The SQL statement is
SELECT
RainFallStudy.ID, RainFallStudy.AirportCode,
RainFallStudy.RFmm AS RF0, RainFallStudy.DateSer,
RainFallStudy.DM1, RainFallStudy_1.RFmm AS RF1,
RainFallStudy_2.RFmm AS RF2, RainFallStudy_3.RFmm AS RF3,
RainFallStudy_4.RFmm AS RF4, RainFallStudy_5.RFmm AS RF5,
RainFallStudy_6.RFmm AS RF6, RainFallStudy_7.RFmm AS RF7,
Nz([rf1], 0) + Nz([rf2], 0) + Nz([rf3], 0) + Nz([rf4], 0) + Nz([rf5], 0) + Nz([rf6], 0) + Nz([rf7], 0) AS RF_W
FROM
((((((RainFallStudy
LEFT JOIN
RainFallStudy AS RainFallStudy_1 ON RainFallStudy.DM1 = RainFallStudy_1.DateSer)
LEFT JOIN
RainFallStudy AS RainFallStudy_2 ON RainFallStudy.DM2 = RainFallStudy_2.DateSer)
LEFT JOIN
RainFallStudy AS RainFallStudy_3 ON RainFallStudy.DM3 = RainFallStudy_3.DateSer)
LEFT JOIN
RainFallStudy AS RainFallStudy_4 ON RainFallStudy.DM4 = RainFallStudy_4.DateSer)
LEFT JOIN
RainFallStudy AS RainFallStudy_5 ON RainFallStudy.DM5 = RainFallStudy_5.DateSer)
LEFT JOIN
RainFallStudy AS RainFallStudy_6 ON RainFallStudy.DM6 = RainFallStudy_6.DateSer)
LEFT JOIN
RainFallStudy AS RainFallStudy_7 ON RainFallStudy.DM7 = RainFallStudy_7.RFmm;
Now I suffer from the slow performance of this query, as the records of each station range from 1,000 to 750,000 records! Is there any better way to find what I need in a faster SQL statement? The second question, can I make a standalone SQL statement for that (one query without the auxiliary query) as I will use it in python, which requires one SQL statement (as Iof my knowledge).
Thanks in advance.
Update
As requested by #Andre, Here are some sample data of table [173] in HTML
<table><tbody><tr><th>ID</th><th>AirportCode</th><th>rYear</th><th>rMonth</th><th>rDay</th><th>RFmm</th></tr><tr><td>11216</td><td>409040</td><td>2012</td><td>1</td><td>23</td><td>0.51</td></tr><tr><td>11217</td><td>409040</td><td>2012</td><td>1</td><td>24</td><td>0</td></tr><tr><td>11218</td><td>409040</td><td>2012</td><td>1</td><td>25</td><td>0</td></tr><tr><td>11219</td><td>409040</td><td>2012</td><td>1</td><td>26</td><td>2.03</td></tr><tr><td>11220</td><td>409040</td><td>2012</td><td>1</td><td>27</td><td>0</td></tr><tr><td>11221</td><td>409040</td><td>2012</td><td>1</td><td>28</td><td>0</td></tr><tr><td>11222</td><td>409040</td><td>2012</td><td>1</td><td>29</td><td>0</td></tr><tr><td>11223</td><td>409040</td><td>2012</td><td>1</td><td>30</td><td>0</td></tr><tr><td>11224</td><td>409040</td><td>2012</td><td>1</td><td>31</td><td>0.25</td></tr><tr><td>11225</td><td>409040</td><td>2012</td><td>2</td><td>1</td><td>0</td></tr><tr><td>11226</td><td>409040</td><td>2012</td><td>2</td><td>2</td><td>0</td></tr><tr><td>11227</td><td>409040</td><td>2012</td><td>2</td><td>3</td><td>4.32</td></tr><tr><td>11228</td><td>409040</td><td>2012</td><td>2</td><td>4</td><td>13.21</td></tr><tr><td>11229</td><td>409040</td><td>2012</td><td>2</td><td>5</td><td>1.02</td></tr><tr><td>11230</td><td>409040</td><td>2012</td><td>2</td><td>6</td><td>0</td></tr><tr><td>11231</td><td>409040</td><td>2012</td><td>2</td><td>7</td><td>0</td></tr><tr><td>11232</td><td>409040</td><td>2012</td><td>2</td><td>8</td><td>0</td></tr><tr><td>11233</td><td>409040</td><td>2012</td><td>2</td><td>9</td><td>0</td></tr><tr><td>11234</td><td>409040</td><td>2012</td><td>2</td><td>10</td><td>5.08</td></tr><tr><td>11235</td><td>409040</td><td>2012</td><td>2</td><td>11</td><td>0</td></tr><tr><td>11236</td><td>409040</td><td>2012</td><td>2</td><td>12</td><td>12.95</td></tr><tr><td>11237</td><td>409040</td><td>2012</td><td>2</td><td>13</td><td>5.59</td></tr><tr><td>11238</td><td>409040</td><td>2012</td><td>2</td><td>14</td><td>0.25</td></tr><tr><td>11239</td><td>409040</td><td>2012</td><td>2</td><td>15</td><td>0</td></tr><tr><td>11240</td><td>409040</td><td>2012</td><td>2</td><td>16</td><td>0</td></tr><tr><td>11241</td><td>409040</td><td>2012</td><td>2</td><td>17</td><td>0</td></tr><tr><td>11242</td><td>409040</td><td>2012</td><td>2</td><td>18</td><td>0</td></tr><tr><td>11243</td><td>409040</td><td>2012</td><td>2</td><td>19</td><td>0</td></tr><tr><td>11244</td><td>409040</td><td>2012</td><td>2</td><td>20</td><td>14.48</td></tr><tr><td>11245</td><td>409040</td><td>2012</td><td>2</td><td>21</td><td>9.65</td></tr><tr><td>11246</td><td>409040</td><td>2012</td><td>2</td><td>22</td><td>3.05</td></tr><tr><td>11247</td><td>409040</td><td>2012</td><td>2</td><td>23</td><td>0</td></tr><tr><td>11248</td><td>409040</td><td>2012</td><td>2</td><td>24</td><td>0</td></tr><tr><td>11249</td><td>409040</td><td>2012</td><td>2</td><td>25</td><td>0</td></tr><tr><td>11250</td><td>409040</td><td>2012</td><td>2</td><td>26</td><td>0</td></tr><tr><td>11251</td><td>409040</td><td>2012</td><td>2</td><td>27</td><td>0</td></tr><tr><td>11252</td><td>409040</td><td>2012</td><td>2</td><td>28</td><td>7.37</td></tr><tr><td>11253</td><td>409040</td><td>2012</td><td>2</td><td>29</td><td>0</td></tr></tbody></table>
And here is sample output (HTML)
<table><tbody><tr><th>ID</th><th>AirportCode</th><th>DateSer</th><th>ThisDay</th><th>Yesterday</th><th>Prev3days</th><th>PrevWeek</th></tr><tr><td>11216</td><td>409040</td><td>23-01-2012</td><td>0.51</td><td>0</td><td>0</td><td>0</td></tr><tr><td>11217</td><td>409040</td><td>24-01-2012</td><td>0</td><td>0.51</td><td>0.51</td><td>0.51</td></tr><tr><td>11218</td><td>409040</td><td>25-01-2012</td><td>0</td><td>0</td><td>0.51</td><td>0.51</td></tr><tr><td>11219</td><td>409040</td><td>26-01-2012</td><td>2.03</td><td>0</td><td>0.51</td><td>0.51</td></tr><tr><td>11220</td><td>409040</td><td>27-01-2012</td><td>0</td><td>2.03</td><td>2.03</td><td>2.54</td></tr><tr><td>11221</td><td>409040</td><td>28-01-2012</td><td>0</td><td>0</td><td>2.03</td><td>2.54</td></tr><tr><td>11222</td><td>409040</td><td>29-01-2012</td><td>0</td><td>0</td><td>2.03</td><td>2.54</td></tr><tr><td>11223</td><td>409040</td><td>30-01-2012</td><td>0</td><td>0</td><td>0</td><td>2.54</td></tr><tr><td>11224</td><td>409040</td><td>31-01-2012</td><td>0.25</td><td>0</td><td>0</td><td>2.03</td></tr><tr><td>11225</td><td>409040</td><td>01-02-2012</td><td>0</td><td>0.25</td><td>0.25</td><td>2.28</td></tr><tr><td>11226</td><td>409040</td><td>02-02-2012</td><td>0</td><td>0</td><td>0.25</td><td>2.28</td></tr><tr><td>11227</td><td>409040</td><td>03-02-2012</td><td>4.32</td><td>0</td><td>0.25</td><td>0.25</td></tr><tr><td>11228</td><td>409040</td><td>04-02-2012</td><td>13.21</td><td>4.32</td><td>4.32</td><td>4.57</td></tr><tr><td>11229</td><td>409040</td><td>05-02-2012</td><td>1.02</td><td>13.21</td><td>17.53</td><td>17.78</td></tr><tr><td>11230</td><td>409040</td><td>06-02-2012</td><td>0</td><td>1.02</td><td>18.55</td><td>18.8</td></tr><tr><td>11231</td><td>409040</td><td>07-02-2012</td><td>0</td><td>0</td><td>14.23</td><td>18.8</td></tr><tr><td>11232</td><td>409040</td><td>08-02-2012</td><td>0</td><td>0</td><td>1.02</td><td>18.55</td></tr><tr><td>11233</td><td>409040</td><td>09-02-2012</td><td>0</td><td>0</td><td>0</td><td>18.55</td></tr><tr><td>11234</td><td>409040</td><td>10-02-2012</td><td>5.08</td><td>0</td><td>0</td><td>18.55</td></tr><tr><td>11235</td><td>409040</td><td>11-02-2012</td><td>0</td><td>5.08</td><td>5.08</td><td>19.31</td></tr><tr><td>11236</td><td>409040</td><td>12-02-2012</td><td>12.95</td><td>0</td><td>5.08</td><td>6.1</td></tr><tr><td>11237</td><td>409040</td><td>13-02-2012</td><td>5.59</td><td>12.95</td><td>18.03</td><td>18.03</td></tr><tr><td>11238</td><td>409040</td><td>14-02-2012</td><td>0.25</td><td>5.59</td><td>18.54</td><td>23.62</td></tr><tr><td>11239</td><td>409040</td><td>15-02-2012</td><td>0</td><td>0.25</td><td>18.79</td><td>23.87</td></tr><tr><td>11240</td><td>409040</td><td>16-02-2012</td><td>0</td><td>0</td><td>5.84</td><td>23.87</td></tr><tr><td>11241</td><td>409040</td><td>17-02-2012</td><td>0</td><td>0</td><td>0.25</td><td>23.87</td></tr><tr><td>11242</td><td>409040</td><td>18-02-2012</td><td>0</td><td>0</td><td>0</td><td>18.79</td></tr><tr><td>11243</td><td>409040</td><td>19-02-2012</td><td>0</td><td>0</td><td>0</td><td>18.79</td></tr><tr><td>11244</td><td>409040</td><td>20-02-2012</td><td>14.48</td><td>0</td><td>0</td><td>5.84</td></tr><tr><td>11245</td><td>409040</td><td>21-02-2012</td><td>9.65</td><td>14.48</td><td>14.48</td><td>14.73</td></tr><tr><td>11246</td><td>409040</td><td>22-02-2012</td><td>3.05</td><td>9.65</td><td>24.13</td><td>24.13</td></tr><tr><td>11247</td><td>409040</td><td>23-02-2012</td><td>0</td><td>3.05</td><td>27.18</td><td>27.18</td></tr><tr><td>11248</td><td>409040</td><td>24-02-2012</td><td>0</td><td>0</td><td>12.7</td><td>27.18</td></tr><tr><td>11249</td><td>409040</td><td>25-02-2012</td><td>0</td><td>0</td><td>3.05</td><td>27.18</td></tr><tr><td>11250</td><td>409040</td><td>26-02-2012</td><td>0</td><td>0</td><td>0</td><td>27.18</td></tr><tr><td>11251</td><td>409040</td><td>27-02-2012</td><td>0</td><td>0</td><td>0</td><td>27.18</td></tr><tr><td>11252</td><td>409040</td><td>28-02-2012</td><td>7.37</td><td>0</td><td>0</td><td>12.7</td></tr><tr><td>11253</td><td>409040</td><td>29-02-2012</td><td>0</td><td>7.37</td><td>7.37</td><td>10.42</td></tr></tbody></table>
I created an additional column rDate (DateTime) and filled it with this query:
UPDATE Rainfall SET Rainfall.rDate = DateSerial([rYear],[rMonth],[rDay]);
Then your desired result can be achieved with several subqueries, using SUM() for the last two columns:
SELECT r.ID, r.AirportCode, r.rDate, r.RFmm,
(SELECT RFmm FROM Rainfall r1 WHERE r1.AirportCode = r.AirportCode AND r1.rDate = r.rDate-1) AS Yesterday,
(SELECT SUM(RFmm) FROM Rainfall r3 WHERE r3.AirportCode = r.AirportCode AND r3.rDate BETWEEN r.rDate-3 AND r.rDate-1) AS Prev3days,
(SELECT SUM(RFmm) FROM Rainfall r7 WHERE r7.AirportCode = r.AirportCode AND r7.rDate BETWEEN r.rDate-7 AND r.rDate-1) AS PrevWeek
FROM Rainfall r
Make sure AirportCode and rDate are indexed for larger numbers of records.
Result:
+-------+-------------+------------+-------+-----------+-----------+----------+
| ID | AirportCode | rDate | RFmm | Yesterday | Prev3days | PrevWeek |
+-------+-------------+------------+-------+-----------+-----------+----------+
| 11216 | 409040 | 23.01.2012 | 0,51 | | | |
| 11217 | 409040 | 24.01.2012 | 0 | 0,51 | 0,51 | 0,51 |
| 11218 | 409040 | 25.01.2012 | 0 | 0 | 0,51 | 0,51 |
| 11219 | 409040 | 26.01.2012 | 2,03 | 0 | 0,51 | 0,51 |
| 11220 | 409040 | 27.01.2012 | 0 | 2,03 | 2,03 | 2,54 |
| 11221 | 409040 | 28.01.2012 | 0 | 0 | 2,03 | 2,54 |
| 11222 | 409040 | 29.01.2012 | 0 | 0 | 2,03 | 2,54 |
| 11223 | 409040 | 30.01.2012 | 0 | 0 | 0 | 2,54 |
| 11224 | 409040 | 31.01.2012 | 0,25 | 0 | 0 | 2,03 |
| 11225 | 409040 | 01.02.2012 | 0 | 0,25 | 0,25 | 2,28 |
| 11226 | 409040 | 02.02.2012 | 0 | 0 | 0,25 | 2,28 |
| 11227 | 409040 | 03.02.2012 | 4,32 | 0 | 0,25 | 0,25 |
| 11228 | 409040 | 04.02.2012 | 13,21 | 4,32 | 4,32 | 4,57 |
| 11229 | 409040 | 05.02.2012 | 1,02 | 13,21 | 17,53 | 17,78 |
+-------+-------------+------------+-------+-----------+-----------+----------+
Use Nz() to avoid NULL values in the first row.
It appears that you store the day in separate fields (rYear, rMonth, rDay). So, in order to get the date you use the DateSerial function. This means that in order to use the date for a join or where clause, Access must calculate the date for the entire table. You need to store the date in a separate field and index it to avoid the calculation.