Insert into table based on rows in same table - sql

I have a table as per below which will contain multiple rows of data - a couple of thousand rows max.
lvl2 | lvl3 | lvl6 | this_rep_cycle | last_rep_cycle | prev_rep_cycle | rowType
================================================================================
ASSET | CURR | FI | 214,060,924,928 | 0 | 0 | 1-Total
ASSET | CURR | FI | 25,199,630,336 | 0 | 0 | 3-Bal
ASSET | CURR | FX | 123,941,472 | 0 | 0 | 1-Total
ASSET | CURR | FX | 0 | 0 | 0 | 3-Bal
What I need to to is inset a new row in the table with the same lv12, vl3, lvl6, but where:
this_rep_cycle = (this_rep_cycle for rowType = '3-Bal' / this_rep_cycle for rowType = '1-total)
last_rep_cycle = (last_rep_cycle for rowType = '3-Bal' / last_rep_cycle for rowType = '1-total)
prev_rep_cycle = (prev_rep_cycle for rowType = '3-Bal' / prev_rep_cycle for rowType = '1-total)
The end result should look like:
lvl2 | lvl3 | lvl6 | this_rep_cycle | last_rep_cycle | prev_rep_cycle | rowType
================================================================================
ASSET | CURR | FI | 214,060,924,928 | 0 | 0 | 1-Total
ASSET | CURR | FI | 25,199,630,336 | 0 | 0 | 3-Bal
ASSET | CURR | FI | 11.77 | 0 | 0 | 4-%
ASSET | CURR | FX | 123,941,472 | 0 | 0 | 1-Total
ASSET | CURR | FX | 0 | 0 | 0 | 3-Bal
ASSET | CURR | FX | 0 | 0 | 0 | 4-%
I have written a self join to achieve this:
set arithignore on
select
pd_1.lvl2, pd_1.lvl3, pd_1.lvl4, pd_1.lvl6, pd_2.last_report_cycle as p2_lrp, pd_1.last_report_cycle as p1_lrp,
(pd_2.this_report_cycle / pd_1.this_report_cycle)*100 as this_report_cycle,
(pd_2.last_report_cycle / pd_1.last_report_cycle)*100 as last_report_cycle
-- (pd_2.prev_report_cycle / pd_1.prev_report_cycle)*100 as prev_report_cycle,
-- '4-%' as [percentage]
from ProxyTrending pd_1
inner join ProxyTrending pd_2 on pd_2.rowType = '3-Bal'
AND pd_1.lvl2 = pd_2.lvl2
AND pd_1.lvl3 = pd_2.lvl3
AND pd_1.lvl4 = pd_2.lvl4
AND pd_1.lvl6 = pd_2.lvl6
where pd_1.rowType = '1-Total'
--order by pd_1.lvl2, pd_1.lvl3, pd_1.lvl4, pd_1.lvl6
set arithignore off
I need set arithignore as I can experience div/zero, but when i execute the above, it (partially) works if only one of the (report_cycle / report_cycle)*100 is uncommented - 2 or more of these lines and zero results are returned.
also, if i have just one of the (report_cycle / report_cycle)*100 uncommented, 60 results are returned where there are 106 '1-total' records and 106 '3-Bal' records - I would have expected the proc to run and return 106 '4-%' results.
I'm not sure what I'm missing.

Well, i have the cause of the error. If I execute:
select pd_2.this_report_cycle, pd_1.this_report_cycle...
the statement executes and returns values. If I place:
select pd_2.this_report_cycle, pd_1.this_report_cycle,
(pd_2.this_report_cycle / pd_1.this_report_cycle)*100 as expected_result
the query return zero rows if either of the values are 0, despite the ARITHIGNORE setting.
And the CASE statement solves the issue.

Related

How to update column in table from two different tables USING DML Command

I have three different tables here:
df_umts_relation table:
|---------------------|------------------|---------------------|------------------|------------------|
| cell_name | n_cell_name | technology | source_ops_num | target_ops_num |
|---------------------|------------------|---------------------|------------------|------------------|
| 121 | 221 | UMTS | 1 | | |
|---------------------|------------------|---------------------|------------------|------------------|
| 122 | 222 | GSM | 2 | | |
|---------------------|------------------|---------------------|------------------|------------------|
| 123 | 223 | UMTS | 3 | | |
|---------------------|------------------|---------------------|------------------|------------------|
| 124 | 224 | GSM | 4 | | |
|---------------------|------------------|---------------------|------------------|------------------|
| 125 | 225 | GSM | 5 | | |
|---------------------|------------------|---------------------|------------------|------------------|
| 126 | 226 | UMTS | 6 | | |
|---------------------|------------------|---------------------|------------------|------------------|
| 127 | 227 | UMTS | 7 | | |
|---------------------|------------------|---------------------|------------------|------------------|
So now I want to update target_ops_num from the two below tables
df_umts_carrier table as this table contains those thow columns I want to work on them and contains some integer values also:
|---------------------|------------------|
| opsnum_umts | cell_name_umts |
|---------------------|------------------|
as I have another table called df_gsm_carrier:
|---------------------|------------------|
| opsnum_gsm | cellname |
|---------------------|------------------|
So All I need I want to update [MyNewDatabase].[dbo].[df_umts_relation].[target_ops_num] CASE WHEN technologyis UMTS then update from table df_umts_carrier ELSE technology is GSM then update from df_gsm_carrier on n_cell_name = cell_name_umts and on n_cell_name = cellname
So I tried to create a query as the below one works with one condition only and it's update the the rows which is UMTS only:
UPDATE [MyNewDatabase].[dbo].[df_umts_relation]
SET [MyNewDatabase].[dbo].[df_umts_relation].[target_ops_num] = [MyNewDatabase].[dbo].[df_umts_carrier].[opsnum_umts]
FROM [MyNewDatabase].[dbo].[df_umts_relation]
INNER JOIN [MyNewDatabase].[dbo].[df_umts_carrier]
ON [n_cell_name] = [cell_name_umts]
and works fine but doesn't update the rows which contains GSM...
On other way I tried to create a query to handle this but it didn't update the GSM part and take a long of time:
UPDATE [MyNewDatabase].[dbo].[df_umts_relation]
SET [MyNewDatabase].[dbo].[df_umts_relation].[target_ops_num] = (CASE WHEN [MyNewDatabase].[dbo].[df_umts_relation].[technology] = 'UMTS'
THEN [MyNewDatabase].[dbo].[df_umts_carrier].[opsnum_umts] ELSE [MyNewDatabase].[dbo].[df_gsm_carrier].[opsnum_gsm] END)
FROM [MyNewDatabase].[dbo].[df_umts_relation]
LEFT JOIN [MyNewDatabase].[dbo].[df_umts_carrier]
ON [n_cell_name] = [cell_name_umts]
LEFT JOIN [MyNewDatabase].[dbo].[df_gsm_carrier]
ON [n_cell_name] = [cell_name]
So any one have any idea how to solve this?
Please check if this will help.
update df_umts_relation
set target_ops_num = ( select case when dur.technology = 'UMTS' then du.cell_name_umts
when dur.technology = 'GSM' then dg.cellname
end
from df_umts_relation dur
left join df_umts_carrier du on dur.n_cell_name = du.opsnum_umts
left join df_gsm_carrier dg on dur.n_cell_name = dg.opsnum_umts
where dur.id= df_umts_relation.id)
Here is a demo

SQL: How to get in a single query info from a table and aggregate info from another table?

I am using MS SQL Server 2014 on Windows7.
I have a table named "PollingStatus".
The below query :
SELECT DeviceIP
,ServiceStatus
,ReportedErrors
FROM PollingStatus
...gives some info like this:
DeviceIP | ServiceStatus | ReportedErrors
10.20.1.1 | 0 | 0
10.20.1.2 | 0 | 0
And in another table named "DeviceProperties" I have something like:
DeviceIP | DeviceType | SwVersion
10.20.1.1 | 3 | 800
10.20.1.1 | 2 | 802
10.20.1.1 | 1 | 804
10.20.1.2 | 2 | 800
10.20.1.2 | 3 | 801
10.20.1.2 | 2 | 806
What I would need is a query to get something like this:
DeviceIP | ServiceStatus | ReportedErrors | DeviceType | SwVersion
10.20.1.1 | 0 | 0 | 1 | 804
10.20.1.2 | 0 | 0 | 2 | 806
i.e.: compared with my existing query, I would like to have also the DeviceType and the maximum SwVersion of the device, from the second table "DeviceProperties".
This should suffice
SELECT DeviceIP
,ServiceStatus
,ReportedErrors
,Max(SwVersion) as MaxSwVersion
FROM PollingStatus p
INNER JOIN DeviceProperties d
ON p.DeviceIP = d.DeviceIP
GROUP BY DeviceIP, ServiceStatus, ReportedErrors
try:
SELECT
ps.DeviceIp,
ps.ServiceStatus,
ps.ReportedErrors,
dp.DeviceType,
Max(dp.SwVersion)
FROM PollingStatus ps
INNER JOIN DeviceProperties dp ON dp.DeviceIp = ps.DeviceIp
GROUP BY ps.DeviceIp, ps.ServiceStatus, ps.ReportedErrors, dp.DeviceType;
Other than that, use a different type of join if you expect records in one table have no correspondent in the other.
Select
D.DeviceIP,
D.ReportedErrors,
D.ServiceStatus,
DD.DeviceType,
DD.SwVersion
from #Device D
INNER JOIN (Select
DeviceIP,
MIN(DeviceType)DeviceType,
MAX(SwVersion)SwVersion
from #Devicetype
GROUP BY DeviceIP )DD
ON D.DeviceIP = DD.DeviceIP

How to define a sub query inside SQL statement to be used several times as a table alias?

I have an MS Access database for rainfall data of several climate stations.
For each day of each station, I want to calculate the rainfall in the previous day (if recorded), and the sum of the rainfall at the previous 3 and 7 days.
Due to the huge amount of data and the limitations of Access, I made a query that takes station by station; Then I applied an auxillary query to find dates first, For each station, The following SQL statement is applied (and named RainFallStudy query):
SELECT
[173].ID, [173].AirportCode, [173].RFmm,
DateSerial([rYear], [rMonth], [rDay]) AS DateSer,
[DateSer]-1 AS DM1,
[DateSer]-2 AS DM2,
[DateSer]-3 AS DM3,
[DateSer]-4 AS DM4,
[DateSer]-5 AS DM5,
[DateSer]-6 AS DM6,
[DateSer]-7 AS DM7
FROM
[173]
WHERE
((([173].AirportCode) = 786660));
I used DM1, DM2, etc as the date serial of the day-1, day-2, etc.
Then I used another query that uses RainFallStudy query with left joints as shown in the figure:
The SQL statement is
SELECT
RainFallStudy.ID, RainFallStudy.AirportCode,
RainFallStudy.RFmm AS RF0, RainFallStudy.DateSer,
RainFallStudy.DM1, RainFallStudy_1.RFmm AS RF1,
RainFallStudy_2.RFmm AS RF2, RainFallStudy_3.RFmm AS RF3,
RainFallStudy_4.RFmm AS RF4, RainFallStudy_5.RFmm AS RF5,
RainFallStudy_6.RFmm AS RF6, RainFallStudy_7.RFmm AS RF7,
Nz([rf1], 0) + Nz([rf2], 0) + Nz([rf3], 0) + Nz([rf4], 0) + Nz([rf5], 0) + Nz([rf6], 0) + Nz([rf7], 0) AS RF_W
FROM
((((((RainFallStudy
LEFT JOIN
RainFallStudy AS RainFallStudy_1 ON RainFallStudy.DM1 = RainFallStudy_1.DateSer)
LEFT JOIN
RainFallStudy AS RainFallStudy_2 ON RainFallStudy.DM2 = RainFallStudy_2.DateSer)
LEFT JOIN
RainFallStudy AS RainFallStudy_3 ON RainFallStudy.DM3 = RainFallStudy_3.DateSer)
LEFT JOIN
RainFallStudy AS RainFallStudy_4 ON RainFallStudy.DM4 = RainFallStudy_4.DateSer)
LEFT JOIN
RainFallStudy AS RainFallStudy_5 ON RainFallStudy.DM5 = RainFallStudy_5.DateSer)
LEFT JOIN
RainFallStudy AS RainFallStudy_6 ON RainFallStudy.DM6 = RainFallStudy_6.DateSer)
LEFT JOIN
RainFallStudy AS RainFallStudy_7 ON RainFallStudy.DM7 = RainFallStudy_7.RFmm;
Now I suffer from the slow performance of this query, as the records of each station range from 1,000 to 750,000 records! Is there any better way to find what I need in a faster SQL statement? The second question, can I make a standalone SQL statement for that (one query without the auxiliary query) as I will use it in python, which requires one SQL statement (as Iof my knowledge).
Thanks in advance.
Update
As requested by #Andre, Here are some sample data of table [173] in HTML
<table><tbody><tr><th>ID</th><th>AirportCode</th><th>rYear</th><th>rMonth</th><th>rDay</th><th>RFmm</th></tr><tr><td>11216</td><td>409040</td><td>2012</td><td>1</td><td>23</td><td>0.51</td></tr><tr><td>11217</td><td>409040</td><td>2012</td><td>1</td><td>24</td><td>0</td></tr><tr><td>11218</td><td>409040</td><td>2012</td><td>1</td><td>25</td><td>0</td></tr><tr><td>11219</td><td>409040</td><td>2012</td><td>1</td><td>26</td><td>2.03</td></tr><tr><td>11220</td><td>409040</td><td>2012</td><td>1</td><td>27</td><td>0</td></tr><tr><td>11221</td><td>409040</td><td>2012</td><td>1</td><td>28</td><td>0</td></tr><tr><td>11222</td><td>409040</td><td>2012</td><td>1</td><td>29</td><td>0</td></tr><tr><td>11223</td><td>409040</td><td>2012</td><td>1</td><td>30</td><td>0</td></tr><tr><td>11224</td><td>409040</td><td>2012</td><td>1</td><td>31</td><td>0.25</td></tr><tr><td>11225</td><td>409040</td><td>2012</td><td>2</td><td>1</td><td>0</td></tr><tr><td>11226</td><td>409040</td><td>2012</td><td>2</td><td>2</td><td>0</td></tr><tr><td>11227</td><td>409040</td><td>2012</td><td>2</td><td>3</td><td>4.32</td></tr><tr><td>11228</td><td>409040</td><td>2012</td><td>2</td><td>4</td><td>13.21</td></tr><tr><td>11229</td><td>409040</td><td>2012</td><td>2</td><td>5</td><td>1.02</td></tr><tr><td>11230</td><td>409040</td><td>2012</td><td>2</td><td>6</td><td>0</td></tr><tr><td>11231</td><td>409040</td><td>2012</td><td>2</td><td>7</td><td>0</td></tr><tr><td>11232</td><td>409040</td><td>2012</td><td>2</td><td>8</td><td>0</td></tr><tr><td>11233</td><td>409040</td><td>2012</td><td>2</td><td>9</td><td>0</td></tr><tr><td>11234</td><td>409040</td><td>2012</td><td>2</td><td>10</td><td>5.08</td></tr><tr><td>11235</td><td>409040</td><td>2012</td><td>2</td><td>11</td><td>0</td></tr><tr><td>11236</td><td>409040</td><td>2012</td><td>2</td><td>12</td><td>12.95</td></tr><tr><td>11237</td><td>409040</td><td>2012</td><td>2</td><td>13</td><td>5.59</td></tr><tr><td>11238</td><td>409040</td><td>2012</td><td>2</td><td>14</td><td>0.25</td></tr><tr><td>11239</td><td>409040</td><td>2012</td><td>2</td><td>15</td><td>0</td></tr><tr><td>11240</td><td>409040</td><td>2012</td><td>2</td><td>16</td><td>0</td></tr><tr><td>11241</td><td>409040</td><td>2012</td><td>2</td><td>17</td><td>0</td></tr><tr><td>11242</td><td>409040</td><td>2012</td><td>2</td><td>18</td><td>0</td></tr><tr><td>11243</td><td>409040</td><td>2012</td><td>2</td><td>19</td><td>0</td></tr><tr><td>11244</td><td>409040</td><td>2012</td><td>2</td><td>20</td><td>14.48</td></tr><tr><td>11245</td><td>409040</td><td>2012</td><td>2</td><td>21</td><td>9.65</td></tr><tr><td>11246</td><td>409040</td><td>2012</td><td>2</td><td>22</td><td>3.05</td></tr><tr><td>11247</td><td>409040</td><td>2012</td><td>2</td><td>23</td><td>0</td></tr><tr><td>11248</td><td>409040</td><td>2012</td><td>2</td><td>24</td><td>0</td></tr><tr><td>11249</td><td>409040</td><td>2012</td><td>2</td><td>25</td><td>0</td></tr><tr><td>11250</td><td>409040</td><td>2012</td><td>2</td><td>26</td><td>0</td></tr><tr><td>11251</td><td>409040</td><td>2012</td><td>2</td><td>27</td><td>0</td></tr><tr><td>11252</td><td>409040</td><td>2012</td><td>2</td><td>28</td><td>7.37</td></tr><tr><td>11253</td><td>409040</td><td>2012</td><td>2</td><td>29</td><td>0</td></tr></tbody></table>
And here is sample output (HTML)
<table><tbody><tr><th>ID</th><th>AirportCode</th><th>DateSer</th><th>ThisDay</th><th>Yesterday</th><th>Prev3days</th><th>PrevWeek</th></tr><tr><td>11216</td><td>409040</td><td>23-01-2012</td><td>0.51</td><td>0</td><td>0</td><td>0</td></tr><tr><td>11217</td><td>409040</td><td>24-01-2012</td><td>0</td><td>0.51</td><td>0.51</td><td>0.51</td></tr><tr><td>11218</td><td>409040</td><td>25-01-2012</td><td>0</td><td>0</td><td>0.51</td><td>0.51</td></tr><tr><td>11219</td><td>409040</td><td>26-01-2012</td><td>2.03</td><td>0</td><td>0.51</td><td>0.51</td></tr><tr><td>11220</td><td>409040</td><td>27-01-2012</td><td>0</td><td>2.03</td><td>2.03</td><td>2.54</td></tr><tr><td>11221</td><td>409040</td><td>28-01-2012</td><td>0</td><td>0</td><td>2.03</td><td>2.54</td></tr><tr><td>11222</td><td>409040</td><td>29-01-2012</td><td>0</td><td>0</td><td>2.03</td><td>2.54</td></tr><tr><td>11223</td><td>409040</td><td>30-01-2012</td><td>0</td><td>0</td><td>0</td><td>2.54</td></tr><tr><td>11224</td><td>409040</td><td>31-01-2012</td><td>0.25</td><td>0</td><td>0</td><td>2.03</td></tr><tr><td>11225</td><td>409040</td><td>01-02-2012</td><td>0</td><td>0.25</td><td>0.25</td><td>2.28</td></tr><tr><td>11226</td><td>409040</td><td>02-02-2012</td><td>0</td><td>0</td><td>0.25</td><td>2.28</td></tr><tr><td>11227</td><td>409040</td><td>03-02-2012</td><td>4.32</td><td>0</td><td>0.25</td><td>0.25</td></tr><tr><td>11228</td><td>409040</td><td>04-02-2012</td><td>13.21</td><td>4.32</td><td>4.32</td><td>4.57</td></tr><tr><td>11229</td><td>409040</td><td>05-02-2012</td><td>1.02</td><td>13.21</td><td>17.53</td><td>17.78</td></tr><tr><td>11230</td><td>409040</td><td>06-02-2012</td><td>0</td><td>1.02</td><td>18.55</td><td>18.8</td></tr><tr><td>11231</td><td>409040</td><td>07-02-2012</td><td>0</td><td>0</td><td>14.23</td><td>18.8</td></tr><tr><td>11232</td><td>409040</td><td>08-02-2012</td><td>0</td><td>0</td><td>1.02</td><td>18.55</td></tr><tr><td>11233</td><td>409040</td><td>09-02-2012</td><td>0</td><td>0</td><td>0</td><td>18.55</td></tr><tr><td>11234</td><td>409040</td><td>10-02-2012</td><td>5.08</td><td>0</td><td>0</td><td>18.55</td></tr><tr><td>11235</td><td>409040</td><td>11-02-2012</td><td>0</td><td>5.08</td><td>5.08</td><td>19.31</td></tr><tr><td>11236</td><td>409040</td><td>12-02-2012</td><td>12.95</td><td>0</td><td>5.08</td><td>6.1</td></tr><tr><td>11237</td><td>409040</td><td>13-02-2012</td><td>5.59</td><td>12.95</td><td>18.03</td><td>18.03</td></tr><tr><td>11238</td><td>409040</td><td>14-02-2012</td><td>0.25</td><td>5.59</td><td>18.54</td><td>23.62</td></tr><tr><td>11239</td><td>409040</td><td>15-02-2012</td><td>0</td><td>0.25</td><td>18.79</td><td>23.87</td></tr><tr><td>11240</td><td>409040</td><td>16-02-2012</td><td>0</td><td>0</td><td>5.84</td><td>23.87</td></tr><tr><td>11241</td><td>409040</td><td>17-02-2012</td><td>0</td><td>0</td><td>0.25</td><td>23.87</td></tr><tr><td>11242</td><td>409040</td><td>18-02-2012</td><td>0</td><td>0</td><td>0</td><td>18.79</td></tr><tr><td>11243</td><td>409040</td><td>19-02-2012</td><td>0</td><td>0</td><td>0</td><td>18.79</td></tr><tr><td>11244</td><td>409040</td><td>20-02-2012</td><td>14.48</td><td>0</td><td>0</td><td>5.84</td></tr><tr><td>11245</td><td>409040</td><td>21-02-2012</td><td>9.65</td><td>14.48</td><td>14.48</td><td>14.73</td></tr><tr><td>11246</td><td>409040</td><td>22-02-2012</td><td>3.05</td><td>9.65</td><td>24.13</td><td>24.13</td></tr><tr><td>11247</td><td>409040</td><td>23-02-2012</td><td>0</td><td>3.05</td><td>27.18</td><td>27.18</td></tr><tr><td>11248</td><td>409040</td><td>24-02-2012</td><td>0</td><td>0</td><td>12.7</td><td>27.18</td></tr><tr><td>11249</td><td>409040</td><td>25-02-2012</td><td>0</td><td>0</td><td>3.05</td><td>27.18</td></tr><tr><td>11250</td><td>409040</td><td>26-02-2012</td><td>0</td><td>0</td><td>0</td><td>27.18</td></tr><tr><td>11251</td><td>409040</td><td>27-02-2012</td><td>0</td><td>0</td><td>0</td><td>27.18</td></tr><tr><td>11252</td><td>409040</td><td>28-02-2012</td><td>7.37</td><td>0</td><td>0</td><td>12.7</td></tr><tr><td>11253</td><td>409040</td><td>29-02-2012</td><td>0</td><td>7.37</td><td>7.37</td><td>10.42</td></tr></tbody></table>
I created an additional column rDate (DateTime) and filled it with this query:
UPDATE Rainfall SET Rainfall.rDate = DateSerial([rYear],[rMonth],[rDay]);
Then your desired result can be achieved with several subqueries, using SUM() for the last two columns:
SELECT r.ID, r.AirportCode, r.rDate, r.RFmm,
(SELECT RFmm FROM Rainfall r1 WHERE r1.AirportCode = r.AirportCode AND r1.rDate = r.rDate-1) AS Yesterday,
(SELECT SUM(RFmm) FROM Rainfall r3 WHERE r3.AirportCode = r.AirportCode AND r3.rDate BETWEEN r.rDate-3 AND r.rDate-1) AS Prev3days,
(SELECT SUM(RFmm) FROM Rainfall r7 WHERE r7.AirportCode = r.AirportCode AND r7.rDate BETWEEN r.rDate-7 AND r.rDate-1) AS PrevWeek
FROM Rainfall r
Make sure AirportCode and rDate are indexed for larger numbers of records.
Result:
+-------+-------------+------------+-------+-----------+-----------+----------+
| ID | AirportCode | rDate | RFmm | Yesterday | Prev3days | PrevWeek |
+-------+-------------+------------+-------+-----------+-----------+----------+
| 11216 | 409040 | 23.01.2012 | 0,51 | | | |
| 11217 | 409040 | 24.01.2012 | 0 | 0,51 | 0,51 | 0,51 |
| 11218 | 409040 | 25.01.2012 | 0 | 0 | 0,51 | 0,51 |
| 11219 | 409040 | 26.01.2012 | 2,03 | 0 | 0,51 | 0,51 |
| 11220 | 409040 | 27.01.2012 | 0 | 2,03 | 2,03 | 2,54 |
| 11221 | 409040 | 28.01.2012 | 0 | 0 | 2,03 | 2,54 |
| 11222 | 409040 | 29.01.2012 | 0 | 0 | 2,03 | 2,54 |
| 11223 | 409040 | 30.01.2012 | 0 | 0 | 0 | 2,54 |
| 11224 | 409040 | 31.01.2012 | 0,25 | 0 | 0 | 2,03 |
| 11225 | 409040 | 01.02.2012 | 0 | 0,25 | 0,25 | 2,28 |
| 11226 | 409040 | 02.02.2012 | 0 | 0 | 0,25 | 2,28 |
| 11227 | 409040 | 03.02.2012 | 4,32 | 0 | 0,25 | 0,25 |
| 11228 | 409040 | 04.02.2012 | 13,21 | 4,32 | 4,32 | 4,57 |
| 11229 | 409040 | 05.02.2012 | 1,02 | 13,21 | 17,53 | 17,78 |
+-------+-------------+------------+-------+-----------+-----------+----------+
Use Nz() to avoid NULL values in the first row.
It appears that you store the day in separate fields (rYear, rMonth, rDay). So, in order to get the date you use the DateSerial function. This means that in order to use the date for a join or where clause, Access must calculate the date for the entire table. You need to store the date in a separate field and index it to avoid the calculation.

SQLite - Complex Query

This is what I want to get.
Art|CANTIDAD1|CANTIDAD2|CANTIDAD1CARGA1 |CANTIDAD2CARGA1 |CANTIDAD1CARGA2 | CANTIDAD2CARGA2
----------------------------------------------------------------------------------------------
001| 7 | 0 | 4 | 0 | 3 | 0
002| 0 | 2 | 0 | 1 | 0 | 1
003| 2 | 0 | 2 | 0 | 0 | 0
004| 3 | 0 | 1 | 0 | 2 | 0
005| 2 | 0 | 0 | 0 | 2 | 0
006| 0 | 1 | 0 | 0 | 0 | 1
I get CANTIDAD1 and CANTIDAD2 doing this query. It is the result of the sum of the amounts corresponding to the "where"
SELECT
SUM(D.NCANTIDAD1) AS NTOTCANTIDAD1,
SUM(D.NCANTIDAD2) AS NTOTCANTIDAD2
FROM
CABPEDIDOS C,
DETPEDIDOS D,
ARTICULOS A
WHERE
C.DFECHAALBARAN IS NULL
AND C.CSERIE = D.CSERIE
AND C.NPEDIDO = D.NPEDIDO
AND D.NFABRICANTE = A.NFABRICANTE
AND D.CARTICULO = A.CARTICULO
GROUP BY
D.NFABRICANTE, D.CARTICULO, A.CNOMBRE
CANTIDAD1CARGA1, CANTIDAD2CARGA1 are quantities that are in the database (d.cantidad1, d,cantidad2 are the real names, I have to sum all of them to get CANTIDAD1 and CANTIDAD2), but I need to get the quantities corresponding to the respective C.CARGA:
(CANTIDAD1 = CANTIDAD1CARGA1 + CANTIDAD1CARGA2)
How can I get these values?
** C.NCARGA can have more than one value, I need to get all CANTIDAD1CARGA'x' and CANTIDAD2CARGA'x'
I don't care if I have to use two querys,
- one for CANTIDAD1 and CANTIDAD2
- other for CANTIDAD1CARGA1, CANTIDAD2CARGA1, CANTIDAD1CARGA2... etc
I have a feeling I'm not really understanding the question, but it seems like you just need:
SELECT CANTIDAD1CARGA1 + CANTIDAD1CARGA2 AS CANTIDAD1,
CANTIDAD2CARGA1 + CANTIDAD2CARGA2 AS CANTIDAD2,
CANTIDAD1CARGA1, CANTIDAD2CARGA1, CANTIDAD1CARGA2, CANTIDAD2CARGA2
FROM ...

How to add one column to another column on all rows in a table?

Given a table like below. I want to move and add all values from the column escrow1 to balance1 of it's corresponding uid. And likesise for escrow2 to balance2. So in the case below. The row with uid 4 will have a balance of 1858000+42000, row with uid 3 will have a balance1 = 1859265+30735 and escrow1 = 0, and row with uid 2 will have a balance2 = 940050+1050000 and escrow2 = 0. Everything else is the same. Is it possible to do this in one query? I've been trying hard, but I can't come up with a solution, so I might have to do it in a function and loop all the rows, but I would prefer not to. Also I know that only a small amount of rows will have escrow values not equal to 0. Given that, is there a way to optimize the query?
uid | balance1 | escrow1 | balance2 | escrow2
-----+----------+---------+----------+---------
1 | 5000 | 0 | 0 | 0
9 | 5000 | 0 | 0 | 0
6 | 1900000 | 0 | 1899960 | 0
5 | 1900000 | 0 | 1900000 | 0
7 | 1900000 | 0 | 1900000 | 0
8 | 1900000 | 0 | 1900000 | 0
4 | 1858000 | 42000 | 1900014 | 0
2 | 1910000 | 0 | 940050 | 1050000
3 | 1859265 | 30735 | 1895050 | 0
If you just want to select the data from the table use the query provided by Greg. If you want to update the table itself, the below query can help.
Update TABLENAME
Set Balance1 = Balance1 + Escrow1,
Balance2 = Balance2 + Escrow2,
Escrow1 = 0, Escrow2 = 0
Hope this helps.
I think it's as simple as:
SELECT uid
,Balance1 + Escrow1 AS Balance1
,Balance2 + Escrow2 AS Balance2
FROM TableName
In terms of optomising, I haven't done much with Postgre, but I doubt you'd need to do any optimizing (assuming you have proper primary key, etc. on the table)