Using Intervalmatch without a synthetic key - qlikview

I'm attempting to use IntervalMatch function to join two tables together a below
InvoiceData:
load Supplier
, SupplierName
, SupplierValue
, Invoice
, InvoiceDate
, DueDate
, OrigInvValue
, OrigDiscValue
, PaymentReference
, PaymentNumber
, PostValue
, Value
, MthInvBal1
, MthInvBal2
, MthInvBal3
, Currency
, ConvRate
, DatabaseName&'.'&Supplier&'.'&Invoice as SupplierInvoice
, DatabaseName as Company
;
SQL Select ****;
CurrencyRates:
Load date(floor([StartDateTime])) as [StartDate]
,date(floor([EndDateTime])) as [EndDate]
,[Currency] as BaseCurrency
,[CADDivision]
,[CHFDivision]
,[EURDivision]
,[GBPDivision]
,[JPYDivision]
,[USDDivision]
,[CADMultiply]
,[CHFMultiply]
,[EURMultiply]
,[GBPMultiply]
,[JPYMultiply]
,[USDMultiply];
SQL SELECT [CR].[StartDateTime]
, [CR].[EndDateTime]
, [CR].[Currency]
, [CR].[CADDivision]
, [CR].[CHFDivision]
, [CR].[EURDivision]
, [CR].[GBPDivision]
, [CR].[JPYDivision]
, [CR].[USDDivision]
, [CR].[CADMultiply]
, [CR].[CHFMultiply]
, [CR].[EURMultiply]
, [CR].[GBPMultiply]
, [CR].[JPYMultiply]
, [CR].[USDMultiply]
FROM [Lookups].[CurrencyRates] [CR];
IntervalMatch:
IntervalMatch (InvoiceDate)
Load distinct [StartDate],[EndDate] Resident CurrencyRates;
From reading the literature, I don't think the should be a synthetic key between the table interval match and the currency rates, however, my data model is still showing this. Is this correct?

You get a synthetic key everytime any two tables are linked with more than one field (in your case StartDateTime and EndDateTime).
Looking at the article from Henric Cronström on Qlik Design Blog (https://community.qlik.com/blogs/qlikviewdesignblog/2013/04/04/intervalmatch) you can read that :
Further, the data model contains a composite key (the FromDate and ToDate fields) which will manifest itself as a QlikView synthetic key. But have no fear. This synthetic key should be there; not only is it correct, but it is also optimal given the data model. You do not need to remove it.
So it seems only natural you'll get that synthetic key.

Related

Condensing Data from 4 Columns to 2

I'm trying to combine two sets of columns down to one set in SQL, where all sets have a common JobID and Date.
I want to take columns FrOpr and BkOpr and condense them down to one Opr field while also take their corresponding FrExtract and BkExtract fields down to one corresponding Extract field.
Any thoughts on how to do this?
All the response are much appreciated. I adapted one of the queries below and used it to create a column of data that I wanted to reference and extract from in a larger query.
The output gives me two columns, an Opr and Extract column. In the larger query, I'm looking to select just values from the new Extract column and then Sum them up as a "Completed" output. My problem is knowing where/how to splice/nest this in to the existing query. Any thoughts on how to do this without creating a temp table? I'll post the larger query I want to add this to
SELECT CONCAT(Operators.OprExtID,'CIREG') AS Processor, Convert(VARCHAR(8), Data.StartDateTime, 112) AS [Processed Date], CONCAT('DEPTRI',Machines.EquipmentType,'',JobTypes.JobTypeDesc,'',Jobs.JobName) AS [Activity Type], SUM(Data.Handled) AS Completed FROM dbo.Operators, dbo.Data DataInput, dbo.jobs jobs, dbo.Machines, dbo.JobTypes WITH (nolock) WHERE (Jobs.ID = Data.JobID AND Data.FrOpr = Operators.Operator AND Data.MachNo = Machines.MachNo AND Data.JobTypeID = JobTypes.JobTypeID)
Processor Processed Date Activity Type Completed 0023390_CIREG 20190116 DEPTRI_LWACS_EXTRACTION_UTGENERAL 43.61 0023390_CIREG 20190116 DEPTRI_MWACS_DOC PREP_AGGEN 7.76 0023390_CIREG 20190116 DEPTRI_SWACS_OPENING_UTGENERAL 808 –
Use UNION
SELECT JobId , Date , FrOpr AS Opr , FrExtract AS Extract
FROM< TableName>
WHERE FrOpr IS NOT NULL
UNION ALL
SELECT JobId , Date , BkOpr AS Opr , BkExtract AS Extract
FROM <TableName>
WHERE BkOpr IS NOT NULL
One option is a CROSS APPLY
Example
Select A.JobID
,A.Date
,B.*
From YourTable A
Cross Apply ( values (FrOpr,FrExtract)
,(BkOpr,BKExtract)
) B(Opr,Extract)
Welcome to Stack Overflow! In the future, please provide sample data and desired results in text form.
This is a pretty simple un-pivot, which I'd do with a Union:
Select
JobId
, Date
, FrOpr as Opr
, FrExtract as Extract
, 'Fr' as Source_Column_Set
From <table_name>
Where <whatever conditions your application requires>
Union
Select
JobId
, Date
, BkOpr as Opr
, BkExtract as Extract
, 'Bk' as Source_Column_Set
From <table_name>
Where <whatever conditions your application requires>
You can make that a CTE and sort the results any way you like.
p.s. I included Source_Column_Set to avoid data loss.

Return records when subdatasheets values are null

I'm pretty new to SQL and am kind of jumping in at the deep end here.
I'm building a tool from scratch in Excel that utilises and Access Database - and whilst the basic Queries aren't causing me any serious issues - the more complex ones are.
I have four tables. Users, Issues, Votes and Comments.
One user can create many issues, one issue can have many votes and one issue can also have many comments.
I want to create a query that shows a list of issues, with the count of vote_id and comment_id for each
I.e. Issue 1 has 3 votes and 4 comments, and so on - however when the item has zero comments or votes, my query is returning nothing at all
SELECT
users.user_name
, Count(vote.query_id) AS CountOfquery_id
, Count(comments.query_id) AS CountOfquery_id1
, issues.query_id
, issues.query_raised_by
, issues.query_raised_date
, issues.query_summary
, issues.query_status
, issues.query_status_date
, issues.query_detail
, issues.query_response
, issues.query_tag1
, issues.query_tag2
, issues.query_tag3
, issues.query_tag4
FROM
(
(users INNER JOIN issues
ON users.user_id = issues.query_raised_by)
INNER JOIN vote
ON issues.query_id = vote.query_id)
INNER JOIN comments
ON issues.query_id = comments.query_id
GROUP BY
users.user_name
, issues.query_id
, issues.query_raised_by
, issues.query_raised_date
, issues.query_summary
, issues.query_status
, issues.query_status_date
, issues.query_detail
, issues.query_response
, issues.query_tag1
, issues.query_tag2
, issues.query_tag3
, issues.query_tag4;
Is there an easy way to do this? Am I massively overcomplicating the issue?
Basically I want to populate a table in Excel with a list of issues and the number of votes and comments for each, how can I get count to work?
Does doing:
COUNT(vote.query_id) OVER (PARTITION BY users.user_name)
COUNT(comments.query_id) OVER (PARTITION BY users.user_name)
and changing your group statement to say group by users.user_name only give you the result you're after?
#Dekks suggested using Left Join instead of Inner Join - I just googled what the different joins do and it makes a lot more sense now.
Just wanted to post an update for you, in case it helps anyone else :)

How can I add a total row to my code

I have this code that I have from a shop system software and wondering how I can add a total row. I'm a total noob former accountant that is transitioning into data analysis. I've taken many tutorials and continue to learn, but still a beginner. Supposedly it's the shop system software databse is based on SQL 2012. Here's my code, works fine, just need a total row for the last column only:
Select ESTIM.DESCRIP
, ESTIM.PARTNO
, ESTIM.PRODCODE
, ESTIM.QTYONHAND
, ESTIM.QTYONORDER
, ESTIM.REORDLEVEL
, ESTIM.STOCKINGCOST
, ESTIM.QTYONHAND * ESTIM.STOCKINGCOST As "Total Item Value in Stock"
From ESTIM
Where ((ESTIM.PRODCODE Like [ENTER PRODUCT CODE:]))
Order By ESTIM.PARTNO;
There's a few ways to get an over all total out of this query.
You could use a UNION query to get a new row with just the total, and all other fields NULL:
SELECT 'TOTAL', NULL, NULL, NULL, NULL, NULL, NULL, SUM(ESTIM.QTYONHAND * ESTIM.STOCKINGCOST) FROM ESTIM WHERE ((ESTIM.PRODCODE Like [ENTER PRODUCT CODE:]));
You could add a new field using a windowing function that holds the total (Which will be repeated for each row)
Select ESTIM.DESCRIP
, ESTIM.PARTNO
, ESTIM.PRODCODE
, ESTIM.QTYONHAND
, ESTIM.QTYONORDER
, ESTIM.REORDLEVEL
, ESTIM.STOCKINGCOST
, ESTIM.QTYONHAND * ESTIM.STOCKINGCOST As "Total Item Value in Stock"
SUM(ESTIM.QTYONHAND * ESTIM.STOCKINGCOST) OVER (PARTITION BY 1) as "Total of Total Item Value"
From ESTIM
Where ((ESTIM.PRODCODE Like [ENTER PRODUCT CODE:]))
Order By ESTIM.PARTNO;
You could also... possibly... get crafty with GROUP BY <fields> WITH ROLLUP, but I think that would add more records to the output then you are looking for.

Performance on Degenerated Dimension for drillthrough Action and Processing

There is a lot of information that doesn't fit as Measures and also not have the necessary dimensionality so I decided to integrate this data in the FactTable for a later drillthrough Action (informations like Document Number, Document Line, etc.). So I use the FactTable as a fact dimension (or degenerate dimension as Kimball calls it). The Fact dimension was related after creation with the Measure Group as in the picture below:
The FactTable/Fact dimension has 140.000.000 rows so I decide to use the ROLAP as storage mode trying to avoid the MOLAP processing but now the performance issues are moved in the drillthrough action. All others Dimensions are in MOLAP.
Analysis is istalled on 64x-Server with 98GB RAM and the Memory\TotalMemoryLimit was set to 70% .
I also activate a Profiler when the drillthrough action (over the degenerate dimension in rolap) was performed so I get the SQL-Query. Lots of aggregation and group by - no wondering.
How can I deal with performace in this case so that the drillthrough Action and the processing of degenerate dimension will perform in a timely fashion manner?
UPDATE 13.04
I attached below the execution plan for the query received in Profiler:
SELECT
SUM ( [dbo_FactCdbSAP_Details].[Amount] ) AS Amount,
SUM ( [dbo_FactCdbSAP_Details].[SharedAmount] ) AS SharedAmount,
[dbo_FactCdbSAP_Details].[Pk_id] ,
[dbo_FactCdbSAP_Details].[DocumentNo] ,
[dbo_FactCdbSAP_Details].[DocumentLine] ,
[dbo_FactCdbSAP_Details].[DocumentHeader] ,
[dbo_FactCdbSAP_Details].[DocumentType] ,
[dbo_FactCdbSAP_Details].[Reference] ,
[dbo_FactCdbSAP_Details].[DocumentDate] ,
[dbo_FactCdbSAP_Details].[EntryDate] ,
[dbo_FactCdbSAP_Details].[FiscalPeriod] ,
[dbo_FactCdbSAP_Details].[StornoDocNo] ,
[dbo_FactCdbSAP_Details].[DocumentCurrency] ,
[dbo_FactCdbSAP_Details].[CustomerNumber] ,
[dbo_FactCdbSAP_Details].[EnteredBy] ,
[dbo_FactCdbSAP_Details].[PartnerSegment] ,
[dbo_FactCdbSAP_Details].[PartnerBusinessArea] ,
[dbo_FactCdbSAP_Details].[ItemText] ,
[dbo_FactCdbSAP_Details].[ID_Date] ,
[dbo_FactCdbSAP_Details].[ID_CostCategory] ,
[dbo_FactCdbSAP_Details].[ID_CostCenter] ,
[dbo_FactCdbSAP_Details].[ID_Currency] ,
[dbo_FactCdbSAP_Details].[ID_Branch] ,
[dbo_FactCdbSAP_Details].[ID_Customer] ,
[dbo_FactCdbSAP_Details].[ID_Scenario] ,
[dbo_DimCostCategory_3].[AccountNo] ,
[dbo_DimCostCategory_3].[AccountNameDEU] ,
[dbo_DimCostCategory_3].[AccountNameEng] ,
[dbo_DimCostCategory_3].[AccountType] ,
[dbo_DimCostCategory_3].[AccountSetSAP] ,
[dbo_DimCostCenter_4].[CostCenterNo] ,
[dbo_DimCostCenter_4].[CostCenterName] ,
[dbo_DimCostCenter_4].[CostCenterAliasDEU] ,
[dbo_DimCostCenter_4].[CostCenterAliasENG] ,
[dbo_DimCurrency_5].[CurrencyCode] ,
[dbo_DimCurrency_5].[CurrencyENG] ,
[dbo_DimBranchShare_6].[Branch No] ,
[dbo_DimBranchShare_6].[Branch Name DE] ,
[dbo_DimBranchShare_6].[Branch Name TM1] ,
[dbo_DimBranchShare_6].[Branch Name ENG] ,
[dbo_DimBranchShare_6].[BranchId] ,
[dbo_DimBranchShare_6].[SharePercentage] ,
[dbo_DimBranchShare_6].[Branch Name ASL] ,
[dbo_DimBranchShare_6].[Country] ,
[dbo_DimBranchShare_6].[Currency] ,
[dbo_DimBranchShare_6].[IsSAP] ,
[dbo_DimCustomers_7].[Customer No] ,
[dbo_DimCustomers_7].[Customer Name1] ,
[dbo_DimCustomers_7].[Short Name] ,
[dbo_DimCustomers_7].[Street] ,
[dbo_DimCustomers_7].[Country] ,
[dbo_DimCustomers_7].[Postal Code] ,
[dbo_DimCustomers_7].[Telefon No] ,
[dbo_DimCustomers_7].[Fax TeletexNo] ,
[dbo_DimCustomers_7].[Attending BST] ,
[dbo_DimCustomers_7].[Key Industry Sector] ,
[dbo_DimCustomers_7].[Booking No] ,
[dbo_DimCustomers_7].[Status Inactiv] ,
[dbo_DimCustomers_7].[Company Key] ,
[dbo_DimCustomers_7].[Direct Mailing Forwarder] ,
[dbo_DimCustomers_7].[Direct Mailing BKeeping] ,
[dbo_DimCustomers_7].[Direct Mailing Sales] ,
[dbo_DimCustomers_7].[Direct Mailing Magazines] ,
[dbo_DimCustomers_7].[Customer Name2] ,
[dbo_DimCustomers_7].[Customer Name3] ,
[dbo_DimScenario_8].[ScenarioTypeENG] ,
[dbo_DimDate_2].[Quarter] ,
[dbo_DimDate_2].[Jan-Feb] ,
[dbo_DimDate_2].[Jan-Mrz] ,
[dbo_DimDate_2].[Jan-Apr] ,
[dbo_DimDate_2].[Jan-Mai] ,
[dbo_DimDate_2].[Jan-Jun] ,
[dbo_DimDate_2].[Jan-Jul] ,
[dbo_DimDate_2].[Jan-Aug] ,
[dbo_DimDate_2].[Jan-Sep] ,
[dbo_DimDate_2].[Jan-Okt] ,
[dbo_DimDate_2].[Jan-Nov] ,
[dbo_DimDate_2].[Jan-Dez] ,
[dbo_DimDate_2].[MonthName] ,
[dbo_DimDate_2].[Semester]
FROM (
SELECT
[dbo].[FactCdbSAP_Details].[Pk_id],
[dbo].[FactCdbSAP_Details].[ID_Date],
[dbo].[FactCdbSAP_Details].[ID_Scenario],
[dbo].[FactCdbSAP_Details].[ID_Branch],
[dbo].[FactCdbSAP_Details].[ID_CostCategory],
[dbo].[FactCdbSAP_Details].[ID_CostCenter],
[dbo].[FactCdbSAP_Details].[ID_Customer],
[dbo].[FactCdbSAP_Details].[ID_Currency],
[dbo].[FactCdbSAP_Details].[DocumentNo],
[dbo].[FactCdbSAP_Details].[DocumentLine],
[dbo].[FactCdbSAP_Details].[DocumentHeader],
[dbo].[FactCdbSAP_Details].[DocumentType],
[dbo].[FactCdbSAP_Details].[Reference],
[dbo].[FactCdbSAP_Details].[DocumentDate],
[dbo].[FactCdbSAP_Details].[EntryDate],
[dbo].[FactCdbSAP_Details].[FiscalPeriod],
[dbo].[FactCdbSAP_Details].[StornoDocNo],
[dbo].[FactCdbSAP_Details].[DocumentCurrency],
[dbo].[FactCdbSAP_Details].[CustomerNumber],
[dbo].[FactCdbSAP_Details].[EnteredBy],
[dbo].[FactCdbSAP_Details].[PartnerSegment],
[dbo].[FactCdbSAP_Details].[PartnerBusinessArea],
[dbo].[FactCdbSAP_Details].[ItemText],
[dbo].[FactCdbSAP_Details].[Amount],
[dbo].[FactCdbSAP_Details].[SharedAmount]
FROM [dbo].[FactCdbSAP_Details]
WHERE
id_date >201509
) AS [dbo_FactCdbSAP_Details],
[dbo].[DimCostCategory] AS [dbo_DimCostCategory_3],
[dbo].[DimCostCenter] AS [dbo_DimCostCenter_4],
[dbo].[DimCurrency] AS [dbo_DimCurrency_5],
[dbo].[DimBranchShare] AS [dbo_DimBranchShare_6],
[dbo].[DimCustomers] AS [dbo_DimCustomers_7],
[dbo].[DimScenario] AS [dbo_DimScenario_8],
[dbo].[DimDate] AS [dbo_DimDate_2]
WHERE
[dbo_FactCdbSAP_Details].[ID_Date] = [dbo_DimDate_2].[ID_Date]
AND
[dbo_FactCdbSAP_Details].[ID_CostCategory] = [dbo_DimCostCategory_3].[PK_Cost]
AND
[dbo_FactCdbSAP_Details].[ID_CostCenter] = [dbo_DimCostCenter_4].[Pk_CostCenter]
AND
[dbo_FactCdbSAP_Details].[ID_Currency] = [dbo_DimCurrency_5].[Pk_Currency]
AND
[dbo_FactCdbSAP_Details].[ID_Branch] = [dbo_DimBranchShare_6].[PK_ShareBranch]
AND
[dbo_FactCdbSAP_Details].[ID_Customer] = [dbo_DimCustomers_7].[Pk_Customer]
AND
[dbo_FactCdbSAP_Details].[ID_Scenario] = [dbo_DimScenario_8].[Pk_Scenario]
AND
[dbo_DimCurrency_5].[CurrencyDEU] = 'Lokale Währung'
AND
[dbo_DimScenario_8].[ScenarioTypeDEU] = 'Ist'
AND
[dbo_DimDate_2].[Year] = 2016
AND
[dbo_DimDate_2].[Month] = 2
group by
....
In order to receive a good performance both for drillthrough action and processing the following solution was found and implemented:
I changed the storage mode of the degenerate dimension in MOLAP
AttributeHierarchyOptimizedState=FullyOptimized for all atributes of the degenerate dimension
AttributeHierarchyOrdered=false for the primary key of the degenerate dimension
I implemented the Process Add. A Delta Table Today-Yesterday was created to find out data that are suitable for ProcessAdd (in this scenario old data won't change)
An SSIS Package was created with a DataFlowTask. Inside the DataFlowTask, delta table was set as OLEDB Source and Dimension Processing Task as Destination (this means Incremental Adds direct in the MOLAP Dimension ). The picture below shows that:
In the end will be processed only the affected Cube Partition also with ProcessAdd option.
Many thanks to Greg Galloway for describing ProcessAdd ond large dimension on this post http://www.artisconsulting.com/blogs/greggalloway/2007/4/20/processadd-on-large-dimensions

Abysmal performance using DECRYPTBYKEY in SQL Server 2008 R2

I'm looking at a query that has relatively complex relationship with other tables. Here's what my original query looks like:
SELECT
SE_CUS_DELIVERY.RECEIPT_NAME,
SE_CUS_DELIVERY.ORDER_PHONE,
SE_CUS_DELIVERY.ORDER_ZIP,
SE_CUS_DELIVERY.ORDER_ADDR,
ISNULL(SE_CUS_DELIVERY.DELIV_QTY,SRCHGT.CREQTY),
SE_CUS_DELIVERY.ORDER_HAND
FROM LC_OUT_REQUEST_DETAIL ,
SE_INVENTORY ,
SE_CUSTOMER ,
SRCHGT ,
SE_CUS_DELIVERY
WHERE
LC_OUT_REQUEST_DETAIL.TOTDATE = '20140203'
AND LC_OUT_REQUEST_DETAIL.IO_GB = '021'
AND LC_OUT_REQUEST_DETAIL.LOCCD >= 'A0000'
... A lot of additional joins here
group by SRCHGT.CRDATE + SRCHGT.CRESEQ + SRCHGT.CRESEQ_SEQ + SE_CUS_DELIVERY.DELIV_SEQ ,
SE_CUS_DELIVERY.RECEIPT_NAME ,
SE_CUS_DELIVERY.ORDER_PHONE ,
SE_CUS_DELIVERY.ORDER_ZIP ,
SE_CUS_DELIVERY.ORDER_ADDR ,
ISNULL(SE_CUS_DELIVERY.DELIV_QTY,SRCHGT.CREQTY) ,
... Also a lot of group by's following here
order by LC_OUT_REQUEST_DETAIL.TOTDATE,
LC_OUT_REQUEST_DETAIL.TOT_NO asc,
LC_OUT_REQUEST_DETAIL.TOT_NO_SEQ
To my surprise, it takes about a second to retrieve more than 10,000 rows.
Nevertheless, I've encrypted data in some columns that contains sensitive data, and I modify my select query like so to get the original value:
open Symmetric Key Sym_Key_TestEnc
decryption by certificate Cert_Test
with password = 'somepasswordhere'
GO
SELECT
DECRYPTBYKEY(SE_CUS_DELIVERY.RECEIPT_NAME),
DECRYPTBYKEY(SE_CUS_DELIVERY.ORDER_PHONE),
DECRYPTBYKEY(SE_CUS_DELIVERY.ORDER_ZIP),
DECRYPTBYKEY(SE_CUS_DELIVERY.ORDER_ADDR),
ISNULL(SE_CUS_DELIVERY.DELIV_QTY,SRCHGT.CREQTY),
DECRYPTBYKEY(SE_CUS_DELIVERY.ORDER_HAND)
FROM LC_OUT_REQUEST_DETAIL,
SE_INVENTORY ,
SE_CUSTOMER ,
SRCHGT ,
SE_CUS_DELIVERY
WHERE
LC_OUT_REQUEST_DETAIL.TOTDATE = '20140203'
AND LC_OUT_REQUEST_DETAIL.IO_GB = '021'
AND LC_OUT_REQUEST_DETAIL.LOCCD >= 'A0000'
AND LC_OUT_REQUEST_DETAIL.LOCCD <= 'A9999'
AND LC_OUT_REQUEST_DETAIL.MAT_CD = SE_INVENTORY.MAT_CD
AND LC_OUT_REQUEST_DETAIL.JCOLOR = SE_INVENTORY.JCOLOR
....
group by SRCHGT.CRDATE + SRCHGT.CRESEQ + SRCHGT.CRESEQ_SEQ + SE_CUS_DELIVERY.DELIV_SEQ ,
SE_CUS_DELIVERY.RECEIPT_NAME ,
SE_CUS_DELIVERY.ORDER_PHONE ,
SE_CUS_DELIVERY.ORDER_ZIP ,
SE_CUS_DELIVERY.ORDER_ADDR ,
.......
GO
Close Symmetric key Sym_Key_TestEnc
Now the performance is abysmal. I've been running the same query for more than 5 minutes and it still does not complete.
According to the MSDN, there shouldn't be much issues performance wise
Symmetric encryption and decryption is relatively fast, and is
suitable for working with large amounts of data.
Which leads me to think that I must be doing something wrong. Or MSDN is lying to me, but that's probably not the case.
Is there a way to optimize the data decryption in this process?