Qlikview - QVW execution failure - qlikview

We've got a QVW Script failing as it can't find the table to concatenate on, or load into the QVD.
ERROR messages shown on partial reload
ERROR MESSAGE 1
Table not found
Concatenate (DATES)
LOAD
'P' & Num(period,'00') & yearcode AS #dFinYearPeriod,
Num(period,'00') as dFinYearPeriod,
Num(period,'00') as dFinPeriod,
'' as dMonthEnd,
Text(yearcode-1) & '/' & Text(yearcode-2000) AS dFinYear,
Num(yearcode) AS dFinYearOnly,
'' AS dMonth,
yearcode as dYear,
'' AS dMonthNo,
'' as dFinYearEnd_Cur,
'' as dFinYearEnd_Prev
ERROR MESSAGE 2
Table not found
STORE DATES into C:\QlikView\QVD\DATES.qvd (qvd)
We've been running back and forth through the script and can't find the cause of the error. Nothing has been changed in the QVW as far as we're aware, the OLEDB connection is fine, and the stored procedure involved is working correctly, as is the sql script.
from the error messages we're getting this looks to be the script failure point, but we can't work out why...
DATES:
LOAD
'P' & Num(dFinPeriod,'00') & Date(dFinYearEnd_Cur,'YYYY') AS #dFinYearPeriod,
if(isnull(dMonthEnd),
Num(dFinPeriod,'00'),
(if(dMonthEnd = '',
Num(dFinPeriod,'00'),
Num(dFinPeriod,'00') & ' (' & Text(Date (dMonthEnd,'MMM')) & ')'
)
)
) as dFinYearPeriod,
Num(dFinPeriod,'00') as dFinPeriod,
Date(dMonthEnd, 'DD/MM/YYYY') as dMonthEnd,
Text(Date(dFinYearEnd_Prev,'YYYY')) & '/' & Text(Date (dFinYearEnd_Cur,'YY')) AS dFinYear,
Year(Date(dFinYearEnd_Cur, 'DD/MM/YYYY')) AS dFinYearOnly, //Return integer
Text(Date(dMonthEnd,'MMM')) AS dMonth,
Text(Date(dMonthEnd,'YYYY')) as dYear,
Num(Month(dMonthEnd),'00') AS dMonthNo,
Date(dFinYearEnd_Cur,'DD/MM/YYYY') as dFinYearEnd_Cur,
Date(dFinYearEnd_Prev,'DD/MM/YYYY') as dFinYearEnd_Prev
//Filter to only financial year 2011/2 and later
WHERE Text(Date(dFinYearEnd_Cur,'YYYY'))>=2012
;
SQL EXEC
dbo.spGetMonthEnds
;
//Add on the non-date f periods ie. 13 to 16
Concatenate (DATES)
LOAD
'P' & Num(period,'00') & yearcode AS #dFinYearPeriod,
Num(period,'00') as dFinYearPeriod,
Num(period,'00') as dFinPeriod,
'' as dMonthEnd,
Text(yearcode-1) & '/' & Text(yearcode-2000) AS dFinYear,
Num(yearcode) AS dFinYearOnly,
'' AS dMonth,
yearcode as dYear,
'' AS dMonthNo,
'' as dFinYearEnd_Cur,
'' as dFinYearEnd_Prev
;
SQL Select
yearcode,
period
from
d_details
where
period <>'R' and
period >12 and period <=16
and yearcode >=2012
group by
yearcode,
period
;
STORE DATES into $(vFolder)DATES.qvd (qvd);
DROP Table DATES;
message on full reload
Connecting to Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=WRVS;Data Source=rvs-psfsql-1-a;Use Procedure for Prepare=1;Auto Translate=True;Packet Size=4096;Workstation ID=WRVS-CLICK-1-A;Use Encryption for Data=False;Tag with column collation when possible=False
Connected
DATES << EXEC
dbo.spGetMonthEnds
48 lines fetched
DATES << Select
yearcode,
period
from
d_details
where
61 lines fetched
the script execution seems to work ish, it's pulling lines back, but can't seem to find the DATES table to concatenate or store into the QVD.
The date manipulation taking place was all in place preiously, and there's nothing weird coming through on the SQL scripts to break any of that.
Any ideas please?
Thanks

It turned out to be a Kerberos error unrelated to the Qlikview application itself
Error Code: 0x7 KDC_ERR_S_PRINCIPAL_UNKNOWN
I think it was related to some virus cleanup work that took place overnight. A server reboot solved the issue.

Related

SAS Passthrough Query Runs With Hard Coded Dates, but not Macro Variables as Dates

I have a script that runs a SAS passhtrough query that connects to an Oracle database. This was part of a cronjob that runs on a Unix server, and has had no issues for years. In the past few weeks however, the job has started hanging up on this one particular step - according to logs it used to take about 15 seconds to run but now just will run indefinitely before we have to kill the job. There are no associated errors or warnings in the log - the job will create a lockfile and just run indefinitely until we have to kill it.
The step where the job hangs up is pasted in below. There are two macro variables &start_dt and &end_dt, which represent the date range the job is pulling sales data for.
While investigating, we tried a few different approaches, and were able to get this step to run successfully and in its usual time by changing three things:
running the script through an Enterprise Guide client which connects
to the same server as opposed to running the script via CLI / shell
script
Changing the library the step writes to to work instead of writing
the dataset to salesdata library (as seen in code below)
Changing the dates to hardcoded values instead of macro variables.
As for the date variables themselves, they are strings in date9 format, e.g
&start_dt = '08-May-22', &end_dt = '14-May-22'. Initially I suspected the issue was related to the way the dates are structured since this is an older project I have inherited, but am confused to why the job ran without issue for so long up until a few weeks ago, even with these oddly formatted date macro vars.
The other possibility I considered was that some sort of resource on the unix server was getting locked up when it got to this step, potentially from some sort of hanging job or some other conflict with an older file such as a log or a previous sas dataset.
Problematic version of the step in the script pasted below:
PROC SQL;
connect to oracle(user=&uid pass=&pwd path='#dw');
create table salesdata.shipped as
Select
SKN_NBR,
COLOR_NBR,
SIZE_NBR,
SALESDIV_KEY,
ORDER_LINE_QTY as QUANTITY label="SUM(ORDER_LINE_QTY)",
EX1 as DOLLARS label="SUM(EX1)" from connection to oracle(
select
A1."SKN_NBR",
A1."COLOR_NBR",
A1."SIZE_NBR",
decode(A1."SALESDIV_KEY", 'ILB', 'IQ',
'IQ ', 'IQ',
'IQC', 'IQ',
'ISQ', 'IQ',
'IWC', 'IQ',
'QVC'),
SUM(A1."ORDER_LINE_QTY"),
SUM(A1."ORDER_LINE_QTY" * A1."ORDER_LINE_PRICE_AMT")
from DW.ORDERLINE A1, DISTINCT_SKN A2, DW.ORDERSTATUSTYPE A3
where
A2."SKN_NBR" = A1."SKN_NBR" AND
A1."CURRENT_STATUS_DATE" Between &start_dt and &end_dt AND
A1."ORDERLINESTATUS_KEY" = A3."ORDERLINESTATUS_KEY" AND
A3."ORDERSTATUS_SHIPPED" = 'Y' AND
A1."ORDER_LINE_PRICE_AMT" > 0
group by A1."SKN_NBR",
A1."COLOR_NBR",
A1."SIZE_NBR",
decode(A1."SALESDIV_KEY", 'ILB', 'IQ',
'IQ ', 'IQ',
'IQC', 'IQ',
'ISQ', 'IQ',
'IWC', 'IQ',
'QVC')
order by A1."SKN_NBR",
A1."COLOR_NBR",
A1."SIZE_NBR",
decode(A1."SALESDIV_KEY", 'ILB', 'IQ',
'IQ ', 'IQ',
'IQC', 'IQ',
'ISQ', 'IQ',
'IWC', 'IQ',
'QVC')
) as t1(SKN_NBR, COLOR_NBR, SIZE_NBR, SALESDIV_KEY, ORDER_LINE_QTY, EX1)
;
disconnect from oracle; quit;
[1]: https://i.stack.imgur.com/GGjin.jpg
What style you need to use for date constants in Oracle depends on your settings in Oracle. But normally you can use expressions like one of these
date '2022-05-14'
'2022-05-14'
You seem to claim that on your system you can use values like
'14-May-22'
(how does Oracle know what century you mean by that?).
Note that in Oracle it is important to use single quotes around constants as it interprets strings in double quotes as object names.
So if you have a date value in SAS just make sure to make the macro variable value look like what Oracle wants.
For example to set ENDDT to today's date you could use:
data _null_;
call symputx('enddt',quote(put(today(),date11.),"'"));
run;
Which would the same as
%let enddt='17-MAY-2022';
So #Tom answer was helpful - it appears that our DBAs updated some settings a few weeks back that impacted how stringent Oracle is in terms of which date formats are accepted.
For what it's worth, the date macro vars were being constructed on the fly using a clunky data step that read off of a date key dataset:
You'll notice the last piece of the date string being put together for bost variables uses year2. format, so just the last two digits of the year.
To #Tom's point, this is apparently confusing Oracle in terms of which century its in, so the job gets hung up.
data dateparm;
set salesdata.week_end_date;
start = "'" || put(day(week_end_date - 6), z2.) || '-' || put(week_end_date - 6, monname3.) || '-' ||
put(week_end_date - 6, year2.) || "'";
end = "'" || put(day(week_end_date), z2.) || '-' || put(week_end_date, monname3.) || '-' ||
put(week_end_date, year2.) || "'";
call symput('start_dt', start);
call symput('end_dt', end);
run;
Once I changed this step to use year4. format for the last piece, the job was able to run fine without incident on both unix and E guide. Example below:
data dateparm;
set npdd.week_end_date;
start = "'" || put(day(week_end_date - 6), z2.) || '-' || put(week_end_date - 6, monname3.) || '-' ||
put(week_end_date - 6, year4.) || "'";
end = "'" || put(day(week_end_date), z2.) || '-' || put(week_end_date, monname3.) || '-' ||
put(week_end_date, year4.) || "'";
call symput('start_dt', start);
call symput('end_dt', end);
run;

embedded IIF States in SSRS

Thank you for taking your time to help me today. I am trying to use multiple if statements to control what value is displayed depending on whether each statement is true. So right now I have this below which is essentially:
IIF(expression = NULL
, CompanyAddress
, IIF(Expression='TX'
, IIF(BOOL=TRUE
,CompanyAddress
, SWITCH(DEALER ADDRESSES))
,CompanyAddress)
)
I have tested each individual IIF statements separately and I get the outcomes which I expect. Currently in the first IIF statement and the Expression = NULL is TRUE , It just outputs #Error and not the "Nothin" OR in my real case Company Address. But if Expression = NULL is FAlSE, I do get the correct output of either the companyAddress or the Dealer.
=IIF(IsNothing(Fields!CoOppId.Value)
,("nothin")
, (IIF(Fields!Addr1.Value.Contains("TX")
, IIF(Fields!UDFCustProv.Value = 1
, Fields!Addr0.Value
, Switch(
Fields!UDFMake.Value.Contains("Chevy")
, "Knapp Chevrolet" + chr(10) + "PO box " + chr(10) + "Houston TX 77210"
, Fields!UDFMake.Value.contains("Ford")
, "Sterling McCall Ford" + chr(10) + "6445 Southwest Freeway" + chr(10) + "Houston TX 77074"
, Fields!UDFMake.Value.contains("International")
, "Pliler International" + chr(10) + "2016 S. Eastman Rd" + chr(10) + "Longview TX 75602"
, Fields!UDFMake.Value.contains("Freightliner")
, "Houston Freightliner, Inc" + chr(10) +"9550 North Loop East" + chr(10) + "Houston TX 77029"
, Fields!UDFMake.Value.contains("RAM")
, "Max Haik Dodge Chrysler Jeep" +chr(10)+ "11000 I-45 North Freeway" + chr(10) + "Houston TX 77037")),Fields!Addr0.Value)))
I agree with #Daniel, the error is most likely being produce by the Fields!UDFMake.Value.Contains when the value is null, as IIF does not short-circuit.
Alternatively to the good options that #Daniel mentioned you can replace the contains method by the function InStr as:
... , Switch(
InStr(Fields!UDFMake.Value,"Chevy") > 0
, "Knapp Chevrolet" + chr(10) + "PO box " + chr(10) + "Houston TX 77210" ...
this will not produce an error even when the value of the field is Null.
I'm going to take a guess that when your CoOppId value is NULL, that your other fields in that row are also NULL. Because IIF does not utilize short circuit logic (it always evaluates both sides of the IIF), you are trying to evaluate the expression "NULL.Contains("TX")" and that will generate an #ERROR because NULL is not a string and cannot be operated on with the CONTAINS function.
There are two workarounds available for this scenario, neither of them particularly nice in my opinion, however:
1) Use nested IIFs to ensure that nothing is ever invalid.
IIF(expression is NULL
, CompanyAddress
, IIF(**IIF(expression is NULL, "", expression)** ='TX'
, IIF(BOOL=TRUE
,CompanyAddress
, SWITCH(DEALER ADDRESSES))
,CompanyAddress)
)
Look at the pseudo code above and notice the additional nested IIF around the expression that is using the CONTAINS functionality. If CoOppId doesn't exist, it substitutes in an empty string for the CONTAINS check. Even though this branch never shows it's value for the null scenario, it will at least be valid now.
2) Create a code-behind function that actually does perform short circuit logic for you:
Public Function CheckForNull(ByVal CoOppId As String, ByVal Addr1 as String, ByVal UDFMake As String, ... all fields)
If String.IsNullOrEmpty(CoOppId)
Return "Nothing"
Else
Return *** do your calculation with your fields here
End If
End Function
Which you utilize in your report like:
=Code.CheckForNull(values....)
I just roughly laid out how such a code behind function works, it's obviously not complete but should be enough to point you in the right direction.

PowerBuilder 12.5 sql cursors transaction size error

i have a major problem and trying to find a workaround. I have an application in PB12.5 that works on both sql and oracle dbs.. (with a lot of data)
and i m using CURSOR at a point,, but the aplications crashes only in sql. Using debuging in PB i found that the sql connection returs -1 due to huge transaction size. But i want to fetch row by row my data.. is any work around to fetch data like paging?? i mean lets fetch the first 1000 rows next the other 1000 and so on.. i hope that you understand what i want to achieve (to break the fetch process and so to reduce the transaction size if possible) , here is my code
DECLARE trans_Curs CURSOR FOR
SELECT associate_trans.trans_code
FROM associate_trans
WHERE associate_trans.usage_code = :ggs_vars.usage ORDER BY associate_trans.trans_code ;
OPEN trans_Curs;
FETCH trans_Curs INTO :ll_transId;
DO WHILE sqlca.sqlcode = 0
ll_index += 1
hpb_1.Position = ll_index
if not guo_associates.of_asstrans_updatemaster( ll_transId, ls_error) then
ROLLBACK;
CLOSE trans_Curs;
SetPointer(Arrow!)
MessageBox("Update Process", "Problem with the update process on~r~n" + sqlca.sqlerrtext)
cb_2.Enabled = TRUE
return
end if
FETCH trans_Curs INTO :ll_transId;
LOOP
CLOSE trans_Curs;
Since the structure of your source table s not fully presented, I'll make some assumptions here.
Let's assume that the records include a unique field that can be used as a reference (could be a counter or a timestamp). I'll assume here that the field is a timestamp.
Let's also assume that PB accepts cursors with parameters (not all solutions do; if it does not, there are simple workarounds).
You could modify your cursor to be something like:
[Note: I'm assuming also that the syntax presented here is valid for your environment; if not, adaptations are simple]
DECLARE TopTime TIMESTAMP ;
DECLARE trans_Curs CURSOR FOR
SELECT ots.associate_trans.trans_code
FROM ots.associate_trans
WHERE ots.associate_trans.usage_code = :ggs_vars.usage
AND ots.associate_trans.Timestamp < TopTime
ORDER BY ots.associate_trans.trans_code
LIMIT 1000 ;
:
:
IF (p_Start_Timestamp IS NULL) THEN
TopTime = CURRENT_TIMESTAMP() ;
ELSE
TopTime = p_Start_Timestamp ;
END IF ;
OPEN trans_Curs;
FETCH trans_Curs INTO :ll_transId;
:
:
In the above:
p_Start_Timestamp is a received timestamp parameter which would initially be empty and then will contain the OLDEST timestamp fetched in the previous invocation,
CURRENT_TIMESTAMP() is a function of your environment returning the current timestamp.
This solution will work solely when you need to progress in one direction (i.e. from present to past) and that you are accumulating all the fetched records in an internal buffer in case you need to scroll up again.
Hope this makes things clearer.
First of all thank you FDavidov for your effort, so i managed to do it using dynamic datastore instead of cursor,, so here is my solution in case someone else need this.
String ls_sql, ls_syntax, ls_err
Long ll_row
DataStore lds_info
ls_sql = "SELECT associate_trans.trans_code " &
+ " FROM associate_trans " &
+ " WHERE associate_trans.usage_code = '" + ggs_vars.usage +"' "&
+ " ORDER BY associate_trans.trans_code"
ls_syntax = SQLCA.SyntaxFromSQL( ls_sql, "", ls_err )
IF ls_err <> '' THEN
MessageBox( 'Error...', ls_err )
RETURN
END IF
lds_info = CREATE DataStore
lds_info.Create( ls_syntax, ls_err )
lds_info.SetTransObject( SQLCA )
lds_info.Retrieve( )
DO WHILE sqlca.sqlcode = 0 and ll_row <= ll_count
FOR ll_row = 1 TO ll_count
ll_transId = lds_info.GetItemNumber( ll_row, 'trans_code' )
ll_index += 1
hpb_1.Position = ll_index
do while yield(); loop
if not guo_associates.of_asstrans_updatemaster( ll_transId, ls_error) then
ROLLBACK;
DESTROY lds_info
SetPointer(Arrow!)
MessageBox("Update Process", "Problem with the update process on~r~n" + sqlca.sqlerrtext)
cb_2.Enabled = TRUE
return
end if
NEXT
DESTROY lds_info
LOOP

Error "DBCC execution completed" when running data connection from Excel

I get this error when I try to run this data connection from Excel 2010
Connection string:
Provider=SQLOLEDB.1;Persist Security Info=True;User ID=sa;Data Source=sql-
server;Use Procedure for Prepare=1;Auto Translate=True;Packet
Size=4096;Workstation ID=PV-SAMSUNG;Use Encryption for Data=False;Tag with
column collation when possible=False;Initial Catalog=BVR_AUTOMAX
Command text:
EXECUTE sp_executesql N'
BEGIN
DBCC TRACEON(8765);
SELECT *
FROM OPENQUERY(SugarCRM, ''
select ticker_symbol,count(a.id) as pocet,sum(case when ifnull(a.account_erp_id,0)=''''0'''' then 0 else 1 end) as bvr, count(a.id) - sum(case when ifnull(a.account_erp_id,0)=''''0'''' then 0 else 1 end) as delta
from crm.accounts a inner join crm.users u on a.assigned_user_id=u.id
inner join crm.accounts_cstm ac on a.id=ac.id_c
where a.deleted=0
group by ticker_symbol
having delta>0 and bvr>0
order by delta desc;
'' );
END';
When I run this code in MS SQL Server Mngt Studio it works fine.
Thanks for your help
Petr
I found a solution, quite simple one. Just change this line of code to the following (add the WITH NO_INFOMSGS argument)
DBCC TRACEON(8765) WITH NO_INFOMSGS ;

Distributed Transaction SQL Error using Linked Server

I'm having some database troubles, and can't seem to find a solution anywhere for this specific problem. I'm trying to grab information from a database using a table on the database and comparing it to a table on a linked server
[3/7/14 10:10:14:181 EST] 00000021 SystemErr R org.apache.ibatis.exceptions.PersistenceException:
### Error querying database. Cause: com.microsoft.sqlserver.jdbc.SQLServerException: The operation could not be performed because OLE DB provider "SQLNCLI" for linked server "EADWH02" was unable to begin a distributed transaction.
### The error may involve com.moog.app.weldlog.dao.mybatis.WeldLogMapping.listEmployee-Inline
### The error occurred while setting parameters
### Cause: com.microsoft.sqlserver.jdbc.SQLServerException: The operation could not be performed because OLE DB provider "SQLNCLI" for linked server "EADWH02" was unable to begin a distributed transaction.
Here is one of the specific stored procedures that is throwing this error:
#workOrder varchar(25) = null,
#idPartNumber varchar(25) = null
begin
IF #workOrder is null
SELECT DISTINCT TOP 25 WO.WO_NBR, PARTS.PART_NBR
FROM EADWH02.MEDW.LBR.WO WO
JOIN EADWH02.MEDW.MFG.PARTS PARTS
ON PARTS.PART_KEY = WO.PART_KEY
AND PARTS.SRC_DB_KEY = WO.SRC_DB_KEY
WHERE (WO.WO_TYPE_KEY = '2' or WO.WO_TYPE_KEY = '3' or WO.WO_TYPE_KEY = '4' or WO.WO_TYPE_KEY = '5') and
(PARTS.PART_NBR like '' + #idPartNumber + '%')
ELSE IF #idPartNumber is null
SELECT DISTINCT TOP 25 WO.WO_NBR, PARTS.PART_NBR, dbo.getDescriptions(WO.WO_NBR) as strDescriptions
FROM EADWH02.MEDW.LBR.WO WO
JOIN EADWH02.MEDW.MFG.PARTS PARTS
ON PARTS.PART_KEY = WO.PART_KEY
AND PARTS.SRC_DB_KEY = WO.SRC_DB_KEY
WHERE ((WO.WO_TYPE_KEY = '2' or WO.WO_TYPE_KEY = '3' or WO.WO_TYPE_KEY = '4' or WO.WO_TYPE_KEY = '5') and
(WO.WO_NBR like '' + #workOrder))
ORDER BY WO.WO_NBR
end
((dbo.getDescriptions is a scalar-valued function and works fine))
And here is my xml mapping file in the project:
<select id="listWorkOrder" statementType="CALLABLE" parameterType="WorkOrder" resultMap="WorkOrderMap"> {
call listWorkOrder( #{workOrder, jdbcType=VARCHAR},
#{partNumber, jdbcType=VARCHAR})
}
</select>
Any chance anyone knows what's causing this error? This sql stored procedure breaks both on a remote server we're using and on a local server. Thanks for the assistance!