Omitting the creation of a temporary access path with DB2/400 when accessing DDS-defined tables with SQL - sql

I have two table definitions in DDS, compiled into *FILE objects and filled with data:
Kunpf:
A UNIQUE
A R KUNTBL
A FIRMA 60A ALWNULL
A KUNR 5S 0B
A KUNID 4S 0B
A K KUNR
A K KUNID
Kunsupf:
A R KUNSUTBL
A KUNID R B REFFLD(KUNID KUN/KUNPF)
A
A SUCHSTR 78A
A K SUCHSTR
A K KUNID
I'm using the following statement in interactive SQL (STRSQL):
SELECT DISTINCT FIRMA, KUNR FROM KUN/KUNPF
LEFT JOIN KUN/KUNSUPF ON (KUNPF.KUNID = KUNSUPF.KUNID)
WHERE SUCHSTR LIKE 'Freiburg%'
ORDER BY FIRMA
FOR READ ONLY
Everytime I execute this statement, I'm getting a considerable delay until the answer screen opens up. Beforehand a message is shown, stating that a temporary access path is being created.
How can I find out which/how this temporary access path is created? My goal is to have this access path made permanent so it doesn't need to be rebuilt with every invocation of this query.
I searched the net (especially the IBM site) but what I found out was mostly for DB2 on z/OS. The F4-Prompting facility in STRSQL doesn't provide help: I was searching for something like EXPLAIN SELECT from MySQL. The IBM DB2 Advanced Functions and Administration PDF states that there's a debug mode but it seems that it is only available from some (old) Windows tool I don't remember to have.
I'm utilizing V4R5, if this is relevant.

to see the access path on the green screen...
strdbg
strsql
run your statement
exit f3
enddbg
dspjoblog
the access path messages are at the bottom of the log f10 f18 afaik

v4r5??? That's like 20 years old...
For the IBM i, the "Run SQL Scripts" component of the old Client Access For Windows iSeries Navigator component and the new Access Client Solutions (ACS) contains Visual Explain (VE).
Luckily it seems though it was added to v4r5
http://ibmsystemsmag.com/ibmi/administrator/db2/database-performance-tuning-with-visual-explain/
Just start iNav, right click on "Database" and select "Run SQL Scripts"
Paste your query there and click "Visual Explain" -->"Run and Explain"
(or the corresponding button)
Optionally, in green screen.
Do a STRDBG to enter debug mode, F12 to continue and then go into STRSQL. The Db optimizer will then output additional messages into the joblog giving you more information about what it is doing..

Related

How to edit XML column in SQL Developer?

I would like to edit XML column which is displaying as (XMLTYPE) by SQL Developer editor (I go there by clicking on the field twice, edit, then save).
After that the displayed value changes to sqldev.xml:/home/myuser/.sqldeveloper/tmp/XMLType8226206531089284015.xml
Build after save retrieving next build context...
Build after save building project 1 of 1 queued projects
Compiling...
Ignoring /home/username/.sqldeveloper/tmp/XMLType5691884284875805681.xml; not on source path
[11:45:33 AM] Compilation complete: 0 errors, 1 warnings.
Build after save finished
and when I try to commit:
UPDATE "USERNAME"."TABLENAME" SET WHERE ROWID = 'AABWNKAAEAAABSbAAB' AND ORA_ROWSCN = '6951979'
One error saving changes to table "USERNAME"."TABLENAME":
Row 1: Illegal format in column NEXTCOLUMN.
I tried to look for this error and found people who also had it, but without the solution.
If you have an advice how to report it to Oracle, it will be also helpful.
Hope this will be of help to you:
UPDATE table_name
SET table_column=
UPDATEXML(table_column,
'/sampleTag1/sampleTag2/text()','value2')
WHERE some_column = some value --<< this part is where you put your condition
Here is where you can find more about it:
https://docs.oracle.com/cd/B19306_01/server.102/b14200/functions205.htm
-------------------------
If your problem is with editing through SQL developer manually via integrated editor then it is, as far as my testing and researching can tell, because of the SQL Developer version.
You have noted in your comment that you use version 4.1.x and I have found few places where people confirm that they had the same problem with this version.
I also have 4.1.x version and I have also successfully repeated your error where the developer is referring to my .xml file in my ...\sqldeveloper\tmp folder not being on it's source path :
Compiling... Ignoring C:\Users\trle\AppData\Roaming\SQL
Developer\tmp\XMLType6413036461637067751.xml; not on source path
[4:33:29 PM] Compilation complete: 0 errors, 1 warnings.
I have then downloaded version 19.2.x where there is no such problem and all works just fine.
So my answer to your problem is to download some newer version of SQL developer. In my case 19.2. works.
-------------------------
UPDATE Just tested on version 4.2.x - also works

HSTMT, UCHAR, SDWORD meaning in ODBC Log

I have the following lines in my ODBC Log :
(Scenario: pgAdmin4, using PostgreSQL server through ODBC32 in another application named System Administration)
System Admi 6a0-6f8 ENTER SQLExecDirect
HSTMT 0x00630718
UCHAR * 0x036B29C0 [ 63] "create table new (npages integer, ifnds integer, ifnid integer)"
SDWORD 63
I need to know the meaning of HSTMT, UCHAR* and SDWORD. What do those numbers next to them mean ?
From the article Everything You Want To Know About ODBC Tracing
In the trace log, you’ll see the data type of the handle being
SQLHANDLE, HENV / SQLHENV, HDBC / SQLHDBC, HSTMT / SQLHSTMT. This
value is unique to the particular series of items being called. You
can search in the trace log for this handle number and each call made
on the handle will be shown. This allows you to pick out only the
calls in the trace log that are relevant to your error message. Keep
in mind, some applications may create multiple handles. For example,
your application may execute three statements consecutively with each
residing on a different statement handle.
For examples and more details follow the next nice document:-
How to read an ODBC trace file.rtf
This is a record of ODBC tracing
There was a procedure "SQLExecDirect" with statement handle 0x00630718, text parameter (passed as pointer) is "create table ..." and last parameter is size of text (passed as word type). See related reply in MSDOC.
HSTMT,UCHAR, SDWORDS are data types used by C, C++ MS Windows API.

Invalid operation result set is closed errorcode 4470 sqlstate null - DB2 data extract

I am running a very simple query and trying to extract the results to a text file. The entire query is essentially what is below, I am selecting everything from one single table with one piece of where criteria which is limiting the data to one month's worth. After it has extracted around 1.2 gig this error shows up. Is there any way that I can work around this other than extracting smaller date ranges? I am trying to pull a couple of years worth of data so if I can only get it a few days at a time it will take a lot of manual work.
I am currently using the free trial of a DB2 query tool - Razor SQL if that makes a difference, I can probably purchase different software if it would help. I am trying to get IBM's tool but for some reason it freezes during the download so I am still working on that. I have searched about this error but everything I see seems much more complex than what I am doing and I can't tell if it applies or not. Thanks in advance.
select *
from MyTable
where date_col between date '2014-01-01' and date '2014-01-31'
I stumbled at this error too, found out it is related to db2jcc.jar (type 4) driver.
Excerpt: If there are no items in the result set left (or to begin with), the Result set is closed automatically and therefore the Exception. Suggestion is to handle it in the application, perhaps in my case, I started checking if(rs.next()) but otherwise, there is a work around. Check out the source link below for how you can set some properties to Data source and avoid exception.
Source :
"Invalid operation: result set is closed" error with Data Server Driver for JDBC
In my case, i missed some properties in WAS, after add allowNextOnExhaustedResultSet the issue is fixed.
1.Log in to the WebSphere Application Server administration console.
2.Select Resources > JDBC > Data sources > Application Center DataSource name > Custom properties and click New.
3.In the Name field, enter allowNextOnExhaustedResultSet.
4.In the Value field, type 1.
5.Change the type to java.lang.Integer.
6.Click OK.
Sometimes you need also check whether resultSetHoldability properties exists. Details refer to here.
I encountered this failure also when ugrading from JDBC Type 2 driver (db2java.zip) JDBC type 4 driver (db2jcc4.jar)
Statement statement = results.getStatement();
if (statement != null)
{
connection = statement.getConnection(); // ** failed here
statement.close();
}
Solution was to check if the statement is closed or not as follows.
Changed to:
Statement statement = results.getStatement();
if (statement != null && !statement.isClosed()) {
{
connection = statement.getConnection();
statement.close();
}
Creating property bellow with type Integer it's worked for me:
allowNextOnExhaustedResultSet:
I had the same issue on WAS 7 so i had to add and change few this on Admin Console.
This TeamWorksRuntimeException exception should be fixed by applying APAR JR50863 which is available on top of BPM V8.5.5 or included on BPM V8.5 refresh pack 6.
For the case that the APAR does not solve the problem, try following workaround:
Log in to the WebSphere Application Server admin console
Select Resources > JDBC > Data sources > DataSource name (TeamWorksDB) > Custom properties and click New
In the Name field, enter downgradeHoldCursorsUnderXa
In the Value field, type true
Change the type to java.lang.Boolean
Click OK to save your changes
Select custom property resultSetHoldability
In the Value field, type 1
Click OK to save your changes
Source of the Answer : https://developer.ibm.com/answers/questions/194821/invalid-operation-result-set-is-closed-errorcode-4/
Restarting the app may fix the problem if connection pool lost session to Db2. If using Tomcat then connection pool property of 'testonBorrow' may reestablish the connection to Db2.

updateblob fails in Powerbuilder

In Powerbuilder I am trying to update a table (Oracle) with blob but get sqlerror, "Database statement must refer to blob variable". My declaration and updateblob statements are as follows:
blob lblob_newxml
long llong_subid
UPDATEBLOB RP_XML_FORMS SET XML_DOC = :lblob_newxml
WHERE SUBMISSION_ID = :llong_subid
USING SQLCA;
Does anybody know why it is happening and or how to solve this problem? Thanks.
To get more information on this problem and the possible causes, I'd run with one of the database traces turned on. (You can check out database trace options in the Connecting to Your Database manual; link may not be appropriate for your PB version, which you haven't mentioned yet.) This may or may not tell you more, but it tracks everything between the app and when the PB drivers pass the commands "over the wall" to the database's driver.
Good luck,
Terry.
"The PowerBuilder VM can get the SQL syntax for the following types of errors, and passes it to the Transaction object’s DBError event for the following types of errors: ..." (see this page).
If your lblob_newxml is null then use this update statement instead:
UPDATE RP_XML_FORMS SET XML_DOC = NULL
WHERE SUBMISSION_ID = :llong_subid
USING SQLCA;

Restart my delta loading after delete the infopackage in PSA by mistake

here i have got one issue.can some one please help me to resolve this.
i was trying to extract some data to DS 0FI_AP_6...
then in InfoPackage Monitor I can see like..
-->Requests (messages): Everything OK
-->Extraction (messages): Everything OK
-->Transfer (IDocs and TRFC): Missing messages or warnings
-->Info IDoc 2 : sent, not arrived ; IDoc ready for dispatch (ALE service)
Data Package 1 : 23752 Records arrived in BW
Data Package 2 : 15216 Records arrived in BW
Request IDoc : Application document posted
Info IDoc 1 : Application document posted
Info IDoc 3 : Application document posted
Info IDoc 4 : Application document posted
-->Processing (data packet): Everything OK
Data Package 1 ( 38672 Records ) : Everything OK
in Status Menu I am having message like...
Missing data packages for PSA Table
Diagnosis
Data packets are missing from PSA Table . BI processing does not
return any errors. The data transport from the source system to BI was
probably incorrect.
Procedure
Check the tRFC overview in the source system.
You access this log using the wizard or following the menu path
"Environment -> Transact. RFC -> Source System".
Error handling:
If the tRFC is incorrect, resolve the errors listed there.
Check that the source system is connected properly to BI. In
particular, check the remote user authorizations in BI.
Please suggest me how to resolve this issue...
thanks in advance for your help and quick reply is much appreciated.
But what the worst thing is I deleted the infopackage in PSA by mistake.
In the normal case, if I repeat the process again, the delta load would be OK, but now the delta load remains error.
so gurus,
1. how can I restart my delta loading correctly?
2. I want to modify the timestamp in the delta table, but how to do it ?
Go to T-Code RSA7 in the source system. This will tell you the date/timestamp that the delta is set to. If the date was changed to a range that no longer works then you will need to re-initialize the datasource in the BW system side. However, the Delta date may still be fine becauase it may have never been changed when you tried to first do your load because of the connection issues.
You can create a new infopackage and set the update to Initialize Datasource with Data Transfer. This will essentially run a full load from the datasource and then reset the delta pointer date/timestamp to when you ran it. This way you will capture all the data that you needed and anything that was already in the PSA should be overwritten.
Also note that you should delete or set the request status to red on the previous request that may contain bad data in the PSA.
From the original error it seems like you are having an RFC connection issue between the datasource and BW. Contact your BASIS support and have them check the connection to make sure it is good. To ensure that your datasource is extracting properly you can run t-code RSA3 on it in the source system. This will ensure that the extraction of data is working properly.