System i SQL stream file from IFS - sql

IBM System i V7R3M0 I would like to pull a .json file with SQL. The SQL cannot resolve QSYS2.IFS_READ.
select x.id, x.username, x.firstName, x.lastName, x.email
from json_table(
QSYS2.IFS_READ(PATH_NAME => '/FileTransferData/userList.json'),
'lax $' columns (
id INTEGER PATH 'lax $.id',
username VARCHAR(30) FORMAT JSON PATH 'lax $.username',
firstName VARCHAR(30) FORMAT JSON PATH 'lax $.firstName',
lastName VARCHAR(30) FORMAT JSON PATH 'lax $.lastName',
email VARCHAR(75) FORMAT JSON PATH 'lax $.email'
)
)
as x;
The error is:
SQL State: 42704
Vendor Code: -204
Message: [SQL0204] IFS_READ in QSYS2 type *N not found. Cause . . . . . : IFS_READ in QSYS2 type *N was not found.
The job log in QZDASOINIT tells me the same thing, that I do not have IFS_READ in QSYS2
Job 111566/QUSER/QZDASOINIT started on 04/12/21 at 12:23:06 in subsystem QUSRWRK in QSYS. Job entered system on 04/12/21 at 12:23:06.
User MOLNARJ from client 10.111.0.24 connected to server.
The following special registers have been set: CLIENT_ACCTNG: ,
CLIENT_APPLNAME: System i Navigator - Run SQL Scripts, CLIENT_PROGRAMID:
cwbunnav.exe, CLIENT_USERID: MOLNARJ, CLIENT_WRKSTNNAME:
FS2ISYMOB943.umassmemorial.org
Trigger Q__OSYS_QAQQINI_BEFORE_INSERT_______ in library QTEMP was added to file QAQQINI in library QTEMP.
Trigger Q__OSYS_QAQQINI_AFTER_INSERT________ in library QTEMP was added to file QAQQINI in library QTEMP.
Trigger Q__OSYS_QAQQINI_BEFORE_UPDATE_______ in library QTEMP was added to file QAQQINI in library QTEMP.
Trigger Q__OSYS_QAQQINI_AFTER_UPDATE________ in library QTEMP was added to file QAQQINI in library QTEMP.
Trigger Q__OSYS_QAQQINI_BEFORE_DELETE_______ in library QTEMP was added to
file QAQQINI in library QTEMP.
Trigger Q__OSYS_QAQQINI_AFTER_DELETE________ in library QTEMP was added to file QAQQINI in library QTEMP.
Object QAQQINI in QTEMP type *FILE created.
1 objects duplicated.
IFS_READ in QSYS2 type *N not found.
I'm between a rock and a hard place. My outsourced support company claims that
I need to create a program to utilize the API. However, I believe the API is not installed.
I have based my work on IBM technical documents such as this:
https://www.ibm.com/docs/en/i/7.4?topic=is-ifs-read-ifs-read-binary-ifs-read-utf8-table-functions
Running the example in this document (with file path and name changed to mine) give the same error.
SELECT * FROM TABLE(QSYS2.IFS_READ(PATH_NAME => '/FileTransferData/userList.json', END_OF_LINE => 'CRLF'));

This link indicates PTF SF99703 level 22 is required. You can check what's installed or available with :
with iLevel(iVersion, iRelease) as (
select
OS_VERSION, OS_RELEASE
from
sysibmadm.env_sys_info )
select
case PTF_GROUP_CURRENCY when 'INSTALLED LEVEL IS CURRENT' then '' else PTF_GROUP_CURRENCY end,
PTF_GROUP_ID "ID",
PTF_GROUP_TITLE "Title",
PTF_GROUP_LEVEL_INSTALLED "Installed",
PTF_GROUP_LEVEL_AVAILABLE "Available",
ptf_group_level_available - ptf_group_level_installed "Diff",
date(to_date(PTF_GROUP_LAST_UPDATED_BY_IBM, 'MM/DD/YYYY')) "Available since",
current date - date(to_date(PTF_GROUP_LAST_UPDATED_BY_IBM, 'MM/DD/YYYY')) "Days since available",
PTF_GROUP_RELEASE "Release",
PTF_GROUP_STATUS_ON_SYSTEM "Status"
from
iLevel,
systools.group_ptf_currency P
where
ptf_group_id = 'SF99703'
order by
ptf_group_level_available - ptf_group_level_installed desc;

Related

How should I get snowpipe auto-ingest working?

Following is my snowpipe definition
create or replace pipe protection_job_runs_dms_test auto_ingest = true as
copy into protection_job_runs_dms_test_events from (select t.$1, t.$2, t.$3, t.$4, t.$5, t.$6, t.$7, t.$8, t.$9, t.$10, t.$11, t.$12, t.$13, t.$14, t.$15, t.$16,
t.$17, t.$18, t.$19, t.$20, t.$21, t.$22, t.$23, t.$24, current_timestamp from #S3DMSTESTSTAGE t)
FILE_FORMAT = (
FIELD_OPTIONALLY_ENCLOSED_BY='"'
)
pattern='dmstest/(?!LOAD).*[.]csv';
When I am executing the copy command manually, it is working correctly.
Anyone knows what might be the issue ?
Regarding to the comments to your questions you tested your COPY-command by loading the same files before without Snowpipe. This means your files have been loaded once and thus you cannot load them afterwards with Snowpipe. Reason: Snowflake prevents loading files twice by default.
You can add the FORCE=true parameter to your COPY-command to prevent this behaviour and load all files - regardless of whether they have been loaded or not.
More infos about the FORCE-parameter here: https://docs.snowflake.com/en/sql-reference/sql/copy-into-table.html
create or replace pipe protection_job_runs_dms_test auto_ingest = true as
copy into protection_job_runs_dms_test_events from (select t.$1, t.$2, t.$3, t.$4, t.$5, t.$6, t.$7, t.$8, t.$9, t.$10, t.$11, t.$12, t.$13, t.$14, t.$15, t.$16,
t.$17, t.$18, t.$19, t.$20, t.$21, t.$22, t.$23, t.$24, current_timestamp from #S3DMSTESTSTAGE t)
FILE_FORMAT = (
FIELD_OPTIONALLY_ENCLOSED_BY='"'
)
pattern='dmstest/(?!LOAD).*[.]csv'
force=true;

Abap Syntax error in Production only but not dev/qa

I have recently come across a syntax error 'No type can be derived from the context for the operator 'VALUE' in production that did not happen in dev or QA.
Declare a standard table type gty_t_type (of char20) in my class.
Create method A with importing parameter iv_param type gty_t_type.
Call class=>method A( exporting iv_param = value #( ( 'xxx' ) ( 'zzz' ) ).
The code compiled fine in dev and imported it to QA without errors. But failed to import to production with a syntax error. What might have caused the difference and lead to an error?
Thanks!

Access violation at address 04FD6CC2 in module 'dbxora.dll'. Read of address 00000004

I have created DELPHI database application which will use DBX TSQLConnection component to connect to Oracle database (19c version).
I'm getting access violation error , When i call oracle listagg function (SQLQuery1.open --in this line).
When i debugged , i got the error in below object file,
FileName : Data.DBXDynalink.pas
Function : function TDBXDynalinkReader.DerivedNext: Boolean;
Error Line : DBXResult := FMethodTable.FDBXReader_Next(FReaderHandle);
Actual Error : Access violation at address 04FD6CC2 in module 'dbxora.dll'. Read of address 00000004
Below is my code ,
...SQLQuery1 initialization....
SQLQuery1.CommandText := Trim(memoSQLText.Lines.Text); // Assigning query
SQLQuery1.Open; // Exactly on this line i'm getting error
if SQLQuery1.RecordCount > 0 then
....Do something here....
Note : Same query is executing in other versions which are all below to Oracle version 19C (19.3)
IDE version used for application development : DELPHI XE3 (i have checked with DELPHI 10.1 Berlin also)
DB version : Oracle 19C (19.3)
Steps to reproduce :
// 1.Execute below queries in order to create testing data ,
create table myuserlist(myuser varchar2(10));
Insert into myuserlist(myuser) values('karthik');
Insert into myuserlist(myuser) values('aarush');
Insert into myuserlist(myuser) values('yuvan');
// 2.Try to open the below mentioned query using TSQLConnection and TSQLQuery
select listagg(a.myuser, ', ') within group (order by a.myuser) as myusernames from myuserlist a
Sample project is available in the GitHub,
https://github.com/yuvankarthik/DELPHI-DemoOracleConnect.git
Help me to resolve this issue.

How to edit XML column in SQL Developer?

I would like to edit XML column which is displaying as (XMLTYPE) by SQL Developer editor (I go there by clicking on the field twice, edit, then save).
After that the displayed value changes to sqldev.xml:/home/myuser/.sqldeveloper/tmp/XMLType8226206531089284015.xml
Build after save retrieving next build context...
Build after save building project 1 of 1 queued projects
Compiling...
Ignoring /home/username/.sqldeveloper/tmp/XMLType5691884284875805681.xml; not on source path
[11:45:33 AM] Compilation complete: 0 errors, 1 warnings.
Build after save finished
and when I try to commit:
UPDATE "USERNAME"."TABLENAME" SET WHERE ROWID = 'AABWNKAAEAAABSbAAB' AND ORA_ROWSCN = '6951979'
One error saving changes to table "USERNAME"."TABLENAME":
Row 1: Illegal format in column NEXTCOLUMN.
I tried to look for this error and found people who also had it, but without the solution.
If you have an advice how to report it to Oracle, it will be also helpful.
Hope this will be of help to you:
UPDATE table_name
SET table_column=
UPDATEXML(table_column,
'/sampleTag1/sampleTag2/text()','value2')
WHERE some_column = some value --<< this part is where you put your condition
Here is where you can find more about it:
https://docs.oracle.com/cd/B19306_01/server.102/b14200/functions205.htm
-------------------------
If your problem is with editing through SQL developer manually via integrated editor then it is, as far as my testing and researching can tell, because of the SQL Developer version.
You have noted in your comment that you use version 4.1.x and I have found few places where people confirm that they had the same problem with this version.
I also have 4.1.x version and I have also successfully repeated your error where the developer is referring to my .xml file in my ...\sqldeveloper\tmp folder not being on it's source path :
Compiling... Ignoring C:\Users\trle\AppData\Roaming\SQL
Developer\tmp\XMLType6413036461637067751.xml; not on source path
[4:33:29 PM] Compilation complete: 0 errors, 1 warnings.
I have then downloaded version 19.2.x where there is no such problem and all works just fine.
So my answer to your problem is to download some newer version of SQL developer. In my case 19.2. works.
-------------------------
UPDATE Just tested on version 4.2.x - also works

What is the reason for null entries in address export table of a windows pe file?

I have a project I have built that inspects a Windows PE file. When processing certain files, such as User32 and Shell32, I notice there are entries in the export address table that are 0. What is the purpose of having a null (0) entry in the export address table? (An entry of 0 does not resolve to a valid virtual address)
FYI - Using applications like NikPEViewer and Dll Export Viewer will not show these entries at all, DumpBin shows exports that are not contained in the export name table and ordinal table but it skips the null entries.
i have only partial response.
assume we write next exports.def (win 8.1 x64 user32.dll) file:
EXPORTS
...
wvsprintfW #2412
NtUserDelegateInput #2503 NONAME
...
and have no any ordinals between #2412 and #2503 - so linker, for preserve your ordinal order will be need generate 90 zero entries in Export Address Table. so reason for linker is clear - it fulfills our designation. but this redirect our for another question - what is the reason wrote this kind of def file ?
think this is somehow related to rewrite this file from version to version.
say in user32.dll from win7 x64 - higest ordinal is #2502 (compare to NtUserDelegateInput #2503)
in win 10 x64 user32.dll can view:
...
NtUserUpdateWindowTrackingInfo #2585 NONAME
; interval [#2586, #2700) is zero
GetDialogBaseUnits96 #2700 NONAME
; #2701 is zero
EnablePerMonitorMenuScaling #2702 NONAME
new API set export begin from ordinal #2700 (space [#2586, #2700) is reserved ?). but user32.dll not in general export known(stable) ordinals - so it must not be preserve from version to version. so reason at all direct set ordinals in DEF file for me unclear