Db2 LUW Java SP issue - db2-luw

I am trying to run a Stored Procedure which has registered as an external Java Program to Db2. Whenever I call it, I keep getting the error:
SQL4304N Java stored procedure or user-defined function
"RSARKAR.CREATE_USER", specific name "CREATE_USER" could not load Java class
"/home/rsarkar/sqllib/function/jar/RSARKAR", reason code "". SQLSTATE=42724
The db2diag.log is showing the following messages ( with DIAGLEVEL 4 )
2023-01-03-22.10.02.119785+000 I27596E440 LEVEL: Info
PID : 5363 TID : 139874111354624 PROC : db2fmp (
INSTANCE: rsarkar NODE : 000 DB : SAMPLE
APPID : *LOCAL.rsarkar.230103220934
HOSTNAME: ip-172-31-20-141.us-west-1.compute.internal
FUNCTION: DB2 UDB, BSU Java support, sqlejCallJavaRoutine_dll, probe:90
DATA #1 : String, 47 bytes
Exception thrown during class loader loadClass:
2023-01-03-22.10.02.119949+000 I28037E448 LEVEL: Warning
PID : 5363 TID : 139874111354624 PROC : db2fmp (
INSTANCE: rsarkar NODE : 000 DB : SAMPLE
APPID : *LOCAL.rsarkar.230103220934
HOSTNAME: ip-172-31-20-141.us-west-1.compute.internal
FUNCTION: DB2 UDB, BSU Java support, sqlejCallJavaRoutine_dll, probe:40
MESSAGE : Class loader loadClass failed. Possible out of memory in JAVA_HEAP_SZ
I've tried with JAVA_HEAP_SZ set to 1,000,000 but still get the same error.
Java Virtual Machine heap size (4KB) (JAVA_HEAP_SZ) = 1000000
2023-01-03-22.10.02.120041+000 I28486E636 LEVEL: Warning
PID : 5363 TID : 139874111354624 PROC : db2fmp (
INSTANCE: rsarkar NODE : 000 DB : SAMPLE
APPID : *LOCAL.rsarkar.230103220934
HOSTNAME: ip-172-31-20-141.us-west-1.compute.internal
FUNCTION: DB2 UDB, BSU Java support, sqlejCallJavaRoutine_dll, probe:40
MESSAGE : PATH to jar is in the form of sqllib/function/jar/.
jar_id:
DATA #1 : Hexdump, 17 bytes
0x00007F372B039C22 : 2252 4453 4442 2020 2022 2E44 4232 4A41 "RSARKAR ".DB2JA
0x00007F372B039C32 : 52 R
2023-01-03-22.10.02.120139+000 I29123E871 LEVEL: Warning
PID : 5363 TID : 139874111354624 PROC : db2fmp (
INSTANCE: rsarkar NODE : 000 DB : SAMPLE
APPID : *LOCAL.rsarkar.230103220934
HOSTNAME: ip-172-31-20-141.us-west-1.compute.internal
FUNCTION: DB2 UDB, BSU Java support, sqlejCallJavaRoutine_dll, probe:40
MESSAGE : If not using JAR, then common paths are sqllib/function and
sqllib/function/unfenced. Check that you have file .class or this
jar:
DATA #1 : Hexdump, 52 bytes
0x00007F372B03ADA0 : 2F68 6F6D 652F 7264 7364 622F 7371 6C6C /home/rsarkar/sqll
0x00007F372B03ADB0 : 6962 2F66 756E 6374 696F 6E2F 6A61 722F ib/function/jar/
0x00007F372B03ADC0 : 5244 5344 422F 4442 324A 4152 2E6A 6172 RSARKAR/DB2JAR.jar
0x00007F372B03ADD0 : 3A44 6232 :Db2
2023-01-03-22.10.02.120232+000 I29995E766 LEVEL: Warning
PID : 5363 TID : 139874111354624 PROC : db2fmp (
INSTANCE: rdsdb NODE : 000 DB : SAMPLE
APPID : *LOCAL.rdsdb.230103220934
HOSTNAME: ip-172-31-20-141.us-west-1.compute.internal
FUNCTION: DB2 UDB, BSU Java support, sqlejCallJavaRoutine_dll, probe:41
MESSAGE : Method missing from the class above:
DATA #1 : Hexdump, 63 bytes
0x00007F372B03ADD5 : 6372 6561 7465 284C 6A61 7661 2F6C 616E create(Ljava/lan
0x00007F372B03ADE5 : 672F 5374 7269 6E67 3B4C 6A61 7661 2F6C g/String;Ljava/l
0x00007F372B03ADF5 : 616E 672F 5374 7269 6E67 3B4C 6A61 7661 ang/String;Ljava
0x00007F372B03AE05 : 2F6C 616E 672F 5374 7269 6E67 3B29 56 /lang/String;)V
2023-01-03-22.10.02.122131+000 I32501E843 LEVEL: Info
PID : 5363 TID : 139874111354624 PROC : db2fmp (
INSTANCE: rsarkar NODE : 000 DB : SAMPLE
APPID : *LOCAL.rsarkar.230103220934
HOSTNAME: ip-172-31-20-141.us-west-1.compute.internal
FUNCTION: DB2 UDB, oper system services, sqlofica, probe:10
DATA #1 : SQLCA, PD_DB2_TYPE_SQLCA, 136 bytes
sqlcaid : SQLCA sqlcabc: 136 sqlcode: -4304 sqlerrml: 70
sqlerrmc: RSARKAR.CREATE_USER CREATE_USER /home/rsarkar/sqllib/function/jar/RSARKAR
sqlerrp : SQLEJEXT
sqlerrd : (1) 0x00000000 (2) 0x00000000 (3) 0x00000000
(4) 0x00000000 (5) 0x00000000 (6) 0x00000000
sqlwarn : (1) (2) (3) (4) (5) (6)
(7) (8) (9) (10) (11)
sqlstate: 42724
The definition of the SP is as follows:
call sqlj.install_jar( 'file:/home/rsarkar/plugins/DB2.jar', 'DB2JAR');
call sqlj.refresh_classes();
CREATE OR REPLACE PROCEDURE RSARKAR.CREATE_USER
(
USERID VARCHAR(128),
PASS VARCHAR(128) DEFAULT NULL,
GROUPS VARCHAR(256) DEFAULT NULL
)
LANGUAGE JAVA
SPECIFIC RDSADMIN.CREATE_USER
PARAMETER STYLE JAVA
DETERMINISTIC
NO EXTERNAL ACTION
NO SQL
FENCED THREADSAFE
EXTERNAL NAME 'DB2JAR:Db2.create';
I was expecting it to run but it keeps giving me error that it cannot find the method in the class.
I tried with a smaller test with a simple Java program and registered it as follows:
create or replace procedure MYJAVASP (in input char(6))
specific myjavasp
dynamic result sets 0
deterministic
language java
parameter style java
no dbinfo
fenced
threadsafe
modifies sql data
program type sub
external name 'MYJAVASPJAR:MYJAVASP.my_JAVASP';
This one works fine without any issues. Any pointers or ideas are welcome.
The definition in Db2.java:
package db2.registry.sql.storedprocedure;
import db2.registry.User;
import db2.registry.UserRegistry;
import org.apache.commons.lang3.StringUtils;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.util.Arrays;
import java.util.Collection;
import java.util.Optional;
import java.util.stream.Collectors;
import static db2.registry.utils.LineBasedFileUtils.DEFAULT_LOCALUSERS_FILE;
public class Db2 {
public static void create(String username, String password, String groups) {
UserRegistry.createUser(username, password, unpackGroups(groups), DEFAULT_LOCALUSERS_FILE);
}
public static void delete(String username) {
UserRegistry.deleteUser(username, DEFAULT_LOCALUSERS_FILE);
}
public java.sql.ResultSet list() {
Collection<User> users = UserRegistry.listUsers(DEFAULT_LOCALUSERS_FILE).values();
return convertToListCallResultSet(users);
}
public static void modify(String username, String password, String groups) {
UserRegistry.modifyUser(username, password, unpackGroups(groups), DEFAULT_LOCALUSERS_FILE);
}
}
The create procedure statement:
call sqlj.remove_jar( 'DB2JAR');
call sqlj.install_jar( 'file:/home/rsarkar/plugins/DB2.jar', 'DB2JAR');
call sqlj.refresh_classes();
CREATE OR REPLACE PROCEDURE RSARKAR.CREATE_USER
(
USERID VARCHAR(128),
PASS VARCHAR(128) DEFAULT NULL,
GROUPS VARCHAR(256) DEFAULT NULL
)
LANGUAGE JAVA
SPECIFIC RSARKAR.CREATE_USER
PARAMETER STYLE JAVA
DETERMINISTIC
NO EXTERNAL ACTION
NO SQL
FENCED THREADSAFE
EXTERNAL NAME 'DB2JAR:Db2.create';
Calling the Stored Procedure:
db2 "call rsarkar.create_user('testuser','xx','DBA')"

Related

Check if a request has a response in Zeek language

Good Morning,
I have a Zeek machine generating logs on a Modbus traffic.
Currently, my script generates logs looking like this :
ts tid id.orig_h id.orig_p id.resp_h id.resp_p unit_id func network_direction
1342774501.072580 32 10.2.2.2 51411 10.2.2.3 502 255 READ_HOLDING_REGISTERS request
1342774501.087014 32 10.2.2.2 51411 10.2.2.3 502 255 READ_HOLDING_REGISTERS response
'tid' is the transaction id to identify a request/response couple. I want to know if a robot hasn't responded to the Controller by logging only requests that did not have a response within 1 second.
My code is :
module Modbus_Extended;
export {
redef enum Log::ID += { LOG_DETAILED,
LOG_MASK_WRITE_REGISTER,
LOG_READ_WRITE_MULTIPLE_REGISTERS};
type Modbus_Detailed: record {
ts : time &log; # Timestamp of event
tid : count &log; # Zeek unique ID for connection
id : conn_id &log; # Zeek connection struct (addresses and ports)
unit_id : count &log; # Modbus unit-id
func : string &log &optional; # Modbus Function
network_direction : string &log &optional; # Message direction (request or response)
address : count &log &optional; # Starting address for value(s) field
quantity : count &log &optional; # Number of addresses/values read or written to
values : string &log &optional; # Coils, discrete_inputs, or registers read/written to
};
global log_modbus_detailed: event(rec: Modbus_Detailed);
global transaction_ids: set[string, string] = {};
event modbus_message(c: connection,
headers: ModbusHeaders,
is_orig: bool) &priority=-5 {
local modbus_detailed_rec: Modbus_Detailed;
if(headers$tid !in transaction_ids[count]){
add transaction_ids[headers$tid, c$modbus$ts]
}else{
delete transaction_ids[headers$tid, c$modbus$ts]
}
for(i in transaction_ids[timestamp]){
if(c$modbus$ts > transactions_ids[headers$tid, i] +1)
{
Log::write(LOG_DETAILED, modbus_detailed_rec);
}
}
}
}
My guess is that I have to store transaction ids and check if I get only one occurence within this timelapse and then log it into a file, but I can figure out how to do it. Currently I can only generate logs with all the modbus traffic.
Thank you for your help

Bro script for reading a list of Ips and domains

I am trying to read a file with a list of IP addresses and another one with domains, as a proof of concept of the Input Framework defined in https://docs.zeek.org/en/stable/frameworks/input.html
I´ve prepared the following bro scripts:
reading.bro:
type Idx: record {
ip: addr;
};
type Idx: record {
domain: string;
};
global ips: table[addr] of Idx = table();
global domains: table[string] of Idx = table();
event bro_init() {
Input::add_table([$source="read_ip_bro", $name="ips",
$idx=Idx, $destination=ips, $mode=Input::REREAD]);
Input::add_table([$source="read_domain_bro", $name="domains",
$idx=Idx, $destination=domains, $mode=Input::REREAD]);
Input::remove("ips");
Input::remove("domains");
}
And the bad_ip.bro script, which check if an IP is in the blacklist, which loads the previous one:
bad_ip.bro
#load reading.bro
module HTTP;
event http_reply(c: connection, version: string, code: count, reason: string)
{
if ( c$id$orig_h in ips )
print fmt("A malicious IP is connecting: %s", c$id$orig_h);
}
However, when I run bro, I get the error:
error: Input stream ips: Table type does not match index type. Need type 'string':string, got 'addr':addr
Segmentation fault (core dumped)
You cannot assign a string type to an addr type. In order to do so, you must use the utility function to_addr(). Of course, it would be wise to verify that that string contains a valid addr first. For example:
if(is_valid_ip(inputString){
inputAddr = to_addr(inputString)
} else { print "addr expected, got a string"; }

Cannot use threads to insert data to PostgreSQL with DBIish. What's going wrong?

Edit: This was solved by moritz. I've added a note to the code on the line that's wrong.
My application is a web server talking to a game client. The server is multithreaded, which Postgres allows. When loading client data into the database, I noticed parallel requests fail with several different errors, none of which make sense to me.
This short-ish test case dumps a nested hash into the database. When run without start, it works perfectly. When run with threads, it almost always gives one or more of the following errors:
DBDish::Pg: Error: (7) in method prepare at
D:\rakudo\share\perl6\site\sources\BAD7C1548F63C7AA7BC86BEDDA0F7BD185E141AD
(DBDish::Pg::Connection) line 48 in block at testcase.p6 line 62
in sub add-enum-mappings at testcase.p6 line 59 in block at
testcase.p6 line 91
DBDish::Pg: Error: ERROR: prepared statement
"pg_3448_16" already exists (7) in method prepare at
D:\rakudo\share\perl6\site\sources\BAD7C1548F63C7AA7BC86BEDDA0F7BD185E141AD
(DBDish::Pg::Connection) line 46 in block at testcase.p6 line 62
in sub add-enum-mappings at testcase.p6 line 59 in block at
testcase.p6 line 91
DBDish::Pg: Error: Wrong number of arguments to
method execute: got 1, expected 0 (-1) in method enter-execute at
D:\rakudo\share\perl6\site\sources\65FFB78EFA3030486D1C4D339882A410E3C94AD2
(DBDish::StatementHandle) line 40 in method execute at
D:\rakudo\share\perl6\site\sources\B3190B6E6B1AA764F7521B490408245094C6AA87
(DBDish::Pg::StatementHandle) line 52 in sub add-enum-mappings at
testcase.p6 line 54 in block at testcase.p6 line 90
message type 0x31 arrived from server while idle
message type 0x5a arrived from server while idle
message type 0x74 arrived from server while idle
message type 0x6e arrived from server while idle
message type 0x5a arrived from server while idle
Here's the code. (If you choose to run it, remember to set the right password. It creates/manipulates a table called "enummappings", but does nothing else.) The meat is in add-enum-mappings(). Everything else is just setup. Oh, and dbh() creates a separate DB connection for each thread. This is necessary, according to the PostgreSQL docs.
#!/usr/bin/env perl6
use DBIish;
use Log::Async;
my Lock $db-lock;
my Lock $deletion-lock;
my Lock $insertion-lock;
INIT {
logger.send-to($*ERR);
$db-lock .= new;
$deletion-lock .= new;
$insertion-lock .= new;
}
# Get a per-thread database connection.
sub dbh() {
state %connections;
my $dbh := %connections<$*THREAD.id>; # THIS IS WRONG. Should be %connections{$*THREAD.id}.
$db-lock.protect: {
if !$dbh.defined {
$dbh = DBIish.connect('Pg', :host<127.0.0.1>, :port(5432), :database<postgres>,
:user<postgres>, :password<PASSWORD>);
}
};
return $dbh;
}
sub create-table() {
my $name = 'enummappings';
my $column-spec =
'enumname TEXT NOT NULL, name TEXT NOT NULL, value INTEGER NOT NULL, UNIQUE(enumname, name)';
my $version = 1;
my $sth = dbh.prepare("CREATE TABLE IF NOT EXISTS $name ($column-spec);");
$sth.execute;
# And add the version number to a version table:
dbh.execute:
"CREATE TABLE IF NOT EXISTS tableversions (name TEXT NOT NULL UNIQUE, version INTEGER NOT NULL);";
$sth = dbh.prepare:
'INSERT INTO tableversions (name, version) VALUES (?, ?)
ON CONFLICT (name)
DO
UPDATE SET version = ?;';
$sth.execute($name, $version, $version);
}
sub add-enum-mappings($enumname, #names, #values --> Hash) {
$deletion-lock.protect: {
my $sth = dbh.prepare('DELETE FROM enummappings WHERE enumname = ?;');
$sth.execute($enumname);
};
my #rows = (^#names).map: -> $i {$enumname, #names[$i], #values[$i]};
info "Inserting #rows.elems() rows...";
$insertion-lock.protect: {
my $sth = dbh.prepare('INSERT INTO enummappings (enumname,name,value) VALUES '~
('(?,?,?)' xx #rows.elems).join(',') ~ ';');
$sth.execute(#rows>>.list.flat);
};
return %(status => 'okay');
}
# Create a bunch of long enums with random names, keys, and values.
sub create-enums(--> Hash[Hash]) {
my #letters = ('a'..'z', 'A'..'Z').flat;
my Hash %enums = ();
for ^36 {
my $key = #letters.pick(10).join;
for ^45 {
my $sub-key = #letters.pick(24).join;
%enums{$key}{$sub-key} = (0..10).pick;
}
}
return %enums;
}
sub MAIN() {
create-table;
await do for create-enums.kv -> $enum-name, %enum {
start {
add-enum-mappings($enum-name, %enum.keys, %enum.values);
CATCH { default { note "Got error adding enum: " ~ .gist; } }
};
}
}
I'm on Windows 10, with a 8-core computer. I know I could insert the data single-threadedly, but what if the game gets a hundred connections at once? I need to fix this for good.
I suspect your problem is here:
my $dbh := %connections<$*THREAD.id>;
The %hash<...> syntax is only for literals. You really need to write %connections{$*THREAD.id}.
With your error in place, you have just one DB connection that's shared between all threads, and I guess that's what DBIish (or the underlying postgresql C client library) is unhappy about.

hive-hbase integration throws classnotfoundexception NULL::character varying

Following with this link https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration#HBaseIntegration-HiveMAPtoHBaseColumnFamily
I'm trying to integrate hive and hbase, I have this configuration in hive-site.xml:
<property>
<name>hive.aux.jars.path</name>
<value>
file:///$HIVE_HOME/lib/hive-hbase-handler-2.0.0.jar,
file:///$HIVE_HOME/lib/hive-ant-2.0.0.jar,
file:///$HIVE_HOME/lib/protobuf-java-2.5.0.jar,
file:///$HIVE_HOME/lib/hbase-client-1.1.1.jar,
file:///$HIVE_HOME/lib/hbase-common-1.1.1.jar,
file:///$HIVE_HOME/lib/zookeeper-3.4.6.jar,
file:///$HIVE_HOME/lib/guava-14.0.1.jar
</value>
</property>
Then create a table named 'ts:testTable' in hbase:
hbase> create 'ts:testTable','pokes'
hbase> put 'ts:testTable', '10000', 'pokes:value','val_10000'
hbase> put 'ts:testTable', '10001', 'pokes:value','val_10001'
...
hbase> scan 'ts:testTable'
ROW COLUMN+CELL
10000 column=pokes:value, timestamp=1462782972084, value=val_10000
10001 column=pokes:value, timestamp=1462783514212, value=val_10001
....
And then create external table in hive:
Hive> CREATE EXTERNAL TABLE hbase_test_table(key int, value string )
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key, pokes:value")
TBLPROPERTIES ("hbase.table.name" = "ts:testTable",
"hbase.mapred.output.outputtable" = "ts:testTable");
So far so good. But when I tried to select data from the test table, exception was thrown:
Hive> select * from hbase_test_table;
FAILED: RuntimeException java.lang.ClassNotFoundException: NULL::character varying
Error: Error while compiling statement: FAILED: RuntimeException java.lang.ClassNotFoundException: NULL::character varying (state=42000,code=40000)
Am I missing anything?
I'm trying Hive 2.0.0 around with HBase 1.2.1
Ok, I figured it out, the "NULL::character varying" is not a part of hive, it is coming from Postgresql, as I'm using it as the back end of Metastore. But the problem is Hive doesn't recognizes this exception from Postgresql. We have the following code for Hive 2.0.0:
300: if (inputFormatClass == null) {
301: try {
302: String className = tTable.getSd().getInputFormat();
303: if (className == null) {
304: if (getStorageHandler() == null) {
305: return null;
306: }
307: inputFormatClass = getStorageHandler().getInputFormatClass();
308: } else {
309: inputFormatClass = (Class<? extends InputFormat>)
310: Class.forName(className, true, Utilities.getSessionSpecifiedClassLoader());
}
Line 302 will not return null which supposed to. so that the line 310 will try to load a non-existing class in. That's the reason why program failed.
I believe it is a compatible bug, the way to fix it is change the database which I hate to. So I just simply replaced 302 with
if (className == null || className.toLowerCase().startsWith("null::")) {
And do same thing to the getOutputFormat() method, then re-compile the jar, That's it.

Passing Argument to Specific Function in STAX Job Using Command Line

I have a STAX Job new15.xml inside /home/dharm/staf/services/stax/samples location.
new15.xml have two function main and readFile
file_name = '/home/dharm/datafiles/ReadData3.txt'
Required Argument for readFile is file_name.
I want to execute this command using command line but i also want to give required parameter to readFile function.
staf local execute file /home/dharm/staf/services/stax/samples/new15.xml wait return result
What are the modification i should do in command to make it work.
"What I have Trierd"
FIRST
dharm#ubuntu:~$ staf local stax execute file /home/dharm/staf/services/stax/samples/new15.xml ARGS file_name="'/home/dharm/datafiles/ReadData3.txt'" wait returnresult
Response
--------
{
Job ID : 3
Start Date-Time: 20130418-23:54:39
End Date-Time : 20130418-23:54:40
Status : Terminated
Result : None
Job Log Errors : [
{
Date-Time: 20130418-23:54:40
Level : Error
Message : STAXPythonEvaluationError signal raised. Terminating job.
===== XML Information =====
File: /home/dharm/staf/services/stax/samples/new15.xml, Machine: local://local
Line <Error in ARGS option>: Error in element type "<External>".
===== Python Error Information =====
com.ibm.staf.service.stax.STAXPythonEvaluationException:
Python object evaluation failed for:
file_name='/home/dharm/datafiles/ReadData3.txt'
SyntaxError: ("mismatched input '=' expecting EOF", ('<pyEval string>', 1, 9, "file_name='/home/dharm/datafiles/ReadData3.txt'\n"))
===== Call Stack for STAX Thread 1 =====
[]
}
]
Testcase Totals: {
Tests : 0
Passes: 0
Fails : 0
}
}
"2ND Command"
dharm#ubuntu:~$ staf local stax execute file /home/dharm/staf/services/stax/samples/new15.xml SCRIPT file_name="'/home/dharm/datafiles/ReadData3.txt'" wait returnresult
Response
--------
{
Job ID : 4
Start Date-Time: 20130418-23:56:01
End Date-Time : 20130418-23:56:02
Status : Terminated
Result : None
Job Log Errors : [
{
Date-Time: 20130418-23:56:02
Level : Error
Message : STAXFunctionArgValidate signal raised. Terminating job.
===== XML Information =====
File: /home/dharm/staf/services/stax/samples/new15.xml, Machine: local://local
Line 20: Error in element type "call".
Required argument "file_name" is not provided in the call to function "readFile".
===== Call Stack for STAX Thread 1 =====
[
function: main (Line: 19, File: /home/dharm/staf/services/stax/samples/new15.xml, Machine: local://local)
]
}
]
Testcase Totals: {
Tests : 0
Passes: 0
Fails : 0
}
}