Connecting to Oracle11g database from Websphere message broker 6 - sql

I am trying a simple insert command from websphere message broker 6 from a compute node.
The data source name which is provided in the odbc.ini file in the message broker is specified in the node property of the compute node. And have wrote the following ESQL code.
SET TABLE = 'MYTABLE';
SET MYVALUE = 'TESTVALUE';
INSERT INTO Database.TABLE VALUES(MYVALUE);
The connection url is provided in the tnsnames.ora. The url is cluster url. Which points to 3 database instances.
When I am running the query i am getting exception that table or view does not exist in the trace.
But when i connect to db using any of the 3 direct urls, i am able to see the table.
Note: database is oracle11g
Can any one explain me what is happening?

The problem was that my application was using the same DSN used by my broker. And while creating the broker, the username and password provided was pointing to different schema, which is not having the the tables for my application.
The solution was creating a new DSN, and using mqsisetdbparams to point it to the correct schema.

Related

Azure SQL Database sys.fn_get_audit_file or sys.fn_xe_file_target_read_file troubles

I'm having troubles on a Azure SQL Database where i'm trying to read DB Audit logs.
Both procedures sys.fn_get_audit_file or sys.fn_xe_file_target_read_file sould be able to read a file.
But whatever I do i'm getting blank tables.But, even if I specify a non existing file I receive a table with zero records instead of a error.
So I'm afraid its something else.
My login is in the db_owner group.
Any suggestions ?
I found that I could only read XEL files by using the same server and same database context that they were created for. So for example, consider the following scenario:
ServerA is the Azure Synapse instance I was creating the audit XEL files from, all related to DatabaseA
ServerB is a normal SQL instance that I want to read the XEL files on
Test 1:
Using ServerB, try to read file directly from blob storage
Result: 0 rows returned, no error message
Test 2:
Using ServerB, download the XEL files locally, and try to read from the local copy
Result: 0 rows returned, no error message
Test 3:
Using ServerA, with the current DB = 'master', try to read file directly from blob storage
Result: 0 rows returned, no error message
Test 4:
Using ServerA, with the current DB = 'DatabaseA', try to read file directly from blob storage
Result: works perfectly
Because I really wanted to read the files from ServerB, I also tried doing a CREATE CREDENTIAL there that was able to read & write to my blob storage account. That didn't make any difference unfortunately - a repeat of Test 1 got the same result as before.

Sybase - SAP ASE - replication server: routing declaration

I'm trying to declare a routing in SAP Replication Server.
I have:
A server (let's call it S1) with ASE and RS server (let's call it RS1).
A server (let's call it S2) with ASE and RS server (let's call it RS2).
A server (let's call it S3) with ASE server.
I have A replication in RS1 from database in S1 to databases in S1 and S2.
Now I'm trying to add a replication to a database in S3 via RS2: a routing from RS1 to RS2 and a subscription to the DB in S3.
I declared the routing, an agent between the 2 RSSDs.
When I'm trying to set the subscription (in RS2) to the databse in S3 I've got an error - saying that it doesn't know the replication definition.
Anyone familiar with routing declaration?
Thanks.
The critical thing with routes is they both need to be in the same replication system, meaning they must share the SAME primary rep server (known as the ID server) - this contains information about all the replication servers in the replication server setup, or domain as it's known. You can create many replication servers in a domain, but for them to be able to link together via routes they must all use the same ID server.
NOTE: You can't set them up separately and then link them later. When you set up RS2 you have say RS1 is the id server and put in all the required info into rs_init for RS1 as you run through the various rs_init menus to create RS2.
If that's been done already correctly then:
Firstly set up route between RS1 and RS2 (via a 'create route' command here) if you want data to flow in both directions at some point it makes sense to setup routes both ways between RS1 and RS2, as by definition a route is in one direction. This will mean you can set replication up between any of the three ASE instances.
NOTE: You need to check that the route is actually fully up and active (via admin who) - if not then you need to start looking through the rep server errorlogs as to why that's failing e.g. missing entry in interfaces file, login issue etc.
One routes are set-up you can create a subscription replication definition against the source database and a subscription at the target database when these are attached to different replication servers. This can be at the table-level or a database-level replication definition (MSA) depending on what your aim is.
Update: I resolve it.
Especially what I did was arrange the settings and delete duplicates. Then I set up the connections again, and then the subscription.
drop connections.
Drop route.
Purge route - clean up any old references that were created with the failed create route.
suspend connection.
Stop rep agent and ran the rs_zeroltm to tell the rep agent to start at the end of the log and restarted the rep agent.
resume connection
re-created the route between the RSSDs.
verify the replication definitions were copied to the target RSSD
create a subscription
resume the replicate connection on the second RS

DB link in postgres looking at another localhost image : relation tablename does not exist

i'm using docker to host two different postgresql instances to try and produce a proof of concept for python as an etl to move data between the two i can connect to the one via python fine but then when calling a procedure inside the first instance it cant find the table in the second instance im connecting to the second instance using dblink using this code
FROM dblink('host=localhost port=5432 user=postgres password=postgres dbname=postgres','SELECT * FROM staging.test')
the error message from this is that the relation staging.test does not exist
When using multiple docker images with postgresql installed on each to get these images to talk to each other when on local host in the connection string use the service name instead of local host
FROM dblink('host=ServiceName port=5432 user=postgres password=postgres dbname=postgres',
'SELECT * FROM staging.test')

Talend Open Studio: Load input files into database

I have an empty SQLlite database. Next to that, I have 6 input files (delimited, excel, json, xml).
Now, all I want to do is load the input files into the empty database.
I tried to connect one input file with the DB and just run it. That didn't work (the DB doens't have anything in it, I suspect that is a problem).
Then, I tried to connect an input file with a tMap, define the table there, define the schema and connect the tMap to the DB (tSQLliteOutput).
When I tried to run it, I receive the following error:
Starting job ProductDemo_Load at 16:46 15/11/2015.
[statistics] connecting to socket on port 3843
[statistics] connected
Exception in component tSQLiteOutput_1
java.sql.SQLException: no such table:
at org.sqlite.DB.throwex(DB.java:288)
at org.sqlite.NativeDB.prepare(Native Method)
at org.sqlite.DB.prepare(DB.java:114)
at org.sqlite.PrepStmt.<init>(PrepStmt.java:37)
at org.sqlite.Conn.prepareStatement(Conn.java:231)
at org.sqlite.Conn.prepareStatement(Conn.java:224)
at org.sqlite.Conn.prepareStatement(Conn.java:213)
at workshop_test.productdemo_load_0_1.ProductDemo_Load.tFileInputExcel_1Process(ProductDemo_Load.java:751)
at workshop_test.productdemo_load_0_1.ProductDemo_Load.runJobInTOS(ProductDemo_Load.java:1672)
at workshop_test.productdemo_load_0_1.ProductDemo_Load.main(ProductDemo_Load.java:1529)
[statistics] disconnected
Job ProductDemo_Load ended at 16:46 15/11/2015. [exit code=1]
I see there's something wrong with the import, but what exactly?
What should I do in order to succesfully load the data from the input files in the database?
I did the exact steps from this little tutorial:
Talend Job: load data into database.
Most talend output components have create table if not exists option.. Did u checked this in your tsqliteoutput..error seems that when talend is inserting data into empty database your table it is not able to find it as it does not exists.. So you to tell talend to create the table first..

Sql power architect to compare two data models

I need to compare the current data model with the old data model.
I am using sql power architect for it to do the comparison, I can able to configure the connections for accessing the database, where the connection is successful.
(I am using amazon redshift DB as the source for this.)
But when I tried to expand the children, i am getting the list of table objects associated with it and when I tried to do a compare datamodel option, I am seeing the below error.
Help me to resolve the same.
Caused by: ca.sqlpower.sqlobject.SQLObjectException:
relationship.populate at
ca.sqlpower.sqlobject.SQLRelationship.fetchExportedKeys(SQLRelationship.java:740)
at
ca.sqlpower.sqlobject.SQLTable.populateRelationships(SQLTable.java:731)
at ca.sqlpower.sqlobject.SQLTable.populateImpl(SQLTable.java:1337)
at ca.sqlpower.sqlobject.SQLObject.populate(SQLObject.java:186) ...
4 more Caused by: org.postgresql.util.PSQLException: Unable to
determine a value for MaxIndexKeys due to missing system catalog data.
at
org.postgresql.jdbc2.AbstractJdbc2DatabaseMetaData.getMaxIndexKeys(AbstractJdbc2DatabaseMetaData.java:64)
at
org.postgresql.jdbc2.AbstractJdbc2DatabaseMetaData.getImportedExportedKeys(AbstractJdbc2DatabaseMetaData.java:3196)
at
org.postgresql.jdbc2.AbstractJdbc2DatabaseMetaData.getExportedKeys(AbstractJdbc2DatabaseMetaData.java:3584)
at
ca.sqlpower.sql.jdbcwrapper.DatabaseMetaDataDecorator.getExportedKeys(DatabaseMetaDataDecorator.java:388)
at
ca.sqlpower.sqlobject.SQLRelationship.fetchExportedKeys(SQLRelationship.java:735)
... 7 more
you are using the wrong version of JDBC .
please follow the below steps
- download the JDBC driver from AWS redshit from AWS website.
- configure the JDBC driver part in sql power architect from connection manager.
- go to JDBC driver -> select postgres -> in add jars configure the jar downloaded -> configure the driver class name - click OK
- now go back to connection manager
- select the appropriate connection and select edit and test for connection
- you can see the downloaded jar configured
- now you can add the data object to sql power architect.