ORA-31655 when using VERSION=10.2 with expdp - sql

i'm trying to export a table with Oracle Datapump, running on a Oracle 12C Instance. The schema has a table called KAT.
When i do the export with:
expdp USER/PASS directory=exp dumpfile=dump.dmp logfile=kat.log TABLES=KAT
everything works as expected.
When i try to do the following (to be able to import the data in a Oracle 10g database), i get the following error:
expdp USER/PASS directory=exp dumpfile=dump.dmp logfile=kat.log TABLES=KAT VERSION=10.2
ORA-39166: Object USER.KAT was not found.
ORA-31655: no data or metadata objects selected for job
Why? Any ideas?

Most likely issue is that your table is using features that exist in 12c but not in 10.2. I'm getting the exact same error message trying to export a table with a virtual column (which were introduced in 11.1) from a 12c database:
No VERSION (i.e. COMPATIBLE): works
VERSION=11.2 or 11.1: works
VERSION=10.2: ORA-39166 error.
Could be a feature on the table itself, or one of its indexes (or constraints). Check the table's DDL.

Related

Write back to Oracle from PowerApps

I am trying write back to Oracle table from PowerApps. But I'm getting the following error:
"The Data source is read-only, so the function Patch can't write to it"
I have access to this Oracle table and table data is also visible in my PowerApp but I cannot insert in DB from my PowerApp.
Primary key is perfectly fine, defined well with all constraints. I can insert by using SQL query in the same table but when I do same with PowerApps I get error.
Here is my Patch function.
Patch('[PLAN].[V_PLAN_L_TYP]',Defaults('[PLAN].[V_PLAN_L_TYP]') , {TYP_ID: TYP_ID_TextInput.Text, TYP_DESC: TYP_DESC_TextInput.Text,
KANAL_AKT: KANAL_AKT_TextInput.Text } );
Is it possible to write back to Oracle table or not?
This naming convention V_PLAN_L_TYP sounds to me like a view but not a table. Please verify it.
This is why that matters - Currently Oracle connector documentation says:
Support Oracle view as read-only table

Unable to access the temp tables in azure sql database

Using following code I have created a temp table in azure sql database.
CREATE TABLE ##UpsertTempTable (
eno varchar(25),
ename varchar(25)
);
and I am want to check the data using the below query
select * from ##UpsertTempTable
Ideally it should run without any issue as in all of the azure documentation it works without any issues but unfortunately it is not working and giving below error.
I tried looking solution in all places in the internet but could not find any relevant documentation for this issue.
Error : Failed to execute query. Error: Invalid object name '##UpsertTempTable'.
I tried in Query Editor(Preview) in Portal, and create temporary table code doesn't work. I both used ##UpsertTempTable and #UpsertTempTable.
For example, when we run the code, no error happens .
When you run select * from ##UpsertTempTable, Query editor will gives the error:
I also try with SSMS V17.9 and SSMS V18.1, everything is ok.
What I think is the query editor doesn't support create temporary table well.
I asked Azure Support and wait their replay, please wait my update.
Update:
Azure Support replied me:
"This is by design, the temp tables exists as long as the connection is open.
The current way portal query editor is designed, the connection is killed resulting in temp table being deleted.
"
Hope this helps.

Bad changelog from generateChangeLog with MariaDB

I am attempting to generate a change log against a MariaDB server. I am able to successfully generate a change log, however if I do a dropAll and update to validate that it is useful, there are multiple problems with it. I have tried using both the native mariadb and MySQL jdbc connectors and both experience these problems. I am also using liquibase 3.1.1.
The first is that there are deferrable and initiallyDeferred flags which are not supported by MySQL. The update command specifically calls these out as being invalid flags against MySQL.
Once I remove all of those references in the .xml, attempting to update runs into a sql syntax error because a double datatype is defined as DOUBLE(22) (in the xml). This is not a valid syntax for a double in mariadb or MySQL. They accept no parameters, or DOUBLE(m,d); my database is defined as default (no parameters).
Now its trying to create a table with an auto_increment but not specifying that the column is also a key in the create table statement; ie. it's missing the primaryKey constraint.
And I'm sure there are more problems in line as I work my way through the changelog (this is just changeset 116 out of 1500+).
Its almost as if liquibase is creating the changelog based on it thinking the db is a different type (postgres/oracle?).
Am I missing something?
You are right, the problem is that Liquibase is thinking it is an unknown database and doesn't know it is almost mysql. There is an issue open to add mariaDB support (https://liquibase.jira.com/browse/CORE-1411) but it hasn't been implemented quite yet.
The easiest work-around would be to add an extension:
Create a new liquibase.database.ext.MariaDBDatabase class in your codebase that extends liquibase.database.core.MySQLDatabase and override the isCorrectDatabaseImplementation(DatabaseConnection conn) method that returns true if conn.getDatabaseProductName() equals whatever the MariaDB jdbc driver is returning.
You may also want to override the getDefaultDatabaseProductName() to return "mariadb" instead of MySQL so you can differentiate it with contexts

Specifying database other than default with Impala JDBC driver

I'm using the Impala JDBC driver (or I guess it's actually the Hive Server 2 JDBC driver). I have a view created in another database -- let's call it "store55".
Let's say my view is defined as follows:
CREATE VIEW good_customers AS
SELECT * from customers WHERE good = true;
When I try to query this view using JDBC as follow:
SELECT * FROM store55.good_customers LIMIT 10
I get an error such as:
java.sql.SQLException: AnalysisException: Table does not exist: default.customers
Ideally, I'd like to specify the database name somewhere in the JDBC URL or as a parameter but when I try to use this JDBC url, I still get the same error:
jdbc:hive2://<host>:<port>/store55;auth=noSasl
Doe the Hive2 JDBC driver just ignore the database part of the URL and assume all queries are executed against the default database?
The only way I was able to have the queries return is to change the view definition itself to include the database name:
CREATE VIEW good_customers AS
SELECT * from store55.customers WHERE good = true;
However, I'd like to keep the view definition free of database names.
Thanks!
You might want to specify in JDBC the "use database xxxxx;" statement.
Also, if you are already using the database try "invalidate metadata" statement.
The URL is jdbc:hive2://:/store55;auth=noSasl correct
Can you run few diagnostics such as:
SHOW TABLES - to ensure that the view is created in store55
Are you using the USE DATABASE command in the DDL's

ORA-00980 error when attempting export using the EXP command

I'm trying to export a schema in an oracle 10 database using the EXP command. Let's call the schema "myschema" and the tns name "mydb" to protect the names of the innocent. Anyway, here's the command line that I'm using
exp myschema/mypassword#mydb file=myschema.dmp grants=y
This works when I try to run an export on other instances, but I get the following error when I try against "mydb".
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
With the Partitioning, OLAP and Data Mining options
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
server uses UTF8 character set (possible charset conversion)
. exporting pre-schema procedural objects and actions
. exporting foreign function library names for user MYSCHEMA
. exporting PUBLIC type synonyms
EXP-00008: ORACLE error 980 encountered
ORA-00980: synonym translation is no longer valid
EXP-00000: Export terminated unsuccessfully
Anybody have any ideas? If any further info is needed let me know and I'll edit this question accordingly.
This can happen if the JVM installation is corrupted. Try:
SELECT comp_id, schema, status, version, comp_name
FROM dba_registry
ORDER BY 1;
If this returns a row with the comp_id of JAVAVM with a status of 'INVALID' you'll need to reinstall the java VM.
Metalink document 276554.1 has the procedure for doing so. If you can recover easily, it might be easier to recreate the database and reload it.
EDIT: I found Oracle-base link where a poster claims this will uninstall and reinstall the JVM (on Unix), I presume it works on windows with slight mods:
(WARNING! You can seriously hose your database if things go wrong here. BACKUP first!)
cd $ORACLE_HOME/javavm/install
sqlplus / as sysdba
#rmjvm.sql
#initjvm.sql
-- Recompile invalid objects
#?/rdbms/admin/utlrp
The metalink note is quite a bit more involved.