I'm generating a changelog from a SQL Server database which I would like to use to create an Oracle version. One of the tables has a varbinary(max) column and contains data to bring over as well. The generated inserts from this table look like this:
<changeSet author=... id="1415603816743-555">
<insert tableName="my_table">
<column name="my_table_id" value="0007JL11X000OZ10J60000948UM000000P8P"/>
...
<column name="my_table_image" value="[B#70eded35"/>
...
</insert>
...
</changeSet>
This throws an "ORA-01465: invalid hex number" when I attempt to insert the my_table_image data on the Oracle side (target column is a BLOB).
The original data on the SQL Server side is hex. Any ideas on how to successfully generate/update from the changelog to both SQL Server varbinary(max) and Oracle BLOB? Thanks!
I would say that Liquibase is probably not the right tool to use for this kind of data transfer. Liquibase is primarily concerned with database structure rather than database contents. For transferring simple data (strings, numerics, etc.) Liquibase can do the job, but for something like pictures I would look at more specialized ETL (Extract-Transform-Load) tools.
The tool I have used most and had success with (although I've never done image data, so YMMV) is called Pentaho - they have a free open-source "community" version that might work for you.
Related
I have some SQL scripts that I'd like to convert into Liquibase XML changelogs. Is this possible?
What exactly do you mean by "convert into changelogs"?
If you don't need cross-database compatibility, you could just use the sql executor:
<changeSet id="testing123" author="me">
<sql splitStatements="false">
<!-- all your existing SQL paste here -->
</sql>
</changeSet>
The splitStatements=false setting will make sure the complete SQL will be sent in one command; otherwise, Liquibase would split it into multiple commands at semicolons.
You could use a detour via a database. Apply your script to a database of choice (that is supported by liquibase). Then use generateChangeLog to generate the changelogs in xml from the database. You will have the XML changelog generated for all the SQL scripts you have applied to your database.
Alternatively, have a look at answer on this SO post.
I am trying to migrate from oracle db to mysql or postgres using Liquibas. I have generated the sql file using Liquibase but, the syntax is not right there is a lot of issue with the generated sql. If anyone has any solution please do let me know thank you.
The best approach is to use the generateChangeLog function to create an XML changeSet description of your oracle database. Go through the generated changelog to make sure everything expected is there, and make any changes to the file as needed such as data type changes.
Once the changelog is correct, you can run the XML changelog directly against your mysql or postgresql database or use updateSQL to generate the SQL liquibase would use. Liquibase will create the correct database-specific SQL when it runs a changelog against a given database.
We have 40+ Tables present in SQL SERVER DB and we need to copy the data to an IBM DB2 database. What methods do you recommend to accomplish this?
My ANALYSIS:
BCP and Data Import - The team is trying to avoid any BCP files
Write Stored procedure and use LINKED Server in SQL and insert the data in DB2
SSIS Packages to move data.
Please let us know if you have any better way to approach this issue.
Have you considered Information Integration, that is known in DB2 as federation? you can do a select in SQL Server directly from DB2, and with this feature you can define a cursor and then just use the LOAD command.
I need to get a copy of a SQL Server 2008 table into an Oracle RDBMS. I have database link for SQL Server, database has a table which contains LONG BINARY type column.
When I issue
create table test_ora as select * from mssqltable#dblink
I get the error
Can't convert LONG
I tried to use to_lob, to_char, hextoraw and a ream of Oracle conversion function but still hasn't defeated the issue. Do you have any ideas?
p.s. I'm out of work now so can't tell exact ORA- error number.
There is a way to do that with undocumented Oracle's package:
http://tonguc.wordpress.com/2008/08/28/how-to-transfer-long-datatype-over-dblink/
I would recommend tool called Pentaho Data Integration. This is free, small and superb ETL tool.
Download page: community(.)pentaho(.)com
It will recreated all tables and types for you. How to do it:
pldwh(.)blogspot(.)co(.)uk/2013/03/pentaho-data-integration-create-tables_1(.)html
I'm looking to automatically generate an XML version of a database schema from SQL (Postgres) DDL.
Are there any tools to help with getting from the DDL to XML?
xml2ddl claims to do this, but it fails to connect and seems unsupported since 2005.
You can use the built-in table_to_xmlschema etc.; see http://www.postgresql.org/docs/current/static/functions-xml.html#FUNCTIONS-XML-MAPPING.
Things that spring immediately to my mind:
Liquibase
Schemaspy
SQL Workbench's WbSchemaReport
They don't use a DDL (SQL) script as input but require a database connection.
Have you also researched DbUnit?