Liquibase: Unable to recognizing data type CLOB - liquibase

I m used Liquibase to reverse engineer Microsoft MYSQL database where I see changeset for CLOB datatype generated as VARCHAR
When I execute the changeset to new environment, as expected Column Profile is created as VARCHAR instead of CLOB.
Is this a known issue or any workaround is provided from API.
Liquibase version: 3.6.2

You have two options:
You can use updatesql to generate a SQL file where you can manually change the data type from VARCHAR to CLOB.
You can use <SQL> tags in your changelog file to let liquibase generate it as you want, for example:
<sql>
CREATE TABLE (ID NUMBER, QUERY CLOB);
</sql>
In this case you would have to take care of the rollback by yourself.

Related

Liquibase - Generating Change Logs

I want Liquibase, to generate a changelog, from this DB 'testing'. Is it possible?
I have an existing database already, with its tables and data inside.
jdbc:mysql://localhost:3306/testing
Now, I want Liquibase, to generate a changelog, from this DB 'testing'. Is it possible?
This is my command, but it doesn't work.
liquibase --driver=com.mysql.jdbc.Driver --classpath=C:\mysql-connector-java-5.1.47.jar
--changeLogFile=C:\db.changelog.xml --url="jdbc:mysql://localhost:3306/testing"
--username=root generateChangeLog
I don't use any password.
The error is related to --changeLogFile=C:\db.changelog.xml
I thought, Liquibase will refer to my DB 'testing', and generate changelog, with name 'db.changelog.xml' in folder C.
Which part I'm wrong? Do I miss something?
Or maybe, Liquibase is not intended, to generate changelog, from existing DB?
Or maybe, Liquibase is intended, to generate DB, from changelog only? And not vice versa?
This is possible. You might be having trouble since you are writing to a file in the root of your c: drive. Try c:\temp\changelog instead.
My experience is that liquibase works in increments. So if you run that command on your database, it will produce a file as if everything in the database has to be created in the changelog file (as if starting with a completely empty database).
If you read the text on Liquibase's site regarding this command, it says:
When starting to use Liquibase on an existing database, it is often useful, particularly for testing, to have a way to generate the change log to create the current database schema.
This means that if you:
Execute this command once against your dev database
Run the result against a new database (let's say test)
Run it again on your dev database
Run that file against your test database
You will get a load of errors stating that functions already exist.
I gather that the idea behind this is that you create new entries in the changelog files and executing them against ALL your databases, instead of using other tools and using liquibase for the delta.
Sample entry
<changeSet author="liquibase-docs" id="addColumn-example">
<addColumn catalogName="cat" schemaName="public" tableName="YY">
<column name="xx" type="varchar(255)"/>
</addColumn>
</changeSet>
SQL equivalent
ALTER TABLE yy ADD COLUMN xx INT

Avoiding create databasechangelog table in Liquibase

I want to generate a SQL file to update my database's databasechangelog table with changeLogSyncSQL via command-line. When I generate the SQL file there is a CREATE TABLE [DATABASECHANGELOG] in it, which is alright when I use the SQL file for the first time on a database but what can I do to generate a SQL file without this CREATE TABLE [DATABASECHANGELOG] statement since then there will already be a databasechangelog table in the database. It is no option to run it directly against a database.
My properties file:
driver: com.microsoft.sqlserver.jdbc.SQLServerDriver
classpath: sqljdbc42.jar
url: offline:mssql?outputLiquibaseSql=true
changeLogFile: cl.xml
outputFile: output.sql
I am using liquibase 3.5.5.

Replicating XML data from a source into SAP HANA Table

I have been asked this question many times that how can I replicate data from a source with XML format into HANA tables.
I have actually done the opposite of it i.e. converting data from HANA CV into XML Format using XSJS.
But not sure if the above is possible i.e.to load data from a source with XML format Data to HANA using XSJS.
please provide if there is any document on this.
Thanks,
Sarthak
If you want to insert XML into a HANA database table you can use following SQL Insert statement
First create the database table including a column with data type nvarchar
Then execute INSERT command
create column table XMLData (
id integer,
xml nvarchar(5000)
);
insert into XMLData (id,xml) values (1,N'
-- your xml here
');
Smart Data Integration provides XML connectors to easily integrate XML data into HANA. As you explicitly asked for XSJS (which means you wanna go the "hard" way), you can use in XS Classic the $.util.SAXParser library for parsing your loaded XML. The same library can be used for XS Advanced in a Node module with XSJS compatibility, but on XSA I would prefer to use one of the many free available XML libraries either for Node or Java.

Liquibase - Handling Multiple Schemas on the SQL format

The XML format changesets allow the property of schemaName="mySchema" to indicate which schema the DDL should be executed within for a given changeset - this can then be set using properties so that the same install could be deployed on servers which use different schema names. e.g.
<createTable tableName="test" schemaName="${primarySchema}">
Using the SQL format files for Liquibase there does not seem to be any documented equivalent to the schemaName property - nor any obvious mechanism to parameterized it beyond separating out all of the DDL per schema and running the migrate command once per set of schema files.
Is there a documented / undocumented way in which to give liquibase directives in the SQL file format to handle schemas in a neater fashion such as this? (which does not work)
--changeset me:1 schemaName:mySchema

Using BLOB in DBUnit

I have to test a class where we are retrieving data from Oracle, from a XMLTYPE column. We are using BLOB for the casting, because the system is prepared to run in MySQL too:
BLOB salePlanXmlType = (BLOB) jdsLoad.getValueCell(0, "SALEPLAN");
In DBUnit, first we are creating the tables, and then loading data. The loading part was fun, but I managed to load two XML's using the tips found here.
Anyway I can't manage to create a table in DBUnit with a BLOB type. Here's the script I try to execute:
CREATE TABLE TSHT_SALEPLAN
(
SALEPLANCODE INTEGER,
VENDORCODE VARCHAR(10),
HOTELCODE INTEGER,
SALEPLAN BLOB
);
When I run the test with this script, I get the following error:
java.sql.SQLException: Wrong data type: BLOB in statement
[CREATE TABLE TSHT_SALEPLAN
(
SALEPLANCODE INTEGER,
VENDORCODE VARCHAR(10),
HOTELCODE INTEGER,
SALEPLAN BLOB]
at org.hsqldb.jdbc.Util.throwError(Unknown Source)
at org.hsqldb.jdbc.jdbcPreparedStatement.executeUpdate(Unknown Source)
Which I don't understand, because BLOB seems to be supported by hsqldb.
If I change the BLOB column definition and use VARBINARY, it works. But then the casting to Blob in my code throws an exception.
Has anybody used BLOB in a create table statement with DBUnit?
The exception is from an old version of HSQLDB (probably 1.8) that does not support BLOB. Use the latest version 2.3.x jar instead.