I have some SQL scripts that I'd like to convert into Liquibase XML changelogs. Is this possible?
What exactly do you mean by "convert into changelogs"?
If you don't need cross-database compatibility, you could just use the sql executor:
<changeSet id="testing123" author="me">
<sql splitStatements="false">
<!-- all your existing SQL paste here -->
</sql>
</changeSet>
The splitStatements=false setting will make sure the complete SQL will be sent in one command; otherwise, Liquibase would split it into multiple commands at semicolons.
You could use a detour via a database. Apply your script to a database of choice (that is supported by liquibase). Then use generateChangeLog to generate the changelogs in xml from the database. You will have the XML changelog generated for all the SQL scripts you have applied to your database.
Alternatively, have a look at answer on this SO post.
Related
We are deploying sql query's using liquibase automation tool.
liquibase version: 4.3.0
We have created DB files in SQL format and used the "liquibase validate" command to validate the syntax error. But I didn't get expected result.
Ex: dem1.sql, demo2.sql
I have found the below doc's for validation,
https://docs.liquibase.com/commands/validate.html?msclkid=72d2d27da9cf11ecae4cc3525df5294e
As per the above document, "liqubase validate" command won't support for SQL format files.
Can any one help me to validate the SQL format file with liquibase or any other ways (like shell script or jenkins pipeline stage) before deploy the sql query's to my DB.
One possible approach for syntax validation could be to utilise a Database test-container matching your runtime environment during integration coverage.
That way your LiquiBase changelogs can be executed during test-setup and syntax is implicitly validated.
See example here (https://www.baeldung.com/spring-boot-testcontainers-integration-test)
I'm generating a changelog from a SQL Server database which I would like to use to create an Oracle version. One of the tables has a varbinary(max) column and contains data to bring over as well. The generated inserts from this table look like this:
<changeSet author=... id="1415603816743-555">
<insert tableName="my_table">
<column name="my_table_id" value="0007JL11X000OZ10J60000948UM000000P8P"/>
...
<column name="my_table_image" value="[B#70eded35"/>
...
</insert>
...
</changeSet>
This throws an "ORA-01465: invalid hex number" when I attempt to insert the my_table_image data on the Oracle side (target column is a BLOB).
The original data on the SQL Server side is hex. Any ideas on how to successfully generate/update from the changelog to both SQL Server varbinary(max) and Oracle BLOB? Thanks!
I would say that Liquibase is probably not the right tool to use for this kind of data transfer. Liquibase is primarily concerned with database structure rather than database contents. For transferring simple data (strings, numerics, etc.) Liquibase can do the job, but for something like pictures I would look at more specialized ETL (Extract-Transform-Load) tools.
The tool I have used most and had success with (although I've never done image data, so YMMV) is called Pentaho - they have a free open-source "community" version that might work for you.
I found 2 issues per se when I ran the updateSQL commandline in Liquibase
The last statement in Liquibase updateSQL output viz Insert into DBCHANGELOG table does not commit automatically when the sql is run via sqlplus commandline
As a result of this, though the changeset gets executed, the DBCHANGELOG table does not have the insert statement to record it. So when I run the updateSQL once again, the last changeset is once again created in the SQL output which is incorrect.
Liquibase does not validate / check syntax errors in SQL.
As a result of this, even if the changeset SQL fails, the insert to DBChangeLog table for the changeset succeeds which is incorrect. Is there a way that the insert statement following the changeset be stopped / failed if the changeset SQL actually failed ?
Any help is greatly appreciated... we are this close to getting Liquibase implemented... !!
To answer the question in your subject line, no, Liquibase cannot validate the SQL. Liquibase supports many different databases, and each has different SQL syntax.
If you can, stop using the SQL generated by updateSQL to actually do the updates, and use Liquibase itself to do the updates. That way Liquibase can detect errors and behave more properly. I recommend that if DBAs are scared of Liquibase touching the database that teams use the generateSQL as a pre-check to see what Liquibase will do, but let Liquibase do its job.
I also find best practice of Liquibase is not to use a SQL script but to manually write the Liquibase XML file for the change.
I've tried using the ExecuteCommand tag to lunch sqlplus or sqlcmd (as I know my target database) and it has a bug which as for now it is closed?! (but this is open source, so I can't complain :) )
Having said that, I found that working on XML to specify the changes causes many other challenges, for example:
1. Making sure that every change was included in the changelog xml file. I've heard many organizations who forget to add the file to the changelog.
2. Making sure the file for the specific change is always in sync with the file-based version control. Imagine what will happen if it doesn't - which happens to many of my customers...
3. Wasted time spent on merging changelogs between different environments (branches, UAT - critical fixes, sandboxes, etc...)
I am trying to migrate from oracle db to mysql or postgres using Liquibas. I have generated the sql file using Liquibase but, the syntax is not right there is a lot of issue with the generated sql. If anyone has any solution please do let me know thank you.
The best approach is to use the generateChangeLog function to create an XML changeSet description of your oracle database. Go through the generated changelog to make sure everything expected is there, and make any changes to the file as needed such as data type changes.
Once the changelog is correct, you can run the XML changelog directly against your mysql or postgresql database or use updateSQL to generate the SQL liquibase would use. Liquibase will create the correct database-specific SQL when it runs a changelog against a given database.
I'm looking to automatically generate an XML version of a database schema from SQL (Postgres) DDL.
Are there any tools to help with getting from the DDL to XML?
xml2ddl claims to do this, but it fails to connect and seems unsupported since 2005.
You can use the built-in table_to_xmlschema etc.; see http://www.postgresql.org/docs/current/static/functions-xml.html#FUNCTIONS-XML-MAPPING.
Things that spring immediately to my mind:
Liquibase
Schemaspy
SQL Workbench's WbSchemaReport
They don't use a DDL (SQL) script as input but require a database connection.
Have you also researched DbUnit?