Using Liquibase to version control remote SQLite Databases - liquibase

The system I have is a local machine for development with the dev DB and a number of remote servers with the production database. While looking for a system to manage the versions of my SQLite database I found Liquibase but I can't understand if it will work for what I need. Which is updating the schema of the production databases when i release a new version, adding the changes configured in Liquibase's changelog file for that version. Ofcourse all the rest code is under GIT so, if Liquibase only needs the changelog files I can put them in the repository, but if it needs something else it could become a problem.

Yes it should work. If you are using liquibase for first time it will run all the migrations and will store information in your database by creating seperate table for itself. Though you should verify the structure at both local and production is same and migrations won't cause error.

Related

Github doesn't change the database in our teams django environment

Our team is using a django environment to develop a website, the main issue is one team member recently updated one of the databases and the change will not care through mysql. We are literally on the same branch but the database tables are completely different. We are using the current version of django, python, and mysql are the development environment and github to share work.
The truth is Git is never synced with your MySQL database. MySQL Database is a local property of your system. If you use SQLite then that can be synced by Git. If you both need to access the same database you need some database in the cloud so that you both are on the same page.
Also, you need to migrate the migrations to get those same tables this is independent of the system but is dependent on the number of migrations created and applied.
This will create same tables and columns. Just run this the terminal
python3 manage.py makemigrations
python3 manage.py migrate

Adding SQL Scripts to TFS Build Process

our team currently updates our production databases via the dacpacs/bacpacs since we do not have any client data. This will not be the case in the near future and I'm looking to change the process of only modifying the database via SQL scripts via build automation.
Is managing these SQL scripts in Team Foundation Server and executing them in the build do-able? And If so how have people approached this?
Thanks!
You should probably not be deploying DACPACs during the build process; it's more appropriate to push changes as part of a release/deployment workflow that occurs after a build.
That said, you can publish a DACPAC without dropping the database and recreating it; there are numerous options for controlling the database publishing behavior that can be set at the project level and overridden at the command line if necessary.
There are two main ways of managing SQL data base changes when deploying to production.
DACPAC
Sequential SQL Scripts
Both have there own issue and bonus when trying to deploy. If you control your entire DevOps pipeline then Dacpac is a good option. If you have to deal with corporate dba's then SQL scripts can also be done.

Create an oracle pl/sql package using a package on a different remote database

Is it possible to create a package or replace an existing package in a local database using a package from a different database without having to export it from the remote database?
Basically i have two environments/servers (DEV and QA).
The developers that work on the packages use the development environment and i would like to update the same packages in the QA environment using the package in DEV (ignore any possible issues for now e.g compilation failures etc).
Is it possible to frequently update the package in QA using the package in Dev as the source (instead of compiling from an .sql file)? Maybe a database link?
Yes, it's possible, you could created a process on your target system which uses the DBMS_METADATA package on the remote system to fetch the DDL for the desired package spec and body, and then use dynamic SQL on local system to compile the fetched code.
Alternatively, you could use tools such as Oracle's SQL Developer for migrating code. Using either the database diff functionality to detect differences and prepare the appropriate DDL scripts, or the Cart functionality to pick and choose what get's migrated. However, I'm not sure how well the SQL Developer method can be automated.

proper application version update that includes database and code update

I got an application written in YII that from time to time will need version update. Currently, when we release a new update, we manually run a shell script to copy/overwrite the application code/source files from our git repo and set the appropriate permissions and other things, then at the end of the script, we run a YII command to run our database update. We have versioning on our database update. We also rollback changes to the database if one of the sql statements of a version fails. Now the issue occurs if a database update fails, and the application code/source is updated, then it will fail when it tries to access some table fields, table or views.
How to best handle an application update with versioning? Much like the way wordpress handles its update or better.
Would like to ask for suggestions to the rigth approach, it may include RPM, GIT or other info.
It would be good to have a detailed list of processes from you guys.
Thanks.
Database updates may include backups, and running multiple scripts, and
should be handled outside of rpm packaging. There are too many failure modes
for RPM scripting to handle flawlessly.
You can always package up the database schema update script in the package
and then check for the correct schema version when your application starts,
providing instructions (or a pointer to instructions) on how to upgrade the
database, and how to reinstall the last-known-good application, when the wrong
schema version is detected.

Prestashop 1.3.6 to 1.4.1

I have an eshop running PS 1.3.6 version. On my local I've updated to 1.4 first and then to 1.4.1...
Now I would like to update on server... is it possible to just upload files from my local 1.4.1, adjust the settings file and run the update script from 1.4.1 directly (without the middle step to 1.4)?
I can see there are database update scripts for each version, so it should be safe to do it like that, but I want to be sure before I run it on server.... thanks
I am usually doing the major upgrades this way:
Take a snapshot of the current site (tar.gz) & backup database using mysqldump tool (for compatibility);
Download all the files and setup the site on your local server machine using database dump (via mysql command) and downloaded snapshot. Adjust settings if necessary.
Perform an upgrade on your local site, thoroughly test it and test it again with your code & theme.
Repackage your updated files and database (tar.gz & mysqldump) and upload them to the server.
Erase the old site and untar upgraded site to its folder to take its place.
Replace old database with an upgraded one (using mysql command on the server).
Adjust settings if necessary. Test and run it! :)
That should be all. If you're more advanced you could optimize most of the steps. Give me a shout if you need all the useful commands to back up and restore files & DB.