proper application version update that includes database and code update - yii

I got an application written in YII that from time to time will need version update. Currently, when we release a new update, we manually run a shell script to copy/overwrite the application code/source files from our git repo and set the appropriate permissions and other things, then at the end of the script, we run a YII command to run our database update. We have versioning on our database update. We also rollback changes to the database if one of the sql statements of a version fails. Now the issue occurs if a database update fails, and the application code/source is updated, then it will fail when it tries to access some table fields, table or views.
How to best handle an application update with versioning? Much like the way wordpress handles its update or better.
Would like to ask for suggestions to the rigth approach, it may include RPM, GIT or other info.
It would be good to have a detailed list of processes from you guys.
Thanks.

Database updates may include backups, and running multiple scripts, and
should be handled outside of rpm packaging. There are too many failure modes
for RPM scripting to handle flawlessly.
You can always package up the database schema update script in the package
and then check for the correct schema version when your application starts,
providing instructions (or a pointer to instructions) on how to upgrade the
database, and how to reinstall the last-known-good application, when the wrong
schema version is detected.

Related

How can I migrate new versions after having executed a repair using flyway?

I executed a repair using flyway and deleted the file that caused the error.
Since I have executed the repair, it's not possible to migrate any other files anymore, although there is only one version present in the flyway info, which is just the flyway-baseline itself.
With the command flyway migrate, it says that the schema is up to date, so there is no migration necessary, although I provided textfiles in the sub-folder "sql".
It worked properly before I had executed the repair command...
Hope someone can help!
Cheers,
Yasmin

Using Liquibase to version control remote SQLite Databases

The system I have is a local machine for development with the dev DB and a number of remote servers with the production database. While looking for a system to manage the versions of my SQLite database I found Liquibase but I can't understand if it will work for what I need. Which is updating the schema of the production databases when i release a new version, adding the changes configured in Liquibase's changelog file for that version. Ofcourse all the rest code is under GIT so, if Liquibase only needs the changelog files I can put them in the repository, but if it needs something else it could become a problem.
Yes it should work. If you are using liquibase for first time it will run all the migrations and will store information in your database by creating seperate table for itself. Though you should verify the structure at both local and production is same and migrations won't cause error.

Wordpress development process [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed last year.
Improve this question
I want to design a wordpress development process like in following picture:
First I want to create a bitbucket repository for my Wordpress site. From this repository all our software developers should able to clone the site to their local machines for developing. For developing all developers should have one local database to test changes.
After a developer finished a task he should be able to push his changes to the repo. When a sprint is done I want to send all changes from the repo with Jenkins pipeline/job to the test environment. At this environment a tester should be able to test all new functions with a cloned database from the prod system (including the dev changes).
When all tests are successfully done I want be able to apply the database changes to the prod system (with a SQL script) and send all changes with an other Jekins pipeline/job to the prod system.
Do you think this can work? Whats with plugin updates? Can I setup environment variables for each system so the plugin updates can be just done on the dev machine?
I'm not sure if this could work because a plugin or plugin update creates a lot of new database changes and I think I need a tool who can display all changes like Sourcetree for git.
Is there someone who has expert knowledge with wordpress and this kind of development process and can share his experience with me?
Or do you think this process is not working with wordpress? If this is true it would be realy bad because I need a process like this.
Thanks a lot!
I don't really know Wordpress, but the process you describe is definitely possible (I've implemented similar solutions on Drupal and Adobe Experience Manager, for instance).
However...
It's hard.
In a CMS project, a change/new feature can include:
a code change (PHP, CSS, JavaScript)
a database structure change (e.g. a new table)
a database content change (e.g. a copy fix, or default/test content)
a configuration change
Working out which version should get what is really hard. You want a developer to commit a change, and have that change replicated on QA with test content - but once QA sign it off, you probably don't want to promote that test content to production. And config changes should probably flow between systems but with different values for each environment.
For managing the database changes, I've found a plug-in that monitors database changes; no idea how scriptable that is.
See WP Activity Log.
What I've done in the past in similar situations is write a script that creates the database definition for each change - so a developer can run that script, and commit it as part of their code change. It requires a lot of discipline, though - you can only modify the database structure by using the scripts.
The correct answer is yes you can do this. I know WordPress, Bit-bucket, GIT, SVN, Linux, Ubuntu exceptionally well. I have built a system very similar to what you describe and use it daily.
The problem stated is the CMS can get tricky. That is true, but you need to use the correct tools for the correct upgrades. So, WordPress ALREADY has versioning and revisions built into it. The DATABASE doesn't need to be involved at all
First off. The database doesn't need to be updated unless you are updating plugins. But for strict development no DB pushes are necessary. So have your developers check files in and out of Bit-bucket. When the lead developer approves the changes have him migrate / push to the MASTER BRANCH in your REPO. Inside of bit-bucket there is a tool called GIT HOOKS. You can trigger a php file on the server every time there is a push to the production branch. What the PHP file does is simply trigger the linux command GIT PULL which will update all the code on the server with what in on your PRODUCTION BRANCH. GIT PULL will also remove any files if files were removed etc. On the server you will have a "checked out" copy of the GIT repo and on linux the credentials after the first clone will be stored. Simply have your PHP file trigger a BASH script that does a GIT PULL. Done.
No matter how many developers you have there will always need to be a set of eyes that reviews the code changes and merges those into production. I.e. that is where the Lead Developer comes into play.
FYI. The only directories in your wordpress instance that needs to be in bitbucket is the THEME DIRECTORY and the PLUGINS directory. You DO NOT need to sync the entire WP install which is pretty large.
In the case that you would be building custom Plugins, again, it is just code that is stored in the plugins directory. If your custom plugins are built correctly and require the use of Databases then when they are activated they will immediately build the WP DB's that are needed. Likewise, correctly built plugin will also drop its own custom table when uninstalled.
You will need to sync the 2 below directories.
Plugins folder resides in: wp-content/plugins/
Themes Folder is wp-content/themes/SELECTED_THEME
Any additional questions just ask and I am here.
From my experience it is always better to allow each developer to have their own Branch and to setup the the Dev server a dedicated master branch for quality control. you can check out some documentation on how to set this up https://plixxer.com/docs/server-management/website-quality-control/
basically you want to have a live server and dev server. The live server should only ever pull from the REPO and and the Dev and coders can pull or Push from the repo. My team treats the dev server as a quality checking station. If the current live code is not up to our standards the entire dev is rolled back to what is live on the master branch. When code in the master succeeds our standards we pull from the master branch onto the live server. Each developer should have their own branch for testing on their local server. Let me know if you need some help on setting up a local environment with GIT.
You will want to make a distinction around "build" and another around "release". The workflow I understand is that developers call their local workstations "dev", and pull request their work to the develop branch (you may have already read through Gitflow). Then, using your choice of CI automation, you get the latest source into a build area and do that - build it. Check out Ansible. If you have BitBucket, maybe you also want to organize your sprint with the likes of Jira? Then you have pretty seemless integration of your sprint objectives with actual branches containing the relative work/source. Ansible can help you automate builds and releases to the point where you are doing daily builds, and running the unit tests across your builds in the various integration environments.
During builds, you would have different configuration files being factored in depending on the target environment. This is how to care for environment configuration. It is part of the build process, and ideally all configuration is possible through the build. For example, a connection string might be different across the environment if you are having different databases to isolate migration of schema changes. For example, in a Angular application you would execute ng b --prod to build production and this would bring in a production configuration file during build to change the connection string (for example).
More about configuration specific to environments... you can also include post deployment scripts that get deployed and executed after files are uploaded so that they will configure the environment as required.
Ask your questions below, and I will do my best to build this out into a comprehensive guide.

Create an oracle pl/sql package using a package on a different remote database

Is it possible to create a package or replace an existing package in a local database using a package from a different database without having to export it from the remote database?
Basically i have two environments/servers (DEV and QA).
The developers that work on the packages use the development environment and i would like to update the same packages in the QA environment using the package in DEV (ignore any possible issues for now e.g compilation failures etc).
Is it possible to frequently update the package in QA using the package in Dev as the source (instead of compiling from an .sql file)? Maybe a database link?
Yes, it's possible, you could created a process on your target system which uses the DBMS_METADATA package on the remote system to fetch the DDL for the desired package spec and body, and then use dynamic SQL on local system to compile the fetched code.
Alternatively, you could use tools such as Oracle's SQL Developer for migrating code. Using either the database diff functionality to detect differences and prepare the appropriate DDL scripts, or the Cart functionality to pick and choose what get's migrated. However, I'm not sure how well the SQL Developer method can be automated.

Need a .NET database versioning script runner

I'm looking at versioning databases and came across the usual articles regarding how to do this (coding horror, ode to code, etc). This all make perfect sense to me, however I'm trying to find a script runner that will run the sql scripts for me. All these articles mention having something to run them automatically, but none of them make any recommendations.
Does anybody know of any utilities for running these scripts? Ideally something that works in the following way:
Runs everything in a transaction so if any single update fails, the whole thing fails
I have control over the name of the scheme version database table
Ability to have a series of scripts that are always run if an upgrade takes place
Can be run as part of an automated task
EDIT
Open Source
We Use DbUp as Script Runner in our Web Project. Its simple and nice open source tools that help you to write you own script runner with Console Application fashion.
DbUp is a .NET library that helps you to deploy changes to SQL Server
databases. It tracks which SQL scripts have been run already, and runs
the change scripts that are needed to get your database up to date.
we can run scripts from folder in filesystem or you can embed them to your assembly and run them as embedded scripts.
you can find more information and sample on their code repository on github.
http://dbup.github.com
Check out SSW SQL Deploy - it would appear to do just about all you're asking for. It keeps track of already executed scripts, it'll run a whole batch of scripts at once and on multiple servers (if required), and so forth.
It's a pretty simple, but nifty tool - highly recommended!