How/Where to run sequelize migrations in a serverless project? - migration

I am trying to use Sequelize js with Serverless, coming from traditional server background, I am confused where/how to run database migrations.
Should I create a dedicated function for running migration or is there any other way of running migrations?

I found myself with this same question some days ago while structuring a serverless project, so I've decided to develop a simple serverless plugin to manage sequelize migrations through CLI.
With the plugin you can:
Create a migration file
List pending and executed migrations
Apply pending migrations
Revert applied migrations
Reset all applied migrations
I know this question was posted about two years ago but, for those who keep coming here looking for answers, the plugin can be helpful.
The code and the instructions to use it are on the plugin repository on github and plugin page on npm.
To install the plugin directly on your project via npm, you can run:
npm install --save serverless-sequelize-migrations

Lambda functions were designed to be available to run whenever necessary. You deploy them when you expect multiple executions.
Why would you create a Lambda function for a migration task? Applying a database migration is a maintenance task that you should execute just one time per migration ID. If you don't want to execute the same SQL script multiple times, I think that you should avoid creating a Lambda function for that purpose.
In this case, I would use a command line tool to connect with this database and execute the appropriate task. You could also run a Node.js script for this, but creating a Lambda to execute the script and later removing this Lambda sounds strange and should be used only if you don't have direct access to this database.

Related

How to detect on CI if there is Prisma schema change but I forgot to create a migration file?

Ok, let's say
I changed Prisma schema and ran yarn prisma migrate dev and created a migration.
I noticed there was a typo in schema, so I fixed it
I forgot to run yarn prisma migrate dev again after fixing a typo, so no migration was created for this change.
And this mistake went to a pull request. And it will be merged if nobody notice this mistake.
And this point, how can CI detects there is a schema change but no migration for that?
Thanks.
I just decided to use Circle CI's no_output_timeout: 1m option on run. and just run yarn prisma migrate dev with postgres docker image on CI.
So, if there is no unsynced stuff between schema and migration, it will pass.
If not, the promft will wait for user's input to get new migration name, so Circle CI's timeout kicks in here and it will make it fail. So we're able to know there was a mistake.
It works pretty well.

Testing in Gitlab CI/CD with different dependacies versions

I'm currently developing a (Laravel) package on Gitlab, and i want to automate testing using its CI/CD pipeline.
The problem
I already know ho to set up a pipeline in Gitlab, but what i want to achieve is to automate testing against different versions of the same dependancy, in order to keep checking compatibility with old version and add checking with upcoming new ones.
The case
My Laravel package is not so complex right now and don't use some particular nor specific Laravel features, so i would like to keep it compatible with te more versions of laravel possible: i would trigger different testing stages in my pipeline to run my tests against laravel 5.6, 5.7, 5.8, 6, 7, and 8.
The question
How do i trigger different testing stages using different laravel/framework versions?
When downloading dependancies composer will go for the latest version available if i define it with '^', so which files do i have to edit?
Ok, i've analyzed the problem a bit more, and made some considerations about it.
I'm writing not to properly answer my question, since i hope someone will eventually came up with a better solution/idea, but to just share some toughts with everyone is facing the same problem.
First: since i'm developing a package for Laravel i cannot declare laravel as dependancy for it, production nor develop, it is my laravel project that need to declare my package as a dependancy.
Second: to test my package compatibility with laravel i'm using orchestra/testbench as a dev-dependancy, and as for its documentation every release target a single and precise Laravel version, so if i want to test my package against different Laravel version i need to test it with different orchestra/testbench releases.
Third: the only dependancy my package has is just php 7.3, so i can easily test against this and subsequent version using Gitlab pipeline and creating a job for each php version that use a docker image with the correct php version and the last composer one.
Conclusion
It is not trivial nor straight to test a Laravel package against different Laravel version.
The only idea i came up with, but not tried since i gave it up aj just test php versions (for now) is to make a branch for each Laravel version i want to test with and update its composer.json dev-dependancy with the correct orchestra/testbench release.
Than i can execute php tests on my features branch merge request, and in case of success merge the develop branch on each "laravel branch" and execute on those the laravel compatibility test.
At last, if every laravel branch pass its tests, or at least the ones i decide to keep deevelopment/support active for, i can merge the develop branch on the master.
I'm not goig for it
I decided to avoid all of this since i'm not quite sure on how implement all of this on the pipeline, and i strongly think that it just adds mantainence burden to this project.
So i just keep the php jobs to check against different php versions, this way i just need to copy/paste a job definition in my gitlab-cy.yml file and change the docker image version accordingly to the new php version to test against.

Adding SQL Scripts to TFS Build Process

our team currently updates our production databases via the dacpacs/bacpacs since we do not have any client data. This will not be the case in the near future and I'm looking to change the process of only modifying the database via SQL scripts via build automation.
Is managing these SQL scripts in Team Foundation Server and executing them in the build do-able? And If so how have people approached this?
Thanks!
You should probably not be deploying DACPACs during the build process; it's more appropriate to push changes as part of a release/deployment workflow that occurs after a build.
That said, you can publish a DACPAC without dropping the database and recreating it; there are numerous options for controlling the database publishing behavior that can be set at the project level and overridden at the command line if necessary.
There are two main ways of managing SQL data base changes when deploying to production.
DACPAC
Sequential SQL Scripts
Both have there own issue and bonus when trying to deploy. If you control your entire DevOps pipeline then Dacpac is a good option. If you have to deal with corporate dba's then SQL scripts can also be done.

proper application version update that includes database and code update

I got an application written in YII that from time to time will need version update. Currently, when we release a new update, we manually run a shell script to copy/overwrite the application code/source files from our git repo and set the appropriate permissions and other things, then at the end of the script, we run a YII command to run our database update. We have versioning on our database update. We also rollback changes to the database if one of the sql statements of a version fails. Now the issue occurs if a database update fails, and the application code/source is updated, then it will fail when it tries to access some table fields, table or views.
How to best handle an application update with versioning? Much like the way wordpress handles its update or better.
Would like to ask for suggestions to the rigth approach, it may include RPM, GIT or other info.
It would be good to have a detailed list of processes from you guys.
Thanks.
Database updates may include backups, and running multiple scripts, and
should be handled outside of rpm packaging. There are too many failure modes
for RPM scripting to handle flawlessly.
You can always package up the database schema update script in the package
and then check for the correct schema version when your application starts,
providing instructions (or a pointer to instructions) on how to upgrade the
database, and how to reinstall the last-known-good application, when the wrong
schema version is detected.

Need a .NET database versioning script runner

I'm looking at versioning databases and came across the usual articles regarding how to do this (coding horror, ode to code, etc). This all make perfect sense to me, however I'm trying to find a script runner that will run the sql scripts for me. All these articles mention having something to run them automatically, but none of them make any recommendations.
Does anybody know of any utilities for running these scripts? Ideally something that works in the following way:
Runs everything in a transaction so if any single update fails, the whole thing fails
I have control over the name of the scheme version database table
Ability to have a series of scripts that are always run if an upgrade takes place
Can be run as part of an automated task
EDIT
Open Source
We Use DbUp as Script Runner in our Web Project. Its simple and nice open source tools that help you to write you own script runner with Console Application fashion.
DbUp is a .NET library that helps you to deploy changes to SQL Server
databases. It tracks which SQL scripts have been run already, and runs
the change scripts that are needed to get your database up to date.
we can run scripts from folder in filesystem or you can embed them to your assembly and run them as embedded scripts.
you can find more information and sample on their code repository on github.
http://dbup.github.com
Check out SSW SQL Deploy - it would appear to do just about all you're asking for. It keeps track of already executed scripts, it'll run a whole batch of scripts at once and on multiple servers (if required), and so forth.
It's a pretty simple, but nifty tool - highly recommended!