How to create production database in hanami? - hanami

Hanami has some commands to create a database.
But both db create and db prepare are not available in production environment.
http://hanamirb.org/guides/command-line/database/
How can I create a database in production?

It depends.
We deliberately disabled these commands in production, because they can be potentially destructive and we can't guess where you're gonna deploy your project.
Is that Heroku? Well, add it via their CLI. Do you use a VPS? Is the database on the same node? Does the Unix user who runs the Ruby process have the permissions to create the database? We can't guess.
It depends where you're gonna deploy.

Related

Required permission to use tSQLt on SQL Server

Is there any other way to run tSQLt on a SQL Server database without sysadmin role or ALTER TRACE permission?
We are currently trying to use a test tool called SQLTest from Redgate which uses the tSQLt framework. We have installed it successfully on the database with the sysadmin role, but no one is able to use the tool apart from the person with the sysadmin role. Anyone else who tries gets an error message relating to permission.
I've been in touch with Redgate support and all they tell me is that the sysadmin role is needed or at least ALTER TRACE permission. These are elevated permissions and shouldn't be given to all users on a database.
Any help would be appreciated.
It is not clear exactly what your use case is but...
Typically, developers would use SQLTest and/or tSQLt on a local sandbox e.g. SQL Server Developer Edition installed on their laptops. If that is the case, most orgs should have no problems allowing developers to be sysadmin on their own locally installed SQL Server instance.
If you are using this on a shared SQL Server instance, again this should be a DEV environment where, hopefully SQL devs are allowed to administer their own dev environment.
I can't imagine any org allowing developers sysadmin access in a production instance but then you really shouldn't be using tSQLt in production anyway.
I might be wrong, but I believe it's only db_owner that's required to run tSQLt tests. Might depend on what procedures you're testing. Almost all of our tests are just data quality AssertTableEmpty tests.
If you have the Redgate Toolbelt (or SQL Compare, specifically), you should be able to create personal dev databases on the fly, make your changes, run the tests, then SQL Compare back to shared dev db.
There are two things at play here. tSQLt requires the executing principal to have permissions to alter any object in the database. Usually, you achieve that by making them part of the db_owner role.
Redgate SQLTest requires (in some circumstances) additional permissions. For example, the ALTER TRACE permission is needed to execute the code coverage tool that comes with SQLTest.
In addition to the above, there is a file called prepareserver.sql that needs to be executed once per SQL SERVER (not per database). It requires sysadmin privileges.
So, if you cannot use dedicated environments as #datacentricity recommended, you can execute tSQLt directly and not use SQLTest, or figure out which options to disable if you want to use SQLTest.
Like others here, I strongly recommend you use dedicated environments to do your development, independent of tSQLt.

How test Azure database components virtually without publishing a database in Azure

I have a Microsoft Azure SQL Database project. I also have a Python3.9 project that uses unittest to unit test this database project. I have an Azure DevOps build pipeline defined in YAML that runs the unit test against the development-integration environment.
I do not want to publish changes to the development-integration environment before running the tests. If you think this is the wrong approach, I will consider your arguments.
I want to 'virtually' test the changes. I want to deploy the new objects to a temporary ad-hoc database instance. It must be equivalent to Azure Database Instance. When the tests have been executed I want to clear everything away. I do not want to deploy a database in Azure for this purpose due to billing, although if I were to use a serverless instance this would not be a problem.
Any ideas?
If you are on cloud and you need to test you need to test that on the cloud too.
You cannot "virtually" test, there is nothing equivalent to Azure SQL database on-prem.
Go with the serverless instance as you said.

How can I automatically restore my Heroku Postgres staging database from a daily backup?

I have my Heroku production database scheduled to make daily backups, and I want to restore the backups onto my staging database daily as well. This way I can keep the staging box in sync with production for testing/debugging purposes and have a daily test of the restoration process run automatically.
I've tried to schedule a bash script to run on the staging box to perform the restore. The script I have uses the Heroku CLI to pull the url of the latest backup and perform the restoration. The problem I have is with authenticating the Heroku CLI. Since I can't open a browser on the dyno, I need to find a safe way to authenticate.
Should I pull a .netrc file from somewhere? Is it even a good idea to give a dyno the CLI? Is there a better way to go about this without standing up another server to run the restorations?
You can put an authorization token in the HEROKU_API_KEY env variable on your staging environment. Generate the token with heroku auth:token.
Then set the token on staging with heroku config:set HEROKU_API_KEY=token -a staging
From a security standpoint, this means your staging environment pretty much has full access to your production environment.
A more secure way is to have a scheduled task run on the production app or a new app just for this purpose that copies the db backup to an S3 bucket the staging app has access to. The staging app the restores from the backup in the s3 bucket. Staging needs no access to production.
This is a good idea anyway - if you lose access to Heroku you'll still have access to your backups.
There's a buildpack for this - https://github.com/kbaum/heroku-database-backups. I encourage you to read the code in the build pack - it is a pretty simple processes. I would also either fork the buildpack, or just write your own code because it will have full access to your production environment. I woud never trust a third party buildpack with that.
Bonus points if your job scrubs sensitive information from your production database for staging. It could do this by:
Restoring the production backup to second database
Scrub sensitive information from the second database
Backup the second database
Push the second database backup to the S3 bucket.

How to apply migrations to production server

Using Code First and Entity Framework, I have created my web application on my dev machine, used migrations and published to my beta application to my production server and database.
Then on my dev system I’ve done lots of changes created several migrations and applied them to my local dev database. When I use update-database this updates my local dev database, but how then do I apply the migrations to my production server database?
I've been using update-database -script to get the SQL to manually apply to my production server. Is there a better way?
You should ideally employ some kind of actual database deployment system like ReadyRoll. Short of that, you should generate SQL scripts that you can commit and deploy manually, preferably via a DBA role in your organization. Code-based migrations can do all sorts of potentially bad things to your database with no notice, but in a SQL script, you can easily see that a table is about to be dropped or a column with lots of irreplaceable data is about to be removed.
Your in Web.config is what establishes which database the application is pointing to. When you point it to production and run the same EF commands (dotnet ef migrations add migrationName, and dotnet ef database update) it should update your production environment.
For my setup I just don't deploy my web.config so in production it always points to production database. When I run the EF update scripts in production it updates production and I'm good to go.

Integration Test on Continuous Integration server

Would like to setup the CI system so that the integration tests could run in an centralized place.
How could we setup a database for each developer for their related branch of work.
We want to guarantee 100% compatibility with the deployed platform, at the cost of having multiple databases which is synchronized with a major db .
installation and data transfer should be automated and not painful during application build.
You have to setup database sandboxes for your CI server. This setup would depend a lot on what database solution you use and the size of your database.