Flywaydb error in kotlin - kotlin

I want to use flywaydb in Kotlin but I have an error in use
My database is PostgreSQL and my ORM is Kotlin Exposed
Code:
val url = "jdbc:postgresql://127.0.0.1/test1"
Database.connect(url, driver = "org.postgresql.Driver", user = "postgres", password = "123")
var flyway = Flyway()
flyway.setDataSource(url, "postgres", "123")
flyway.migrate()
Error:
Exception in thread "main" org.flywaydb.core.api.FlywayException:
Found non-empty schema(s) "public" without schema history table! Use
baseline() or set baselineOnMigrate to true to initialize the schema
history table. at
org.flywaydb.core.Flyway$1.execute(Flyway.java:1197) at
org.flywaydb.core.Flyway$1.execute(Flyway.java:1168) at
org.flywaydb.core.Flyway.execute(Flyway.java:1655) at
org.flywaydb.core.Flyway.migrate(Flyway.java:1168)
How can I solve it? Where is my code wrong?

Found non-empty schema(s) "public" without schema history table! Use baseline() or set baselineOnMigrate to true to initialize the schema history table.
That error message pretty much says it all. You seem to be running Flyway on a database already populated with tables.
By default, Flyway expects to be run on a brand-new database, in a greenfield project. Flyway installs its own table first, for its internal tracking. This is the “schema history table” mentioned in your error message. After installing its own table, Flyway runs your SQL scripts creating further tables.
If adding Flyway to an existing database, choose either solution:
Recreate your database from scratch, starting with an empty database, running Flyway first, then writing and executing SQL scripts to recreate all the elements of your old database, and finally importing existing data.
Read about Flyway baseline feature, just as the error message suggested.

Related

PostgreSQL error : the "PostAudienceEnum" type already exists

I generated a migrate.sql file from the prisma ORM which I then imported into my PostgreSQL database from Query Tool. However when I run this file I get the error: the "PostAudienceEnum" type already exists . I don't know why yet I did declare PostAudienceEnum as an enum . Here are the lines where we find the enumeration PostAudienceEnum:
-- CreateEnum
CREATE TYPE "PostAudienceEnum" AS ENUM ('PUBLIC', 'FRIENDS', 'ONLY_ME', 'SPECIFIC');
CREATE TABLE "Post" (
...
"audience" "PostAudienceEnum" NOT NULL DEFAULT E'PUBLIC',
...
CONSTRAINT "Post_pkey" PRIMARY KEY ("id")
);
This file was designed from my prisma schematic. I don't know how to modify it without messing up my database and why PostgreSQL throws this error.
You might be getting this error if the database already has data, and you're attempting to manually execute the SQL file against the database. You should only use prisma migrate to manage changes in your schema against your database. Prisma internally records what migrations have and have not been executed against the database, and attempting to run an SQL file manually outside of prisma migrate defeats the purpose of using it in the first place.
Migrate docs: https://www.prisma.io/docs/concepts/components/prisma-migrate
You should ONLY use Prisma Migrate to make changes to your database tables/columns/relationships, and not an external tool. However, if you are developing your database FIRST and then looking to keep your Prisma schema up to date with your database (and not the other way around), you will want to introspect your database. Same deal applies: Prisma knows what parts of your database are reflected in your Prisma schema and what's not.
Introspection docs: https://www.prisma.io/docs/concepts/components/introspection

getting started with liquibase on snowflake

I am trying to get started with liquibase on snowflake.
I think I am almost there with the liquibase.properties file
driver: net.snowflake.client.jdbc.SnowflakeDriver
classpath: ./liquibase-snowflake-1.0.jar
url: jdbc:snowflake://XXXXXX.us-east-1.snowflakecomputing.com
username: YYYYYYYYY
password: ZZZZZZZZZZ
changeLogFile: mySnowflakeChangeLog.xml
Unfortunately, liquibase complains about not having a "current database" when trying to create the tables databasechangelog and/or databasechangeloglock.
Since, I do not have access to the sql script creating these database tables, how do I instruct liquibase which DATABASE to use?
I pinged an internal team here #Snowflake. They recommended:
adding db=mydb database connection parameter to the URL.. or set
default namespace for the user.. alter user mike set
default_namespace=mydb
Hope that helps!
I am not an expert in liquibase, but JDBC standard allows custom connection properties being passed in. If liquibase support that, you can specify database as a custom connection property, and Snowflake JDBC will pass the database information with create connection request sending to the server.

DACPAC deployment fails on EXTERNAL DATA SOURCEd schemas

(Here is a problem with a similar issue:
Publish to SQL Azure fails with 'Cannot drop the external data source' message)
There is this new pattern called Sql Db Elastic Query (Azure).
The gist of it is captured here:
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-elastic-query-overview
Now, I have a SqlProj that defines:
External Data Source (EDS) and
Database Scoped Credentials (DSC)
To prevent passwords being in the script (I am using SqlCmd variables)
I probably have a gazillion views on "external" tables, based on the elastic query pattern.
And during DACPAC deployments to SQL Azure, I always get an error on:
Error SQL72014: .Net SqlClient Data Provider: Msg 33165, Level 16, State 1, Line 1 Cannot drop the external data source 'XXXX' because it is used by an external table.
Error SQL72045: Script execution error. The executed script:
DROP EXTERNAL DATA SOURCE [XXXX];
Checking the logs, I realize that there are all these Views/Tables that exist and use these EDS/DSC combo.
The work around comes with a price that's ever deepening.
So the question is, has anyone else hit this problem and found the root cause of this?

Entity Framework Core Error: dotnet.exe : System.Data.SqlClient.SqlException (0x80131904): There is already an object named 'Company' in the database

I am building a application with ASP.NET Core MVC6 and Entity Framework Core code first with build in DB context, the SQL database has already been populated with data records. I recently made some small changes to the data models and recreate the migration, with commands as ("dotnet ef migrations add Stage3", "dotnet ef database update") in VS 2015 Package Manager Console, but it ran into error as:
dotnet.exe : System.Data.SqlClient.SqlException (0x80131904): There is already an object named 'Company' in the database.
Company table is on the top of the tables relationship, it seems that because the table Company is already there and the EF can not update the new table structure. If I change DB name in the connection string, it will create new database with new table structure without any issues. I am not sure how to address this issue? After the application go live in the near future I will properly make more changes to the Modes and will have same issue again and I cannot delete database with live data to recreate new table structure, Maybe I should configure it in the Startup.cs file, but I haven't found any useful resources yet. Please give me some advises.
I have attempted to change the DB Initializer as attached screenshot, but not sure how to do it.
I checked the project code again, the migration has not been applied to __MigrationHistory table, the migration code actually contained the code to create whole database structure as sample below:
migrationBuilder.CreateTable(
name: "Company",
columns: table => new
{
CompanyId = table.Column<int>(nullable: false)
.Annotation("SqlServer:ValueGenerationStrategy", SqlServerValueGenerationStrategy.IdentityColumn),
CompanyName = table.Column<string>(maxLength: 100, nullable: false),
IsAdmin = table.Column<bool>(nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_Company", x => x.CompanyId);
});
And I haven't changed the project namespace. Recently I just made some changes on few table relationships such as site user permission table(company has many sites). I added a permission table, so now site user permission table can have multiple permissions type instead of single permission type.
Not sure how to set up automatic migrations in Entity framework core.
In Entity framework code first approch, there are four different database initialization strategies:
CreateDatabaseIfNotExists: This is default initializer. As the name
suggests, it will create the database if none exists as per the
configuration. However, if you change the model class and then run
the application with this initializer, then it will throw an
exception.
DropCreateDatabaseIfModelChanges: This initializer drops an existing database and creates a new database, if your model classes
(entity classes) have been changed. So you don't have to worry about
maintaining your database schema, when your model classes change.
DropCreateDatabaseAlways: As the name suggests, this initializer drops an existing database every time you run the application,
irrespective of whether your model classes have changed or not. This
will be useful, when you want fresh database, every time you run the
application, like while you are developing the application.
Custom DB Initializer: You can also create your own custom initializer, if any of the above doesn't satisfy your requirements or
you want to do some other process that initializes the database using
the above initializer.
So if you are using DropCreateDatabaseIfModelChanges or DropCreateDatabaseAlways then replace it with CreateDatabaseIfNotExists.
Please try this out.

Can I exclude a custom schema from a Schema comparison in SSDT?

We have a SQL server database that is very dynamic and is always creating new and dropping existing tables from a custom schema called 'temp' (we have a dbo schema and a temp schema). We also use SSDT to maintain and monitor changes in our schema but we are unable to use the update feature on a schema comparison because if a new table is created (say temp.MyTable) after the schema comparison is made and before the updated is attempted, SSDT invalidates the schema comparison because something has changed. At the moment, our only solution to this is to run the schema comparisons around midnight when system activity is practically non-existent but is not ideal for the person who has to do the schema comparison.
My question is, is there a way we can exlude tables from the schema comparison that are apart of the 'temp.' schema?
How are you doing the deployment? as I test I used sqlpackage.exe to publish a dacpac and sat there constantly creating new tables and it deployed without complaining.
However, there are a couple of things you can do, the first is to stop getting the deployment to stop when drift is detected:
/p:BlockWhenDriftDetected=False
This is set to true by default.
The second thing is to ignore the temp schema, but I don't think this will help unless you also stop the drift but you might want to use this filter to stop all changes to the temp schema:
http://agilesqlclub.codeplex.com/
Ed