Converting a "DBML" file to a "SQL database file" - sql

I have the DBML file of a database and would like to generate an SQL database file from this file.
Thanks

There's a method on the Data Context called CreateDatabase() that you could use.
http://msdn.microsoft.com/en-us/library/system.data.linq.datacontext.createdatabase.aspx

I know of no available utilities that do that, and it's a bit much to do for a SO answer.
But, for the most part, it's not that big a deal. The DBML file is written in XML; it should be easy to read via Linq-to-xml. Then just split out the SQL commands for the values in the xml into a script file. Then run the script. (It also could be done with a XSLT transformation)
<Table Name="dbo.Person" Member="Persons">
becomes
CREATE TABLE Persons (
and
<Column Name="PersonID" Type="System.Int32" DbType="Int NOT NULL IDENTITY"
IsPrimaryKey="true" IsDbGenerated="true" CanBeNull="false">
</Column>
<Column Name="AddressID" Type="System.Int32" DbType="Int NOT NULL"
CanBeNull="false"></Column>
becomes:
PersonID Int NOT NULL,
AddressID int NOT NULL,
and so on.

You can use the CLI tool.
dbml2sql schema.dbml -o schema.sql
https://www.dbml.org/cli/#convert-a-dbml-file-to-sql

Related

Liquibase - Generating Change Logs

I want Liquibase, to generate a changelog, from this DB 'testing'. Is it possible?
I have an existing database already, with its tables and data inside.
jdbc:mysql://localhost:3306/testing
Now, I want Liquibase, to generate a changelog, from this DB 'testing'. Is it possible?
This is my command, but it doesn't work.
liquibase --driver=com.mysql.jdbc.Driver --classpath=C:\mysql-connector-java-5.1.47.jar
--changeLogFile=C:\db.changelog.xml --url="jdbc:mysql://localhost:3306/testing"
--username=root generateChangeLog
I don't use any password.
The error is related to --changeLogFile=C:\db.changelog.xml
I thought, Liquibase will refer to my DB 'testing', and generate changelog, with name 'db.changelog.xml' in folder C.
Which part I'm wrong? Do I miss something?
Or maybe, Liquibase is not intended, to generate changelog, from existing DB?
Or maybe, Liquibase is intended, to generate DB, from changelog only? And not vice versa?
This is possible. You might be having trouble since you are writing to a file in the root of your c: drive. Try c:\temp\changelog instead.
My experience is that liquibase works in increments. So if you run that command on your database, it will produce a file as if everything in the database has to be created in the changelog file (as if starting with a completely empty database).
If you read the text on Liquibase's site regarding this command, it says:
When starting to use Liquibase on an existing database, it is often useful, particularly for testing, to have a way to generate the change log to create the current database schema.
This means that if you:
Execute this command once against your dev database
Run the result against a new database (let's say test)
Run it again on your dev database
Run that file against your test database
You will get a load of errors stating that functions already exist.
I gather that the idea behind this is that you create new entries in the changelog files and executing them against ALL your databases, instead of using other tools and using liquibase for the delta.
Sample entry
<changeSet author="liquibase-docs" id="addColumn-example">
<addColumn catalogName="cat" schemaName="public" tableName="YY">
<column name="xx" type="varchar(255)"/>
</addColumn>
</changeSet>
SQL equivalent
ALTER TABLE yy ADD COLUMN xx INT

how to use biml offline schema

I have defined a table in BIML that does not exists.
I then create the table using ExecuteSQL, and want to load that table.
However BIML will fail as it tries to query the non existent table.
I have already provided the column mapping so it doesnt need to query the table anymore.
This thread Create table before the dataflow in BIML mentions the offline schema, but there is not much content on how to use it.
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Metadata>
<OfflineSchema Name="OfflineSchema1">
<Items>
<Item Name="[Schema].[Table]" ConnectionName="ConnName">
<ExternalTableSource Table="[Schema].[Table]" />
</Item>
</OfflineSchema>
</Metadata>
</Biml>

liquibase data generate ChangeLog with extra spaces

I am migrating data from mssql to oracle.Now I generated the changeLog for mssql by specifying the diff type data.I got the following
<insert tableName="TEST_TABLE">
<column name="test_column" value="TEST_value "/>
</insert>
But Test_value contains too many spaces which I don’t have in my mssql database.It has generated extra spaces.Could any one please help me out this why I got extra spaces

Connect databases to HSQLDB Server

I'm using version hsqldb-1.8 in a stand-alone way. I want to start the server with some databases. I'd like to use the command line from the HSQLDB documentation :
java -cp ../lib/hsqldb.jar org.hsqldb.Server -database.0 file:mydb -dbname.0 xdb
My problem is I have the script of each database I need and this command line creates a new database, without using my script. May the problem comes from one of these points :
The location of my script
The extension of my script which is a SQL-file
Something wrong or missing in the command line
The command line can't do it => If yes, is there any way to do it ?
I'd like to stay in the console to do all that. That way, I'll only have to launch a script to do all the job. Any help will be much appreciated ! Thanks
I found the solution of my issue :)
I've created a server.properties file located in the directory ../hsqldb-1.8.10/hsqldb/ which contains that :
server.database.0=file:mydb;user=test;password=test
server.dbname.0=mydb
I've also created the mydb.script file with that code in it:
CREATE SCHEMA PUBLIC AUTHORIZATION DBA
CREATE MEMORY TABLE MYDB(ID BIGINT NOT NULL,VERSION INTEGER NOT NULL,NOM VARCHAR(255))
CREATE USER TEST PASSWORD "TEST"
GRANT DBA TO TEST
SET WRITE_DELAY 10
SET SCHEMA PUBLIC
INSERT INTO MYDB VALUES(1,0,'test')
Then, I launch the HSQLDB Server with this command:
java -cp ../lib/hsqldb.jar org.hsqldb.Server
We can see that the database is successfully created :
[Server#10f0f6ac]: Database [index=0, id=0, db=file:mydb, alias=mydb] opened successfully in 313 ms.
To check if the database really contains my data, I use the HSQLDB DatabaseManager tool with this command:
java -cp ../lib/hsqldb.jar org.hsqldb.util.DatabaseManager
To connect:
URL : jdbc:hsqldb:file:mydb
User : test
Password : test
After that, we are connected to the database. Execute the command SELECT * FROM MYDB; and we can see the line of the database.
Hope that will help ! :)
The easist way to connect with with HSQLDB (IMDB)
Connection String
<property name="driverClassName" value="org.hsqldb.jdbcDriver" />
<property name="url" value="jdbc:hsqldb:file:/home/vikask/elmo/db/elmo;" />
<property name="username" value="sa" />
<property name="password" value="" />
In this simple Java project to demonstrate Hibernate, HSQL and Maven using Java Annotations. HSQL database is used to make the project simple, as we can use in-memory database and we would need only a JAR file to be included in our project.
To connect to an embedded HSQLDB database, select the JDBC (HSQLDB Embedded) connection type from the connection type list. Enter any login information if applicable, and then specify whether to use an existing embedded database, or to have HSQLDB create a new embedded database.
If the embedded database already exists, browse to the directory where the database files are located (such as database_name.log, database_name.script, and database_name.properties) and select the database_name.script file.
If the database does not exist, type in or browse to create a new location for the HSQLDB database. HSQLDB will then create the necessary files with the prefixed with the database name typed in. For example, if typing /home/vikask/sample as the location of the database, HSQLDB will create a file called sample.properites, and perhaps sample.log, etc. The actual name of the database is simply sample in this case.
HSQLDB creates a file with the .script extension for its internal use. This is not something that you create.
First run the server and connect to one of the databases. The database will be empty at this point. Then use the utilitly, SQLTool which is in the HSQLDB zip package to execute YOUR script on the database. All the tables and data that are created by your script are persisted in the HSQLDB database.

Can Datanucleus SchemaTool create additional columns not bound to a PC but mapped in JDO ORM?

I have a User class (#PC) currently having only one property: 'email', now I want the user-table (where User class is stored) to have additional columns, which are not managed by JDO but are subject of authentication happening outside of PM. When I auto-create the table by Datanucleus and then ALTER the table adding my columns everything works as expected.
Of course I would be happy to use SchemaTool for generation/update of the schema, yet dont want to have manual ALTER table procedure on that user table. Naively I've tried to put the two extra columns into ORM file (omitting targets):
<package name="bo">
<class name="User" table="tb_user">
<column name="USER_SECURITY" jdbc-type="VARCHAR" length="64"/>
<column name="SEC_SALT" jdbc-type="VARCHAR" length="10"/>
</class>
</package>
but SchemaTool did not generate the extra columns, altough ORM file was loaded according to logs.
BTW: no way I want to have those columns mapped and managed during JDO lifecycle.
so, is it possible to get Schematool to generate extra columns on tables or do I have to sort them out into another table not managed by Datanucleus?
thanks
JDO spec defines such as that as seen in this link, and I have no problem with our tests using such unmapped columns in SchemaTool
http://www.datanucleus.org/products/accessplatform_3_3/jdo/orm/schema_mapping.html#unmapped