Execute service builder generated sql file on postgresql - sql

I would like to execute sql files generated by the service builder, but the problem is that the sql files contains types like: LONG,VARCHAR... etc
Some of these types don't exist on Postgresql (for example LONG is Bigint).
I don't know if there is a simple way to convert sql file's structures to be able to run them on Postgresql?

execute ant build-db on the plugin and you will find sql folder with vary vendor specific scripts.

Daniele is right, using build-db task is obviously correct and is the right way to do it.
But... I remember a similar situation some time ago, I had only liferay-pseudo-sql file and need to create proper DDL. I managed to do this in the following way:
You need to have Liferay running on your desktop (or in the machine where is the source sql file), as this operation requires portal spring context fully wired.
Go to Configuration -> Server Administration -> Script
Change language to groovy
Run the following script:
import com.liferay.portal.kernel.dao.db.DB
import com.liferay.portal.kernel.dao.db.DBFactoryUtil
DB db = DBFactoryUtil.getDB(DB.TYPE_POSTGRESQL)
db.buildSQLFile("/path/to/folder/with/your/sql", "filename")
Where first parameter is obviously the path and the second is filename without .sql extension. The file on disk should have proper extension: must be called filename.sql.
This will produce tables folder next to your filename.sql which will contain single tables-postgresql.sql with your Postgres DDL.
As far as I remember, Service Builder uses the same method to generate database-specific code.

Related

SQL Server - Copying data between tables where the Servers cannot be connected

We want some of our customers to be able to export some data into a file and then we have a job that imports that into a blank copy of a database at our location. Note: a DBA would not be involved. This would be a function within our application.
We can ignore table schema differences - they will match. We have different tables to deal with.
So on the customer side the function would ran somethiug like:
insert into myspecialstoragetable select * from source_table
insert into myspecialstoragetable select * from source_table_2
insert into myspecialstoragetable select * from source_table_3
I then run a select * from myspecialstoragetable and get a .sql file they can then ship to me which we can then use some job/sql script to import into our copy of the db.
I'm thinking we can use XML somehow, but I'm a little lost.
Thanks
Have you looked at the bulk copy utility bcp? You can wrap it with your own program to make it easier for less sophisticated users.
Since it is a function within your application, in what language is the application front-end written ? If it is .NET, you can use Data Transformation Services in SQL Server to do a sample export. In the last step, you could save the steps into a VB/.NET module. If necessary, modify this file to change table names etc. Integrate this DTS module into your application. While doing the sample export, export it to a suitable format such as .CSV, .Excel etc, whichever format from which you will be able to import into a blank database.
Every time the user wants do an export, he will have to click on a button that would invoke the DTS module integrated into your application, that will dump the data to the desired format. He can mail such file to you.
If your application is not written in .NET, in whichever language it is written, it will have options to read data from SQL Server and dump them to a .CSV or text file with delimiters. If it is a primitive language, you may have to do it by concatenating the fields of every record, by looping through the records and writing to a file.
XML would be too far-fetched for this, though it's not impossible. At your end, you should have the ability to parse the XML file and import it into your location. Also, XML is not really suited if the no. of records are too large.
You probably think of a .sql file, as in MySql. In SQL Server, .sql files, that are generated by the 'Generate Scripts' function of SQL Server's interface, are used for table structures/DDL rather than the generation of the insert statements for each of the record's hard values.

Import Oracle User Schema

I've got an oracle database with several users (Other Users?), and I would like to import an schema which is in an .sql file.
My doubt is how to specify on my .sql file that the import is for an specific user.
Thank you in advance.
Examine your sql file. If the commands in there specify a schema name, then you'll need to modify it before you can import it into a different schema.
For example, does it have commands like this:
CREATE TABLE scott.mytable (...)
or like:
CREATE TABLE mytable (...)
If the schema name (e.g. "scott") has been hard-coded, then you'll need to edit your sql script to carefully remove it.
If not, then you just need to log in as the target username and run your sql script.
That depends on the content of your SQL file. You're not doing an import, you are running an SQL file, and that is a bit like "running a script" : it can contain anything. So, it's hard for us to tell from here, how you should run a file, to which we have no clue what is the content. There are many ways of defining the owner of an object. It can be done explicit, or implicit. So, that's a first thing to check : is a user (schema) specified IN the script ? If it is, where is it specified, and how ?
In the most simple case, people would just write a script that connects, and installs objects - in the current schema. Sometimes even without the connect. So, in that case you can call the script as any user you want the objects to be created in.
In the totally other way, you can have a script where a given owner, is specified at each object reference. In that case, you'll probably end up doing a global search and replace.
So, let us know how your script works, and we can detail.

How to generate the change log in SQL format to create the current database schema?

When starting to use Liquibase on an existing database, it is often useful, particularly for testing, to have a way to generate the change log to create the current database schema. Liquibase allows you to do this with the “generateChangeLog” command_line command. However this command will generate database change log in XML format only.
So how to generate the change log in SQL format to create the current database schema ? Does it exist any way to convert the database change log in XML format to SQL format ? If not, is there any extension point in Liquibase API to add this feature ?
There is no support currently to generate SQL, but you would be able to write an extension to do it. The *Serializer classes like liquibase.serializer.core.xml.XMLChangeLogSerializer take a changelog object and output into whatever format you want.
With something like FormattedSqlChangeLogSerializer that overrides getValidFileExtensions() to return new String[]{"xml"} you can just run generateChangeLog with outputFile=some.file.sql and get the SQL you generated in your custom serializer class.
The extension system will allow you to create this class in a custom jar.

Include SQL file to another SQL file

I have a specific SQL file that may be "connected" to another, more generic SQL init file.
Is it possible to somehow include reference from one SQL file to another SQL file?
I am using Oracle and the DB is populated using Spring DataSourceInitializer class.
If you are using SQL*Plus to run your script, you can use the # (or ##) sign to include another SQL script.
See the manual for details:
http://download.oracle.com/docs/cd/B19306_01/server.102/b14357/ch12002.htm#i2696724
http://download.oracle.com/docs/cd/B19306_01/server.102/b14357/ch12003.htm#i2696759

Using wix3 SqlScript to run generated temporary sql-script files

I am starting to write an installer which will use the SqlScript-element.
That takes a reference to the Binary-table what script to run.
I would like to dynamically generate the script during the installation.
I can see three possibilities:
Somehow to get SqlScript to read it data from a file rather then a Binary entry.
Inject my generated script into the Binary table
Using SqlString
Which will cause the need to place some rather long strings into Properties, but I guess that shouldn't really be a prolem.
Any advice?
Regards
Leif
(My reason, should anyone be interested is that the database should have a job set up, that calls on an installed exe-file. I prefer to create the job using sqlscript. And the path of that file is not known until InstallDir has been choosen.)
The way this is typically handled is to have the static stuff in SqlScript and use SqlString (which can contain formatted Properties) to execute the dynamic stuff. You can interleave the two with careful use of the Sequence attribute.