Include SQL file to another SQL file - sql

I have a specific SQL file that may be "connected" to another, more generic SQL init file.
Is it possible to somehow include reference from one SQL file to another SQL file?
I am using Oracle and the DB is populated using Spring DataSourceInitializer class.

If you are using SQL*Plus to run your script, you can use the # (or ##) sign to include another SQL script.
See the manual for details:
http://download.oracle.com/docs/cd/B19306_01/server.102/b14357/ch12002.htm#i2696724
http://download.oracle.com/docs/cd/B19306_01/server.102/b14357/ch12003.htm#i2696759

Related

executing a common sql file using liquibase

I have a situation to handle, i have my liquibase structured as per the best practices recommended. I have the change log xml structured as given below
Master XML
-->Release XML
-->Feature XML
-->changelog XML
In our application group, we run updateSQL to generate the consolidated sql file and get the changes executed through our DBA group.
However, the real problem I have is to execute a common set of sql statements during every iteration. Like
ALTER SESSION SET CURRENT_SCHEMA=APPLNSCHEMA
as the DBA executes the changes as SYSTEM but the target schema is APPLNSCHEMA.
How to include such common repeating statements in Liquibase changelog.
You would be able to write an extension (http://liquibase.org/extensions) that injects it in. If you need to do it per changeLog, it may work best to extend XMLChangeLogParser to automatically create and add a new changeSet that runs the needed SQL.
You could make a changeSet with the attribute 'runAlways' set to true and include the SQL.
As far as I know, there isn't a way to have Liquibase itself do this. I suggest that you wrap Liquibase with your favorite scripting language such that you run a command "generateSQLforThoseCrazyDBAs" that runs Liquibase and then prepends the SQL you need to the output created by Liquibase.

Execute service builder generated sql file on postgresql

I would like to execute sql files generated by the service builder, but the problem is that the sql files contains types like: LONG,VARCHAR... etc
Some of these types don't exist on Postgresql (for example LONG is Bigint).
I don't know if there is a simple way to convert sql file's structures to be able to run them on Postgresql?
execute ant build-db on the plugin and you will find sql folder with vary vendor specific scripts.
Daniele is right, using build-db task is obviously correct and is the right way to do it.
But... I remember a similar situation some time ago, I had only liferay-pseudo-sql file and need to create proper DDL. I managed to do this in the following way:
You need to have Liferay running on your desktop (or in the machine where is the source sql file), as this operation requires portal spring context fully wired.
Go to Configuration -> Server Administration -> Script
Change language to groovy
Run the following script:
import com.liferay.portal.kernel.dao.db.DB
import com.liferay.portal.kernel.dao.db.DBFactoryUtil
DB db = DBFactoryUtil.getDB(DB.TYPE_POSTGRESQL)
db.buildSQLFile("/path/to/folder/with/your/sql", "filename")
Where first parameter is obviously the path and the second is filename without .sql extension. The file on disk should have proper extension: must be called filename.sql.
This will produce tables folder next to your filename.sql which will contain single tables-postgresql.sql with your Postgres DDL.
As far as I remember, Service Builder uses the same method to generate database-specific code.

How to generate the change log in SQL format to create the current database schema?

When starting to use Liquibase on an existing database, it is often useful, particularly for testing, to have a way to generate the change log to create the current database schema. Liquibase allows you to do this with the “generateChangeLog” command_line command. However this command will generate database change log in XML format only.
So how to generate the change log in SQL format to create the current database schema ? Does it exist any way to convert the database change log in XML format to SQL format ? If not, is there any extension point in Liquibase API to add this feature ?
There is no support currently to generate SQL, but you would be able to write an extension to do it. The *Serializer classes like liquibase.serializer.core.xml.XMLChangeLogSerializer take a changelog object and output into whatever format you want.
With something like FormattedSqlChangeLogSerializer that overrides getValidFileExtensions() to return new String[]{"xml"} you can just run generateChangeLog with outputFile=some.file.sql and get the SQL you generated in your custom serializer class.
The extension system will allow you to create this class in a custom jar.

How to Export data to Excel in SQL Server using SQL Jobs

I need to export the data from a particular table in my database to Excel files (.xls/.xlsx) that will be located into a shared folder into my network. Now the situation is like this -
I need to use SQL SERVER Agent Jobs.
2.I need to generate a new excel file in every 2 minutes that will contain the refreshed data.
I am using sql server 2008 that doesn't include BI development studio. I'm clueless how to solve this situation. First, I'm not sure how to export the data using jobs because every possible ways I tried had some issues with the OLEDB connection. The 'sp_makewebtask' is also not available in SQL 2008. And I'm also confused how to dynamically generate the names of the files.
Any reference or solution will be helpful.
Follow the steps given below :
1) Make a stored procedure that creates a temporary table and insert records to it.
2) Make a stored procedure that read records from that temporary table and writes to file. You can use this link : clickhere
3) Create an SQL-job that execute step 1 and step 2 sequentially.
I found a better way out. I have created a SSIS(SQL Server Integration Services) package to automate the whole Export to Excel task. Then I deployed that package using SQL Server Agent Jobs. This is a more neat and clean solution as I found.

Can I retrieve config values for SSIS from XML in a table?

My current client stores all of their configuration information for the enterprise applications in a single table that holds XML. They then use a custom built front end to maintain the configuration values.
I'm writing a fairly straight-forward import process for them using SSIS. I need to make the connection strings and some other information configurable and they want me to use their table. It seems like SSIS expects a file though. Is there any way that I can point SSIS to retrieve its configuration values from an XML stream instead of a path to a file?
The configuration table that they use does not match the structure of a standard SSIS configuration table that you would get using SQL Server as a configuration source with the standard wizard.
Thanks for any advice!
You can retrieve values from the table, put it in variables, and using a script, transfer the varaibale values into the SSIS parameters.
Having the XML formatted just like the SSIS XML file is a huge bonus, though.
Is there a away to put a trigger on thier table to update the SQL Server config table or create a new XML document anytime an SSIS configuration is inserted or updated? Then you could use what you need and they could do what they need and all would be happy.