Liferay ServiceBuilder doesn't alter tables - sql

Short story
When I modify the column withs in tables.sql (VARCHAR(4000)) generated by the service builder, redeploying the portlet does not cause Liferay to alter the db tables. How can I make sure that the column withs get expanded?
Long story
I have to make some changes to a Liferay 6.1.20 EE GA2 project developed by another contractor. The project uses maven as a build tool.
After adding some columns to the service.xml and running mvn liferay:build-service, I noticed, that the portlet-model-hints.xmlgot overriden (see https://issues.liferay.com/browse/MAVEN-37) and resettet to the default column width.
There's alot of data in the tables (it is running in production mode), so I cannot simply drop and recreate the tables.
So I manually modified the column width in the generated tables.sql and redeployed the portlet. The new columns are now present in the db tables, but the column widths were not altered.
Does Liferay alter column width or do I have to fire some sql statements against the database manually?
(We are working with an oracle 10g database)

If you want to change the column withs, you need to write in the portlet-model-hints.xml.
For instance, to increase a field until 255 you will do:(Its important running the build service after that change.)

ServiceBuilder doesn't do ALTER TABLE by itself - you'll have to write an UpgradeProcess for this yourself. Check this blog post or the underlying documentation.
In short: The update that can always be done automatically is of the type "DROP TABLE - CREATE TABLE", but, as you say, this is typically not desirable. Any more fancy way needs to be done manually, and that's exactly what this mechanism is for.

Related

Track Structure Changes in a database

I am using SQL Server 2012. I want to track the schema changes made in a database. For example, when a new column is added to a table (weather from designer or script), a script should be generated like 'alter table tbl1 add col1 int' and so on.
I got to the Schema Changes History Report but it doesn't provide enough information required as it only shows the table changed, change type and date and time etc but does not provide the script which was generated to make that change.
Well I got to the solution via this link. It is too easy to enable
https://www.mssqltips.com/sqlservertip/1723/auto-generate-change-scripts-in-sql-server-management-studio-ssms-for-tables/

Can I exclude a custom schema from a Schema comparison in SSDT?

We have a SQL server database that is very dynamic and is always creating new and dropping existing tables from a custom schema called 'temp' (we have a dbo schema and a temp schema). We also use SSDT to maintain and monitor changes in our schema but we are unable to use the update feature on a schema comparison because if a new table is created (say temp.MyTable) after the schema comparison is made and before the updated is attempted, SSDT invalidates the schema comparison because something has changed. At the moment, our only solution to this is to run the schema comparisons around midnight when system activity is practically non-existent but is not ideal for the person who has to do the schema comparison.
My question is, is there a way we can exlude tables from the schema comparison that are apart of the 'temp.' schema?
How are you doing the deployment? as I test I used sqlpackage.exe to publish a dacpac and sat there constantly creating new tables and it deployed without complaining.
However, there are a couple of things you can do, the first is to stop getting the deployment to stop when drift is detected:
/p:BlockWhenDriftDetected=False
This is set to true by default.
The second thing is to ignore the temp schema, but I don't think this will help unless you also stop the drift but you might want to use this filter to stop all changes to the temp schema:
http://agilesqlclub.codeplex.com/
Ed

How To Edit .sdf file

I have my application and I install a .sdf file and I add columns to my database. So how do I update my database without losing my Data?
You can execute standard data definition language (DDL) commands in code at application startup.
For example:
myCommand.ExecuteNonQuery("ALTER TABLE MyTable ADD NewColumn1 INT NULL")
We have done this for years in devices ranging from PocketPC to Tablets.
We used to check whether or not a database table or column existed before modifying the DB structure, but we have found it is much easier to record the current database version in a table and then just check the version on startup then make the modifications that we know are necessary between the current version in the database and the application's database version.
You can extend/update a database schema by choosing Default Values for new columns or allow new columns to accept Nulls.

Doctrine schema changes while keeping data?

We're developing a Doctrine backed website using YAML to define our schema. Our schema changes regularly (including fk relations) so we need to do a lot of:
Doctrine::generateModelsFromYaml(APPPATH . 'models/yaml', APPPATH . 'models', array('generateTableClasses' => true));
Doctrine::dropDatabases();
Doctrine::createDatabases();
Doctrine::createTablesFromModels();
We would like to keep existing data and store it back in the re-created database. So I copy the data into a temporary database before the main db is dropped.
How do I get the data from the "old-scheme DB copy" to the "new-scheme DB"? (the new scheme only contains NEW columns, NO COLUMNS ARE REMOVED)
NOTE:
This obviously doesn't work because the column count doesn't match.
SELECT * FROM copy.Table INTO newscheme.Table
This obviously does work, however this is consuming too much time to write for every table:
SELECT old.col, old.col2, old.col3,'somenewdefaultvalue' FROM copy.Table as old INTO newscheme.Table
Have you looked into Migrations? They allow you to alter your database schema in programmatical way. WIthout losing data (unless you remove colums, of course)
How about writing a script (using the Doctrine classes for example) which parses the yaml schema files (both the previous version and the "next" version) and generates the sql scripts to run? It would be a one-time job and not require that much work. The benefit of generating manual migration scripts is that you can easily store them in the version control system and replay version steps later on. If that's not something you need, you can just gather up changes in the code and do it directly through the database driver.
Of course, the more fancy your schema changes becomes, the harder the maintenance will get i.e. column name changes, null to not null etc.

Applying changes easily in Access Database

I have got a backup of a live database (A copy of an ACCDB format Access database) in which I've worked, added new fields to existing tables and whole new tables.
How do I get these changes and apply that fast in the running database?
In MS SQL Server, I'd right-click > Script Table As > Alter To, save the query and run it wherever I desire, is there an as easy way as that to do it in an Access Database ?
Details:
It's an ACCDB MS-Access database created on Access 2007, copied and edited in Access 2007, in which I need to get some "alter" scripts to run on the other database so that it has all the new columns and tables I've created on my copy.
For new tables, just import them from one database into the other. In the "External Data" section of the ribbon, choose the Access icon above "Import". That choice starts an import wizard to allow you to select which objects you want imported. You will have a choice to import just the table structure, or both structure and data.
Remou is right that you can use DDL ALTER TABLE statements to add new columns. However, DDL might not support every feature you want for your new columns. And if you want not just the empty columns added, but also also any data from those new columns, you will probably need to run UPDATE statements to get it into your new columns.
As far as "Script Table As", see if OmBelt's Export Table to SQL tool for MS Access can do what you want.
Edit: Allen Browne has sample ALTER TABLE statements. See CreateFieldDDL and the following one, CreateFieldDDL2.
You can run DDL in Access. I think it would be easiest to run the SQL with VBA, in this case.
There is a product called DbWeigher that can compare Access database schemas and synchronize them. You can get a free trial (30 days). DbWeigher will write a script of all schema differences and write it out as DDL. The script is thorough and includes relationships, indexes, validation rules, allow zero length, etc.
A free tool from the same developer, DBWConsole, will let you execute a DDL script against any Access database. If you wrote your own DDL scripts this would be an easy way to apply the changes to your live database. It even handles some DDL that I don't know how to process in VBA (so it must be magic). DBWConsole is included if you downloaded the trial version of DBWeigher. Be aware that you can't make schema changes to a table in a shared Access database if anyone has the table open.
DbWeigher creates a script of all differences between the two files. It can be a lot to manually parse through if you just want a few of the changes. I built a parser for DbWeigher script files so they could be filtered by table, to extract just the parts I wanted. I contacted the DbWeigher author about it but never heard back. It's safe to say that I have no affiliation with this developer.