NHibernate "View or function 'x' is not updatable..." Error - nhibernate

Using NHibernate 2.1.2.4000, I'm getting the following error for a view I've not modified the data in when saving another table when running Session.Flush():
System.Data.SqlClient.SqlException: View or function 'x' is not updatable because the modification affects multiple base tables.

The problem was that NHibernate's model for the view had properties as non-nullable ints that were actually nullable in the view. NHibernate was automatically converting the null values to 0s, but then at Session.Flush() was trying to update the nulls in the database to be 0s.

Related

how to remove Error for schema bindings in redshift

I want to be able to make CTE to make the below SQL work, I am getting the error
ERROR: Cannot replace a normal view with a late binding view for the below SQL, any way I could change it up so that it doesnt bind with schema views?
CREATE OR REPLACE
VIEW "dev"."XXBRK_DAILY_FX_RATES" ("F_C", "CURRENCY", "C_D", "C_R") AS
SELECT DISTINCT GL.GL_R.F_C, GL.GL_R.CURRENCY,
GL.GL_R.DATE, GL.GL_R.C_R
FROM GL.GL_R
with no schema binding
WHERE GL.GL_R.C_T='Corporate'
UNION ALL
SELECT DISTINCT GL.GL_R.F_C, GL.GL_R.F_C CURRENCY, GL.GL_R.DATE, 1
FROM GL.GL_R;
So you seem to have a statement issue. The last 4 lines are after the ';' and not part of the statement being run. I'm guessing that these are extraneous and posted by mistake.
Views on Redshift come in several types - normal and late binding are 2. The view "dev"."XXBRK_DAILY_FX_RATES" seems to already exist in your cluster so your command is trying to replace it, not create it. The error message is correct, you cannot replace a view with a view of a different type. You need to drop the view, then recreate it as late binding.
Now be careful as other objects dependent on this view will be impacted when you drop it (especially if you CASCADE the drop). When you drop and recreate the view it is a new object in the database but replacing a view just make a new definition for the same object. Understand the impacts of drop to your database before you execute it.

Hibernate + DB2 trigger - SQLCODE:-746

I have DB2 database with some legacy triggers and I am moving from jdbc to Hibernate. My problem is, that I am stucked up with error thrown by DB2 while storing data to DB through Hibernate.
The error -746 says:
If a table is being modified (by INSERT, DELETE, UPDATE, or MERGE),
the table can not be accessed by the lower level nesting SQL
statement.
If any table is being accessed by a SELECT statement, no table can be
modified (by INSERT, DELETE, UPDATE, or MERGE) in any lower level
nesting SQL statement.
In my case I am trying to save entity (which is owned by another already entity saved in this transaction), but there is before-insert trigger in DB, which checks constraint (like "is this instance the only one that has flag set to true?"). But while the attempt to execute this trigger, error is thrown.
I already had similar error while saving another entity with another trigger and I did it like store the problematic entity by JDBC, load it by hibernate and save the rest of entities. But this approach seems to me a bit cumbersome. Is there a way how to resolve this problem? And why exactly is this not working by Hibernate whereas by JDBC is? In both cases it is in one transaction. I tried to flush the data before storing the problematic entity, but did not help.

Views are failing when their underlying table is repopulated

I am creating a view on a bigQuery Table. The view works fine, until I truncate that table and put new data in it. I get the following error:
"Query Failed Error: A view from this query references an old version of
a table that might be incompatible. Please delete and re-create
repcore-dev:views.listings."
If I run the query that defines the view, it works. If I re-create the view with the same query, it works. But my question is why this is necessary. For us the real value in the view is that it provides an abstraction of the table data (even when that data is re-populated).

ColdFusion ORM how to update without dropping?

I have a persistent entity in ColdFusion and I need to update a property
property name="createdDateTime" ormtype="date";
to
property name="createdDateTime" ormtype="timestamp";
before, I use to delete the table then reload ORM. However,now I have data in my table I cannot just delete it. Is there anyway I can update this field in ORM without dropping the whole table?
Thanks
Yes, you should be able to just change the property and do ormReload(). Try it in a test environment first but the ormtype is not directly tied to the database type.
in your Application.cfc
this.ormSettings.dbCreate = "Update";
Anyway, in your case (date -> timestamp), the underlying SQL type should be the same (at least in SQL Server, which is datetime)

SqlException (0x80131904): Invalid column name for mvc3 SaveChanges() after schema change

I've built an MVC3 application using the Entity Framework database first approach. I was able to round trip all objects, then needed to make a schema change. After changing the database schema and updating the .edmx, SaveChanges() fails for objects that map to a db table with column changes.
Specifically: Originally I had a table 'project_issue_installation' that had column 'installation_system_id'. I've changed the schema to remove the 'installation_system_id' from 'project_issue_installation', ran an 'update model from database', recompiled and checked the datmodel .edmx. No errors on compilation and the model .edmx looks correct.
When I try to persist a project_issue_installation object, I get a Invalid column name 'installation_system_id' exception.
I've searched the entire solution for 'installation_system_id' and came up with nothing. Can anyone point me to where the app is holding on to that column name?
-Dan
Turned out to be a problem in the db schema, not the EF model. My DBA missed a trigger that was still pointing to the old column.