nHibernate default schema unused in formula - nhibernate

Hey Guys,
I'm using nhibernate 2.2 and ran into a problem that I can't seem to find an answer to. My program is using a default schema assigned in the hibernate.cfg.xml file like this:
<property name="default_schema">MY_SCHEMA</property>
which works as advertised for all generated SQL statements, however I have statements in a formula that need to be assigned the default schema as well:
<property name="Count" type="int" formula="SELECT COUNT(*) FROM DETAILS WHERE DETAILS.ID = ID" />
MY_SCHEMA changes relatively often, so I need the SQL to be interpreted as <property name="Count" type="int" formula="SELECT COUNT(*) FROM MY_SCHEMA.DETAILS WHERE DETAILS.ID = ID" />
Is this possible without resorting to hardcoded schemas? Thanks!
Kevin

You can change your mappings on the fly when building the session factory.
Of course that's easier to do if you use a code-based mapping solution, like Fluent or ConfORM.

Related

How to use HSQLDB in Oracle query syntax mode?

I am trying to use HSQLDB as an embedded database in a spring application (for testing). As the target production database is Oracle, I would like to use HSQLDBs Oracle syntax mode feature.
In the Spring config I use
<jdbc:embedded-database type="HSQL" id="dataSource">
</jdbc:embedded-database>
<jdbc:initialize-database data-source="dataSource" enabled="true">
<jdbc:script location="classpath:schema.sql"/>
</jdbc:initialize-database>
And in schema.sql at the top I wrote:
SET DATABASE SQL SYNTAX ORA TRUE;
However, when running my test, I get the following error:
java.sql.SQLException: Unexpected token: DATABASE in statement [SET DATABASE SQL SYNTAX ORA TRUE]
Is this a syntax error or a permissions error or something entirely different?
Thanks - also for any pointers that might lead to the answer.
Given that HSQL is the Spring default for jdbc:embedded-database and given the target is Oracle, this scenario should actually be very common. However, I found nothing on the Web even touching the issue.
Update:
The issue above is resolved thanks to answer #1.
However, I now get another exception:
org.springframework.dao.DataAccessResourceFailureException: Failed to populate database; nested exception is java.sql.SQLException: java.lang.RuntimeException: unsupported internal operation: StatementCommand unsupported internal operation: StatementCommand
Any idea what this is caused by?
This option was introduced with HSQLDB 2.0.
Are you sure you are using the correct version?
Maybe you have 1.8 still in the classpath somewhere.
But that won't get you far in terms of testing anyway, because this only turns on some basic syntax "replacing", there is no real behaviour change involved here (and I'm not even talking about more advanced Oracle features like analytical functions, CONNECT BY or something similar).
It is very seldom a good idea to test your application with a DBMS that will not be used in production. It is simply not a valid test.
Even if it only change some basic syntax here is an example of how you can do it:
<bean class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close" id="dataSource">
<property name="driverClassName" value="org.hsqldb.jdbcDriver" />
<property name="url" value="jdbc:hsqldb:mem:PUBLIC;sql.syntax_ora=true" />
<property name="username" value="sa" />
<property name="password" value="" />
</bean>
The sql.syntax_ora=true URL property enables syntax compatibility, including the NUMBER type and the DUAL table. Additional properties can be used for more behavior compatibility. These are documented in the HSQLDB Guide:
http://hsqldb.org/doc/2.0/guide/compatibility-chapt.html#coc_compatibility_oracle

How to map small binary objects properly in SQLite/NHibernate combo (wrong type affinity)?

Trying to store property of C#/.NET type byte[] in SQLite. Here is my mapping:
<class name="MyClass" lazy="false" table="MyTable">
<property name="MyProperty" type ="BinaryBlob" access="property" />
</class>
In SQL server it works like a charm even without the explicit type="BinaryBlob" in the mapping. In SQLite I've tried various types' combinations between SQL CREATE TABLE statements and in NHibernate data types, but each time getting either an exception that "the mapping can't be compiled" (because of type incompatibility) or an exception that a cast from the fetched datatype to the mapping type is impossible.
The value of MyProperty in insert statement looks like this: 0x7C87541FD3F3EF5016E12D411900C87A6046A8E8.
Update: continuing to debug System.Data.SQLite.SQLiteDataReader - looks like no matter what SQL type is (tried decimal, blob, unsigned big int) - the type affinity is always text.
What am I doing wrong, please (either technically or in general)? Any suggestion is welcomed.
The reason for text affinity was that the data was imported into a table from CSV (comma-separated values) file. Switching to the SQL file with a proper INSERT statement solved the problem.
Did you look at: How do i store and retrieve a blob from sqlite? There is an article on ayende.com as well here: Lazy loading BLOBS and the like in NHibernate. These links might help push you in the right direction to see what is going on.

How to use native Sql for Insert and Update in Castle Active Record?

How to use native Sql for Insert and Update in Castle Active Record? There are sample for using Select query here http://www.castleproject.org/activerecord/documentation/trunk/usersguide/nativesql.html
But I can't find any sample for Update and Insert.
Update: Basically I am looking for a support for Update/Insert query like this.
<class name="Person">
<id name="id">
<generator class="increment"/>
</id>
<property name="name" not-null="true"/>
<sql-insert>INSERT INTO PERSON (NAME, ID) VALUES ( UPPER(?), ? )</sql-insert>
<sql-update>UPDATE PERSON SET NAME=UPPER(?) WHERE ID=?</sql-update>
<sql-delete>DELETE FROM PERSON WHERE ID=?</sql-delete>
</class>
AFAIK <sql-insert> et al. are not implemented in ActiveRecord. You could try implementing INHContributor to modify the NHibernate configuration and add those queries to the class mapping, but it won't be easy.
Even better would be implementing it and submitting a patch! For guidance, ask on the Castle developers google group.

How to make Nhibernate generate table with Text field instead of nvarchar(255)

I'm trying to make NHibernate generate my schema/SQL 2008, and using the mapping below it keeps wanting to create an nvarchar(255) column instead of text...any ideas?
<property name="AnnouncementText" column="AnnouncementText" type="StringClob">
<column name="AnnouncementText" sql-type="NTEXT"/>
</property>
Thanks!
The problem is specifying the name of the column twice...once I took it and the length out of the property element it worked perfectly
<property name="AnnouncementText" type="StringClob">
<column name="AnnouncementText" sql-type="text"/>
</property>
I'm used to SQL Server 2005 and the dialect it uses, but I presume you can do something similar. Since nvarchar(n) allows n up to 4000, a value above this will use nvarchar(max).
I presume that SQL Server 2000, which it sounds like you're using, does something similar once you hit the limit. If I read the NHibernate code correctly (NHibernate.Dialect.MsSql2000Dialect..ctor()) you get ntext once you pass 0xFA0 = 4000 characters.
<property name="AnnouncementText" column="AnnouncementText" type="string" length="10000"/>
this won't help at all, but I remember the good old days ;-)...
create table YourTable
(
...
AnnouncementText text null
...
)

NHibernate "database" schema confusion [.\hibernate-mapping\#schema]

I'm using NHibernate primarily against an MSSQL database, where I've used MSSQL schemas for the various tables.
In my NH mapping (HBM) files, I've specified the schema for each table in the mapping as follows:
<?xml version="1.0"?>
<hibernate-mapping xmlns="urn:nhibernate-mapping-2.2"
auto-import="true"
schema="xyz"> <!-- schema specified -->
<class name="Customer">
<id name="Id">
<generator class="native" />
</id>
<property name="Name" />
</class>
</hibernate-mapping>
For my unit testing, I've been experimenting with SQLite, however my mappings fail now as NH reports that the database "xyz" cannot be found.
I understand there is a difference in interpretation of schema, so what is NH's interpretation/implementation, and what is the best approach when using schema's?
BTW: Searching the web using keywords like "nhibernate database schema" doen't yielded anything relevant.
The "standard" interpretation is that a table has a three-part name: "CATALOG.SCHEMA.TABLE" : these are the names used in the standard (ISO-SQL standard?) "information_schema" views. Hibernate (presumably also NHibernate) follows this convention, you can specify catalog and schema in a class mapping, and default_catalog and default_schema in a configuration.
In my own unit test environment (using Hypersonic), I massaged the Hibernate Configuration before building the SessionFactory: I myself did this to do things like setting HSQL-compatible IdentifierGenerators, but you can problably go through clearing the schema properties of the mapped classes.
In general, I try to avoid specifying schemas and catalogs in applications at all. In Oracle, I generally create synonyms so users see the objects in their own namespace; in PostgreSQL, set the search_path in the database configuration; in SQL Server, put all tables into 'dbo'.
The NHibernate.Mapping.Table class has an GetQualifiedName(NHibernate.Dialect.Dialect dialect) method, which is defined as follows:
public string GetQualifiedName(NHibernate.Dialect.Dialect dialect)
{
string quotedName = this.GetQuotedName(dialect);
return ((this.schema == null) ?
quotedName :
(this.GetQuotedSchemaName(dialect) + '.' + quotedName));
}
So there's basically no way you can make SQLite to ignore schema name other than to have a separate set of mappings for every scenario (or preprocessing them before compilation).
You can specify the schema (if you need it) in the configuration file using property default_schema. You can use multiple configuration files, or alter the one you're using - one for production and the other for test.
It's possible you can simply ignore the schema setting and use different credentials.