Binary Blob truncated to 8000 bytes - SQL Server 2008 / varbinary(max) - nhibernate

I have upgraded from Fluent Nhibernate 1.0 with Nhibernate 2.1 to pre-
release 1.x with NHibernate 3.0 GA and have hit what I think is a regression, but I want to hear if that's indeed the case.
I am using SQL Server Express 2008 and the MSSQL 2008 dialect and have an
Image property of type System.Drawing.Image and I have mapped it like
this:
Map (food => food.Image)
.Length (int.MaxValue)
.Nullable ();
The Image column in the table is of type varbinary(MAX).
The generated hbm for the property is:
<property name="Image" type="System.Drawing.Image, System.Drawing,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a">
<column name="Image" length="2147483647" not-null="false" />
</property>`
However no matter what I do the binary blob is truncated to 8000 bytes
when serialized with the current FNH and NH versions. That didn't used
to be the case with previous versions.
Ideas of why this is happening and how to fix/workaround it?

I too have encountered a similar problem and after much experimentation I noticed that when using Nhibernate to generate my schema to a file the generated column type was always length 8000.
Setting setting CustomSqlType to Varbinary(max) as suggested above made no difference, however, this work around in my FluentMapping seemed to do the trick:
Map(x => x.LogoBytes).CustomType("BinaryBlob").Length(1048576).Nullable();
The length of course is an arbitrary amount but I think it should be set to something less than int.Max. I am new to Nhibernate so I'm still figuring things out but I'd be interested to know if this helps you.

In 3.0.0GA, the following mapping seems to do the trick:
<property name="Data" type="Serializable" length="2147483647" />

This is a regression. I have raised a bug and provided patches at https://nhibernate.jira.com/browse/NH-2484

Map(x => x.Image).Length(100000).Not.Nullable();
Add the 'Length(MAXVALUE)' as above and it will work :)

Have you tried this?
Map(x => x.Image).CustomSqlType("VARBINARY(MAX)");

Related

how to persist a String/Object array such that indices and values are stored into separate columns in eclipselink

Am using EclipseLink 2.4.2 with derby database. I have an Entity class which has a String array. Now, i need to map this array to a separate table 'MY_ARGS' such that index must be mapped to one column 'ARRAY_INDEX' and the value (at index) must be mapped to another column 'ARG'.
In hibernate we have 'array' element through which we can do this. Like below :
<array name="args" table="MY_ARGS" cascade="all">
<key column="PARENT_ID"/>
<list-index column="ARRAY_INDEX" />
<element type="string" column="ARG" length="16384" />
</array>
But in eclipse-link am not able to find any such element (am using eclipselink-orm.xml) through which i can achieve this!
Does Eclipselink support such array to table(multi-column) mapping?
I read about #Converter through which we can convert the data type while storing to/retrieving from DB. But it looks like converter deals with single column and cannot store to two columns simultaneously.
Is there a way that i can do this through converter or any such work around? (I prefer not to use collections)
Any quick help is greatly appreciated!
Thanks in Advance!
-Alekhya

NHibernate setting access="field.camelcase-underscore" fails in version 3

I have a solution that was created with NHib 1.2 which we're upgrading to NHib 3.0.
Our hbm file has the following property:
<property name="ContentId" column="ContentId" access="field.camelcase-underscore" />
The class doesn't have a ContentId property. This was working fine in NHib 1.2 but now we're getting getting the following exception:
Could not compile the mapping document: XXXX.Core.Domain.Video.hbm.xml ---> NHibernate.MappingException: Problem trying to set property type by reflection ---> NHibernate.MappingException: class Core.Domain.Video, Core, Version=1.0.0.29283, Culture=neutral, PublicKeyToken=null not found while looking for property: ContentId ---> NHibernate.PropertyNotFoundException: Could not find the property 'ContentId', associated to the field '_contentId', in class 'Core.Domain.Video'.
Why would this stop working? Is it still supported in NHib 3?
We have many many properties like this that we might need to add.
NHibernate greatly improved its error messaging and diagnostics in NH2.X and again in NH3.X. You are telling NHibernate that you have a property and you want to map it via field access to a field named by _camelCase convention. You don't have a property named ContentId and NHibernate is letting you know that you lied to it. :)
Try updating your mapping to:
<property name="_contentId" column="ContentId" access="field" />
You will need to update any HQL or Criteria queries to use _contentId rather than ContentId. Another option would be to add a private ContentId property.
I'd like to provide information which helped me answer this question:
http://groups.google.com/group/nhusers/browse_thread/thread/e078734a221c3c0c/ec8b873b385d4426?lnk=gst&q=field+camelcase+underscore#ec8b873b385d4426
In this link Fabio explains the same problem you are having like this:
This mapping
<property name="PositiveValue" access="field.camelcase-underscore" />
mean: For my property named "PositiveValue" you (NH) have to access to
the field; to discover which is the associated field you (NH) have to
use the strategy "camelcase-underscore".
If there is no property you can't use the accessor with a specific
strategy.
Which struck me as a little odd because it meant adding dummy, unused properties, just to make the nhibernate3 compiler happy. The underlying functionality is the same.

FluentNHibernate CustomType("Binary") MappingException

I have an issue with mapping an entity with the latest FluentNHibernate build available on NuGet (package version: 1.1.1.694) and NHibernate 3.0 GA
What I am trying to reach is sql type: binary(64) with FluentNHibernate in a database-agnostic manner (I don't want to use CustomSqlType).
The default is varbinary(64) which I don't want. Lowercase "binary" leads to this as well.
My mapping code:
this.Map(x => x.PasswordHash)
.CustomType("Binary")
.Length(64)
.Not.Nullable();
Gives in NHibernate mapping XML file:
<property name="PasswordHash" type="Binary">
<column name="PasswordHash" length="64" not-null="true" />
</property>
Exception on generating schema:
Could not load type Binary.
System.TypeLoadException: Could not load type Binary. Possible cause: no assembly name specified.
at NHibernate.Util.ReflectHelper.TypeFromAssembly(AssemblyQualifiedTypeName name, Boolean throwOnError)
On the other hand CustomType("StringClob") works. Is there something I am missing ?
Is there a way to make FluentNHibernate .CustomType<> work with built-in NHibernate types ?
(Useful for AnsiChar, or other non-standard mapping between .NET type and database type) ?
I believe you have to change the sql-type, not the type (fluent syntax is probably .SqlType("binary") or something like that)

Coldfusion ORM 9.0.1 - Error while resolving relationship

I got this example from the adobe coldfusion documentation, some of the names are changed but everything else is the same, unless I am just so frustrated that I have missed a letter.
user.cfc:
/**
*#persistent
*/
component
{
property name="id" fieldtype="id" generator="native";
property name="userName" type="string" length="100";
property name="Credential" fieldtype="one-to-one" cfc="model.user.credentials";
}
credentials.cfc:
/**
*#persistent
*/
component
{
property name="id" fieldtype="id" generator="foreign" params="{property='userinfo'}";
property name="userinfo" fieldtype="one-to-one" cfc="model.user.user" constrained="true";
property name="passwordHash" type="string";
}
no matter how I word it, after searching many sites, I still get a error of:
Error while resolving the relationship Credential in cfc user. Check the column mapping for this property.
I have checked that both cfcs are accessible by coldfusion by removing the one-to-one properties and the tables have been created successfully.
I am using SQL Server 2008 with Coldfusion 9.0.1 under Apache 2.2 web server.
I am new to ORM and Hibernate but have successfully created different types of relationships and will confess to a less then expert level of coldfusion.
Thanks, this is really bothering me as this came directly from the coldfusion documentation.
Do you have a mapping for model?
If not, add one, or you could try:
property name="Credential" fieldtype="one-to-one" cfc="credentials";

Unable to bulk insert using NHibernate

I've tried adding bulk inserting to my application, but the Batcher is still NonBatchingBatcher with a BatchSize of 1.
This is using C#3, NH3RC1 and MySql 5.1
I've added this to my SessionFactory
<property name="adonet.batch_size">100</property>
And my code goes pretty much like this
var session = SessionManager.GetStatelessSession(type);
var tx = session.BeginTransaction();
session.Insert(instance);
I'm using HILO identity generation for the instances in question, but not for all instances on the database. The SessionFactory.OpenStatelessSession doesn't take a type, so it can't really know it can do batching on this type, or...?
After some digging into NHibernate, I found something in SettingsFactory.CreateBatcherFactory that might give some additional info
// It defaults to the NonBatchingBatcher
System.Type tBatcher = typeof (NonBatchingBatcherFactory);
// Environment.BatchStrategy == "adonet.factory_class", but I haven't
// defined this in my config file
string batcherClass = PropertiesHelper.GetString(Environment.BatchStrategy, properties, null);
if (string.IsNullOrEmpty(batcherClass))
{
if (batchSize > 0)
{
// MySqlDriver doesn't implement IEmbeddedBatcherFactoryProvider,
// so it still uses NonBatchingFactory
IEmbeddedBatcherFactoryProvider ebfp = connectionProvider.Driver as IEmbeddedBatcherFactoryProvider;
Could my configuration be wrong?
<hibernate-configuration xmlns="urn:nhibernate-configuration-2.2" >
<session-factory name="my application name">
<property name="adonet.batch_size">100</property>
<property name="connection.driver_class">NHibernate.Driver.MySqlDataDriver</property>
<property name="connection.connection_string">my connection string
</property>
<property name="dialect">NHibernate.Dialect.MySQL5Dialect</property>
<property name="proxyfactory.factory_class">NHibernate.ByteCode.Castle.ProxyFactoryFactory, NHibernate.ByteCode.Castle</property>
<!-- To avoid "The column 'Reserved Word' does not belong to the table : ReservedWords" -->
<property name="hbm2ddl.keywords">none</property>
</session-factory>
</hibernate-configuration>
I know this question is a year old, but there is a NuGet package that adds MySQL batching functionality to NHibernate. The reason that it's not baked directly into NHibernate is that the functionality required a reference to the MySQL.Data assembly, and the dev team didn't want the dependency.
IIRC, batching is currently supported for Oracle and SqlServer only.
As almost any other aspect of NH, this is extensible, so you can write your own IBatcher/IBatcherFactory and inject them via configuration.
Sidenote: current version of NH is 3.0 GA.
Really old question but...let's be completely correct
Another reason for batching not working can be use of stateless session (as in your case). Stateless session does not support batching. From documentation:
The insert(), update() and delete() operations defined by the
StatelessSession interface are considered to be direct database
row-level operations, which result in immediate execution of a SQL
INSERT, UPDATE or DELETE respectively. Thus, they have very different
semantics to the Save(), SaveOrUpdate() and Delete() operations
defined by the ISession interface.