I have an issue with mapping an entity with the latest FluentNHibernate build available on NuGet (package version: 1.1.1.694) and NHibernate 3.0 GA
What I am trying to reach is sql type: binary(64) with FluentNHibernate in a database-agnostic manner (I don't want to use CustomSqlType).
The default is varbinary(64) which I don't want. Lowercase "binary" leads to this as well.
My mapping code:
this.Map(x => x.PasswordHash)
.CustomType("Binary")
.Length(64)
.Not.Nullable();
Gives in NHibernate mapping XML file:
<property name="PasswordHash" type="Binary">
<column name="PasswordHash" length="64" not-null="true" />
</property>
Exception on generating schema:
Could not load type Binary.
System.TypeLoadException: Could not load type Binary. Possible cause: no assembly name specified.
at NHibernate.Util.ReflectHelper.TypeFromAssembly(AssemblyQualifiedTypeName name, Boolean throwOnError)
On the other hand CustomType("StringClob") works. Is there something I am missing ?
Is there a way to make FluentNHibernate .CustomType<> work with built-in NHibernate types ?
(Useful for AnsiChar, or other non-standard mapping between .NET type and database type) ?
I believe you have to change the sql-type, not the type (fluent syntax is probably .SqlType("binary") or something like that)
Related
I have an entity with a property which I wish to be readonly - meaning that when I insert this entity to the DB, SqlServer will generate the property's value automatically so I need nhibernate to ignore this property when executing the INSERT command but retrieve it when selecting the entity.
Important note: this property isn't ID! I don't want NHibernate to initialize it using generator, SqlServer will do it by itself.
And another note: I use configuration mapping so no fluent mapping solutions please.
That functionality is supported. There are two attributes:
<property name="GeneratedBySql" insert="false" update="false" />
The same could be applied even for reference mapping
<many-to-one name="ReferenceGeneratedBySql" insert="false" update="false" />
If we want to use Mapping-by-Code we do have the same in places, see:
Mapping-by-Code - Property (by Adam Bar)
Snippet cited:
Property(x => x.Property, m =>
{
m.Column("columnName");
...
m.Update(false);
m.Insert(false);
Recently i had some performance problems in a SOAP webservice I wrote a while ago. I noticed I had a lot of queries going on and my hbm.xml mappings where full of lazy=false statements. I upgraded to NHibernate 3.0 and removed the lazy = false stuff and everything was a LOT faster....but now i am getting the following error:
System.InvalidOperationException: There was an error generating the XML document. ---> System.InvalidOperationException: The type UserProxy was not expected. Use the XmlInclude or SoapInclude attribute to specify types that are not known statically.
User is a class of which i removed the lazy=false property from the class tag like this:
<class name="User" table="Users" >
<id name="DatabaseID" unsaved-value="0" column="ID" type="integer" >
<generator class="native"/>
</id>
<property name="IsExpert"/>
.....more stuff here....
</class>
My webservice has a method like this (simplified a little..in real-life i use a repository-like pattern between the service and nhibernate):
[WebMethod]
public User GetUser(int userid)
{
session = GetCurrentSession();
return session.Load<User>(userid);
}
The webservice expects to serialize a user and NHibernate gives me a UserProxy (which is not a user exactly). How should I overcome this?
Don't return entities from the web method. Use a DTO.
Webservices cannot serialise proxies - session.Load(userId) will return a proxy. You should user session.Get(userId) .
I think the answers saying you should use DTOs are not helpful, there is a time and place for DTOs and sometimes you may just want to return the entity.
If the User has child proxy properties, I have a class for handling this situation. Basically it loops through all properties (using reflection, and recursively going through child objects and collections) and uses the NHibernate.IsInitialized to check whether the property is a proxy or the genuine article. If it is a proxy then it sets it to null, thus making it possible for WCF to serialise it.
I have a solution that was created with NHib 1.2 which we're upgrading to NHib 3.0.
Our hbm file has the following property:
<property name="ContentId" column="ContentId" access="field.camelcase-underscore" />
The class doesn't have a ContentId property. This was working fine in NHib 1.2 but now we're getting getting the following exception:
Could not compile the mapping document: XXXX.Core.Domain.Video.hbm.xml ---> NHibernate.MappingException: Problem trying to set property type by reflection ---> NHibernate.MappingException: class Core.Domain.Video, Core, Version=1.0.0.29283, Culture=neutral, PublicKeyToken=null not found while looking for property: ContentId ---> NHibernate.PropertyNotFoundException: Could not find the property 'ContentId', associated to the field '_contentId', in class 'Core.Domain.Video'.
Why would this stop working? Is it still supported in NHib 3?
We have many many properties like this that we might need to add.
NHibernate greatly improved its error messaging and diagnostics in NH2.X and again in NH3.X. You are telling NHibernate that you have a property and you want to map it via field access to a field named by _camelCase convention. You don't have a property named ContentId and NHibernate is letting you know that you lied to it. :)
Try updating your mapping to:
<property name="_contentId" column="ContentId" access="field" />
You will need to update any HQL or Criteria queries to use _contentId rather than ContentId. Another option would be to add a private ContentId property.
I'd like to provide information which helped me answer this question:
http://groups.google.com/group/nhusers/browse_thread/thread/e078734a221c3c0c/ec8b873b385d4426?lnk=gst&q=field+camelcase+underscore#ec8b873b385d4426
In this link Fabio explains the same problem you are having like this:
This mapping
<property name="PositiveValue" access="field.camelcase-underscore" />
mean: For my property named "PositiveValue" you (NH) have to access to
the field; to discover which is the associated field you (NH) have to
use the strategy "camelcase-underscore".
If there is no property you can't use the accessor with a specific
strategy.
Which struck me as a little odd because it meant adding dummy, unused properties, just to make the nhibernate3 compiler happy. The underlying functionality is the same.
I have upgraded from Fluent Nhibernate 1.0 with Nhibernate 2.1 to pre-
release 1.x with NHibernate 3.0 GA and have hit what I think is a regression, but I want to hear if that's indeed the case.
I am using SQL Server Express 2008 and the MSSQL 2008 dialect and have an
Image property of type System.Drawing.Image and I have mapped it like
this:
Map (food => food.Image)
.Length (int.MaxValue)
.Nullable ();
The Image column in the table is of type varbinary(MAX).
The generated hbm for the property is:
<property name="Image" type="System.Drawing.Image, System.Drawing,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a">
<column name="Image" length="2147483647" not-null="false" />
</property>`
However no matter what I do the binary blob is truncated to 8000 bytes
when serialized with the current FNH and NH versions. That didn't used
to be the case with previous versions.
Ideas of why this is happening and how to fix/workaround it?
I too have encountered a similar problem and after much experimentation I noticed that when using Nhibernate to generate my schema to a file the generated column type was always length 8000.
Setting setting CustomSqlType to Varbinary(max) as suggested above made no difference, however, this work around in my FluentMapping seemed to do the trick:
Map(x => x.LogoBytes).CustomType("BinaryBlob").Length(1048576).Nullable();
The length of course is an arbitrary amount but I think it should be set to something less than int.Max. I am new to Nhibernate so I'm still figuring things out but I'd be interested to know if this helps you.
In 3.0.0GA, the following mapping seems to do the trick:
<property name="Data" type="Serializable" length="2147483647" />
This is a regression. I have raised a bug and provided patches at https://nhibernate.jira.com/browse/NH-2484
Map(x => x.Image).Length(100000).Not.Nullable();
Add the 'Length(MAXVALUE)' as above and it will work :)
Have you tried this?
Map(x => x.Image).CustomSqlType("VARBINARY(MAX)");
I've asked this elsewhere and not got any sensible reply
I'm trying to map an IDictionary. I have this mapping:
<class name="MyProject.Item, MyProject" table="Item">
<...>
<map name="Properties" access="property" table="ItemProperties" lazy="false">
<key column="ItemID" />
<index column="Idx" type="int" />
<element column="Value" type="System.Boolean, mscorlib"/>
</map>
I can persist data, but when the data is retrieved I get an nHibernate exception:
{"The value "0" is not of type "Project.PropertyType" and cannot be used in this generic collection. Parameter name: key"}
So it can't map to the enum, but why ? if I have a regular property that uses an enum, it works fine.
Is what I'm trying to do even possible ? I can't find much info on doing this.
Your mapping shows the key as an integer, not as an enum. To map the enum properly, use type="MyProject.Project.PropertyType, MyProject".
However, normally for an enum the best approach is to leave the type information out of the mapping file altogether and let NHib pick it up through reflection. My reading of the NHib source implies that if you are mapping into a generic IDictionary<K,V> then NHib should pick up the exact type of your key via reflection. IOW you still should be able to leave out the type attribute.