Nhibernate connection error - nhibernate

I am using nhibernate 3.1.0, oracle 10g express edition and asp.net 3.5 MVC. I am getting error "ora 06413: connection not open" please help me to solve this problem. My Connection properties are:
<property name="connection.driver_class">NHibernate.Driver.OracleClientDriver</property>
<property name="connection.connection_string">User ID=user;Password=pwd;Data Source=localhost:1521/XE</property>
<property name="show_sql">false</property>
<property name="dialect">NHibernate.Dialect.Oracle10gDialect</property>
<property name="query.substitutions">true 1, false 0, yes 'Y', no 'N'</property>

Just taking a stab here - but it looks like your connection string isn't formatted properly.
The connection string you're using appears to be for "XE Client", but I don't think that's what NHibernate uses. I'd be willing to be NHibernate is going to use something more similar to one of the Microsoft connection strings shown on that page.
In all the oracle connection strings I see at ConnectionStrings.com/Oracle it seems the properties "User ID", "Password", and "Data Source" are not valid.
You could also reference this example of using NHibernate with an Oracle server: http://tiredblogger.wordpress.com/2008/11/07/using-oracle-odp-with-nhibernate-from-a-c-class-library/

Apparently Oracle 10g is a bit finicky on 64-bit systems. The best option seemed to be switching to MySQL.
the usage of MySQL also provide more features in hibernate

Related

Websphere datasource configuration in IntelliJ

I'm trying to migrate J2EE, very heavy, old school application, from RAD 8.5.5.1 to IntelliJ 2016.1.1. DataSource building using JNDI.
I compiled and configured all components (right for now) except DataSource.
In RAD DataSource configured like this, in resource.xml:
<resources.jdbc:JDBCProvider xmi:id="JDBCProvider_1163951110780" name="DB2 DataSource" description="DB2 Universal JDBC Driver Provider" implementationClassName="com.ibm.db2.jcc.DB2ConnectionPoolDataSource">
<classpath>${DB2UNIVERSAL_JDBC_DRIVER_PATH}/db2jcc.jar</classpath>
<classpath>${UNIVERSAL_JDBC_DRIVER_PATH}/db2jcc_license_cu.jar</classpath>
<classpath>${DB2UNIVERSAL_JDBC_DRIVER_PATH}/db2jcc_license_cisuz.jar</classpath>
<nativepath>${DB2UNIVERSAL_JDBC_DRIVER_NATIVEPATH}</nativepath>
<factories xmi:type="resources.jdbc:DataSource" xmi:id="DataSource_1163951270521" name="pensionjndi" jndiName="pensionjndi" description="DB2 Universal Driver Datasource" category="" authDataAlias="sec" relationalResourceAdapter="builtin_rra" statementCacheSize="150" datasourceHelperClassname="com.ibm.websphere.rsadapter.DB2UniversalDataStoreHelper">
<propertySet xmi:id="J2EEResourcePropertySet_1163951270522">
<resourceProperties xmi:id="J2EEResourceProperty_1163951270523" name="databaseName" type="java.lang.String" value="value" description="This is a required property. This is an actual database name, and its not the locally catalogued database name. The Universal JDBC Driver does not rely on information catalogued in the DB2 database directory." required="true"/>
<resourceProperties xmi:id="J2EEResourceProperty_1163951270524" name="driverType" type="java.lang.Integer" value="4" description="The JDBC connectivity-type of a data source. If you want to use type 4 driver, set the value to 4. If you want to use type 2 driver, set the value to 2. On WAS z/OS, driverType 2 uses RRS and supports 2-phase commit processing." required="true"/>
<resourceProperties xmi:id="J2EEResourceProperty_1163951270525" name="serverName" type="java.lang.String" value="serverName" description="The TCP/IP address or host name for the DRDA server. If custom property driverType is set to 4, this property is required." required="false"/>
<resourceProperties xmi:id="J2EEResourceProperty_1163951270526" name="portNumber" type="java.lang.Integer" value="50000" description="The TCP/IP port number where the DRDA server resides. If custom property driverType is set to 4, this property is required." required="false"/>
...
...
...
<resourceProperties xmi:id="J2EEResourceProperty_1175088739299" name="webSphereDefaultIsolationLevel" type="java.lang.Integer" value="2" description="" required="false"/>
</propertySet>
<connectionPool xmi:id="ConnectionPool_1163951270521" connectionTimeout="15" maxConnections="200" minConnections="5" reapTime="180" unusedTimeout="1800" agedTimeout="0" purgePolicy="EntirePool"/>
<mapping xmi:id="MappingModule_1163951296456" mappingConfigAlias="DefaultPrincipalMapping" authDataAlias="sec"/>
</factories>
I tried to define DataSource with same name (pensionjndi) using IntelliJ's DataSource and Driver window
DS IntelliJ Screen
No luck! Application doesn't recognize the DataSourse (but it looking for the RIGHT DS name "pensionjndi")
The question is: What is a right way to configure DataSource for IntelliJ Artifacts? (Using an existing DataSources)
If an additional information required, I'll edit the post..
I didn't found any example, or guide, for DataSource config for websphere.
Please HELP!?
The problem is solved by defining data source in Websphere Application Console. WAS console | Resorces | Data Sources
See the topic of IBM "Configuring a data source using the administrative console"
Here is discussion about the problem with IntelliJ support

Coldfusion ORM 9.0.1 - Error while resolving relationship

I got this example from the adobe coldfusion documentation, some of the names are changed but everything else is the same, unless I am just so frustrated that I have missed a letter.
user.cfc:
/**
*#persistent
*/
component
{
property name="id" fieldtype="id" generator="native";
property name="userName" type="string" length="100";
property name="Credential" fieldtype="one-to-one" cfc="model.user.credentials";
}
credentials.cfc:
/**
*#persistent
*/
component
{
property name="id" fieldtype="id" generator="foreign" params="{property='userinfo'}";
property name="userinfo" fieldtype="one-to-one" cfc="model.user.user" constrained="true";
property name="passwordHash" type="string";
}
no matter how I word it, after searching many sites, I still get a error of:
Error while resolving the relationship Credential in cfc user. Check the column mapping for this property.
I have checked that both cfcs are accessible by coldfusion by removing the one-to-one properties and the tables have been created successfully.
I am using SQL Server 2008 with Coldfusion 9.0.1 under Apache 2.2 web server.
I am new to ORM and Hibernate but have successfully created different types of relationships and will confess to a less then expert level of coldfusion.
Thanks, this is really bothering me as this came directly from the coldfusion documentation.
Do you have a mapping for model?
If not, add one, or you could try:
property name="Credential" fieldtype="one-to-one" cfc="credentials";

Unable to bulk insert using NHibernate

I've tried adding bulk inserting to my application, but the Batcher is still NonBatchingBatcher with a BatchSize of 1.
This is using C#3, NH3RC1 and MySql 5.1
I've added this to my SessionFactory
<property name="adonet.batch_size">100</property>
And my code goes pretty much like this
var session = SessionManager.GetStatelessSession(type);
var tx = session.BeginTransaction();
session.Insert(instance);
I'm using HILO identity generation for the instances in question, but not for all instances on the database. The SessionFactory.OpenStatelessSession doesn't take a type, so it can't really know it can do batching on this type, or...?
After some digging into NHibernate, I found something in SettingsFactory.CreateBatcherFactory that might give some additional info
// It defaults to the NonBatchingBatcher
System.Type tBatcher = typeof (NonBatchingBatcherFactory);
// Environment.BatchStrategy == "adonet.factory_class", but I haven't
// defined this in my config file
string batcherClass = PropertiesHelper.GetString(Environment.BatchStrategy, properties, null);
if (string.IsNullOrEmpty(batcherClass))
{
if (batchSize > 0)
{
// MySqlDriver doesn't implement IEmbeddedBatcherFactoryProvider,
// so it still uses NonBatchingFactory
IEmbeddedBatcherFactoryProvider ebfp = connectionProvider.Driver as IEmbeddedBatcherFactoryProvider;
Could my configuration be wrong?
<hibernate-configuration xmlns="urn:nhibernate-configuration-2.2" >
<session-factory name="my application name">
<property name="adonet.batch_size">100</property>
<property name="connection.driver_class">NHibernate.Driver.MySqlDataDriver</property>
<property name="connection.connection_string">my connection string
</property>
<property name="dialect">NHibernate.Dialect.MySQL5Dialect</property>
<property name="proxyfactory.factory_class">NHibernate.ByteCode.Castle.ProxyFactoryFactory, NHibernate.ByteCode.Castle</property>
<!-- To avoid "The column 'Reserved Word' does not belong to the table : ReservedWords" -->
<property name="hbm2ddl.keywords">none</property>
</session-factory>
</hibernate-configuration>
I know this question is a year old, but there is a NuGet package that adds MySQL batching functionality to NHibernate. The reason that it's not baked directly into NHibernate is that the functionality required a reference to the MySQL.Data assembly, and the dev team didn't want the dependency.
IIRC, batching is currently supported for Oracle and SqlServer only.
As almost any other aspect of NH, this is extensible, so you can write your own IBatcher/IBatcherFactory and inject them via configuration.
Sidenote: current version of NH is 3.0 GA.
Really old question but...let's be completely correct
Another reason for batching not working can be use of stateless session (as in your case). Stateless session does not support batching. From documentation:
The insert(), update() and delete() operations defined by the
StatelessSession interface are considered to be direct database
row-level operations, which result in immediate execution of a SQL
INSERT, UPDATE or DELETE respectively. Thus, they have very different
semantics to the Save(), SaveOrUpdate() and Delete() operations
defined by the ISession interface.

Binary Blob truncated to 8000 bytes - SQL Server 2008 / varbinary(max)

I have upgraded from Fluent Nhibernate 1.0 with Nhibernate 2.1 to pre-
release 1.x with NHibernate 3.0 GA and have hit what I think is a regression, but I want to hear if that's indeed the case.
I am using SQL Server Express 2008 and the MSSQL 2008 dialect and have an
Image property of type System.Drawing.Image and I have mapped it like
this:
Map (food => food.Image)
.Length (int.MaxValue)
.Nullable ();
The Image column in the table is of type varbinary(MAX).
The generated hbm for the property is:
<property name="Image" type="System.Drawing.Image, System.Drawing,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a">
<column name="Image" length="2147483647" not-null="false" />
</property>`
However no matter what I do the binary blob is truncated to 8000 bytes
when serialized with the current FNH and NH versions. That didn't used
to be the case with previous versions.
Ideas of why this is happening and how to fix/workaround it?
I too have encountered a similar problem and after much experimentation I noticed that when using Nhibernate to generate my schema to a file the generated column type was always length 8000.
Setting setting CustomSqlType to Varbinary(max) as suggested above made no difference, however, this work around in my FluentMapping seemed to do the trick:
Map(x => x.LogoBytes).CustomType("BinaryBlob").Length(1048576).Nullable();
The length of course is an arbitrary amount but I think it should be set to something less than int.Max. I am new to Nhibernate so I'm still figuring things out but I'd be interested to know if this helps you.
In 3.0.0GA, the following mapping seems to do the trick:
<property name="Data" type="Serializable" length="2147483647" />
This is a regression. I have raised a bug and provided patches at https://nhibernate.jira.com/browse/NH-2484
Map(x => x.Image).Length(100000).Not.Nullable();
Add the 'Length(MAXVALUE)' as above and it will work :)
Have you tried this?
Map(x => x.Image).CustomSqlType("VARBINARY(MAX)");

Connect to ESRI Shape File (DBase *.dbf file) from NHibernate

I've been trying to work out how to connect to an ESRI shape file (which I think is a DBase table file) through NHibernate but haven't had any luck with anything I've tried.
Currently, my config's looking like this:
<property name="connection.provider">NHibernate.Connection.DriverConnectionProvider</property>
<!--<property name="dialect">NHibernate.Dialect.GenericDialect</property>
<property name="connection.driver_class">NHibernate.Driver.OdbcDriver</property>
<property name="connection.connection_string">Database=A303.dbf;protocol=TCPIP</property>-->
<property name="connection.driver_class">NHibernate.Driver.OdbcDriver</property>
<!--<property name="connection.connection_string">driver={IBM DB2 ODBC DRIVER};Database=a303.dbf;protocol=TCPIP</property>-->
<property name="connection.connection_string">Provider=VFPOLEDB.1; Data Source=C:\projects\rm4\Sandbox\bin\Debug\A303.dbf;Extended Properties=dBase III</property>
<property name="dialect">NHibernate.Dialect.DB2Dialect</property>
<property name="use_outer_join">true</property>
<property name="proxyfactory.factory_class">NHibernate.ByteCode.Castle.ProxyFactoryFactory, NHibernate.ByteCode.Castle</property>
<property name="show_sql">true</property>
I've left the commented out bits in so you can see what values I've been trying. No matter what I try, I get the error message:
ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified
I've gone through most of the connection string I've found online and in some answers to questions on here was getting to the 'clutching at straws' phase where I'm just putting anything in so thought I'd better ask for help.
I'm not even sure if it's possible to connect to this type of file from NHibernate but, if it is, does anyone know what should be in the config?
A Shapefile (.shp) is not a dbf, per se. It actually is a collection of files, one of which is a DBF, but the shapefile itself that stores the geometry is a different format altogether.
There is a whitepaper on the ESRI website (www.esri.com)
I would try a different NHibernate driver. Here is a list of NHibernate drivers from the documentation.
Judging from the provider name in your connection string, I would try NHibernate.Driver.OleDbDriver.
Failing this, I would eliminate NHibernate from the mix and see if you can connect using the standard .NET data classes, such as System.Data.Odbc.OdbcConnection and System.Data.OleDb.OleDbConnection. If you cannot connect at this level, then the problem is not NHibernate.