OracleBulkCopy WriteToServer method not working for large records - odp.net

We have recently migrated Oracle 12c to Oracle 19c, where for ODP.Net we have used DDTek.Oracle but for migrated code we are using Oracle.DataAccess dll.
We have requirement to load file data into database, to achieve this we have implemented .Net code by using Oracle.DataAccess.Client BulkCopy, WriteToServer() functionality where we have Oracle19c as database, we have one file of 2.6 million records which is throwing error
Attempted to read or write protected memory. this is often an indication that memory is corrupt
The same code is working fine for Oracle 12c with DDTek.Oracle dll for 2.6 million records, but not in migrated oracle version.

Related

Data Migration Assistant

I am using Data Migration Assistant to assess compatibility issues migrating a SQL database to Azure SQL. After running for a couple of minutes, it throws an error saying "The file contains the XML node type {0}. This type is unsupported or in an unsupported location." I have successfully assessed other databases using DMA but this particular database always aborts after throwing this error.
I decided to go ahead and migrate the database using the wizard (Deploy Database to Microsoft Azure SQL Database) from SSMS, and ran into several compatibility issues that showed as errors. The database had several triggers that were created by a third-party database tool that referred to table objects with the 3 & 4-part naming conventions which is not supported on Azure SQL. There were several other errors in addition to these but I decided to delete these triggers first and run Data Migration Assistant again. This time it ran to completion and I got compatibility report. I am not sure if it was the sheer number of issues found or something in those triggers that I deleted had caused the error.

The data value could not be converted for reasons other than sign mismatch or data overflow

I have recently migrated an application from .NET Framework 2.0 to .NET Framework 4.5. It is using Informix version 3.50 as a database. After migration, when it is fetching from the database and the fields contain special characters, it is throwing the following error:
The data value could not be converted for reasons other than sign mismatch or data overflow. For example, the data was corrupted in the data store but the row was still retrievable".
I am using a SELECT query to fill a DataSet. When I run the query manually in database, it gives the results without any issues, but when filling the DataSet in the code, I am getting the error. The non migrated solution is working properly.
Can anyone provide any solution for this. Please let me know if you need any more information.
For the .NET Framework 4.x driver support, the minimum recommended version of Informix Client SDK is CSDK 4.10 xC2 or higher. The latest is CSDK 4.10 xC12
Error - data value could not be converted for reasons other than sign mismatch or data overflow.
I downloaded Microsoft Access Database Engine 2010 Redistributable again using the re-install option and issue was fixed

SSIS custom transformation component memory usage

I am moving large volumes of data from Oracle to SQL server on a daily basis. The application using the oracle system stores dates in non-standard formats and these need to be converted to SQL Server dates.
I have created a custom SSIS transformation components t odo this. My issue is that when running the SSIS packages consume a huge amount of memory, often reaching multiple gigabytes. The usage keeps ballooning while the package is running.
The issue is with the process "IISServerExec.exe" running on the server. When seen from the task manager the memory usage constantly increases while running. Several of these package need to run at the same time each day, but with this memory ballooning the system can only manage two or three.
I have also followed numerous online examples and they suffer the same problem. One example is the simple component from microsoft to convert a string to uppercase. This example consumed over 600mb of ram, with 6 input columns over 9 million rows.
If I create similar (but simpler) transformations using derived columns these consume less than 100mb of memory.
I am using SQL Server 2012 SP2 (11.0.5058) and have tested on four separate machines running Windows 7 and Server 2008 R2, each with all updates installed through windows update. All programming is done in VB on Visual Studio 2013. Oracle connection are using the Attunity source connector.
Is there a command you need to run at the end of the processInput section to flush out the ram or is memory usage expected?

Issue with System.Data.OracleClient and ODP.Net 11g together used in .net 2.0 web site

In our .net framework 2.0 based application we were using System.Data.Oracleclient and now migrating to ODP.Net, the volume of the project is too high,
so we cannot do the entire migration on one go, as a result the application is using 2 providers System.Data.Oracleclient & ODP.Net as of now.
Now we are changing our OS, from Windows xp 32bit to Windows 7 64bit. While doing so we observed the following,
1) A query executes in < 1 sec using System.Data.Oracleclient & ODP.Net 10g 64bit (Oracle.DataAccess.dll version 2.102.2.20).
and the same query executes in < 1 sec on Oracle SQL Developer v1.5.
2) However the same query is taking 2-3 mins to execute using System.Data.OracleClient with ODP.Net 11g 64bit (Oracle.DataAccess.dll version 2.112.3.0).
we found a remarkable performance degradation in point 2),
we have to use System.Data.OracleClient with ODP.Net 11g 64bit (Oracle.DataAccess.dll version 2.112.3.0) on Windows 7 64bit OS,
but we cannot live with the performance degradation as mentioned in point 2),
and we cannot convert all code which uses System.Data.OracleClient to ODP.Net very quickly.
So can anyone help us, on why do we see such remarkable performance degradation as mentioned in point 2), and what do we do to resolve this problem.
Regards
Sanjib Harchowdhury
Adding the following to your config will send odp.net tracing info to a log file:
<oracle.dataaccess.client>
<settings>
<add name="TraceFileName" value="c:\temp\odpnet-tests.trc"/>
<add name="TraceLevel" value="63"/>
</settings>
</oracle.dataaccess.client>
This will probably only be helpful if you can find a large gap in time. Chances are rows are actually coming in, just at a slower pace.
Try adding "enlist=false" to your connection string. I don't consider this a solution since it effecitively disables distributed transactions but it should help you isolate the issue. You can get a little bit more information from an oracle forumns post:
From an ODP perspective, all we can really point out is that the
behavior occurs when OCI_ATR_EXTERNAL_NAME and OCI_ATR_INTERNAL_NAME
are set on the underlying OCI connection (which is what happens when
distrib tx support is enabled).
I'd guess what you're not seeing is that the execution plan is actually different (meaning the actual performance hit is actually occuring on the server) between the odp.net call and the sql developer call. Have your dba trace the connection and obtain execution plans from both the odp.net call and the call straight from SQL Developer (or with the enlist=false parameter).
If you confirm different execution plans or if you want to take a preemptive shot in the dark, update the statistics on the related tables. In my case this corrected the issue, indicating that execution plan generation doesn't really follow different rules for the different types of connections but that the cost analysis is just slighly more pesimistic when a distributed transaction might be involved. Query hints to force an execution plan are also an option but only as a last resort.
Finally, it could be a network issue. If your odp.net install is using a fresh oracle home (which I would expect unless you did some post-install configuring) then the tnsnames.ora could be different. Host names might not be fully qualified, creating more delays resolving the server. I'd only expect the first attempt (and not subsequent attempts) to be slow in this case so I don't think it's the issue but I thought it should be mentioned.
please refer this link, or just replace ODP.Net 64bit component with ODP.Net 32bit, as we are using asp.net we could easily configure our application to run using the 32bit component in Windows 7 (x64) edition.

Upgrading an old project from NHibernate 1.2 to 3.3

WE have an old project, which was originally written in .NET 2.0 and VS2005, which ended up in VS2008. It uses NHibernate 1.2 for data access. As part of our upgrade, we moved to .NET 4.0 and VS2010, but we are having some problems with the move from NHibernate 1.2 to 3.3.
The main problem we are having is querying a table, which has a link on it. The query we are running is as follows:
IQuery query = base.Session.CreateSQLQuery("select t from Transaction t inner join Order o where TransactionDate >= ? && TransactionDate <= ? order by TransactionDate desc");
We get 2 different errors: either t.Transaction or t.Orders does not exist in the database. We know these tables exist, i have checked multiple times, and i know there is data in there...
I have seen the Question What to be aware of when upgrading from NHibernate 1.2 to 3.2 and it mentions that we may need to modify our mapping files... but does not mention what needs to be changed... Is there something that will look at our mapping files and tell us what needs to change? I will admit, this is my first time using NHibernate at the lower level (actually talking to the DB). up till this point, all database stuff was already "done"... its only now with the upgrade that the problems have occurred...
Since CreateSQLQuery, as the name implies, executes raw SQL, the only explanation I can think of is that you are connecting to the wrong database.
Considering you're using ? for the parameter place holders, I know you're not using SQL Server... so it's probably a DB that requires data source configuration outside of the connection string.
That opens the option of something I've seen before: 32bit and 64bit drivers using different configuration files/registry keys.