0x8007000E Description: "Out of memory." error - sql

I am trying to load data from db2 to db2 using ssis. There is only source table which has 2.4 million records and there is no transformation between source and destination table, but the loading stops after 1.6 million records. error which I am getting is:
Error: 0xC0202009 at LOAD TO SATGE_GLMXPF_COMPRESSED, OLE DB
Destination [227]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error
has occurred. Error code: 0x8007000E. An OLE DB record is available.
Source: "Microsoft Cursor Engine" Hresult: 0x8007000E Description:
"Out of memory.". Error: 0xC0047022 at LOAD TO
SATGE_GLMXPF_COMPRESSED, SSIS.Pipeline: SSIS Error Code
DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "OLE
DB Destination" (227) failed with error code 0xC0202009 while
processing input "OLE DB Destination Input" (240). The identified
component returned an error from the ProcessInput method. The error is
specific to the component, but the error is fatal and will cause the
Data Flow task to stop running. There may be error messages posted
before this with more information about the failure. Error: 0xC02090F5
at LOAD TO SATGE_GLMXPF_COMPRESSED, DataReader Source [2]: The
DataReader Source was unable to process the data. Exception from
HRESULT: 0xC0047020 Error: 0xC0047038 at LOAD TO
SATGE_GLMXPF_COMPRESSED, SSIS.Pipeline: SSIS Error Code
DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on DataReader Source
returned error code 0xC02090F5. The component returned a failure code
when the pipeline engine called PrimeOutput(). The meaning of the
failure code is defined by the component, but the error is fatal and
the pipeline stopped executing. There may be error messages posted
before this with more information about the failure.

You need to configure "Rows per batch" and "Maximum insert commit size" carefully.

The "Out of Memory" error occurs when the memory allocated to your process is overwhelmed by the amount of data that you are trying to process. Here are a few considerations:
Do you actually have enough memory installed on the machine that you are running this? I would raise an eyebrow at a 32 GB server. If you are testing locally, use a smaller dataset and install more memory on your development machine. It is not uncommon to have a 16GB laptop these days.
32 or 64bit? 32 bit processes can only allocate 2GB of memory which can become quickly overrun. Try switching to 64 bit if possible.
Reduce the columns in the dataflow. If possible, remove any columns that are not being used. This will improve utilization of the memory buffers.
Use data providers from IBM or buy a custom one. i.e. http://www.cozyroc.com/ssis/db2-destination. The error that you had in trying to fast load:
Failed to open a fastload rowset for...
is most likely caused by the provider not supporting bulk loading. Bulk loading will help move things much faster, which is important on a memory constrained system.

Related

SQL Server: resuming an interrupted export

I'm migrating 2 million rows from a SQL Server table over to a Postgres database. I'm using the SQL Server Import/Export Wizard and everything was going fine until I got to 1.5 million records (took all night).
I'm not sure why it crashed (see error below) but I need to get the rest of the records over there. I can append them if it wasn't a resource issue. If it was, then I can export the remaining 500k as a new table and then merge them later in pgAdmin.
My question is: when selecting the remaining 500k rows to finish the job, how do I correctly exclude the 1.5 million records that DID copy? There is no primary key and there are no unique values.
I tried this, listed in a post about selecting a range of rows. Problem is I don't know for sure if its correctly gathering the remaining records. Postgres indicates that 1574503 rows out of the 2M were successfully migrated. Stats indicate the Postgres database is at 495mb now.
SELECT *
FROM
(SELECT
ROW_NUMBER() OVER (ORDER BY (SELECT NULL AS noorder)) AS RowNum,
*
FROM
invoices) AS alias
WHERE
RowNum BETWEEN 1574504 AND 2000000
In case it helps, here is the SQL Server error info from the when it crashed:
Error 0xc020844b: Data Flow Task 1: An exception has occurred during data insertion, the message returned from the provider is: The connection has been disabled.
Error 0xc0047022: Data Flow Task 1: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Destination - invoices" (139) failed with error code 0xC020844B while processing input "Destination Input" (142). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error 0xc02020c4: Data Flow Task 1: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Source - Query returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

SSIS Conversion Error - Visual Studio 2013

Currently I am working on two databases. Migrating information from an operational database to a datawarehouse with the well known Data Vault structure. However, it seems impossible for me to load data from the operational database to the data vault. I keep receiving to same errors over and over again.
I have checked multiple standard error creators:
COLLATE of both Database structures is the same;
No change in datatypes between operational and datawarehouse;
Removed and rewritten queries and databases multiple times.
Error output is in the next code block. Could one of you help me out here?
If needed i can provide a screenshot of the Visual Studio error screen. I have tried googling and searching this particular problem everywhere, but I do not seem to get it fixed.
Error: 0xC0202009 at Load SATT, SATT_DELIVERY_PRICE [439]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
Error: 0xC0202009 at Load SATT, SATT_DISTRICT [461]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
Error: 0xC0209029 at Load SATT, SATT_DISTRICT [461]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "SATT_DISTRICT.Inputs[OLE DB Destination Input]" failed because error code 0xC020907B occurred, and the error row disposition on "SATT_DISTRICT.Inputs[OLE DB Destination Input]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at Load SATT, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "SATT_DISTRICT" (461) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (474). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error: 0xC0202009 at Load SATT, SATT_EMPLOYEE [483]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
Error: 0xC02020C4 at Load SATT, CONSIGNMENT [2]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0209029 at Load SATT, SATT_EMPLOYEE [483]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "SATT_EMPLOYEE.Inputs[OLE DB Destination Input]" failed because error code 0xC020907B occurred, and the error row disposition on "SATT_EMPLOYEE.Inputs[OLE DB Destination Input]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047038 at Load SATT, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on CONSIGNMENT returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at Load SATT, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "SATT_EMPLOYEE" (483) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (496). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error: 0xC0202009 at Load SATT, SATT_CUSTOMER [403]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
Error: 0xC0209029 at Load SATT, SATT_CUSTOMER [403]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "SATT_CUSTOMER.Inputs[OLE DB Destination Input]" failed because error code 0xC020907B occurred, and the error row disposition on "SATT_CUSTOMER.Inputs[OLE DB Destination Input]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at Load SATT, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "SATT_CUSTOMER" (403) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (416). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error: 0xC0209029 at Load SATT, SATT_DELIVERY_PRICE [439]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "SATT_DELIVERY_PRICE.Inputs[OLE DB Destination Input]" failed because error code 0xC020907B occurred, and the error row disposition on "SATT_DELIVERY_PRICE.Inputs[OLE DB Destination Input]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at Load SATT, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "SATT_DELIVERY_PRICE" (439) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (452). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
This is why there is no substitute for understanding teh meaning of your data and the meaning of error codes. You are trying to put data into a column where you have no data in the row for that field. This could be a problem of database design where the first database allows nulls and the one you are inserting to does not or it could be an incorrect query that thorugh a bad join or where condition is not getting teh right tdata to pass to the other database or it could be a problem of incorrect SSIS pacakge design causing the data to not correctly populate based on what the business rules should be. We can't tell that - only someone who has expertise in your system can.
I can give you an a example I ran across recently though where this type of thing happened. A pacakage that had been running correctly for months (and for multiple clients) was suddenly not working for only one client and the error was that the PK was null. This made no sense at all since the PK was generated in one table and passed to two related staging tables in an earlier step and it was a required field in the staging tables. The problem came because one of the tables did not have a record to match the data in the other table and the data set for the table it was failing on took the key from the table withouth the matching record and not the one that had a value. In this case the SSIS package was at fault and needed to be fixed.
Another way this happens is if the data is not a PK in the orginating table and is in a field that allows nulls. In this case you need to fix the data and you should fix the SSIS package to send that sort of data to an exception table. You should have been doing that as a matter of course anyway.
It is far easier to troubleshoot exceptions when you send the failed data to an exception table. It also means you can process the other data if your business rules would allow it. However, be careful of sending data to an exception table and never trying to fix those exceptions. If you have an exception table, someone should review it and try to make fixes for the problems at least weekly.
Or you could be pulling data from a query that needs adjustment. Suppose you had a case where all types of sales always included the customer id. Then the application was adjusted to let tehre be internal sales and in a fit of stupidity decided that internal customers did not have to have customer ids. Now suddenly a field you always had filled in is occasionally empty. Application devs rearely consider other things sucha as datawarehousing and reporting that might be affected by their decisions and often don't notify the BI devs who wook on these things when they make changes. So looking for what might have changed in the underlying data model is always a possibility to consider. In this case perhaps the internal orders are not needed for the data warehouse or if they are perhaps an fake cuastomer id like -1 can be attached in the dataflow if the underlying model cannot be changed.
The biggest problem I see with your package however is that you apparently don't understand what the errors mean (or even how to pull them out from a mess of information) or you would have known all this already. You need to start looking at the errors you get and learning through Google what they mean. Then you need to apply that meaning to the way your particular database and SSIS package are designed.
You need to develop a far better way to troublshoot, so you need to set up your SSIS pacakages with logging and exception tables and the things you will need to be able to capture the problem when it occurs.
There is no such thing as an SSIS package that will work prefectly 100% of the time. Data models change, people start entering data differntly than they used to, one bad data entry person can mess up a working SSIS pacakge by putting in values that no one ever considered. Or worse the data can in error and continue on to the next location blindly.
Error handling and trapping of the data that is being sent through the process is something every SSIS package should have because you will need it. You can't think of designing an SSIS package with the assumption that everything will remain the same. You need to know it will fail when it should, it will reject bad data and it will recognize when unexpected data happens. You need to understand that things will change over time and an SSIS package needs to be able to throw up a smoke signal when they do.
I have packages that fail because there are two few records in the import file. I have packages that send data to the exception table when I was expecting an integer and I got text and on an on and on. Most of my packages spend far more tasks on fixing data, sending unfixable data to the exceptions and handling errors than they do on the actual movement of data. This is one reason why I never design a package without staging tables. Far better to fail before you get to the final destination than to have to rollback a bunch of production tables. If you are not doing those things, you need to start.

Acess 2007 Attachment type to SQL server

My predecessor created an Access database in 2007, which has a number of tables, forms and query.
One of the tables holds images, which are stored as attachments.
The database is almost 2GB in size (compacting and repairing doesn't change it). So I'd like to convert it to a SQL Server 2014 database.
I've used the Data Import tool, which has copied the tables & data into SQL Server. But the attachments are just converted to a string of their file name (image1.jpg)
Is there anyway to import the images from the MS Access database in to the SQL Server?
I also have a Sharepoint 2010 server, would this be a better option?
Update (sorry new to Stack Overflow, not sure what to put where)
I've changed the field mapping in import wizard to IMAGE, and I get the following error. If I set it to ignore, the fields are blank. using VARSTRING, puts either Null or 0x in the field.
Executing (Error) Messages Error 0xc02020c5: Data Flow Task 1: Data conversion failed while converting column "Photos" (23) to column
"Photos" (57). The conversion returned status value 2 and status text
"The value could not be converted because of a potential loss of
data.". (SQL Server Import and Export Wizard)
Error 0xc0209029: Data Flow Task 1: SSIS Error Code
DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Data Conversion 0 -
0.Outputs[Data Conversion Output].Columns[Photos]" failed because error code 0xC020907F occurred, and the error row disposition on "Data
Conversion 0 - 0.Outputs[Data Conversion Output].Columns[Photos]"
specifies failure on error. An error occurred on the specified object
of the specified component. There may be error messages posted before
this with more information about the failure. (SQL Server Import and
Export Wizard)
Error 0xc0047022: Data Flow Task 1: SSIS Error Code
DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Data
Conversion 0 - 0" (49) failed with error code 0xC0209029 while
processing input "Data Conversion Input" (50). The identified
component returned an error from the ProcessInput method. The error is
specific to the component, but the error is fatal and will cause the
Data Flow task to stop running. There may be error messages posted
before this with more information about the failure. (SQL Server
Import and Export Wizard) Error 0xc02020c4: Data Flow Task 1: The
attempt to add a row to the Data Flow task buffer failed with error
code 0xC0047020. (SQL Server Import and Export Wizard) Error
0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.
The PrimeOutput method on Source - Attachments returned error code
0xC02020C4. The component returned a failure code when the pipeline
engine called PrimeOutput(). The meaning of the failure code is
defined by the component, but the error is fatal and the pipeline
stopped executing. There may be error messages posted before this
with more information about the failure. (SQL Server Import and
Export Wizard)
You've used SQL Server to import the data which is a good thing. I would probably use data type IMAGE for the column to hold the images. You could also try the data typoe VARBINARY(MAX).
Use the Import and Export WIzard to import the data en see which datatype works.
Hope this helps you out.
Edit:
You say when you convert the attachment table you only have the names of the images. Are these names related to the images. When you still have the images you can store them in a folder and use FILESTREAM to connect to thme using SQL Server.
Using FILESTREAM enhances the database because the data of the images is no longer saved in the database, only a reference to them.
Unfortunately SQL Server doesn't support multi-valued or attachment like fields.
For more information about FILESTREAM look at this website: http://msdn.microsoft.com/library/hh461480

SQL Server 2005 SSIS Buffer Overflow

I'm trying to import a UTF-8 encoded file (not sure if this is relevant) containing 10 columns - the delimiter is รพ. I've added a flat file source and set up all the columns as usual and set them to DT_WSTR.
However, it's failing to read in the file as I keep getting the error message below, [advertiser-id] buffer overflow. Advertiser ID is the first column in the file. I've checked the file and all the data is correct, no missing delimiters or columns anywhere. But strangely, it works if there's 1193 rows of data or less, regardless of which rows they are from the file.
Any ideas on why it's work for 1193 but no more? I've tried setting the max rows in the buffer to small numbers and big numbers but still getting the same message.
`SSIS package "pck_NetsightV3_Import_Doubleclick.dtsx" starting.
Information: 0x4004300A at Transfer Creative Match Table, DTS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at Transfer Creative Match Table, DTS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Transfer Creative Match Table, DTS.Pipeline: Pre-Execute phase is beginning.
Information: 0x402090DC at Transfer Creative Match Table, Flat File Source [1]: The processing of file "\\man2\systems\Team\datafiles\netsight\doubleclick\latest\doubleclick_matchtables_creative.csv" has started.
Information: 0x4004300C at Transfer Creative Match Table, DTS.Pipeline: Execute phase is beginning.
Error: 0xC020209C at Transfer Creative Match Table, Flat File Source [1]: The column data for column "Advertiser-ID" overflowed the disk I/O buffer.
Error: 0xC0202091 at Transfer Creative Match Table, Flat File Source [1]: An error occurred while skipping data rows.
Error: 0xC0047038 at Transfer Creative Match Table: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "Flat File Source" (1) returned error code 0xC0202091. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Error: 0xC0047021 at Transfer Creative Match Table: SSIS Error Code DTS_E_THREADFAILED. Thread "SourceThread0" has exited with error code 0xC0047038. There may be error messages posted before this with more information on why the thread has exited.
Error: 0xC0047039 at Transfer Creative Match Table: SSIS Error Code DTS_E_THREADCANCELLED. Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown. There may be error messages posted before this with more information on why the thread was cancelled.
Error: 0xC0047021 at Transfer Creative Match Table: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0047039. There may be error messages posted before this with more information on why the thread has exited.
Information: 0x40043008 at Transfer Creative Match Table, DTS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DD at Transfer Creative Match Table, Flat File Source [1]: The processing of file "\\man2\systems\Team\datafiles\netsight\doubleclick\latest\doubleclick_matchtables_creative.csv" has ended.
Information: 0x402090DF at Transfer Creative Match Table, OLE DB Destination [156]: The final commit for the data insertion has started.
Information: 0x402090E0 at Transfer Creative Match Table, OLE DB Destination [156]: The final commit for the data insertion has ended.
Information: 0x40043009 at Transfer Creative Match Table, DTS.Pipeline: Cleanup phase is beginning.
Information: 0x4004300B at Transfer Creative Match Table, DTS.Pipeline: "component "OLE DB Destination" (156)" wrote 0 rows.
Task failed: Transfer Creative Match Table
SSIS package "pck_NetsightV3_Import_Doubleclick.dtsx" finished: Success.
The program '[7916] pck_NetsightV3_Import_Doubleclick.dtsx: DTS' has exited with code 0 (0x0).`

Issue exporting from SQL Server 2012 to PostgreSQL

I am trying to export a SQL Server 2012 database into a PostgreSQL 9.3 but am getting a weird conversion error for float columns. If the import ignores the float columns, everything goes well (about 300k rows), otherwise I get the following:
- Copy in "TABLE_TO_COPY" (Error)
Messagges
Error 0xc020844b: Data Flow Task 1: Exception during data insertion. The message returned by the provider is: Unable to cast object of type 'System.Double' to type 'System.Char[]'.
(SQL Server Import and Export Wizard)
Errore 0xc0047022: Data Flow Task 1: Error Code SSIS DTS_E_PROCESSINPUTFAILED. Error in method ProcessInput in component "Destination - TABLE_TO_COPY" (94) with code 0xC020844B during the processing of the input "Destination Input" (97). The specified component returned an error from the method ProcessInput. [...]
Error 0xc02020c4: Data Flow Task 1: Attempt to add a row to the buffer of activity Data flow errored with code 0xC0047020. [...]
(SQL Server Import and Export Wizard)
Error 0xc0047038: Data Flow Task 1: Error Code SSIS DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Origin - PATIENT_DIALYSIS_SYMPTOM returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
It's not a memory issue, since the server settings have the highest possible value. What could cause this?
The issue is only present in SQL Server Management Studio's Import/Export Wizard. By resorting to the SSIS, the issue is not there.
This is the configuration I have set up for my Data Flow Task:
Source: OLE DB pointing to my SQL Server database table
Destination: OCDB connection pointing to my PostgreSQL table
It doesn't look like SSIS is able to create source tables into the origin DB, so I had to do that before setting up the project.