Why doesn't SSIS OLE DB Command transformation insert all the records into a table? - sql

I have an SSIS package that takes data from Tables in an SQL database and insert (or update existing rows) in a table that is another database.
Here is my problem, after the lookup, I either insert or update the rows but over half of the rows that goes into the insert are not added to the table.
For the insert, I am using an Ole Db Command object in which I use an insert command that I have tested. I found out why the package was running without error notification but still not inserting all the rows in the Table.
I have checked in sqlProfiler and it says the command was RCP:Completed which I assume means it supposedly worked.
If I do the insert manually in sql management studio with the data the sql profiler gives me (the values it uses toe execute the insert statement with), it works. I have checked the data and everything seems fine (no illegal data in the rows that are not inserted).
I am totally lost as to how to fix this, anyone has an idea?

Any specific reason to use OLE DB Command instead of OLE DB Destination to insert the records?
EDIT 1:
So, you are seeing x rows (say 100) sent from Lookup transformation match output to the OLE DB destination but only y rows (say 60) are being inserted. Is that correct? Instead of inserting into your actual destination table, try to insert into a dummy table to see if all the rows are being redirected correctly. You can create a dummy table by clicking on the New... button on the OLE DB destination. It will create a table for you matching the input columns. That might help to narrow down the issue.
EDIT 2:
What is the name of the table that you are trying to use? I don't think that it matters. I am just curious if the name is any reserved keyword. One other thing that I can think of is whether there are any other processes that might trigger some action on your destination table (either from within the package or outside of the package)? I suspect that some other process might be deleting the rows from the table.

Related

Copy rows from database table to another database table with same structure and where

Well, I would like to copy an entire table that has like 400,000,000 rows in it, but using a where condition, to another table on another database with the same structure.
I let this command executing this night:
INSERT INTO db2.tb1 SELECT * from db1.tb1 where column=xxx;
But nothing has copied since then. What I'm doing wrong?
EDIT
When I isolate the execution this is what happens:
EDIT2
Setting autocommit to 1:

Data Agent - SELECT from one table and insert into another

Is there any type of product where I can write a SQL statement to select from one table and then insert into another database (The other database is out in the cloud). Also, it needs to be able to check to see if that record exists and then update the row if anything has changed. Then it will need to run every 10-30 minutes to check to see what has changed or if new records have been added.
The source database and the ending database have a different schema (if that matters?) I've been looking, but it seams that only products out there are ones that will just copy one table and insert into a table with the same schema.

Load data from excel file into a temporary table

I needed to load 100,000 rows of data from an excel file into a temporary table that I created using "on commit preserve rows". But somehow the most efficient methods did not seem to populate the temporary table due to session issues?
I used Toad to Import Table Data and it showed that x amount of records are imported. But when I select from the temp table, it was empty. Then I generated a bunch of insert scripts and saved them in a notepad.sql and called it from toad editor using #/script/location/notepad.sql and hit F5. It ran and showed how many records were inserted. Again the temp table was somehow still empty. So, I decided to run a random insert script manually in the editor and it showed up in the temp table. I believe the methods that didn't work are not considered to be the same session?
I haven't try SQLLDR but I am assuming it will not work judging from the methods I tried. Can someone confirm? I can't access SQLLDR so I won't know.
Is there anyway to get this to work? I can't run the insert scripts manually. That will be time consuming and Toad can't take that many scripts at the same time.
Oracle temp tables created with ON COMMIT PRESERVE ROWS are session-specific, so the data put into them is only visible within a single session, and for the duration of that session. Toad may be creating a separate session for each window and thus data which is populated from one window/session isn't visible from another window/session. The fact that you can run an insert script and then select the data back suggests this may be the case if both operations were done from the same window. I expect you'd see the same behavior if you used SQL*Loader to load the tables because the load would run in one session and the data would be discarded when the session terminated. Best of luck.

OleDB Destination executes full rollback on error, Bulk Insert Task doesn't

I'm using SSIS and BIDS to process a text file who contains lots (millions) of records. I decided to use the Bulk Insert Task and it worked great but then the destination table needed an additional column with a default value on the insert operation and the Bulk Insert Task stopped working. After that, I decided to use a Derived Column with the defaul value and an OleDB Destination to insert the bulk data. It solved my last problem but generated a new one: If there is an error when inserting the data in the OleDB Destination, then it executes a full rollback and no row was added on my table, but when I used the Bulk Insert Task, there were rows based in the BatchSize configuration. Let me explain it with a sample:
I use a text file with 5000 lines. The file contained a duplicate id (intentionally) between the rows 3000 and 4000.
Before starting the DTS, the destination table was totally empty.
Using Bulk Insert Task, after the error raised (and the DTS stopped), the destination table had 3000 rows. I set the BatchSize attribute to 1000.
Using OleDB Destination, after the error raised, the destination table had 0 rows! I set the Rows per batch attribute to 1000 and the Maximum insert commit size to its max value: 2147483647. I tried changing last one to 0, no effect.
Is this the normal behavior of OleDB Destination? Can someone provide me a guide about working with these tasks? Should I forget to use these tasks and use the Bulk Insert from T-SQL?
As a side note, I also tried following the instructions for KEEPNULLS in Keep Nulls or UseDefault Values During Bulk Import (SQL Server) to not use the OleDB Destination task, but it didn't work (maybe is just me).
EDIT: Additional info about the problem.
Table structure (sample)
Table T
id int, name varchar(50), processed int default 0
CSV File (sample)
1, hello
2, world
There is no rolling back on Bulk Inserts, that's why they are fast.
Take a look at using format files:
http://msdn.microsoft.com/en-us/library/ms179250.aspx
You could potentially place this in a transaction in SSIS (you'll need MSDTC running), or you could create T-SQL script with a try-catch to handle any exceptions of the bulk insert (probably just rollback or commit).

SQL Server how to get last inserted data?

I ran a large query (~30mb) which inserts data in ~20 tables. Accidentally, I selected wrong database. There are only 2 tables with same name but with different columns. Now I want to make sure that no data is inserted in this database, I just don't know how.
If your table has a timestamp you can test for that.
Also sql-server keeps a log of all transactions.
See: https://web.archive.org/web/20080215075500/http://sqlserver2000.databases.aspfaq.com/how-do-i-recover-data-from-sql-server-s-log-files.html
This will show you how to examine the log to see if any inserts happened.
Best option go for Trigger
Use trigger to find the db name and
table name and all the history of
records manipulated