I need to copy the data from one database to another. However, while copying I need the ID values from the source database to be retained. Is there a way to do it? I tried copying the data but the id values get the new values causing my queries / reports to function incorrectly.
Help is greatly appreciated.
Thanks
Should have thought of this earlier. I ended copying the tables into my current database and then did the INSERT INTO table SELECT * SQL and that helped retain the id numbers when importing.
Related
I am trying to write data to an Azure SQL DB with Azure Data Factory. I'm using a Copy Data Task within a For Each that looks through all the rows in an ETL table in the DB. In the pre-copy script, I have
TRUNCATE TABLE [master].[dbo].[#{item().DestinationObjectName}]
DestinationObjectName being the name of the table that is being loaded in the ETL table. The problem I'm having is that for some of the tables (not all, some work perfectly fine) I am getting the error 'Cannot find the object % because it does not exist or you do not have permissions'. The account I'm using has all the necessary privileges as well. I am able to see the script that is sent to ADF which I have copied into the DB and confirmed this script works sometimes but not every time. If I select top 1000 from the table in question and replace that object for the one in the truncate table script, it works. I'm really at a loss here. Like I said the truncate works for a majority of the tables but not all. I have also double checked that the object names are the exact same.
Any help is appreciated.
This issue has been solved. I had to drop the affected tables and remove the brackets surrounding each in the create table statements and recreate without the brackets. very strange issue.
I'm having issues using the normal sysproc.admin_cmd I thought we should use, as I get error SQL3001. It says I should put the file location of the export to bcufenc folder, but I can't find it, maybe because it is data studio client on my machine.
I was hoping to get pointed in the right direction on this, or another way to export/unload and import/load
I just want to transfer data from one table to a table with the same structure in another state or scheme, as in the columns are the same, I just need to transfer the data.
Thank you in advance!
If source and target tables are in the same Db2 database, consider using the LOAD FROM (<query>) variant of ADMIN_CMD
CALL SYSPROC.ADMIN_CMD('LOAD FROM (SELECT * FROM T1) OF CURSOR INSERT INTO T2')
I have about 100,000+ delimited text files (they dont have same number of columns in each file, e.g. some files have 10 columns, some have 20 and so on). I need to upload all of them to SQL server. Please suggest how can I do it?
I also have an excel spreadsheet enlisting the names/path where files are stored and also the number of columns in each text file. I am clueless how to go about it.
Thanks in advance
I assume you are able to use C# (or other programming language) to create an app which will help you to complete the task. The program should do the following:
Run through all the files and determine all the columns you need.
Create a table on SQL server with columns the program found. Set datatype varchar(max) for each column.
Run through all the files again and insert data to the table. There is 2 ways you can go:
a) Insert data row by row. Pretty slow, but simple.
b) If you use C# you can implement your own DataReader and use SqlBulkCopy class to bulk insert data into the table
Here is some link which may help you:
http://www.sqlteam.com/article/use-sqlbulkcopy-to-quickly-load-data-from-your-client-to-sql-server
http://www.michaelbowersox.com/2011/12/22/using-a-custom-idatareader-to-stream-data-into-a-database/
http://daniel.wertheim.se/2010/11/10/c-custom-datareader-for-sqlbulkcopy/
I have one excel file that I want to import into two different tables, tblUni and tblUser.
I have a third table which contains the id's from the other two tables:
tblUni_Students
Id
UniId
StudentId
What I need is when I import the excel data into the first two tables, for each record, the newly created ids to be inserted into the Uni_Students table also.
Using SSIS, I have managed to import the data into two sql destinations but cannot seem to then take the new ids from these destinations to then insert into the lookup table.
Can anyone advise please. Thanks.
It's a bit difficult to answer without knowing the target database or the structure of the data but speaking generally this would be much better done by adding the data into a "load" table. i.e. one who's sole reason is to temporarily hold data while you process it, you would then update the tblStudent, tblUni and tblUni_Student tables from the load area using SQL statements either via Procedure or via an Execute SQL Task component.
You'd it as an oledbcommand component, where the command is to insert values into the table. Then in the same component you'd output the generated identity. Assign the generated identity to a new column in the output, and now you have all your data plus the generated identity in the dataflow.
This will be processed one row at a time, so it will be slow. Personally I'd put it in a staging table and do it as CiarĂ¡n described.
I am having two tables, one is Staging and another one is Report. All processing happens in Staging and upon completion of such process I have to copy all the records to Report.
The Staging table contains millions of records so I just want to know what is the fastest way to copy this data to Report.
3 options which I know are:
Insert into
Select into
creating a package and executing it via a job.
Any help in this regard is much appreciated.
another option is BCP out (queryout) and then BCP in/BULK INSERT
you can also use the BULK INSERT task in SSIS
Have a look at Transferring Data from One Table to Another
It discusses
The INSERT INTO Method
The DTS Import/Export Wizard method
The BCP/Bulk Insert Method