In sql server, From my desktop I connected to the server. And I want to move a data from a database to another. I have used both select into and import wizard. But import wizard seems to be slow. Why?
Is there any methodology changes for transferring data ?
Select into is a SQL query, and it is executed directly.
Import and Export Wizard is a tool which invokes Integration Services (SSIS).
Wizard is slow, but can use various data sources
More about export/import wizard
https://msdn.microsoft.com/en-US/en-en/library/ms141209.aspx
Topic about select into and export/import wizard
https://social.msdn.microsoft.com/forums/sqlserver/en-US/e0524b2a-0ea4-43e7-b74a-e9c7302e34e0/super-slow-performance-while-using-import-export-wizard
I agree with Andrey. The Wizard is super slow. If you perform a Google search on "sql server import and export wizard slow", you will receive nearly 50k hits. You may want to consider a couple of other options.
BCP Utility
Note: I have used this on a number occasions. Very fast processing.
The bcp utility bulk copies data between an instance of Microsoft SQL Server and a data file in a user-specified format. The bcp utility can be used to import large numbers of new rows into SQL Server tables or to export data out of tables into data files. Except when used with the queryout option, the utility requires no knowledge of Transact-SQL. To import data into a table, you must either use a format file created for that table or understand the structure of the table and the types of data that are valid for its columns.
Example:
BULK INSERT TestServer.dbo.EmployeeAddresses
FROM 'D:\Users\Addresses.txt';
GO
OPENROWSET(BULK) Function
The OPENROWSET(BULK) function connects to an OLE DB data source to restore data and it allows access to a remote data by connecting to a remote data source.
Example:
INSERT INTO AllAddress(Address)
SELECT * FROM OPENROWSET(
BULK 'D:\Users\Addresses.txt',
SINGLE_BLOB) AS x;
Reference
https://msdn.microsoft.com/en-us/library/ms175915.aspx
http://solutioncenter.apexsql.com/sql-server-bulk-copy-and-bulk-import-and-export-techniques/
Mysql Store data into many places and it stores data in Small chunk of files for faster retrieve and when we use export wizard what it does is write all metadata and data to our RAM first and depending on our system and increases overhead and same happen in case of importing, and Select into is fast because mysql has to create inbuilt replica of the database that already exist.
in real life, Select into is like photocopy of a page whereas wizard is like re-writing the page manually.
Related
I have two tables Table A and B that is present in a Azure SQL Database. I have the same database clone running in my local, but I want to populate the data that is present in Azure by using SSMS Export Data option. While using that option I specify the source and destination and then choose the option of "Write a query to specify the data to transfer"
And then I add the query "Select * from A where Condition1" and select the destination table here:
The issue is if I have 5 tables to export data from, I have to do this whole process 5 times, only difference is the queries and destination tables. Anyone has any idea how can I do this whole thing faster by some other means? I just need to copy data using some select statements with where clauses.
As per the Official Documentation
When you select  Write a query to specify the data to transfer, you can only copy the results of one query to one destination table.
So, you have to repeat the entire process for multiple times if you want to export data like that.
You can use the following ways for importing and exporting data:
Use Transact-SQL statements.
Use BCP (Bulk Copy Program) from the command prompt.
If you want to design a custom data import, you can use SQL Server Integration Services.
Use Azure Data factory.
Use BACPAC file. Refer this material by accu web hosting to know about it. Rather than querying before exporting the data, instead you can delete the unwanted data in destination database after exporting using delete statement.
REFERENCE:
Import & export data from SQL Server & Azure SQL Database - SQL Server | Microsoft Docs
I will soon need to import millions of records into into a single SQL Server Database table which we use in production. The data to import will be available in the form of about 40 csv files, each having hundreds of thousands of records.
For each row, some of the column values are supplied by the csv files, whereas other rows will require values that I must specify.
I am trying to determine which tool to use. I noticed that SQL Server Management Studio comes with the Import Export Wizard. Is that tool advisable for this type of job? Or should I use SSIS instead?
Some other questions I have:
Should I "lock" the table during the operation?
Should I perform the insert into a copy of the production table and
then once the operation is validated, should I make the copy the
official version of the production table?
As you are having some logic to handle for the rows from CSV (some rows, you will insert and some rows require you to supply some values), you cannot have these kinds of logic in the Import Export Wizard. It is straightforward load. So, you have to go for SSIS only.
You need to have conditional branching to split the rows and supply values to the target table.
For the second question, If possible, I would suggest you to load to separate table and then rename them later. That way, production system users are not impacted by this loading.
I want to transfer SQL query results to a new csv file. This is because I have placed my SQL query inside a loop which will generate export query results to csv file each time. I'm using MS SQL Server 2012. I don't want to take GUI option.
Sql Server is not really designed to import and export files. You can use bulk copy program but I dont think it works in tsql code (looping). You can use openrowset but you need to set a special flag that opens up your surface area of attack which some do not want to do.
The answer is SSIS (or a tool like Talend). It comes with Sql and is designed by MS as the go to tool for import and export from Sql. If you were to right click on the data base, choose tasks and then export the wizard eventually creates and executes an SSIS package.
I recommend you reconsider a GUI option.
ps - Another answer was to use save results as. I have heard of problems using this method including problems with delimiters or text qualified fields.
There are multiple ways to attain this. Either you can export the resultset using BCP or using IMPORT/ EXPORT or using CTRL+SHIFT+S (this will change the resultset to SAVE AS. Hope this may help.
How can I generate large table scripts ( data only) in sql server 2012?
-- Have approximately 116463 rows selected after seelect query was cancelled.could be more than that
Please suggest.
To do large amounts of just data the bcp Utility may be of a lot of help it can export data very quickly. It is through the cmd prompt but it is very clean and fast
It is a bulk copy.
This is the information from Microsoft
https://msdn.microsoft.com/en-us/library/ms162802.aspx
Look into DTS Wizard. It is fast, easy and just right for such one time jobs. You can control where the data goes to, including another SQL Server, Excel, CSV, etc.... And, if needed, move the data in reverse, from your backup medium back into the original database. DTS Wizard...don't go anywhere without it. ;)
Right click your database in the Object Explorer
Choose Tasks
Choose Export Data to bring up the SQL Server Import and Export Wizard and pick your source (DB and table or query) and destination.
I am a C# developer, I am not really good with SQL. I have a simple questions here. I need to move more than 50 millions records from a database to other database. I tried to use the import function in ms SQL, however it got stuck because the log was full (I got an error message The transaction log for database 'mydatabase' is full due to 'LOG_BACKUP'). The database recovery model was set to simple. My friend said that importing millions records using task->import data will cause the log to be massive and told me to use loop instead to transfer the data, does anyone know how and why? thanks in advance
If you are moving the entire database, use backup and restore, it will be the quickest and easiest.
http://technet.microsoft.com/en-us/library/ms187048.aspx
If you are just moving a single table read about and use the BCP command line tools for this many records:
The bcp utility bulk copies data between an instance of Microsoft SQL Server and a data file in a user-specified format. The bcp utility can be used to import large numbers of new rows into SQL Server tables or to export data out of tables into data files. Except when used with the queryout option, the utility requires no knowledge of Transact-SQL. To import data into a table, you must either use a format file created for that table or understand the structure of the table and the types of data that are valid for its columns.
http://technet.microsoft.com/en-us/library/ms162802.aspx
The fastest and probably most reliable way is to bulk copy the data out via SQL Server's bcp.exe utility. If the schema on the destination database is exactly identical to that on the source database, including nullability of columns, export it in "native format":
http://technet.microsoft.com/en-us/library/ms191232.aspx
http://technet.microsoft.com/en-us/library/ms189941.aspx
If the schema differs between source and target, you will encounter...interesting (yes, interesting is a good word for it) problems.
If the schemas differ or you need to perform any transforms on the data, consider using text format. Or another format (BCP lets you create and use a format file to specify the format of the data for export/import).
You might consider exporting data in chunks: if you encounter problems it gives you an easier time of restarting without losing all the work done so far.
You might also consider zipping the exported data files up to minimize time on the wire.
Then FTP the files over to the destination server.
bcp them in. You can use the bcp utility on the destination server for the BULK IMPORT statement in SQL Server to do the work. Makes no real difference.
The nice thing about using BCP to load the data is that the load is what is described as a 'non-logged' transaction, though it's really more like a 'minimally logged' transaction.
If the tables on the destination server have IDENTITY columns, you'll need to use SET IDENTITY statement to disable the identity column on the the table(s) involved for the nonce (don't forget to reenable it). After your data is imported, you'll need to run DBCC CHECKIDENT to get things back in synch.
And depending on what your doing, it can sometimes be helpful to put the database in single-user mode or dbo-only mode for the duration of the surgery: http://msdn.microsoft.com/en-us/library/bb522682.aspx
Another approach I've used to great effect is to use Perl's DBI/DBD modules (which provide access to the bulk copy interface) and write a perl script to suck out the data from the source server, transform it and bulk load it directly into the destination server, without having to save it to disk and move it. Also means you can trap errors and design things for recovery and restart right at the point of failure.
Use BCP to migrate data.
Another approach i have used in the past is to take a backup of the transaction log and shrink the log Prior to the migration. Split the migration script in parts and run the log backup- shrink - migrate iteration a few times.