How do you import a large MS SQL .sql file? - sql

I use RedGate SQL data compare and generated a .sql file, so I could run it on my local machine. But the problem is that the file is over 300mb, which means I can't do copy and paste because the clipboard won't be able to handle it, and when I try to open the file in SQL Server Management Studio I get an error about the file being too large.
Is there a way to run a large .sql file? The file basically contains data for two new tables.

From the command prompt, start up sqlcmd:
sqlcmd -S <server> -i C:\<your file here>.sql
Just replace <server> with the location of your SQL box and <your file here> with the name of your script. Don't forget, if you're using a SQL instance the syntax is:
sqlcmd -S <server>\instance.
Here is the list of all arguments you can pass sqlcmd:
Sqlcmd [-U login id] [-P password]
[-S server] [-H hostname] [-E trusted connection]
[-d use database name] [-l login timeout] [-t query timeout]
[-h headers] [-s colseparator] [-w screen width]
[-a packetsize] [-e echo input] [-I Enable Quoted Identifiers]
[-c cmdend] [-L[c] list servers[clean output]]
[-q "cmdline query"] [-Q "cmdline query" and exit]
[-m errorlevel] [-V severitylevel] [-W remove trailing spaces]
[-u unicode output] [-r[0|1] msgs to stderr]
[-i inputfile] [-o outputfile] [-z new password]
[-f | i:[,o:]] [-Z new password and exit]
[-k[1|2] remove[replace] control characters]
[-y variable length type display width]
[-Y fixed length type display width]
[-p[1] print statistics[colon format]]
[-R use client regional setting]
[-b On error batch abort]
[-v var = "value"...] [-A dedicated admin connection]
[-X[1] disable commands, startup script, environment variables [and exit]]
[-x disable variable substitution]
[-? show syntax summary]

I had exactly the same issue and had been struggling for a while then finally found the solution which is to set -a parameter to the sqlcmd in order to change its default packet size:
sqlcmd -S [servername] -d [databasename] -i [scriptfilename] -a 32767

You can use this tool as well. It is really useful.
BigSqlRunner
NB: Broken link, so have updated it.

Take command prompt with administrator privilege
Change directory to where the .sql file stored
Execute the following command
sqlcmd -S 'your server name' -U 'user name of server' -P 'password of server' -d 'db name'-i script.sql

I am using MSSQL Express 2014 and none of the solutions worked for me. They all just crashed SQL. As I only needed to run a one off script with many simple insert statements I got around it by writing a little console app as a very last resort:
class Program
{
static void Main(string[] args)
{
RunScript();
}
private static void RunScript()
{
My_DataEntities db = new My_DataEntities();
string line;
System.IO.StreamReader file =
new System.IO.StreamReader("c:\\ukpostcodesmssql.sql");
while ((line = file.ReadLine()) != null)
{
db.Database.ExecuteSqlCommand(line);
}
file.Close();
}
}

Run it at the command line with osql, see here:
http://metrix.fcny.org/wiki/display/dev/How+to+execute+a+.SQL+script+using+OSQL

Hope this help you!
sqlcmd -u UserName -s <ServerName\InstanceName> -i U:\<Path>\script.sql

I had similar problem. My file with sql script was over 150MB of size (with almost 900k of very simple INSERTs). I used solution advised by Takuro (as the answer in this question) but I still got error with message saying that there was not enough memory ("There is insufficient system memory in resource pool 'internal' to run this query").
What helped me was that I put GO command after every 50k INSERTs.
(It's not directly addressing the question (file size) but I believe it resolves problem that is indirectly connected with large size of sql script itself. In my case many insert commands)

==> sqlcmd -S [servername] -d [databasename] -i [scriptfilename] -a 32767
I have successfully done with this command with 365mb sql file.
this syntax runs in about 15 minutes.
it helped me solve a problem that took me a long time to figure out

Run the script file
Open a command prompt window.
In the Command Prompt window, type: sqlcmd -S <ServerName\InstanceName> -i C:\yourScript.sql
Press ENTER.

Your question is quite similar to this one
You can save your file/script as .txt or .sql and run it from Sql Server Management Studio (I think the menu is Open/Query, then just run the query in the SSMS interface). You migh have to update the first line, indicating the database to be created or selected on your local machine.
If you have to do this data transfer very often, you could then go for replication. Depending on your needs, snapshot replication could be ok. If you have to synch the data between your two servers, you could go for a more complex model such as merge replication.
EDIT: I didn't notice that you had problems with SSMS linked to file size. Then you can go for command-line, as proposed by others, snapshot replication (publish on your main server, subscribe on your local one, replicate, then unsubscribe) or even backup/restore

The file basically contain data for two new tables.
Then you may find it simpler to just DTS (or SSIS, if this is SQL Server 2005+) the data over, if the two servers are on the same network.
If the two servers are not on the same network, you can backup the source database and restore it to a new database on the destination server. Then you can use DTS/SSIS, or even a simple INSERT INTO SELECT, to transfer the two tables to the destination database.

There is probably another way for all the fellows still encountering problems importing really large SQL dumps.
What also be considered when possible: If you have access to the server you could export the database in multiple parts, like first the structure, then per table (or related objects) an export of the data in smaller pieces, instead of one big file.
When you don't have access to server and/or required to use the existing big file, you could try to split them into parts with SQLDumpSplitter: https://philiplb.de/sqldumpsplitter3/.
Then import the pieces to get a full copy of the database.
Good luck, guys.

Related

LocalDB import and export options not working

I've created an SqlLocalDB by using the command prompt, and somehow I cannot seem to import or export data into my databases. On any organisational database I don't have this problem, but my LocalDB won't let me do so.
Is there somehow I can fix this issue, since it somehow only targets my LocalDB?
I still don't know much about your localdb instance, e.g. creation, rights, where you see the grey restore option, etc. but
the docs show two commands:
SqlLocalDB.exe create "DEPARTMENT" 12.0 -s
And then for a 'shared instance' using Adminstrator priveledges
SqlLocalDB.exe create "DeptLocalDB"
SqlLocalDB.exe share "DeptLocalDB" "DeptSharedLocalDB"
SqlLocalDB.exe start "DeptLocalDB"
SqlLocalDB.exe info "DeptLocalDB"
REM The previous statement outputs the Instance pipe name for the next step
sqlcmd -S np:\\.\pipe\LOCALDB#<use your pipe name>\tsql\query
CREATE LOGIN NewLogin WITH PASSWORD = 'Passw0rd!!#52';
GO
CREATE USER NewLogin;
GO
EXIT
Then you can enter the instance using
sqlcmd -S (localdb)\.\DeptSharedLocalDB -U NewLogin -P Passw0rd!!#52
I am still unsure what you are trying and how you have done things on your side,
but I feel you should try to run a shared instance but you still have the "current user" issue then. I fear your greyed out button issue is also a problem related to user rights, although I am not 100% sure.
This user also switched to SQL Express after some difficulties.
Usually you import databases like this:
To find the names needed to import a DB you can run
RESTORE FILELISTONLY
FROM DISK = 'D:\tmp\db.bak'
Then restore the DB using
RESTORE DATABASE dbnameofyourchoice
FROM DISK = 'D:\tmp\db.bak'
WITH MOVE 'name_of_data_file' TO
'C:\...\db.mdf',
MOVE 'name_of_log' TO
'C:\...\db.ldf',
REPLACE;

Changing default database in RODBC for sql server database

I have tried to connect a Sql server database to R using:
conn<-odbcConnect("dsnDb",uid="",pwd="")
It retrieves the default database 'master' instead of the database I need. How can I manually get a particular database?
So, there's no way to do this within the RODBC package, but here's a function which achieves the same thing:
Assuming your DSN is called "MyODBClink":
MakeChannel = function(dbName){
system(paste("REG ADD HKEY_CURRENT_USER\\Software\\ODBC\\ODBC.INI\\MyODBClink /f /v \"Database\" /t REG_SZ /d ", "\"", dbName ,"\"", sep=""),intern=T)
return(odbcConnect("MyODBClink"))
}
The way this function works is to edit the details of the registry (I'm assuming you're on Windows) for that particular DSN to point to the correct database, and then return an open connection to that DSN.

Import postgres database without roles

I have a database that was exported with pg_dump, but now when I'm trying to import it again with:
psql -d databasename < mydump.sql
It fails trying to grant roles to people that don't exist. (error says 'Role "xxx" does not exist')
Is there a way to import and set all the roles automatically to my user?
The default behavior of the import is that it replaces all roles it does not know with the role you are doing the import with. So depending on what you need the database for, you might just be fine with importing it and with ignoring the error messages.
Quoting from http://www.postgresql.org/docs/9.2/static/backup-dump.html#BACKUP-DUMP-RESTORE
Before restoring an SQL dump, all the users who own objects or were granted permissions on objects in the dumped database must already exist. If they do not, the restore will fail to recreate the objects with the original ownership and/or permissions. (Sometimes this is what you want, but usually it is not.)
The answer that you might be looking for is adding the --no-owner to the pg_restore command. Unlike the accepted answer at the moment, the command should create every object with the current user even if the role in the dump don't exist in the database.
So no element will get skipped by pg_restore but if some elements imported are owned by different users, all of the records will be now owned by only one user as far as I can tell.
With pg_restore you can use the --role=rolename option to force a role name to be used to perform the restore. But the dump must be non plain text format.For example you can dump with:
pg_dump -F c -Z 9 -f my_file.backup my_database_name
and than you can restore it with:
pg_restore -d my_database_name --role=my_role_name my_file.backup
for more info:
http://www.postgresql.org/docs/9.2/static/app-pgrestore.html
Yes, you can dump all the "Global" objects from your source DB with pg_dumpall's -g option:
pg_dumpall -g > globals.sql
Then run globals.sql against your target DB before importing.
I used the following:
pg_dump --no-privileges --no-owner $OLD_DB_URL | psql $NEW_DB_URL
From
https://www.postgresql.org/docs/12/app-pgdump.html
-O
--no-owner
Do not output commands to set ownership of objects to match the
original database. By default, pg_dump issues ALTER OWNER or SET
SESSION AUTHORIZATION statements to set ownership of created database
objects. These statements will fail when the script is run unless it
is started by a superuser (or the same user that owns all of the
objects in the script). To make a script that can be restored by any
user, but will give that user ownership of all the objects, specify
-O.
This option is only meaningful for the plain-text format. For the
archive formats, you can specify the option when you call pg_restore.
-x
--no-privileges
--no-acl
Prevent dumping of access privileges (grant/revoke commands).
In newer versions of pg_restore it will complain about text file imports.
However, ou can just remove these lines with awk. You can pipe it to a new file to make sure it didn't break anything or just pipe it directly into psql like this:
cat my-import-file.sql | awk '!/old-role/' | psql -U target_owner -d target_db_name -1 -v ON_ERROR_STOP=1
replace my-import-file.sql with your import file
replace target_owner with your username
replace target_db_name with the name of the database to create
replace old-role with the role it is complaining about
If you have multiple roles:
cat my-import-file.sql | awk '!/role1|role2|role3/' | psql -U target_owner -d target_db_name -1 -v ON_ERROR_STOP=1
This will remove all lines that mention that role. Everything will be created owned by the user logged in to psql.
If you are trying to import a backup by using pgadmin enable this flags when you are restoring your bd:
I hope it helps you.
cheers!
Well You can just create new role with same name as that you are missing, and then import dump with no errors.
error says 'Role "xxx" does not exist' - so create it :)

MS SQL Server use old log file location after detach/copy/attach

I create database "Test" in folder "d:\test". Database files are "d:\test\Test.mdf" and "d:\Test\Test_log.ldf". I detach database from MS SQL Server 2008 R2, copy all files to new folder ("d:\test_new"), delete log file ("d:\test_new\Test_log.ldf"), and try to attach database again from new location. When I use SQL Server Management Studio, and choose "d:\test_new\Test.mdf" file, it determines that log file is located in "d:\test\Test_log.ldf" (old location). How can I attach this database with rebuilding log in new location? Just imagine, that I cannot copy ldf file again to new location, and that it is still available there, so SQL Server see it anyway. I want to say to SQL Server - "please, forget that log file, and create new log file here". It's be better if you help me with T-SQL script, but if it will be steps in Management studio - I will convert it to script myself.
What I had tried already:
1.
CREATE DATABASE [test]
ON ( FILENAME = N'D:\test_new\test.mdf' )
FOR ATTACH_REBUILD_LOG
attaches log file from old location (FOR ATTACH - the same)
2.
CREATE DATABASE [test]
ON ( FILENAME = N'D:\test_new\test.mdf' )
LOG ON ( FILENAME = N'D:\test_new\test_log.ldf' )
FOR ATTACH_REBUILD_LOG
returns an error: Unable to open the physical file "D:\test_new\test_log.ldf". Operating system error 2: "2(File not found.)".
3.
sp_attach_db and sp_attach_single_file_db
was tried too. And I even had checked their source codes - they just create dynamic SQL and call CREATE DATABASE ... FOR ATTACH statement.
The question is slowly changed to: "Is it possible?"
UPDATE
Well, it looks like it's not possible with current versions of SQL Server. If anybody knows a way to do it - please, I will be very pleased to know it too!
Edit2: To my knowledge, it is not possible for SQL Server to recreate a log file. It can shrink the ldf, but not create it when only the mdf exists.
When you copy your files from d:\test\ to d:\test_new\, do not delete the d:\test_new\Test_log.ldf.
Leave the log file there, because you cannot reattach the new DB without that log file. Afterwards, you can shrink that log to a minimum size.
So, to synthesize:
Copy your files from d:\test\ to d:\test_new\ and leave the log
file there.
Run your create database script that you posted in your question (point 2).
Run the following script to shrink the log to a minimum size
.
USE test
GO
DBCC SHRINKFILE(logicalFileName, 1)
GO
To find out what logicalFileName is, run sp_helpfile, that will give you the logical file name for your log file:
USE test
GO
EXEC sp_helpfile
GO
more info here
Edit:
I think you need first to detach the test database from the old location:
(You might create a script that does it all, from the following commands)
C:\> osql -E
1> sp_detach_db 'test'
2> go
3> quit
C:\>
Then copy the files to the new location.
C:\> copy d:\test\* d:\test_new\*
Next, attach the test DB to the new path location:
C:\> osql -E
1> sp_attach_db #dbname = N'test', #filename1 = N'd:\test_new\Test.mdf', #filename2 = N'd:\test_new\Test_log.ldf'
2> go
3> quit
C:\>
to test if the new database was successfully attached:
C:\> osql -E
1> use test
2> go
3> quit
C:\>
If there are no errors after the go command, then all is ok
Hope this helps
Microsoft article on how to move files
The users must copy BOTH the .mdf and .ldf files. They then have to use the following command (one or the other).
sp_attach_db (deprecated, use the CREATE DATABASE WITH ATTACH in the future)
EXEC sp_attach_db #dbname = 'dbname', #filename1='d:\test_new\test.mdf', #filename2='d:\test_new\test.ldf'
This will result in the databas using the data file (mdf) and transaction log (ldf) from the \test_new directory
CREATE DATABASE FOR ATTACH
CREATE DATABASE dbname ON '(FILENAME=d:\test_new\test.mdf'), (FILENAME='d:\test_new\test.ldf') FOR ATTACH

How do you stop a user-instance of Sql Server? (Sql Express user instance database files locked, even after stopping Sql Express service)

When using SQL Server Express 2005's User Instance feature with a connection string like this:
<add name="Default" connectionString="Data Source=.\SQLExpress;
AttachDbFilename=C:\My App\Data\MyApp.mdf;
Initial Catalog=MyApp;
User Instance=True;
MultipleActiveResultSets=true;
Trusted_Connection=Yes;" />
We find that we can't copy the database files MyApp.mdf and MyApp_Log.ldf (because they're locked) even after stopping the SqlExpress service, and have to resort to setting the SqlExpress service from automatic to manual startup mode, and then restarting the machine, before we can then copy the files.
It was my understanding that stopping the SqlExpress service should stop all the user instances as well, which should release the locks on those files. But this does not seem to be the case - could anyone shed some light on how to stop a user instance, such that it's database files are no longer locked?
Update
OK, I stopped being lazy and fired up Process Explorer. Lock was held by sqlserver.exe - but there are two instances of sql server:
sqlserver.exe PID: 4680 User Name: DefaultAppPool
sqlserver.exe PID: 4644 User Name: NETWORK SERVICE
The file is open by the sqlserver.exe instance with the PID: 4680
Stopping the "SQL Server (SQLEXPRESS)" service, killed off the process with PID: 4644, but left PID: 4680 alone.
Seeing as the owner of the remaining process was DefaultAppPool, next thing I tried was stopping IIS (this database is being used from an ASP.Net application). Unfortunately this didn't kill the process off either.
Manually killing off the remaining sql server process does remove the open file handle on the database files, allowing them to be copied/moved.
Unfortunately I wish to copy/restore those files in some pre/post install tasks of a WiX installer - as such I was hoping there might be a way to achieve this by stopping a windows service, rather then having to shell out to kill all instances of sqlserver.exe as that poses some problems:
Killing all the sqlserver.exe instances may have undesirable consequencies for users with other Sql Server instances on their machines.
I can't restart those instances easily.
Introduces additional complexities into the installer.
Does anyone have any further thoughts on how to shutdown instances of sql server associated with a specific user instance?
Use "SQL Server Express Utility" (SSEUtil.exe) or the command to detach the database used by SSEUtil.
SQL Server Express Utility,
SSEUtil is a tool that lets you easily interact with SQL Server,
http://www.microsoft.com/downloads/details.aspx?FamilyID=fa87e828-173f-472e-a85c-27ed01cf6b02&DisplayLang=en
Also, the default timeout to stop the service after the last connection is closed is one hour. On your development box, you may want to change this to five minutes (the minimum allowed).
In addition, you may have an open connection through Visual Studio's Server Explorer Data Connections, so be sure to disconnect from any database there.
H:\Tools\SQL Server Express Utility>sseutil -l
1. master
2. tempdb
3. model
4. msdb
5. C:\DEV_\APP\VISUAL STUDIO 2008\PROJECTS\MISSICO.LIBRARY.1\CLIENTS\CORE.DATA.C
LIENT\BIN\DEBUG\CORE.DATA.CLIENT.MDF
H:\Tools\SQL Server Express Utility>sseutil -d C:\DEV*
Failed to detach 'C:\DEV_\APP\VISUAL STUDIO 2008\PROJECTS\MISSICO.LIBRARY.1\CLIE
NTS\CORE.DATA.CLIENT\BIN\DEBUG\CORE.DATA.CLIENT.MDF'
H:\Tools\SQL Server Express Utility>sseutil -l
1. master
2. tempdb
3. model
4. msdb
H:\Tools\SQL Server Express Utility>
Using .NET Refector the following command is used to detach the database.
string.Format("USE master\nIF EXISTS (SELECT * FROM sysdatabases WHERE name = N'{0}')\nBEGIN\n\tALTER DATABASE [{1}] SET OFFLINE WITH ROLLBACK IMMEDIATE\n\tEXEC sp_detach_db [{1}]\nEND", dbName, str);
I have been using the following helper method to detach MDF files attached to SQL Server in unit tests (so that SQ Server releases locks on MDF and LDF files and the unit test can clean up after itself)...
private static void DetachDatabase(DbProviderFactory dbProviderFactory, string connectionString)
{
using (var connection = dbProviderFactory.CreateConnection())
{
if (connection is SqlConnection)
{
SqlConnection.ClearAllPools();
// convert the connection string (to connect to 'master' db), extract original database name
var sb = dbProviderFactory.CreateConnectionStringBuilder();
sb.ConnectionString = connectionString;
sb.Remove("AttachDBFilename");
var databaseName = sb["database"].ToString();
sb["database"] = "master";
connectionString = sb.ToString();
// detach the original database now
connection.ConnectionString = connectionString;
connection.Open();
using (var cmd = connection.CreateCommand())
{
cmd.CommandText = "sp_detach_db";
cmd.CommandType = CommandType.StoredProcedure;
var p = cmd.CreateParameter();
p.ParameterName = "#dbname";
p.DbType = DbType.String;
p.Value = databaseName;
cmd.Parameters.Add(p);
p = cmd.CreateParameter();
p.ParameterName = "#skipchecks";
p.DbType = DbType.String;
p.Value = "true";
cmd.Parameters.Add(p);
p = cmd.CreateParameter();
p.ParameterName = "#keepfulltextindexfile";
p.DbType = DbType.String;
p.Value = "false";
cmd.Parameters.Add(p);
cmd.ExecuteNonQuery();
}
}
}
}
Notes:
SqlConnection.ClearAllPools() was very helpful in eliminating "stealth" connections (when a connection is pooled, it will stay active even though you 'Close()' it; by explicitely clearing pool connections you don't have to worry about setting pooling flag to false in all connection strings).
The "magic ingredient" is call to the system stored procedure sp_detach_db (Transact-SQL).
My connection strings included "AttachDBFilename" but didn't include "User Instance=True", so this solution might not apply to your scenario
I can't comment yet because I don't have high enough rep yet. Can someone move this info to the other answer so we don't have a dupe?
I just used this post to solve my WIX uninstall problem. I used this line from AMissico's answer.
string.Format("USE master\nIF EXISTS (SELECT * FROM sysdatabases WHERE name = N'{0}')\nBEGIN\n\tALTER DATABASE [{1}] SET OFFLINE WITH ROLLBACK IMMEDIATE\n\tEXEC sp_detach_db [{1}]\nEND", dbName, str);
Worked pretty well when using WIX, only I had to add one thing to make it work for me.
I had took out the sp_detach_db and then brought the db back online. If you don't, WIX will leave the mdf files around after the uninstall. Once I brought the db back online WIX would properly delete the mdf files.
Here is my modified line.
string.Format( "USE master\nIF EXISTS (SELECT * FROM sysdatabases WHERE name = N'{0}')\nBEGIN\n\tALTER DATABASE [{0}] SET OFFLINE WITH ROLLBACK IMMEDIATE\n\tALTER DATABASE [{0}] SET ONLINE\nEND", dbName );
This may not be what you are looking for, but the free tool Unlocker has a command line interface that could be run from WIX. (I have used unlocker for a while and have found it stable and very good at what it does best, unlocking files.)
Unlocker can unlock and move/delete most any file.
The downside to this is the apps that need a lock on the file will no longer have it. (But sometimes still work just fine.) Note that this does not kill the process that has the lock. It just removes it's lock. (It may be that restarting the sql services that you are stopping will be enough for it to re-lock and/or work correctly.)
You can get Unlocker from here: http://www.emptyloop.com/unlocker/
To see the command line options run unlocker -H
Here they are for convenience:
Unlocker 1.8.8
Command line usage:
Unlocker.exe Object [Option]
Object:
Complete path including drive to a file or folder
Options:
/H or -H or /? or -?: Display command line usage
/S or -S: Unlock object without showing the GUI
/L or -L: Object is a text file containing the list of files to unlock
/LU or -LU: Similar to /L with a unicode list of files to unlock
/O or -O: Outputs Unlocker-Log.txt log file in Unlocker directory
/D or -D: Delete file
/R Object2 or -R Object2: Rename file, if /L or /LU is set object2 points to a text file containing the new name of files
/M Object2 or -M Object2: Move file, if /L or /LU is set object2 points a text file containing the new location of files
Assuming your goal was to replace C:\My App\Data\MyApp.mdf with a file from your installer, you would want something like unlocker C:\My App\Data\MyApp.mdf -S -D. This would delete the file so you could copy in a new one.