So we had an encrypted database that puked and killed our whole SQL Server setup. Sucks about our data, but we were smart enough to have our Data structure/Stored Procs/Functions in Git
The problem is they're saved as .sql files.
Is there any way we can batch restore our schema from directories full of these files?
I've looked around and I can only find tutorials for restoring from .bak files or .mdf's. This isn't the lazy man's way out- I just need to find a solution ASAP. Any advice or resources/ anything at all would be greatly appreciated.
Thanks Interwebs,
Dylan
Considering the large size of the data structure I was trying restore, running each files individually was not a practical solution. I'm sure I could have written a bat fie, but I got it done pretty quickly in python:
import os, subprocess
processDir = 'C:\\Database-master\\'
files = os.listdir(processDir)
for f in files:
db = processDir + f
#potentially drop corrupt db and create new ones with f
scripts = os.listdir(db)
for script in scripts:
path = db + '\\' + script
proc = subprocess.Popen('sqlcmd -S 127.0.0.1 -i "' + path +'"', shell=True)
proc.wait()
If your database was large/complex the real problem you are going to encounter is not batch execution but the order in which scripts should be executed.
Unless you have some backup file this is going to be the real problem here.
If you only have your scripts then I'd suggest something like this.
tables
views
everything else...
Just execute one query after another until you get an error. When you do encounter an error this is most probably because you are trying to reference object that doesn't already exist. Just save that script for later and continue with executing scripts. Then start from the beginning again and go through scripts that caused an error. Now objects are probably there. Repeat this as many times as needed until you create all your objects.
Related
I am currently working on a project and I want to know how to save an sqllite database in rails as a csv file. I want it when you click the button, the current database on the system download. Can anybody help me? Thanks!
Your problem isn't really specific to Rails. Instead, you're mostly dealing with an administrative issue. You should write a script to export your database as csv, something like this:
#!/bin/bash
./bin/sqlite3 ./my_app/db/my_database.db <<!
.headers on
.mode csv
.output my_output_file.csv
select * from my_table;
!
This script exports a single table. If you have additional tables, you'll want to add them to your script.
The only Rails related issue is the matter of calling that script. Save the script within your application structure; I'd suggest my_app/assets or some similar location.
Now you can run that script using system(command) where command is the absolute path for your script, within a set of double-quotes.
I use the sqlcmd utility to import a 7 GB large SQL dump file into a remote SQL Server. The command I use is this:
sqlcmd -S IP address -U user -P password -t 0 -d database -i file.sql
After about 20-30 min the server regularly responds with:
Sqlcmd: Error: Scripting error.
Any pointers or advice?
I assume file.sql is just a bunch of INSERT statements. For a large amount of rows, I suggest using the BCP command-line utility. This will perform orders of magnitude faster than individual INSERT statements.
You could also bulk insert data using the T-SQL BULK INSERT command. In that case, the file path needs to be accessible by the database server (i.e. UNC path or copied to a drive on the server) and along with needed permissions. See http://msdn.microsoft.com/en-us/library/ms188365.aspx.
Why not use SSIS. While I have a certificate as a DBA, I always try to use the right tool for the job.
Here are some reasons to use SSIS.
1 - Use can still use fast-load, bulk copy. Make sure you set the batch size.
2 - Error handling is much better.
However, if you are using fast-load, either the batch commits or it gets tossed.
If you are using single record, you can direct each error row to a separate destination.
3 - You can perform transformations on the source data before loading it into the destination.
In short, Extract Translate Load.
4 - SSIS loves memory and buffers. If you want to get really in depth, read some articles from Matt Mason or Brian Knight.
Last but not least, the LAN/WAN always plays a factor if the job is not running on the target server with the input file on a local disk.
If you are on the same backbone with a good pipe, things go fast.
In summary, yeah you can use BCP. It is great for little quick jobs. Anything complicated with robust error handling should be done with SSIS.
Good luck,
I am playing with MySQL but reading this post before
https://blog.stackoverflow.com/2011/04/creative-commons-data-dump-apr-11/.
I want to play with this data in SQL Server.
When I download them I found many rar files there. When I extract one of them, I found the xml file but I really do not know how I can restore them.
Can anyone can explain what I need to do to restore them.
You can do this from shell command/command line
$ mysql -u [uname] -p[pass] [db_to_restore] < [backupfile.sql]
http://www.webcheatsheet.com/SQL/mysql_backup_restore.php
I have two backup files
1) is named 'backup.sql' with a bunch of SQL defining TABLES
2) is named 'backup' with a bunch of encoded data, which I believe are the ROWS
I need to restore these TABLES + ROWS, but all I am able to figure out is how to restore the tables.
Any tips on dealing with these files? It's the first time I ever deal with SQL Server.
The backup process would not create a file with actual SQL statements, it would create a binary file. So #1 is not a backup file (it's probably a script someone saved to re-create the schema).
I would try to use SQL Server Management Studio to restore the second file and see what happens. I don't think it will allow you to restore an invalid file, but I would take some basic precautions like backing up the system first.
What is the extension for the 'backup' file? Is the filename backup.bak? If you have a backup file created by sql server then it 'should' contain the logic to create both the tables and restore the data, but it could depend on how the backup was created.
---Edit
It is possible for a .SQL file to contain data values as well as the logic to create the tables/columns for a database. I used to run backups of a MySql database in this way a long time ago...it just is not seen very often with SQL server since it has built in backup/restore funcationality.
Seems unlikely they would export all the rows from all tables into CSV file, and given you said it looks encrypted, it's making me think that's your actual backup file.
try this, save a copy of the "backup" file, rename it to backup.bak and run this from SQL Server Management Studio
restore filelistonly from disk='C:\backup.bak'
(assuming your file is saved on the root of the C: drive)
Any results/errors?
I have a SQL script which is extremely large (about 700 megabytes). I am wondering if there is a good way to reduce the size of the script?
I know there are code minimizers for JavaScript and am looking for one to use with SQL scripts.
I am not looking to get performance on the SQL script. I am trying to make the file size smaller. Removing excess whitespace. Keeping name-qualification down so that the script file sizes can be smaller.
If I attempt to load the file in SQL Server Management Studio I get this error.
Not enough storage is available to
process this command. (Exception from
HRESULT: 0x80070008) (mscorlib)
What's in this script of 700MB?! I would hope that there are some similarities/repetitions that would allow it to shorten the file.
Just some guesses:
Instead of inserting a million records using Insert statements, use a bulk loading tool
Instead of updating a number of individual records, try to batch updates to the same value into one (e.g. Update tab set col=1 where id in (..) instead of individual updates)
long manipulations can be defined as a stored procedure (before running the script) and the script would only have to call the stored proc
Of course, splitting the script up into smaller portions and calling each one from a simple batch file would work too. But I'd be a little worried about performance (how long does the execution take?!) and would look for some faster ways.
What about breaking your script into several small files, and calling those files from a single master script?
This link describes how to do it from a stored procedure.
Or you can do it from a batch file like this:
REM =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
REM Define widely-used variables up here so they can be changed in one place
REM Search for "sqlcmd.exe" and make sure this path is valid for you
REM =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
set sqlcmd="C:\Program Files\Microsoft SQL Server\100\Tools\Binn\sqlcmd.exe"
set uname="your_uname_here"
set pwd="your_pwd_here"
set database="your_db_name_here"
set server="your_server_name_here"
%sqlcmd% -S %server% -d %database% -U %uname% -P %pwd% -i "c:\script1.sql"
%sqlcmd% -S %server% -d %database% -U %uname% -P %pwd% -i "c:\script2.sql"
%sqlcmd% -S %server% -d %database% -U %uname% -P %pwd% -i "c:\script3.sql"
pause
I like the batch file approach myself, because it is easier to tinker with it, and you can schedule it as a windows job.
Make sure the .BAT file is in a folder with the appropriate security restrictions, since it has your credentials in a plain text .BAT file.
gzip should do.
SQL is much harder to shrink, the field, table names and commands need to be what they are. Plus, you wouldn't just want to rewrite the commands as something shorter because it could have implications on performance.
Depending on the DBMS that you use, it may allow short names for commands, and then there might be a converter.
(Answering this because it is the top item returned when I searched for "SQL script size")
I got the same error when trying to load a large script into Management Studio. In my case I was trying to downgrade a database from SQL2008 R2 to SQL 2008 by using the SQL Server script generator, which created a 700mb structure and data .sql file.
To get around it I used the command line to run the script instead:
C:>sqlcmd -S [SQLSERVER\INSTANCE] -i [FILELOCATION\FILENAME].sql
Hopefully this helps someone else.
Compress the sql file will have the most compression ratio.
Minimizing the txt sql file will reduce some bytes/kilobytes per mega.. is not worth...
The better approach is to create a "function" to unzip and read the file. The best benefit I guess.
Today, filesize shouldn't be a problem. Dial-up connection? Floppy disks?
pg-minify can do it, and not just for PostgreSQL, but for most notations, including MS, MySql, etc.