I am using the DatabasePublishingWizard to generate a large create script, containing both data and schema. The file it produces is ginormous so opening the script to fix any syntax errors is next to impossible and machines with less than 4gb have a hard time getting it to run! What should I do and how should I do it? Thanks everyone for any help you can provide.
With the Database Publishing Wizard, you can have all the objects created as separate files instead of one big one. Then you can put all the files in source control and track all changes.
My current project uses a script to recreate a database for development. The script deletes all existing objects and then readds them using the following statement for each object file.
sqlcmd -S %1 -d THRIVEHQ -E -b -i "../Tables/Schema.sql"
if %ERRORLEVEL% NEQ 0 goto errors
Just want to add to Kevin's comment. Breaking the scripts into separate files THEN write the script to put all the files in order of execution.
When dumping a large database that has a lot of inter-dependencies as one large file won't do you much good as in most cases the script won't execute without errors. In my world I use a naming convention that helps me quickly see which, in this case, views are dependent on other views. For example, if I have a view that just produces a dump of data I'd use something like this v_VIEW_NAME_ORIGINATING-TABLE_Dump, then I'd change the suffix to something like _weekly or _weekly_Summary for views that are derived off of the main dump table.
I learned my lesson many years ago and have since follow this naming schema in all my databases.
Try DBSourceTools.
http://dbsourcetools.codeplex.com
It will script your entire database to disk, one file per database object.
Using Deployment Targets, you can then re-create any database from file.
It's specifically designed to help developers get their databases under source code control.
I would do it in steps.
Generate all of your tables and views as 1 script.
Generate all of your stored procedures and grants as 1 script.
Use DTS or SSIS to migrate your data.
All of this can be achieved with MS SQL Server Management Studio.
Off the top of my head, I would say write a script to separate the file into multiple ones with the break occurring after each "GO" statement. You could the write another script to execute each broken out file in order.
Related
I have an application that has been released for several years. I have updated the SQL Server database around 90 times and each time I have created an update SQL script, so when the user runs the new software version the database is updated.
I would like to amalgamate all the updates into a single SQL script to make deployment faster and simpler, is there an easy way to do this? I presume I could simply grab the latest database after it has run through all 90 updates via SQL Server Management Studio?
EDIT
For clarification; the software I wrote automatically applies new database updates when the user downloads the latest version. This is done via C# / .Net and look for an embedded sql script on startup in the format XX_update.sql calling each script one by one i.e.
1_update.sql - this creates the tables and initial data etc. This was my initial release database.
2_update.sql - updates to the initial database such as adding a new SP or changing column datatype etc
3_update.sql
4_update.sql
...
90_update.sql (4 years and lots of program updates later!).
Ideally, I would install my software and create a brand new database running through all 90 update scripts. Then take this database and convert it into a script which I can replace all 90 scripts above.
This is too long for a comment.
There is no "easy" way to do this. It depends on what the individual updates are doing.
There is a process you can follow in the future, though. Whenever an update occurs, you should maintain scripts both for incremental updates and for complete updates. You might also want to periodically introduce major versions, and be able to upgrade to and from those.
In the meantime, you'll need to build the complete update by walking through the individual ones.
I use a similar system at work and while I prefer to run the scripts separately I have amalgamated several scripts sometimes when they have to be deployed by another person with no problem.
In SQL Server the rule is that as long as you separate the scripts by go and use SQL Server Management Studio or another tool that process the batch separator properly there is no problem in amalgamating it, because it would look like separate scripts to SQL Server (instead of being sent to SQL Server as a one big script the tool send it in batches using the go as the batch separator).
The only problem is that if you have an error in the middle of the script, the tool would continue sending the rest of batches to the server (at least by default, maybe there is an option for changing this). That is why I prefer when possible using a tool to run then separately, just to err on the safer side and stop if there is a problem and locate easily the problematic command.
Also, for new deployments your best option is what you say of using a database that is already updated instead of using the original one and apply all the updates. And to prevent being in this situation again you can keep a empty template database that is not used for any testing and update it when you update the rest of databases.
Are you running your updates manaually on the server. Instead, you can create a .bat file, powershell script or a exe. Update scripts in your folder can be numbered. The script can just pick up updates one by one and execute it over the database connection instead of you doing it.
If you have multiple script files and you want them to combine into one,
rename these as txt, combine, rename the resulting file as sql.
https://www.online-tech-tips.com/free-software-downloads/combine-text-files/
This is easy. Download SQL Server Data Tools and use that to create the deployment script. If there's no existing database, it will create all the objects, and if targeting an older version it will perform a diff against the target database and create scripts that create the missing objects, and alter the existing ones.
You can also generate scripts for the current version using SSMS, or use SSMS to take a backup, and use a restore to in an install.
I have searched for an answer to this, and one seems not to exist.
Problem:
A website is querying a database and unable to return results (as an export to Excel) in a timely fashion. This is primarily due to result set size. I'd like to set up a background process to 'ping' for waiting queries and execute them one-by-one, dumping data into a location to be downloaded from. The 'pinging' task can be handled a whole host of ways. My original ideal solution was a trigger (alternatively, a SQL Server Agent task) that exported the data to the filesystem. But I have run into an issue where I don't know how to set up an amorphous output to the filesystem with a simple T-SQL statement.
SSIS is apparently the standard solution to this. I don't know enough about SSIS to know whether it will handle what I want it to do, but I have been told the queries are too great in number / various in output for that to be a feasible solution.
xp_cmdshell can be run to do a BCP export. This works fine, but apparently opens a security hole.
Previous solutions:
A solution I used years ago, DTS passing data straight to the operating system, seems to have been disabled in SQL Server 2008/ 2012. I also used to be able to use sp_makewebtask to export data directly to the filesystem but no longer can do that either.
Current solution
I am writing a PowerShell script tied to some SQL tables and stored procedures to manage execution. This seems like a non-ideal solution; I'm curious as to whether I have missed something. Is there an easy way to set up SSIS to export data without a structure? A way to create an Excel file on the fly and fill it with data?
The answer seems to be No.
You can export to CSV instead of Excel (because Excel opens CSV files easily), but they don't have any formatting. You can set up SSIS (or BCP in a scheduled task) to export in the CSV file a single column which already contains the commas and the text delimiters, so the data would be presented by Excel in separate columns.
I was wondering if there is a way to automatically append to a script file all the changes I am making to my columns, tables, relationships etc...
The thing is I am doing a lot of different changes on a TEST db and the idea will be to apply this change script when I move the test db to production... hence keeping production data but applying all schema and object changes.
Is there an easy way to do this? Can it also migrate database diagram changes?
I have seen how you can create a change script each time I do a change but this means I have to copy and paste into a master file. Actually pretty easy!
I was just wondering if I was missing something?
Do not make changes to the test server using the UI. Write scripts and keep them under source control. You can test your scripts starting from backups of the live data and you can tune yoru scripts untill they achieve the desired result. Then you can check in the scripts for reference and later apply them on the live server. See this article Version Control and Your Database.
BTW, check out the SSMS toolpack, I think it may do what you want (I'm not sure). My advice stand none the less: version your schema, use explicitly created/saved scripts, use source control.
There's no way to directly generate a "delta" script in SSMS.
However, if every time you publish changes, you script out the entire database, including data, to SQL using the SQL Server Database Publishing Wizard you should be able to extract diffs between the versions and get your deltas that way.
If money is no object, you can purchase Visual Studio Team System Database Architect edition and use its fantastic database comparison tools to generate and version control exactly the diffs you want.
Try using TableDiff , that came with SQL Server 2005.
SQL Server 2005 TableDiff Utility
tablediff Utility
We have the process where when a developer gets done with a change, they then script it out and check it into Subversion. In Subversion we have a folder for Tables, Stored Procs, Data, etc. They script it out so it is repeatable (i.e. don’t insert the new data if it is already there.) This is important to do anyway so you keep the history of changes for a given object in the database.
In the past, we would just enter each of the files that we wanted scripted out into a text file (i.e. FileListV102.txt). When we were ready to make a release we would do “get latest” on all of the files (from VSS back then.) We then had a simple utility that would read the “file list” file and open each of those files in turn concatenating them into an output file. That is pretty easy to code.
We outgrew that and now we have a release management tools (which can be found here and will be on sale mid September), that takes all of the files and creates a big SQL script file out of it. It does it in the order that you would expect based on the folder names – so files found in the "Tables" folder are done before those in the "Data" folder, etc.
Either way, once you are done you have a big SQL script file that you can then apply to a fresh copy of production and that is what you test against.
I know I'm way late to the party, but I just wanted to add that there are tens of third party products out there. Some are very good, some are very cheap or free, and some are a mixture. I listed 22 here:
http://bertrandaaron.wordpress.com/2012/04/20/re-blog-the-cost-of-reinventing-the-wheel/
We have been using a relatively new software called Kal Admin.
It has Change Management feature and let distributing selected changes to other databases very easily. We used to do it by comparing two databases but it not satisfy our need for change tracking.
BTW Kal Admin has Metadata and data compare capabilities as well.
I've been looking at the SqlPubWiz.exe command to write a batch file so that I can keep my script up to date in my source control. But what I need is for the command line tool to allow me to pick specific tables to include (and that I can exclude others).
I think SqlPubWiz.exe won't do that for me (let me know if I'm wrong) but if someone can point me to another tool that can do it, that's what I'm looking for.
There are several commercial tools out there that can create database scripts, e.g.
ApexSQL's SQL Script
EMS DB Extract
Here's an article showing off a free tool - however, it will only script ALL objects from your database: Eric Moreau's blog.
If you want to "roll your own", have a look at the Server Management Objects (SMO) - those allow you to inspect your database and create scripts from them.
See info here, here or here.
Marc
Try DBSourceTools. http://dbsourcetools.codeplex.com
Its open source, and specifically designed to script databases - tables, views, procs to disk.
It also allows you to select which tables, views, db-objects to script.
I have a VB project that runs on SQL SERVER 2005, while making the setup file for it, how do I include the DB?
You don't
Typically you have a DB generation script that is run either as part of setup or as part of first run of application
You also need to consider migrations (changes to DB when new releases of your application are published)
Consider using MigratorDotNet or RikMigrations to solve these problems in a seperate installer/upgrade program if you are still using VB6
I disagree, you could include the database. Simply distribute the .MDF file with your application.
Of course, the setup application would have know how to attach the database to an existing SQL Server RDBMS.
Both methods given in the above answers will work. I have tried them both. However using a
db generation script reduces the size of the final deployment files considerably. I would launch the script on the first run of the application and not in the setup itself.
I will second jack on this one.
From my experience of using installs that require an actual database file tend to have more issues then when updating or on first install when running scripts. As jack mentioned another bonus is reduced file size.
You can create who database scripts by right clicking on the required database, and selecting the script database option. Note however this will only create the tables and fields and not replicate any data.