Which is a better option for deploying databases VSDBCMD or SQLCMD when using a database project (VS 2010). Is there any major drawback other than the defaulted variables (databasename, datapath and logpath)?
vsdbcmd is a diff tool: it can analyze the .dbschema, compare it with the target db and bring the target db up to the schema in the .dbschema file by selectively adding, dropping and altering existing objects. sqlcmd is only an execution tool, it takes a .sql script and blindly runs it. So it really apples to oranges, the two tools are quite different in purpose and capabilities.
Your question is not very clear, so it isn't easy to give you a good answer, but according to your comment to Remus's answer, I assume you are trying to execute the .sql script that vsdbcmd.exe generated. If my assumption is correct, you need to use sqlcmd.exe to execute this script.
According to this thread on MSDN Forums, the VSDB team didn't want to duplicate functionality in vsdbcmd.exe that already exists in sqlcmd.exe.
Related
I have set up a project in DataGrip with several sql files spread over a couple of directories like this:
My hope is to manage the complexity as this turns into hundreds of files. This is a learning/proof of concept level effort right now.
What I want to do is have a way to run/build/publish this project but at present the best I have found is to select the files and then do a "Run Files" CTRL+SHIFT+F10. This worked for a bit but now I have a foreign key that gets run in the wrong order. I don't want to have to make a hack like prefixing the file names with integers to force a specific order. It feels like a real kludge.
How should I accomplish this, I must have missed something since the alternative is very manual and error prone. If it matters the database I am working against is Oracle.
Since DataGrip 2020.1 one can create a Run Configuration and specify data source and multiple files or scripts:
Refer to DataGrip blog post.
I've got a question about building a deployment script using SSDT.
Could anyone tell me if it's possible to build a deployment script using SQLPackage.exe where the source file is NOT a dacpac file, but uses the .sql files instead?
To give some background, I've created a project in Visual Studio 2012 for my database schema. This works great, and SSDT builds the folder structure without a problem (functions, stored procedures etc which contain all the .sql files).
Here's the problem - the database in question is from a legacy system, and is riddled with errors. Most of these errors we don't care about anymore and it's not practical or safe to fix them all, so for years we've basically ignored them. However it means we can't build the project and therefore can't generate the dacpac file. Now this doesn't prevent us from doing the schema compare and syncing the database with the file system (a local mercurial repository). However it does seemingly prevent us from building a deployment script.
What I'm looking for is a way of building the deployment script using SQLPackage.exe without having to generate the dacpac file. I need to use the .sql files in the file system instead. Visual Studio will produce a script of the differences without building the dacpac, so this makes me think it must be possible to do it using SQLPackage.exe using one of the parameters.
Here's an example of SQLPackage.exe which I'd like to adapt to use the .sql files instead of the dacpac:
sqlpackage.exe /Action:Script /SourceFile:"E:\SourceControl\Project\Database
\test_SSDTProject\bin\Debug\test_SSDTProject.dacpac" /TargetConnectionString:"Data
Source=local;Initial Catalog=TestDB;User ID=abc;Password=abc" /OutputPath:"C:
\temp\hbupdate.sql" /OverwriteFiles:true /p:IgnoreExtendedProperties=True
/p:IgnorePermissions=True /p:IgnoreRoleMembership=True /p:DropObjectsNotInSource=True
This works fine because it uses the dacpac file. However I need to point it at the folder structure where the .sql files are instead.
Any help would be great.
As has been suggested in comments, I think that biting the bullet and fixing the errors is the way ahead. You say
it's not practical or safe to fix them all,
but I think you should give this a bit more thought. I have recently been in a similar situation to you, and the key to emerging from it is to realise that the operational risk associated with dropping procedures and functions that will throw an exception as soon as they are called is zero.
Note that this does not apply if the reason these objects won't build is that they contain cross-database or cross-server references that are present in production but not in your project; this is a separate problem altogether, but also a solvable one.
Nor am I in favour of "exclude from build" as an alternative to "delete"; a while ago I saw a project where this technique had been deployed extensively; it makes it harder to see what does what from the source files and I am now of the opinion that "Build Action=None" is simply "commenting out the bits that don't work" for the Snapchat generation.
The key to all of this, of course, is source control. This addresses the residual risk that one day you might indeed want to implement a working version of one of your currently non-working procedures, using the non-working code as a starting point. It also obviates the need to keep stuff hanging around in the solution using Build Action=None, as one can simply summon an earlier revision of the code that contained the offending objects.
If my experience is any guide, 60 build errors is nothing; these could easily be caused by references to three or four objects that no longer exists, and can be consigned to the dustbin of source control with some enthusiastic use of the "Delete" key.
Do you have a copy of SQL Compare at your disposal? If not, it might be worth downloading the trial to see if it will work in your scenario.
Here are the available switches:
http://documentation.red-gate.com/display/SC10/Switches+used+in+the+command+line
At the very least you'll need to specify the following:
/scripts1:
/server2:
/database2:
/ScriptFile:
I am installing an major upgrade of product in my system.
Folders and registry entries will be updated for this.
I would like to take snapshot of folder structure and registry before/after installing update so that i can compare them easily.
Is there any tool or simple powershell module available to do this?
I hope testers would have done this while doing installation testing. If you have followed any good approach ,please update us.
One of the best tools I've come across for before/after registry comparisons is called RegShot:
http://portableapps.com/apps/utilities/regshot_portable
The 1.8.3 version supports 64 bit registries:
http://sourceforge.net/projects/regshot/
Yet another tool is called ZSoft Uninstaller:
http://portableapps.com/apps/utilities/zsoft_uninstaller_portable
This one is tailored toward software installation analysis.
Both of these can perform registry and file system before/after comparisons.
Well, practically, I think you'll have to limit the paths you want to "monitor". You can use the PowerShell provider for registry very easily. For example:
Get-ChildItem -Path HKCU:\SOFTWARE -recurse | Out-File HKCU_Software.reg
More information here. Then, you can make a diff (before, after) using a tool like DiffMerge. Same principles for directories.
However, once again, beside a shallow check, I don't think that approach is realistic.
Don't know your context, but Microsoft's Attack Surface Analyzer might be useful.
I recommend RegistryChangesView by NirSoft.
It could be particularly good for you as you mentioned PowerShell modules in your question, and RegistryChangesView has documented command-line options. RegistryChangesView also supports exporting the comparison to a .reg, HTML, or CSV file etc.
I'm still looking for a tool that can directly output the difference in the form of PowerShell commands; until then, I can export to a .reg file and then run it through a .reg-to-PowerShell-commands converter.
Other options include the classic Regshot which has also been forked as Regshot Advanced. However, personally, I go with RegistryChangesView because I think it has better documentation/command-line options.
I'm using visual studio 2010 database edition and running a schema compare against a SQL database instance to sync up new objects on the database which arent yet in my project (dbproj). My solution contains many projects some of which reference each other via database references. When I write updated from my schema compare to my target project any references to objects in my other projects get added as hard references and do not use the sqlcmd variables that get set up when adding a database references.
This causes my solution to have warnings and errors until I manually swap out the hard references for the sqlcmd variables. I've found I can re-make the database references which has an option to pick up all the hard references through the project but this is still cumbersome. Is there a way to use the existing set of sqlCmd variables defined for my project such that updated are written to my project with the variable references used?
This is just a thought but we never make direct changes in the database and use our project to build against the db to produce a deploy script. By using the project as the master of all knowledge we feel this works very well for us.
We branch out to create our dev, test environments then merge the project when we deploy.
As for references, we found referencing the dbmeta file instead of the actual database was also useful as it makes all the projects more transferable.
My solution isn't exactly what you want to do, but it might help.. When I run into similar issues, I usually create a (temp) DB for the schema compare & use it instead of my project..
I have an oracle .dmp file, and no access to a local oracle install..
Is there any way I can read the data or open it in another program to see what data is in this file?
Nope, the .dmp files are generally meant to be read and imported by Oracle's imp tool
If that's it, don't waste your time. DMP files are to be read by
Oracle's IMP utility and nothing else
(you could, of course, use any editor,
but you're close to see nothing useful
in there).
http://www.orafaq.com/forum/t/64809/42800/
Seems if you have a licensed copy of Toad, you can use the DBA utilities to view the files
The DBA utilities in TOAD has File
Browser that allows you to view export
files. Although I think this function
is only available in a fully licensed
copy of TOAD.
http://www.orafaq.com/forum/t/64809/42800/
If all you want is a feel for what's in there, open it with Vi or Notepad++ or some other reasonable editor and you can see the DDL statements scattered throughout. A lot of the data will be intelligible as well. However, it IS a binary file, so making a lot of use of it is problematic. Also, if it's huge, the editor you use may not be able to handle it, so you might want to use a pager like less.
That said, can't you download the Oracle Client software and install it? The administrator utilities in there include imp/exp IIRC.