I've got a .dacpac file that's being called by MSBuild and published to a QA database for testing. This publish is failing and the error I'm getting back from them is a generic "an error has occurred" message. I was hoping I could generate the deployment script from the dacpac and walk through it to see where the problem is occurring and hopefully teach them how to do this as well.
Is there any way to point a dacpac at a specific database and have it generate the sql for updating the database without actually publishing to the database?
You can use SqlPackage.exe. Look for it on your machine in a directory with a name similar to this:
C:\Program Files (x86)\Microsoft SQL Server\130\DAC\bin
Note that it may be found in the 110, 120 or 130 folder.
If you don't have SqlPackage.exe already, you can download it from here: https://www.microsoft.com/en-us/download/details.aspx?id=53013
If you download it, be sure to look in the System Requirements section of the download page to find the dependencies SqlSysClrTypes.msi and SqlDom.msi, which must be installed as well.
Example usage:
SqlPackage.exe /a:script /SourceFile:C:\temp\mydb.dacpac /TargetConnectionString:"Data Source=myserver;Initial Catalog=mydb;Integrated Security=true" /OutputPath:C:\temp
Related
I am working as part of a wider team, on an SSIS package. We use TFS for collaboration, and the package has links to .sql files amongst other things.
When I run the package manually, the package runs without any issues, as the connection manager references my directory. When someone else tries to execute the package, they get an error message "Could not find a part of the path ......... script.sql"
Is there a way to correctly reference the TFS location, so it works for everyone, or is there some other way to reference these files in SSIS, so they work for everyone in the team?
Thank you
I'm trying to use VSTS to deploy into my database, the problem is in one of the steps I need to pick up the dacpac file and deploy it to the Azure SQL server but it fails:
in that step, I'm using "Execute Azure SQL: DacpacTask" which is provided by Microsoft in VSTS.
there is a filed to do it which is called "DACPAC File" and the documentation said to use it like this:
$(agent.releaseDirectory)\AdventureWorksLT.dacpac
but it gave me the below error:
No files were found to deploy with search pattern
d:\a\1\s\$(agent.releaseDirectory)\AdventureWorksLT.dacpac
so I did a cheating and put the below value in it:
d:\a\1\s\AdventureWorksLT.dacpac
it does work but obviously, it won't work forever as I need to use an environment variable, something like :
$(agent.releaseDirectory)\AdventureWorksLT.dacpac
any suggestion?
I've had this same problem. I wasn't able to find detailed documentation, but from experimenting, this is what I found.
I'm assuming that your DACPAC is created as part of a Build Solution task. After the build completes and the DACPAC is created, it exists in a sub-folder of the $(System.DefaultWorkingDirectory) directory.
Apparently, the Azure SQL Database Deployment task cannot access the $(System.DefaultWorkingDirectory) folder. So the file must be copied somewhere where it can be accessed. So here's what I did:
The Visual Studio Build task builds the solution, including the DACPAC. The resulting DACPAC is placed in a $(System.DefaultWorkingDirectory) sub-folder.
Add a Copy Files task as your next step. The Source Folder property should be "$(System.DefaultWorkingDirectory)". The Contents property should be "**/YourDacPacFilename.dacpac". The Target folder should be $(build.artifactstagingdirectory). The "**/" tells VSTS to search all subfolders for matching file(s).
Add an Azure SQL Database Deployment task to deploy the actual DACPAC. The DACPAC file will be in the $(build.artifactstagingdirectory).
I had the same problem and I solved it by removing the old artifact from the release and adding it again to take the correct alias of the new artifact.
That's why the Azure SQL Database Deployment task says it doesn't have access to the $(System.DefaultWorkingDirectory) folder, the artifact has changed and you must make sure you're using the new one that is saved in the azure pipeline.
I'm trying to use the vsdbcmd.exe command to deploy a SQL DB Project. However, when I try to build said project using MSBuild, or even in VS (2013), it is not generating a dbschema file (as documented here).
Can anyone offer any suggestions as to why this might be? I have permissions in the solution and project directory. I've tried build, rebuild, msbuild /t:Build c:\DB\DB.sqlproj, but I can't see the .dbschema file anywhere.
You're using the wrong command for the job. You're using either SSDT or the native VS2013 projects. Those generate *.dacpac files (which are glorified zip files and contain the dbschema and other files). You need to use SQLPackage.exe to publish those.
I've got some articles about the SSDT process on my blog: http://schottsql.blogspot.com/2013/10/all-ssdt-articles.html
This article in particular likely pertains to your issue: http://schottsql.blogspot.com/2012/11/ssdt-publishing-your-project.html
When I run my TeamCity build with the only build step being of runner type Visual Studio (sln), I get the following error:
C:\TeamCity\buildAgent\work\4978ec6ee0ade5b4\Test\Code\Test.sln(2, 1): error MSB4025: The project file could not be loaded. Data at the root level is invalid. Line 2, position 1.
This is on a dedicated CI server running TeamCity Professional 8.1.1 (build 29939). There are several other successfully-running builds on this server.
The odd bit is that the same build runs successfully on TeamCity on my dev machine. I followed an answer to a similar question, and copied the specified folders across, but that didn't help.
I'm sure the project/solution file isn't invalid because in addition to the build running on my dev box, I have opened the solution in Visual Studio and built it there with no problems.
Any suggestions?
I just fixed this.
Look inside the Test.sln file for Project or EndProject tags that aren't closed. For us, the EndProject was missing and it broke on teamcity, but no issues in Visual Studio.
It seems the TeamCity error message will occur for any number of root causes. In my case the problem occurred because a line inside the GlobalSection(NestedProjects) section was referring to a project Guid which didn't relate to any project defined in the Solution file.
As with the previous post I didn't have any issues building in Visual Studio. I only got a more helpful error message that allowed me to discover what the real problem was when I built using msbuild.
See https://therightjoin.wordpress.com/2014/07/04/msb4025-the-project-file-could-not-be-loaded-data-at-the-root-level-is-invalid-error-when-building-ssdt-project-in-teamcity for another example, and where using msbuild helped identify the true problem.
In our case, it was a duplicate project reference in the solution file (caused by near simultaneous commits and an automatic merge).
In our situation the problem was specifying a ToolsVersion that was not installed on that machine. (14 which VS2015 has but VS2017 does not have by default)
In my case, after merging, in .sln file, it was a mismatch of lines under
GlobalSection(NestedProjects) = preSolution
{6B971E15-6B61-4AA8-9B93-9639C23269C3} = {9A14E7EF-3FA1-4B9A-B413-C550B3E5AC62}
{54D14F01-D576-4DE6-9404-D21AD0DC4916} = {9A14E7EF-3FA1-4B9A-B413-C550B3E5AC62}
... (was some extra entry here )
...
EndGlobalSection
section. In clear words, there were some extra lines added after merging. So, If you have merged, please compare two solution files manually. You can start with total line numbers in both files.
In another Case
We had a blank lines - so make sure any blank lines are removed!
Hope this helps some else too!
I got this same error with Jenkins. It turns out the root Jenkins folder was set to C:\Program Files (x86)\ and it didn't have write access to bin and obj directories.
Error:
error MSB4025: The project file could not be loaded. Data at the root level is invalid.
I launched cmd as Administrator and ran this:
"C:\Program Files (x86)\MSBuild\14.0\Bin\MSBuild.exe" "C:\Program Files (x86)\Jenkins\workspace\BuildBI_1\Reports\Test\ReportsTests.sln" /t:Build /p:RunOctoPack=true
And that gave me clues about not being able to write to bin and obj.
This worked for me-
You can install Build Tools for Visual Studio 2017, make sure to select C++ tools, Windows 10 SDK and MSBuild and your set.
Use MSBuild to identify the underlying problem:
$> msbuild mysolution.sln
Gave me this beauty with the correct error line number:
If msbuild cannot be accessed like that from the command line / powershell, try to find the MSBuild.exe shipped with VisualStudio, e.g. C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Current\Bin\amd64\MSBuild.exe.
VisualStudio itself seems to be very "tolerant" against errors / inconsistencies in the solution file, so having it open in VS is no guarantee for the sln file being correct.
I fixed it by updating the solution file.
Another possible problem (and resolution): I had a stray unused solution file in my repo, pointing to who-knows-where, and the MSBUILD step in my Azure DevOps pipeline was set to **\*.sln.
I have a SSIS master package, which executes several child packages. It works great, but when I deploy it to the file system on the server, I get an error code "0xC00220DE". "The system cannot find the file specified."
When I run the package on the server by double-clicking it, it works correctly. But when I use DTExec:
dtexec /FILE "d:\cmcdx\ssis\MAESTRO_FACTURACION.dtsx"
I get the mentioned error.
The package configuration is correct, and the user I'm executing the package with is administrator of the machine.
Should I deploy the packages to Sql Server? What are the best practices for deploying a master-child package? I'm running out of ideas here...
By the way, I'm running Sql Server 2005 sp3.
Solved it.
I was using relative paths to point to the child packages, and in runtime SSIS was unable to find them.
In the end I used a specific path, set in a configuration file. Then I used the deployment utility, copied everything to the server, run it by double clicking on the SSISDeploymentManifest file and changed the paths to the proper location.
Thank James and Justin for your answers.
Is the package not getting a path or location value from a package configuration file? If so make sure you include the /ConfigFile argument and the path to the config file. Another thing to possibly check is if you have any connections in the package that refer to mapped network drives, these may not work running under the different service account than your local console account.
[Edit]
Try this command line below on the server (notice the double slashes).
dtexec /FILE "d:\\cmcdx\\ssis\\MAESTRO_FACTURACION.dtsx"
There are several things that could be going wrong here. You mention that you're using a master package to run several child packages. Are all of your child packages in their proper location on the server as well?
Remember that the paths to the child packages should be variables in your master package so that those values can be changed through a configuration file on the server if need be.
You might also want to check out this set of tutorials on MSDN:
Package Deployment How-To Topics
These tutorials explain how to properly enable package configurations on the server when your package runs.