WebDeploy only if package is newer than current deployment - webdeploy

I am using WebDeploy packages to update UAT and Production environments. Is it possible to run the deployment process such that it will only actually deploy if the package is an update to the currently deployed application?

Based on my observations it appear that this is in fact the default behaviour.

Related

TFS Build definition - add step to run DacPac for Unit Tests

We are using an on-premise installation of Microsoft Visual Studio Team Foundation Server Version 16.131.
We have a Development Continuous integration build definition that includes running some unit-tests on a very old application. As such, the unit-tests require a database to run (I know, not an ideal situation with unit tests...)
One of the artifacts of the build is a DACPAC of the database.
I'd like to deploy any database changes from that DACPAC as part of the build definition, before running the Unit-Test steps. That way, any tests that are added / changed that are dependent on database changes will (hopefully) pass.
Any ideas if this is possible, and if so, how can I publish from the DACPAC within the build definition?
We are using a PowerShell step in the build/release pipeline.
In the powershell script you will call SqlPackage:
&"C:\Program Files\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe" /Action:Publish /SourceFile:"Dacpac_Artifact\DACPAC\DB.dacpac" /Profile:"Dacpac_Artifact\DACPAC\FileWithPublishPrife.publish.xml"
You could also have a look at this Microsoft DevLabs task, it also contains a sub task to deploy dacpac remotely, so you could deploy to another machine (or use localhost to deploy on the build server itself if you want). There are also other marketplace task you could use just navigate to the store and type sql you'll see a dozen.
IIS Web App Deployment Using WinRM

Download Nuget Packages for Offline Development/Publishing

Short Question
Is there a way to programmatically download all of the Nuget packages needed to build/publish a solution and output them into a single directory that can be moved to a local Nuget repository?
Some Background
Where I work, most development is done on an air-gapped network (no internet). On a recent project, I was able to develop on our internet-enabled network. This happens to be the first application developed on the internet-enabled network and the first ASP.NET Core app we've ever developed. The solution builds, runs, and publishes just fine on the internet-facing network. I am now trying to move the solution to the air-gapped network, but I am having issues getting all of the dependencies moved over.
At first, the solution wouldn't build because of missing ASP.NET Core nuget packages; so I copied ALL of the nuget packages from the local cache on the machine that I used to develop the application to the local Nuget repository on the air gapped network. Now the application builds, but I can't seem to publish the web project (ASP.NET Core). I'm getting 25+ errors along the lines of:
Unable to find package runtime.any.System.Diagnositics.Tools. No packages exist with this id in source(s)...
Unable to find Nicrosoft.NETCore.App with version (>= 2.1.6)...
Unable to find...
I'm also getting a runtime error "This page cannot be displayed" when I try to run from visual studio (using IIS Express), but I'm not sure if that's a related issue. The unit/integration tests run fine.
I could try manually downloading each nuget package from Nuget.org and moving them over to the air gapped network, but it takes hours to get things moved from one network to another. Is there anyway I can automate the retrieval of all nuget packages required to build/publish a solution so that I can make a single transfer from one network to the other instead moving what I have and waiting to see what breaks? Preferably I'd like a exe or a PowerShell script that could look at a sln file and drop all the necessary nuget packages into a specified directory.
On your internet enabled dev env, you can browse to the root directory of your project and use:
dotnet restore --packages .\package\
This will use the directory called 'packages' as the local cache for the solution and all nuget files will be copied to it.
You can then include the same switch in your build, to ensure that the local cache is used.
dotnet build --packages .\package\
I answered two very similar questions a few weeks ago. Have a look at these answers to see if they help
Command to download a Nuget package with all dependencies to a folder
Is it possible to create a cache of nuget packages for computers without internet?

Using Octopus Deploy with EPiServer to handle database upgrades

Since EPiServer 7 upgrading to a newer version has involved:
Updating all EPiServer.* NuGet packages
Running PM > Update-EPiServer - to upgrade the local database
Running PM > Export-EPiServer - to produce a set of database upgrade script files that can be run on other servers.
However if only the upgraded EPiServer solution is built and deployed by Octopus Deploy, the database will not be upgraded meaning the site will not run.
Currently I run the EPiServerPackage manually on a server in each of our environments after a deployment.
I'm trying to decide on the cleanest way to include the /EPiUpdatePackage folder and contents running the Export-EPiServer command produces so that it will:
Checked into source control
Turned into a NuGet package on the build server
Be deployed by Octopus Deploy so that it can be remotely executed on the server the script is deployed to
As per Eric Herlitz suggestion, I have simply used the <episerver.framework updateDatabaseSchema="true"> in the web.config transforms for the environments I wish to have the database automatically upgraded on.
I'm unsure it this will present a problem if the SQL connection string user does not have the required level of permissions. However in my case, this is working correctly.

TFS release management, .net core rc2 deployment to on-premise servers

New to Visual Studio Team Services, previously Visual Studio Online
I have a couple .net core rc2 apps.
Example 1: Solution contains 1 .net core MVC app, 1 Web Api app, Multiple dependent assemblies.
I've got 3 on-premise servers (Dev, QA, Staging)
My dev server contains the build agent. My confusion is on how to best deploy these apps to my on-premise servers and finally to my production server on azure.
Do I generate webdeploy packages? If so, where and with what? In my build definition or release definition (on tfs)?
What would be the proper way to do the deployment part of these .net core rc2 apps and (using what, and in what order) is what im trying to figure out.
To my understanding so far, I believe on check in (CI build), I build/deploy to all 3 environments (dev, qa, staging). With dev and qa being automatic. Staging being either automatic or approved (authorized), depending on QA results. Finally production, being manual. I understand this is not set in stone, and certain things can be done differently, but does it sound right?
Oh, and all my servers are windows server 2012 r2
You can use build definition to build your project and generate the deployment packages and then use release definition to deploy the packages. In the release definition, you can add three environment to deploy to Dev, QA & Staging. For how to build and deploy Asp.Net Core app, please refer to this link for details: Build and deploy your ASP.NET Core RC2 app to Azure.

Anybody out there using MsBuild to do Installs?

I've noticed projects such as Msbuild Extension Pack and MsBuild Community Tasks give msbuild the power to install assemblies, sql, and setup IIS. These features seem to be oriented to doing installs and not builds.
So I was wondering how many people out there are using msbuild, perhaps in conjunction with Cruise Control.Net to do installs on staging environments?
I use MsBuild to build, and part of the build process runs Wix to create an installer(MSI) which is used to deploy to production.
I wrote up a little sample of templating configurations for different target environments with msbuild: http://blog.privosoft.com/2010/10/msbuild-environment-sandboxing.html
We use CC.NET & MSBuild to build and then also to publish to our dev and stage environments, however we do not have the push live on CruiseControl.NET, we run that MSBuild by hand. We just thought it would be way to tempting with a button to publish live ;) It took probably 2 or 3 revisions to get our MSBuild set up right. But now everything is in one file, and everything is based on Targets and Properties to do all the work. About 6 months ago, was what should be the last update and that was a multi-server push so we are ready for scaling up. We can now push any combination of parts to any combination of servers. So if we want 5 database servers, 3 contenet servers, and 2 web servers we have that ability. No need to use anything else. MSBuild can do it.
I created a deployment system where a central coordinator can:
- identify the right target server for a given component (e.g Windows service goes to a given server, web services go to another, etc.)
- perform a PsExec of a deployment MSBuild script on the target server
- the deployment MSBuild script is responsible for:
a) downloading the right component package (in my case a .zip)
b) backing up previous versions of the component
c) extracting the package to the right place
d) tailoring the installation steps to the type of component to deploy (e.g. needs to perform an Exec task of installutil.exe on a Windows service )
e) logging the result of the deployment
This system is built using a mix of:
- core MSBuild tasks
- [Tigris MSBuild community tasks][1]
- [MS SDC tasks][2]
- and custom tasks
The system allows us to perform consistent deployment of complex apps across partitioned environments (e.g. DEV, QA, UAT, etc) made of virtual servers.
I use MSBuild to build a fairly large client/server application. I use InstallShield 2008 to create a separate client and server install set.
By adding a custom target into the build process you can combine the creation of the installers into the build.
I would recommend that you create and test the build and the installer separately, before attempting to integrate the two.
I know this is an old question... but...
I am currently using MSBuild with MSBuild Extension Pack (http://msbuildextensionpack.codeplex.com) to do my entire deployment. The database portion is handled with the VS database command-line tool (vsdbcmd.exe - http://msdn.microsoft.com/en-us/library/dd193283.aspx). That Extension Pack is pretty amazing, and is letting me build web sites, app pools, Windows services, update config, and much more.
I've also put Team City agents on the test servers, so I can deploy as part of a build chain (introduced in version 7 of Team City). And running my deploy MSBuild script is super easy from Team City.
I used to use MSBuild, now I'm using PowerShell. MSBuild is a build language. It is painful to script in. There is a lot I wanted to do in it that were difficult and sometimes impossible.
Over the past year, I've created an PowerShell module somewhat equivalent to MSBuild Extension Pack called Carbon.
I strongly, strongly encourage everyone out there to learn and use PowerShell.