Best mix of aspnet deployment tools - msbuild

I know how many questions have been asked around this, but I still can't figure out the answer to my question.
I need to deploy an ASP.NET 4.0 site, and I want to do something like this:
Get latest version og the entire solution - a website and a couple class projects that are used by the webapp (I am doing this already using CCNet- not a problem)
Build and deploy in debug config to a test site
Build and deploy in a release configuration to a staging site
If everything looks fine on the staging site, I'll run a script that deploys the release build staging site, to 7-8 similar sites used by different customers on the same server. In the future this will be on another server.
There is MSDeploy (webdeploy 2.0), aspnet_compiler, MSBuild, Powershell (my weapon of choice..) and propably more ... I am not 100% sure what to use where?
I would love to mimick the "only deploy files needed to run the site" from the Vs2010 GUI assisted deployment, and I'd love to have the possibility of not touching some existing folders in the websites that we deploy to.
I feel like I should use MSDeploy a lot ... but I find it pretty hard to GET. I'm reading away in IIS.NET and I've heard the Scott Hanselman/Jon Arild Tørresdal podcast. I am not 100% sure where to start ... and I'm not a MSBuild expert, so Powershell is looking pretty good to me. But I feel I'm missing out on the right tools by going that way ...
What tool would you use in which step??

According to me, MSDeploy is build to achieve all of your needs with much more control. MSDeploy is the product of IIS team and it's far more feature rich.
Answers to your questions
Considerations: Svn as Source control, MSDeploy for deployment
You can use SVN Command line tool to get latest version of your source code in some temp directory and build it using MSBuild command line tool. Then you can create a Deployment Package or a Publish Content Package which can be deployed to specific environment. I am currently working on a project which can basically create a deployment package independent of the environment, means the pacakge can be deployed to any stage/test/live environment. Sample for creating a deployment package is...
msbuild C:\Projects\NimBuildDeployTestApp\NimBuildDeployWebApp/NimBuildDeployWebApp.csproj /t:package /p:Configuration=Release /p:PackageLocation=C:\Temp\DeployPackage\NimTestWebApp.zip /p:EnablePackageProcessLoggingAndAssert=true
Use MSDeploy command line tool to deploy the package as below
msdeploy -verb:sync -source:package=packagelocation -dest:CONFIGURABLE_ACCORDING_TO_YOUR_NEEDS
User Step 2 to build the solution in Release mode using MSBUILD and configure the dest parameter in msdeploy.
If everything seems fine on your staging server, run the deploymentToLive.batch file which will deploy a particular PublishContent/package on your live server. Execute 7 msdeploy scripts in same batch file or else use DFS (which is far more complex solution)
MSDeploy is definitely the way to go. One thing that you may consider is having a deployment package ready which can be deployed onto any environment of yours. What I mean by that is your web.config file won't be similar for all the envrionments, and hence create a deployment package which contains web.confif file for all the environments. Refer to my solution here
Hope this helps and gives you confidence in moving forward to automate your deployment process

Related

Options for local hosted client side package management in VS2019?

A common issue I keep bumping into for web projects in net core, is the need for sharing javascript in easy to use modules from project to project. Often times large quantities of code written in VS project A could be very much used in project B, sometimes in the same solution.
Restrictions:
Must be self hosted, not publicly exposed, only within local network etc etc can access the libs/modules/packages/etc
Ideally can be performed via visual studio projects and make use of build tasks, powershell, msbuild, or other such automation tools to deploy and package, minify, bundle, etc etc the javascript libraries.
The absolute ideal is if this can all be hosted from just a network folder
NPM/Yarn
I'm not super familiar with either of these, but is there a way we can drag and drop javascript code we've built into some designated folder, perhaps modify some form of manifest, json or xml file or what have you, and then anyone can just npm install those packages? I guess what I'm wondering is, is there a way to tell npm "This folder now is a source of packages you can install from"?
Bonus points: If said "trust this folder" config can be set inside of the VS project, so if someone new grabs the git repo, it will just work "out of the box" and they dont need to go through steps configuring npm or yarn so it knows how to find those packages.
Libman
Same as above, but mostly I'm trying to figure out if there is any way at all to configure libman from VS. It's the default and what is currently in use, but it just has its four default CDNs it comes with that it trusts and I am not seeing any way at all to tell Libman "Here's a now resource for files to trust, add that to the selectable drop down"
But I am seeing basically zero configuration as an option for libman, which is quite disappointing.
Nuget
This is the other option that is already popular locally, but something about using nuget to deliver js files when NPM, Yarn, and Libman already exist sets my teeth on edge, but, we have I believe a locally hosted nuget server that could be used already, so the infrastructure I believe is already setup, if not, I know how to do it. I do like the fact that nuget 100% for sure could leverage actual projects and build steps and msbuild and etc for deploying.
Conclusion
What's the popular and easy way to do this nowadays? Best case scenario is if there's a way to go, "Put a manifest.json file in the folder root that points to all the modules inside, then add it as a trusted source to your package manager, and now you can install those packages"

ASP.NET Core front-end developer workflow with VSCode and VS 2019

I haven't done any cshtml front-end development for a few years.
What's the current, generally accepted way for ASP.NET Core front-end developers to work across a range of tools on Windows?
By that, I mean a way to have the front-end JS build and the .NET project(s) also build and to work rapidly in the browser and code.
My thinking is.
We have much better command line story around dotnet today.
Some folk like VS Code.
Some folk prefer VS 2019, and some like either, depending.
We need to work on UI aspects sometimes.
But we also need to attach a debugger and debug the server logic sometimes.
The build server should have no problem, be simple, and rely mostly on build logic held in the repo.
Tooling, and kicking off the whole build and serve process should be understandable and familiar.
It should be pretty simple to get going after a team noob clones the repo.
My initial thought would be to setup NPM then use something like Gulp to kick off everything, including running dotnet run.
Then when running under the Visual Studio 2019 debugger, use the Task Runner Explorer to kick off the Gulp stuff but skip the dotnet run part.
(shame there doesn't seem to be a command line for start VS(Code or 2019) and attach debugger)
Now I'm expecting to get a "primarily opinion based" SO beating, but there are general trends and ideas that go into designing all these tools for how they can all play ball together and what the dev story looks like.
You've pretty much already described the process. However, I'll add a few things:
You don't need the dotnet run bit. Visual Studio and VS Code are both capable of debugging directly.
You can assign the gulp tasks to build tasks in Task Runner Explorer, so you really don't even then to think about running those directly. I'm not as sure on this aspect of VS Code, but I'm sure there's probably some extension to handle it, if it's not already built-in.
If you want true ease of development, the best thing you can do is use Docker. Just add a Dockerfile to each project that actually runs (i.e. not a class library) and set up the steps to build and run it there. In Visual Studio, you can right-click the project and choose Add > Docker Support, and it will actually generate a ready-made Dockerfile, though you may need to add a step or two to handle the client-side build steps. In any case, this then becomes truly click and run, with nothing to worry about. The story is even better when you use docker-compose, as then Visual Studio and VS Code can spin up your entire application stack all at once, including external dependencies such as a database, Redis instance, etc. If you haven't used Docker before, start now. It's absolutely revolutionary for development.
One note for CI/CD, as much as possible, you should add a YAML file to describe your CI/CD pipeline. Depending on the the actual provider you're using for build/release, there might be some differences, so consult the relevant documentation. (Azure DevOps, for example, doesn't currently support describing release pipelines in yaml, though you can still do your build that way.) In any case, this allows you to configure all this in code, and have it committed to source control.
You may consider the same for your infrastructure. Azure has ARM templates, AWS has CloudFormation, GCP has Deployment Manager. There's also third-party tools like Terraform or Ansible. All of these, in some form or fashion (usually JSON or YAML) allow you to define all the characteristics of the infrastructure you're going to deploy to and commit that to source control. This makes deployment and things like creating new environments as breeze.

How to setup a TeamCity build for a ASP.NET 5 project

I'm trying to setup a CI server for a website that I'm developing, but I can't find any info regarding how to do it with the new ASP.NET 5.
I got you brother. This took me a few days to figure out. This configuration is on TeamCity v10 for a ASP.NET Core 1.0 RC2/preview2 project. As a bonus, I am including the step where it pushes to Octopus Deploy. You will need to install the dotnet teamcity plugin and the newest Octopus Deploy plugin with Push functionality. Here's an overview of the build steps:
First off, don't try to use dotnet restore to restore the packages. It won't work if you have internal nuget packages that are not compiled as .Net Core. This took forever to figure out. I would ignore trying to use dotnet restore until people have converted everything over to .Net Core or Microsoft fixes dotnet.exe to be more flexible.
Some of the stuff I read said to use the newest beta version of NuGet, 3.5. When I tried this, I would get the following error.
[14:30:09][restore] Starting NuGet.exe 3.5.0.1737 from D:\buildAgent\tools\NuGet.CommandLine.3.5.0-rc1\tools\NuGet.exe
[14:30:10][restore] Could not load type 'NuGet.CommandAttribute' from assembly 'NuGet, Version=3.5.0.1737, Culture=neutral, PublicKeyToken=31bf3856ad364e35'.
I don't know what that means, and I don't care. Use 3.4.4 for now. Fill in the rest as appropriate.
The dotnet publish step is pretty straightforward. Make sure you provide the output directory because you want to use it in the final step. Also, be sure to specify an absolute path by using the %teamcity.build.workingDir% variable because of this bug. Otherwise it will fail to find your web.config file and not finish publishing the entire site. You'll be missing things like web.config and wwwroot!
Finally we Push to Octopus. This was very tricky for me. Note the part that says
%teamcity.build.workingDir%/published-app/**/* => OrderReviewBoard.1.0.0.zip
IF ANY PART OF THIS IS INVALID, YOUR STEP WILL FAIL WITHOUT EXPLAINING ITSELF!!! By invalid, I mean maybe you put a teamcity environment variable (like the %build.number% they show in all the examples) in that zip name that doesn't properly resolve. Or you specify a non-existent path. Or any number of things, you will see an error that says "[Octopus Deploy] Please specify a package to push". That means that one was never generated because that statement failed. I realize you want to have an auto-incrementing build number there. I'll leave it up to you to figure out how to do that.
Don't get all confused by what is running here. Octopus tries to explain it on their site, but it is hidden here. There is octo pack and octo push. The new version of octo pack is running out of sight, based on whatever statement you put in that "Package paths" box. Don't get sidetracked trying to create a nuspec package, or trying to use dotnet pack. These are dead ends for our purposes. Create a .zip file and move on with your life. Finally, notice the additional command line arguments I added. These help you out a tiny bit. They aren't required. Good luck.
We (the ASP.NET team) use TeamCity as the build server. Each repo has a build.cmd file, similar to this one. TeamCity simply invokes that file.
For Mac/Linux builds, there is a build.sh file.
At the moment you can try to use TeamCity plugin for .NET Core projects:
https://github.com/JetBrains/teamcity-dotnet-plugin
Please check these blog posts;
http://blog.coderinserepeat.com/2015/01/25/building-asp-net-5-projects-in-teamcity/
http://blog.maartenballiauw.be/post/2014/12/19/Building-future-NET-projects-is-quite-pleasant.aspx
Since there has been many changes to the ASP.NET Core world and I got asked about it a few times, I wrote down a step-by-step guide on how to setup a CI/CD environment using TeamCity for .NET Core. I think it is especially helpful for beginners.

Testing a NuGet package

We are big users of NuGet, we've got 25-30 packages which we make available on a network share.
We'd like to be able to test new packages before they're built and released in the consuming applications. Ideally, this could be done using something similar to Maven's snapshot and having a specific development package (e.g. snapshot functionality).
Has anyone else come up with a, ideally reasonably non-hacky, way of doing it?
Our favoured method is to generate the package assemblies and then manually overwrite the assemblies in the packages/ directory, i.e. to replace the actual project references, but that doesn't seem particularly clean.
Update:
We use a CI build server which creates builds on every commit and has a specific manually triggered NuGet build which works off specifically tagged versions of the codebase. We don't want to create a NuGet build off every commit, but we would like to be able to test a likely candidate in the wild before we trigger the manual NuGet package build.
I ended up writing a unit / integration testing framework to solve a simular problem. Basically, I needed to verity the content of the package, the versions and info, what would happen when I installed and uninstalled the package, what versions were the assemblies in the lib, what bits the assemblies were built as (x86 or x64) and so on - and I needed it all to run without Visual Studio installed and on my build machine (headless) as a quality gate.
Standing on the shoulders of giants like: Pester, PETools, and SharpDevelop's package management module I put together - nuget-test
Clone the project into your package directory (where your .nuspec file and package files are). If for whatever reason you want to keep the nuget-test project as a "git" repo then simple remove "remove-item nuget-test/.git -Recurse -Force" from the command below.
git clone https://github.com/nickfloyd/nuget-test.git; remove-item nuget-test/.git -Recurse -Force
Run Setup.ps1 in the root of the nuget-test directory in an x86 instance of PowerShell.
PS> .\setup.ps1
Write tests and place them in the nuget-test/test directory using the Pester syntax.
Run the tests.
PS> Invoke-Pester
Project page: nuget-test
On github: https://github.com/nickfloyd/nuget-test
I hope this helps you get closer to what you're trying to get done.
If you're using NuGet packages to distribute your libraries, you should not limit to only testing the libraries. You should test the packages themselves as well (if your binaries are OK but incorrectly installed, consumers still have issues). The whole point is to improve this experience.
One way could be to have an additional CI or QA repository. The one you currently have is actually your "production" repository containing consumable releases, considered finished high-quality products.
Going further, you could have a logical package promotion flow (based on Continuous Integration or even using a Continuous Delivery approach), where:
- each check-in produces a package on your CI repository
- testers pick up a CI package for QA and if found OK promote it to either a QA feed, or to the Production feed (whatever you prefer, depends on the quality of your testing and how well it is automated)
There are various ways of implementing this scenario, using simple network shares, internal NuGet.Server or Gallery implementations, or simply use http://myget.org to give it a try with minimal cost and zero effort.
Hope that helps!
Cheers,
Xavier

Local Build Automation?

Working in a team environment, we have a Team Foundation Server that also contains a Team Build component. It is configured to automatically build all projects and solutions at specific times or on request.
We develop a product that is built with several solultions that depend on eachother. When things have been changed in one solution, it has to be rebuilt locally manually in both debug and release mode so that changes take effect in another solution that depends on it.
Also when a developer retrieves all sources the first time, he has to build all solutions manually in the correct order to get a working environment.
What is the best way to automate things like this? Create .cmd files that trigger the correct msbuild files? Using a program such as CruiseControl.NET?
What do you people do to maintain a clean local development environment?
What I did for our Team was to provide a Visual Studio Solution which contains all projects. Then I created a simple .cmd file which uses the commmandline tools of Visual Studio to build this solution with their respective debug/release/profile configurations. This is a one step build solution that can be used from every engineering machine.
The next level is the continuous integration system that is setup to check for changes every 15 min and start a build if there are changes in the VCS. I'm using hudson as our CI system. The CI system is used to build the native projects, the java projects as well as the flex stuff. Since everything can be build from the commandline it's pretty easy to use it with hudson or CruiseControl.NET.