MSBuild and Webpack - msbuild

I am developing an Angular2 application in VS2015 and have a webpack bundling and minification environment set up for the same.
This is my webpack.conf.js
switch (process.env.NODE_ENV) {
case 'prod':
case 'production':
module.exports = require('./config/webpack.prod');
break;
case 'test':
case 'testing':
//module.exports = require('./config/webpack.test');
break;
case 'dev':
case 'development':
default:
module.exports = require('./config/webpack.dev');
}
I have also installed a webpack task runner which invokes this with the following commands
cmd /c SET NODE_ENV=development&& webpack -d --color
and
cmd /c SET NODE_ENV=production&& webpack -p --color
The setup seems to work fine. However, I want to integrate this with my TFS builds CI. The webpack command should fire after the project is built and report a build failure incase the webpack build fails. I have tried to incorporate the following code in my .csproj file
<Target Name="AfterBuild">
<Exec Condition="$(Configuration) == 'Debug'" Command="cmd /c SET NODE_ENV=production&& webpack -p --color">
</Exec>
</Target>
It failed with an error 9009
After that I tried, starting the command up from the node_modules folder where webpack was installed
<Target Name="AfterBuild">
<Exec Condition="$(Configuration) == 'Debug'" Command="./node_modules/.bin cmd /c SET NODE_ENV=production&& webpack -p --color">
</Exec>
</Target>
still in vain. Even if I get this to work, I am not sure if it would cause the VS build to fail if it encounters an error in webpack.
How do I go ahead with this?

Put different scripts in package.json for development and production mode. Then on 'AfterBuild' event of visual studio, call those scripts on different configurations.
Add following two scripts, 'buildDev' and 'buildProd' in package.json:
"scripts": {
"buildDev": "SET NODE_ENV=development && webpack -d --color",
"buildProd": "SET NODE_ENV=production && webpack -p --color",
}
In AfterBuild events of visual studio add these two conditional commands:
<Target Name="AfterBuild">
<Exec Condition="$(Configuration) == 'Debug'" Command="npm run buildDev" />
<Exec Condition="$(Configuration) == 'Release'" Command="npm run buildProd" />
</Target>
And finally webpack.conf.js like this:
switch (process.env.NODE_ENV.trim()) {
case 'prod':
case 'production':
module.exports = require('./config/webpack.prod');
break;
case 'dev':
case 'development':
default:
module.exports = require('./config/webpack.dev');
break;
}
Important Note: Make sure to use trim() function with process.env.NODE_ENV as cmd will treat the blank space in the command "SET NODE_ENV=development && webpack -d --color as character and append it in NODE_ENV variable. So when we are setting it as 'development', it actually gets set as (development + whitespace).

For TFS CI build, you can refer to these steps to achieve your requirement.
Add npm step
Add Command Line step
Note: There is –bail argument, which is required otherwise the step/task will be succeed even though webpack command throws exception.
Also, you can add variable in build definition (variable tab)

Related

What MSBuild condition should be used to detect target OS?

What Condition Expression for PropertyGroup/ItemGroup should be used to differ target OS (-r argument of dotnet publish)? E.g. on these commands:
dotnet publish -c Release -r win-x86 --self-contained false
dotnet publish -c Release -r linux-arm --self-contained false
Currently I've forced to use different Configurations and build using these commands:
dotnet publish -c ReleaseWin32 -r win-x86 --self-contained false
dotnet publish -c ReleaseLinux -r linux-arm --self-contained false
I know that MSBuild can define even target .NET Core/Framework version (e.g.Condition="'$(TargetFramework)' == 'netcoreapp3.1'"), so probably should also define target OS (something like Condition="'$(TargetOS)' == 'win-x86'").
Does there may be somehow used direct detection of target OS in CSPROJ file without using -c ReleaseWin32 / -c ReleaseLinux for builds for different platforms? Shortly, does MSBuild syntax have any Condition about target OS?
The CLI's -r linux-arm translates to MSBUild -property:RuntimeIdentifier=linux-x64 so you can use $(RuntimeIdentifier) in conditions:
<PropertyGroup Condition="'$(RuntimeIdentifier)' == 'linux-arm'">
</PropertyGroup>
<ItemGroup Condition="$(RuntimeIdentifier.StartsWith('win'))">
</ItemGroup>

dotnet publish -o ./dist does not set $OutDir or $OutPath in msbuild

I Have a file move event which I want to trigger after a publish
<Target Name="CopyEmailTemplates" AfterTargets="AfterPublish">
<ItemGroup>
<TemplatesFolder Include="Views\EmailTemplates\*.cshtml" />
</ItemGroup>
<Copy SourceFiles="#(TemplatesFolder)" DestinationFolder="$(OutDir)Views\EmailTemplates\" />
</Target>
I've confirmed that the command does not return the publish directory with this target:
<Target Name="OutputTest" AfterTargets="AfterPublish">
<Exec Command="echo OutPath: $(OutputPath)" />
<Exec Command="echo OutDir: $(OutDir)" />
</Target>
Expected:
OutDir is set to dist/
Actual behavior:
OutDir is set to bin/Release/netcoreapp2.0/
I am using: .NET Command Line Tools (2.1.4) on osx.10.12-x64
Publish is a two-step process. The project is built using normal build settings and then published to $(PublishDir). Use this property wherever you need to know the path of the publish output.
Self answering in hopes to prevent future headaches for people.
The dotnet publish -o ./dist command will set the $(PublishDir) variable in msbuild.
dotnet build -o ./dist does however set $(OutDir)
To be more explicit with our build I now use the msbuild command
dotnet publish -o ./dist -c Release
Becomes:
dotnet msbuild /t:publish /p:PublishDir=dist/ /p:Configuration=Release

Passing argument to the middle of an npm script

Title says it all. I want to be able to pass the argument to the middle of an npm script so that I may do the following.
$ npm run deploy -- <destination-path>
In package.json
"scripts": {
"deploy": "robocopy dist <destination-path> /E /NP"
}
Is this possible without using environment variables or npm's configuration variables?
Per Passing args into run-scripts #5518 it would appear that is it not possible to pass arguments to the middle of the script.
We are not going to support passing args into the middle of the script, sorry. If you really need this, write your test command using literally any of the command line parsers that anyone uses. (Minimist, dashdash, nopt, and commander all support this just fine.)
However, an alternative to this using the npm configuration block has been documented here. My implementation then looks like this:
"name": "foo"
"config": { "destination" : "deploy" },
"scripts": { "deploy": "robocopy dist %npm_package_config_destination% /E /NP" }
I can then override this on the command line and on my build server with:
npm run deploy --foo:destination=C:\path\to\deploy\dir
You can use an environment variable to set the destination path.
PATH=/path/to/file npm run deploy -- $PATH
or
export PATH=/path/to/file
npm run deploy -- $PATH
I have a different way to do that via shell.
If your npm command is:
"deploy": "robocopy dist ${1} /E /NP"
Where ${1} is the parameter you want to substitute.
Then wrap it in a function as follow:
"deploy": "func() { robocopy dist ${1} /E /NP";}; func"
then you can run a positional parameter substitution in shell as follow:
npm run deploy -- xyz
which would run
robocopy dist xyz /E /NP
And since this is a shell script, you can use default parameters as well:
"deploy": "func() { robocopy dist ${1:-xyz} /E /NP";}; func"
And you can use it as follows:
npm run deploy <==> robocopy dist xyz /E /NP
npm run deploy -- abc <==> robocopy dist abc /E /NP
You can make use of the arg structure of "sh -c". In my example below, I have to echo-feed the npm arg into a language parser. The argument for npm run foma <some word> will be in place of the $0:
"sayhello": "bash -c 'echo hello $0!"
A cross-platform solution I use is:
"arg-helper": "node -e \"process.stdout.write(require('child_process').execSync(process.argv[1].replace('$', process.argv[2] || '')))\"",
"sayhello": "npm run arg-helper \"echo hello $!\"
...
>npm run sayhello world
Levereging npm_config_* variables.
package.json
{
"scripts": {
"echoMyParam": "echo 'your param value is' $npm_config_foo"
}
}
Run
npm run echoMyParam --foo=bar
Result
your param value is bar
It's important to check the docs for other cases: https://docs.npmjs.com/using-npm/config

.NET Core 1.0 - How to run "All tests in Solution" with xUnit command line

The Getting started with xUnit.net (.NET Core / ASP.NET Core) page describes how to run tests with dotnet test command line.
It states that it requires a specific project.json, where we add xunit dependencies and test runner:
"testRunner": "xunit",
"dependencies": {
"xunit": "2.1.0",
"dotnet-test-xunit": "1.0.0-rc2-build10015"
}
If I try calling it from the parent directory:
C:\git\Project\test [master ≡]> dotnet test
dotnet-test Error: 0 : System.InvalidOperationException: C:\git\Project\test\project.json does not exist.
at Microsoft.DotNet.Tools.Test.TestCommand.GetProjectPath(String projectPath)
at Microsoft.DotNet.Tools.Test.TestCommand.DoRun(String[] args)
C:\git\Project\test [master ≡]>
Question: Is there a way to run all tests (multiple project.json) with a single dotnet test?
In case anyone looks for a Windows answer, here's oneliner in PowerShell that does the job:
dir test | % { dotnet test $_.FullName }
Since it's been almost a month and no answer, I'll at least share what I've been doing. (this won't be relevant once Visual Studio "15" RTM is launched because project.json is dead)
Simply using a for loop on all project.json:
Locally, from the test directory, I just run:
for /f %a in ('dir /b /s project.json ^| find /v "TestUtilities"') do dotnet test %a
Running it on all project.json except where the path has: TestUtilities
Mind that on TeamCity you need to escape % (and in scripts you need double: %%) so it goes by:
for /f %%%a in ('dir /b /s project.json ^| find /v "TestUtilities"') do dotnet test %%%a
Note the %%%. Since % in TeamCity is used for variables, the third % escapes it.
Guys from Serilog have an example of building multiple test projects in their CI pipeline . Check out this powershell script https://github.com/serilog/serilog/blob/dev/Build.ps1#L44
Thanks Andrzej Lichnerowicz for initial pointer. I've been trying to integrate with AppVeyor and while this fix executed all test assemblies the build would no longer break if any tests failed.
Taking to next level I created a powershell macro, imported in to the appveyor build...
version: 1.0.{build}
install:
- ps: Import-Module .\Appveyor.psm1
before_build:
- ps: dotnet restore
build:
verbosity: minimal
test_script:
- ps: Invoke-AppVeyorTest
...and then executed the following macro:
function Invoke-AppVeyorTest
{
[CmdletBinding()]
param()
$result = "true"
Get-ChildItem NetCoreXunit* -Recurse -Directory | % {
$test_path = $_.FullName
$output = & dotnet test $test_path
if ($output -Match ", Failed: 0, ")
{
Write-Output "All tests passed in $test_path"
}
else
{
Write-Output "Located failed tests in $test_path"
$result = "false"
}
}
if ($result -eq "false")
{
$host.ui.WriteErrorLine("Failed tests detected.")
exit 1
}
}
Appveyor collates all the test results and the build once again fails if any tests failed.
For a cross-platform solution, you can use Node and NPM with the foreach-cli package. If you don't have a package.json in the root folder, do npm init, then:
npm install foreach-cli -D
In package.json:
"scripts : {
...
"test": "foreach -g 'test/**/project.json' -x 'cd #{dir} && dotnet test'"
}
To run tests:
npm test
It doesn't look like this will be possible at all via the commandline, given the latest feedback from the CLI team on a recent github issue regarding the project search algorithm:
...though the team decided to move in a different direction. Specifically, we decided to have all of the commands require a path to a root artifact from which a closure is determined.
However, if you're using TFS builds there does exist an option in the dotnet build step (currently 'Preview') called "Project(s)", which accepts wildcards, so you can use the following settings to run all tests in all dotnet;
Command: 'test'
Projects: '**/project.json'
Beware however, **/project.json will attempt to execute tests in all projects even if they don't have a testrunner defined, which may cause the build to fail.

Exec Task in MSBuild for execution of command on remote machine

I am using following command to install a service via MSBuild file. This works great
<Exec Command= 'c:\test\myService.Appservices.exe install' ContinueOnError='false' />
But the above command install the service on local machine. I want to install the service on a remote machine. How can I specify the machine name using this command?
As per Mike Vine's comment, MSBuild doesn't include tools for remote execution. You could however use something like psexec. e.g.
<Exec Command='psexec -accepteula -s \\RemoteServer "C:\Path To EXE on Remote Machine\my.EXE"' IgnoreExitCode="false" ContinueOnError="false" Timeout="600000" >
<Output TaskParameter="ExitCode" PropertyName="exitCode1"/>
</Exec>