MSBuild.exe doesn't exist on Jenkins slave VM - msbuild

I'm having trouble setting up a slave device in Jenkins to build my .net projects. The error I keep getting is FATAL: C:\Windows\Microsoft.NET\Framework\v4.0.30319\msbuild.exe doesn't exist. MSBuild DOES exist on the VM slave in the path above.
I am using the Jenkins MSBuild plug-in, version 1.15 (also didn't work using 1.13). I have set the "path to msbuild" as "C:\Windows\Microsoft.NET\Framework\v4.0.30319\msbuild.exe" with no default parameters and install automatically unticked. If I run the same project configuration on the master node, it builds fine.
I have also tried setting the Node Properties - Tool Locations for the slave node but this has no effect either.
Does anyone know how to get Jenkins to see MSBuild on a slave node?
Thanks
Tom

I had a similar problem. Make sure you do not have quotes in the path in your config. When you "copy as path" using the left-shift right-click method in widows it adds quotes.

I managed to insert whitespace into the beginning of the MSBuild executable path, which apparently made the check-if-exists logic performed by Jenkins to fail. You would think the MSBuild plugin would trim the input prior to committing the configuration... just remove it and you are fine!

Related

Buildsteps after each other

How do I run several buildsteps after each other in IntelliJ? I think I want a mini CI/CD build system inside the editor.
For example, the project I work on now is a Spring boot and javascript web site. I need to build it with maven with mvn clean package -Pdockerimage. This copies files for building the Docker image to target/dockerimgbuild.
Then I want to build the docker image using docker build -t scheduling-ui-dev . and after that run it with docker compose docker-compose up --build from src/main/resources/docker-compose.
I have built one run configuration for each of these steps but how do I run them after each other? I have found that you can have before launch but the system is clunky and complains if target/dockerimgbuild doesn't exists even before it have run the maven step which creates it. Latest problem I stumbled on was that a file prevented maven from removing target/dockerimgbuild and all run steps was automatically removed from the run configurations.
There is a run configuration called compound but that runs everything in parallell and you can not specify order which is a problem.
I wonder if it is feasible to start TeamCity in a container, do anyone have a clue about that (is teamcity easy to configure, how to make it launch a docker-compose container on my host machine etc)?
My solution right now is to have several terminals (if this gets more permanent I will replace it with a script) where I just press up and enter to execute the steps manually. Seems stupid as I guess maven itself can do all of this...but I don't know how or how much work it is.
There is a compound Run/Debug configuration: https://www.jetbrains.com/help/idea/run-debug-configuration-compound-run-configuration.html
Also, there is a multi-run plugin: https://plugins.jetbrains.com/plugin/7248-multirun

Viewing log file after appveyor script fails

I am trying to diagnose an error during the build of a project with appveyor. This same project is also built with travis-ci, without any problems. I assume it is windows related.
The script produces some log files, but I have no clue on how these can be viewed after appveyor is done trying to build.
As a specific example: See the log of this build. At line 11706 it says:
Logs have been written to: C:\stack.stack-work\logs\yaml-0.8.28.log
How can I view the contents of that file?
You can push this file as artifact at on_finish stage, or simple RDP to the build worker and explore it interactively.
Side note: you can also try to debug your build in RDP, but note that environment variables from the build session are not available in the RDP session, so you need to re-create all or part of them.

On UnsatisfiedLinkError, clarification needed

When building the project from command line using mvn clean install everything builds without any issues.
When running some tests that use precompiled C libraries from IntelliJ, tests fail with java.lang.UnsatisfiedLinkError
I may be completely off here, but does IntelliJ not see the .so file? Is so, how can it be added please?
Shared library fails to load with UnsatisfiedLinkError if:
it's not in the working directory configured in the test run configuration.
it's not in PATH environment (on Mac Terminal and GUI apps have different environment, see this answer). Run IDEA from the Terminal open -a /Applications/IntelliJ\ IDEA\ 12.app/ to make environment the same.
it's not in the location specified using -Djava.library.path VM option.
.so depends on some other library that is not found for any of the 1-3 reasons (or the dependency of that dependency is not found, etc).

How do I control which version of an msbuild file is used between .NET4 and 4.5RC?

On my development laptop I have only VS2012 RC installed, and I am successfully able to hook into the new MSDeploy .pubxml plumbing (DeployOnBuild and PublishProfile settings) from powershell (via psake) to deploy my web site to our test server.
However, on my build server, I initially had VS2010 SP1 installed, and I've now additionally installed the 2012 RC (I have other builds on this machine that are still .NET 4).
When running the same script with exactly the same parameters, I see different results between my dev machine and the build server. The command I'm running is
exec { msbuild "Website\WebSite.csproj" /m p:DeployOnBuild=True /p:PublishProfile=MyTestProfile }
On the build server, this does not in fact trigger MSDeploy, but simply the packaging bits that zip the site up and makes a deployment package. My machine successfully picks the pubxml file up and does a successful deployment.
Eventually, I believe I've traced the problem to the file Microsoft.Web.Publishing.targets. On my dev machine I have only
C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v11.0\Web
but the server additionally has
C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\Web
and it seems like this file (without knowledge of the .pubxml stuff) is what's being used there.
Has anybody got any idea what I need to twiddle (preferably within my own msbuild files so I don't screw up anything else on the build server for the 4.0 builds) to get msbuild to pick up the v11.0 version of the file and thereby use my .pubxml file?
That is interesting it should be picking up the latest (v11.0), seems like there is a bug here . This is controlled by the MSBuild property VisualStudioVersion.
Here are the rules for how this value is populated at build time.
1. If VisualStudioVersion is defined as an environment variable or a global property (e.g. /p: on the command line) that wins. This is how Dev11 & the Dev11 command prompt are always v11 – they both define VSV as an environment variable
1. Otherwise, if there is a sub-toolset that matches the equivalent solution version (which is currently always file format version – 1), choose that
1. Otherwise, get the default version; 10.0 if Dev10 is installed, Highest-versioned sub-toolset version installed (currently always 11) otherwise
In your case since you are running into an issue you can pass in the property /p:VisualStudioVersion=11.0 to ensure that the correct targets are used.

Problems executing a SSIS package deployed to the file system

I have a SSIS master package, which executes several child packages. It works great, but when I deploy it to the file system on the server, I get an error code "0xC00220DE". "The system cannot find the file specified."
When I run the package on the server by double-clicking it, it works correctly. But when I use DTExec:
dtexec /FILE "d:\cmcdx\ssis\MAESTRO_FACTURACION.dtsx"
I get the mentioned error.
The package configuration is correct, and the user I'm executing the package with is administrator of the machine.
Should I deploy the packages to Sql Server? What are the best practices for deploying a master-child package? I'm running out of ideas here...
By the way, I'm running Sql Server 2005 sp3.
Solved it.
I was using relative paths to point to the child packages, and in runtime SSIS was unable to find them.
In the end I used a specific path, set in a configuration file. Then I used the deployment utility, copied everything to the server, run it by double clicking on the SSISDeploymentManifest file and changed the paths to the proper location.
Thank James and Justin for your answers.
Is the package not getting a path or location value from a package configuration file? If so make sure you include the /ConfigFile argument and the path to the config file. Another thing to possibly check is if you have any connections in the package that refer to mapped network drives, these may not work running under the different service account than your local console account.
[Edit]
Try this command line below on the server (notice the double slashes).
dtexec /FILE "d:\\cmcdx\\ssis\\MAESTRO_FACTURACION.dtsx"
There are several things that could be going wrong here. You mention that you're using a master package to run several child packages. Are all of your child packages in their proper location on the server as well?
Remember that the paths to the child packages should be variables in your master package so that those values can be changed through a configuration file on the server if need be.
You might also want to check out this set of tutorials on MSDN:
Package Deployment How-To Topics
These tutorials explain how to properly enable package configurations on the server when your package runs.