RoboCopy not included in latest MSBuild Community release? - msbuild

I'm setting up a RoboCopy job using the MSBuild Community tasks. It seems however that the task has not been released, despite it being in the list of tasks on the project front page. The latest release v1.2.0.306 does not include it, but if it is present in the SVN trunk. Am I looking in the wrong place?
I know there is an MSBuild Extension project that also have a RoboCopy task, but I'm already using some of the other Community tasks, and I'd rather not make my build depend on two almost identical extension packs.
The Tigris site seems abandoned in terms of documentation, so I'm attempting to see if anyone in here knows.

In case you didn't notice it: the latest official release on the download page (v1.2.0.306, exactly what you downloaded) is nearly five years old (February 2007).
Since then, obviously a lot of stuff happened in the trunk.
You can download and compile the trunk yourself, or you can use the nightly build which you can download at the bottom of the main project page:
Download The Latest Nightly Build
The latest test binaries and source from the automated build server.
Version: 1.3.0.516 Date: 9/8/2011
MSBuild.Community.Tasks.Nightly.zip
MSBuild.Community.Tasks.Nightly.msi
I'm using this version.
So...yes, you are looking in the wrong place :-)

MSBuild (latest) is also available through NuGet: http://nuget.org/packages/MSBuildTasks
You can install it via GUI or in Package Manager Console run
Install-Package MSBuildTasks

Related

What is the upgrade path for C# MSBuild Tools to be?

We are using Bamboo to compile our C# projects and recently enhanced our build AMIs to have the MsBuild 15 Build Tools so devs are able to use C#7. With the advent of C# 7.1, Microsoft state they are "increasing the cadence" of language releases and I've been trying to find out what the upgrade path for MS Build Tools and how to keep it updated with the latest version.
At the moment it appears that the Bamboo admins would have to always be one step ahead of the devs (who update their IDE to use the new language release) to enable clean compilations.
I can't find a decent way of automating this (other than though something like Chocolatey) - I'd be interested to see how other people get round this or ensure their Build Tools will always up to date.

Install npgsql for PowerBI

I'm not a programmer and I haven´t Visual Studio installed in my PC but I need npgsql to connect Microsoft PowerBI (Power Query) to a Postgres instance.
Is there a way to install it without having to compile it?
Is it necessary to do special configurations in my PC to make it work?
Thank you very much.
Javier
To expand on #Shay's comment, you can use nuget if you want the latest version of Npgsql.
You don't need to install nuget.exe; you can down download the latest nuget package from http://packages.nuget.org/api/v1/package/Npgsql/
Rename the .NUPKG file as .ZIP and unzip it, and you'll find Npgsql.dll in /lib/net45.
Office.com instructions indicate you'll also need Mono.Security.dll, whcih you can download at http://packages.nuget.org/api/v1/package/Mono.Security/
It seems the Office.com instructions are slightly out of date, because the latest Npgsql GitHub releases aren't including compiled binary downloads. You can follow the Office.com instructions after downloading an older GitHub release.
(It may also work to just run the setup EXE instead of continuing with the Office.com instructions.)

Is nuget appropriate for daily development workflow?

I am looking at nuget for improving automatic handling of dependencies (both internal and third party) during development.
A long as you develop through the CI Build Server, all is good:
get latest source for A and B, where B depends on A
fix bug in A
build A
check into source control
CI Build Server initiated
new nuget package is created and placed in corporate repository
build B (which will get the updated A package)
run B to verify that the bug in A was fixed
n. repeat n times
However, I'm wondering if it is possible to work locally as a single developer, without having to wait for the CI Build Server to produce a new package?
Nuget has a feature Package Restore, which will download all dependencies automatically on build. You can also list the repository order that the Package Restore should look for packages.
If the workflow could become:
get latest source for A and B, where B depends on A
fix bug in A
build A
(building creates a local nuget package)
run B to test the (resolved) bug in A (should now use our local nuget package, not local repository)
...repeat n times
check into source control
CI Build Server initiated
new nuget package created in corporate repository
Is this possible using Visual Studio, MSBuild, a CI Build Server and nuget? I'm especially interested in the making of local packages while developing locally.
Note that I have native projects, although except the generation of nuget package post-build, this would be a workflow that I hope should work for both C# and C++ projects.
The solution I have now, though far from ideal, is what I could figure out works best. Oh! and it is a work in progress so it WILL change in the coming weeks/months as I figure out how to get around the kinks.
I mostly have to deal with managed DLL right now but I do have some native code and worst, multi-platform native code to deal with eventually.
Create a local repository, basically just a folder and configure it in your list of nuget feeds.
Then I created a task (MSBuild) that will package the project and output it in the local repository's root folder. Make sure the version of your package is always increasing. Presently I do this manually by editing the assembly version.
Once built, update your other projects that reference it, I usually do this though the package manager console (update-package).
Each projects that was updated, bump up their version rinse lathe and repeat until you get to your top-most project (the actual program).
Once everything is nice and good and you are ready to commit then the build system should do it's own packaging and send it to your official repository.
The Good
No clogging of the repository and build system with intermediary development versions, that garbage remains (as it should) local.
Local repos are super easy to set-up, can even be done without changes to VS though the global nuget config.
This is friendly to both paradigms of package recover or checking-in packages with the project. That said I would recommend not checking in the packages you built locally but rather one that was committed to your local repository ideally through the build system. What's built local should remain local.
The Bad
Still much more complicated than just adding projects to a solution.
The deeper (or wider) your dependency tree the bigger the pain.
The Ugly
Makes some native nuget behaviors quite quirky and annoying :
Update operation takes forever if your VS is connected to a version system (perforce for me). I hear they "solved" the problem, would hate to see how it was before if it was worst that it is now !
Having nuget change non-code reference back to never copy is a major pain.
If Only
Configure the desired state of a content dependency (copy always, never or newer) directly from the nuspec and be done with it ! (oh and same story with ClickOnce content status include, exclude etc)
Make the update operation quick, 2 minutes for a dozen project is just insane, especially if the ultimate goal is to manage 500+.
Perhaps a hybrid mode where locally we work with projects inclusion but the build system would work with nuget dependency (and build them if necessary)
If you are to parse the project do follow MSBuild parsing rules and honor the conditional statements.
There are still issues I have yet to figure out like how to manage multiple branches of the code in the repository. How to handle version conflict further up the food chain. In a large project (ultimately we have to bring 500+ separate projects together in a single application executable, conflicts are expected).
I would love to bring all the goodness of sane dependency management à la Maven but thus far I did not find nuget to be mature enough to even think of proposing it to the dev team.
Certainly. In our solutions, NuGet parks the libraries in the "packages" directory of the solution's hierarchy which is ultimately kept in TFS. This allows for complete solution check-outs that includes the required libraries. If it's your intention to update the libraries normally provided by NuGet, you'll need to update the dependent projects' references to point to the project containing the updated code normally provided by the NuGet process.
Prior to checking-in your regular solution work (not the NuGet related libs,) make sure the solution's NuGet libs are up to date, and the references in the solution point back to the NuGet installed libs. Of course, you'll check-in and fetch the NuGet related libs beforehand.

Testing a NuGet package

We are big users of NuGet, we've got 25-30 packages which we make available on a network share.
We'd like to be able to test new packages before they're built and released in the consuming applications. Ideally, this could be done using something similar to Maven's snapshot and having a specific development package (e.g. snapshot functionality).
Has anyone else come up with a, ideally reasonably non-hacky, way of doing it?
Our favoured method is to generate the package assemblies and then manually overwrite the assemblies in the packages/ directory, i.e. to replace the actual project references, but that doesn't seem particularly clean.
Update:
We use a CI build server which creates builds on every commit and has a specific manually triggered NuGet build which works off specifically tagged versions of the codebase. We don't want to create a NuGet build off every commit, but we would like to be able to test a likely candidate in the wild before we trigger the manual NuGet package build.
I ended up writing a unit / integration testing framework to solve a simular problem. Basically, I needed to verity the content of the package, the versions and info, what would happen when I installed and uninstalled the package, what versions were the assemblies in the lib, what bits the assemblies were built as (x86 or x64) and so on - and I needed it all to run without Visual Studio installed and on my build machine (headless) as a quality gate.
Standing on the shoulders of giants like: Pester, PETools, and SharpDevelop's package management module I put together - nuget-test
Clone the project into your package directory (where your .nuspec file and package files are). If for whatever reason you want to keep the nuget-test project as a "git" repo then simple remove "remove-item nuget-test/.git -Recurse -Force" from the command below.
git clone https://github.com/nickfloyd/nuget-test.git; remove-item nuget-test/.git -Recurse -Force
Run Setup.ps1 in the root of the nuget-test directory in an x86 instance of PowerShell.
PS> .\setup.ps1
Write tests and place them in the nuget-test/test directory using the Pester syntax.
Run the tests.
PS> Invoke-Pester
Project page: nuget-test
On github: https://github.com/nickfloyd/nuget-test
I hope this helps you get closer to what you're trying to get done.
If you're using NuGet packages to distribute your libraries, you should not limit to only testing the libraries. You should test the packages themselves as well (if your binaries are OK but incorrectly installed, consumers still have issues). The whole point is to improve this experience.
One way could be to have an additional CI or QA repository. The one you currently have is actually your "production" repository containing consumable releases, considered finished high-quality products.
Going further, you could have a logical package promotion flow (based on Continuous Integration or even using a Continuous Delivery approach), where:
- each check-in produces a package on your CI repository
- testers pick up a CI package for QA and if found OK promote it to either a QA feed, or to the Production feed (whatever you prefer, depends on the quality of your testing and how well it is automated)
There are various ways of implementing this scenario, using simple network shares, internal NuGet.Server or Gallery implementations, or simply use http://myget.org to give it a try with minimal cost and zero effort.
Hope that helps!
Cheers,
Xavier

Nightly build for VB.NET program, Versioning

I currently have a nightly build system running as a windows scheduled task, calling at batch file, that works sort of like this:
Check out the latest revision from subversion
Modify the AssemblyInfo.vb file of the main executable and the librarys to set the version number to 0.0.0.revision
Invoke MSBuild to build everything (including the installer)
Upload the installer and a log of the build to an FTP server
This works ok, but step 2 is dirty and fragile, and I can't imagine that this the only way to do what I want. Any ideas?
There are a couple of ways to deal with this. You may want to check this post or others tagged with svn (and containing "AssemblyInfo").