Using Nuget in development environment - best practices / how to - development-environment

Trying to figure out the best way to use Nuget in a development environment to manage our own libraries.
We want to standardize on Nuget way of doing things for our 3rd party libs, but would also like to use Nuget to manage our internal utility libraries, for developers consuming the in house libs this is great and everyones happy. However, for devs actively working on the Utility lib it seems to be more problematic, their previous process of build lib , build main app , F5 and go is now slowed down with publishing, and updating and potentially lots of packages, not to mention the moaning about additional process!
We use TDD on the internal libs but everyone needs to be able to debug and modify libs along with main app, have seen Phil Haacks demo on debug packages in 1.3 and read David Ebbos blog, but that fits different scenario.
So what is the best process for dev/debug cycles? if to use Nuget then we need to accept the existing constraints, or is there a hybrid practice people are using and maybe 1.3 gets closer to automating all this, or do we just avoid Nuget for internal packages which would be a real shame.
Loving Nuget, maybe wanting way to much from the little guy, feedback appreciated.
Thanks

I'd suggest you use separate network shares or feeds (similar to what myget.org supports in the cloud) for different scenarios.
You could imagine creating a CI share, a QA share, a Releases share, ...
Make people working on the referenced library do CI builds that drop CI packages on the CI repository for instance, and have them picked up by other projects (who just need to do a simple update, could be automated through PowerShell in pre-build: check for new version, if so, update).
Just make sure that when products release their milestones, they also release with released dependencies (could be as simple as switching feeds, releases will always have a higher version number than CI builds).
Hope that helps!
Cheers,
Xavier

If you're working on the source code for the lib and the main app at the same time, I'd say NuGet is probably not a good solution. I think it'll only work in situations where you work with a "stable" version of the library that don't need to change frequently during the development of your main app.
That said - is it possible the development on your library could be done in isolation? You already mention you're doing TDD on the lib, so why can't that work be done, then built, deployed, then the main app work done?

Related

Versioning APIs during internal development

In our team we have a number of APIs specified using the Open API Specification (formerly Swagger). We use Maven and OpenAPI Generator to generate code, build and publish the artifact to our local nexus. We build our code on TeamCity. The artifact is given the version that is specified in the pom.xml file of Maven.
During development we only use snapshot versions, that is versions which can be overwritten and will be cleaned up. This is opposite to release versions, that cannot be overwritten and needs administrative privileges to clean up. The reason for this is, that a developer usually changes a little bit at the time, which is much more convenient with snapshot versions. This also makes cleaning up outdated unreleased artifacts much easier.
Our problem is, that from time to time a developer makes API changes but forgets to set a new version. This works fine locally, but when the code is build on TeamCity the changed API overwrites the artifact of an older version. A developer not working on this branch will then experience a compile error, because the code does not match the API artifact being used.
What does others do? Is there a best practice? Preferably with standard tools. We have tried many things and nothing works well. At the same time this issue is so basic that someone must have a good solution - or at least experience enough to point to the least bad solution.

Is nuget appropriate for daily development workflow?

I am looking at nuget for improving automatic handling of dependencies (both internal and third party) during development.
A long as you develop through the CI Build Server, all is good:
get latest source for A and B, where B depends on A
fix bug in A
build A
check into source control
CI Build Server initiated
new nuget package is created and placed in corporate repository
build B (which will get the updated A package)
run B to verify that the bug in A was fixed
n. repeat n times
However, I'm wondering if it is possible to work locally as a single developer, without having to wait for the CI Build Server to produce a new package?
Nuget has a feature Package Restore, which will download all dependencies automatically on build. You can also list the repository order that the Package Restore should look for packages.
If the workflow could become:
get latest source for A and B, where B depends on A
fix bug in A
build A
(building creates a local nuget package)
run B to test the (resolved) bug in A (should now use our local nuget package, not local repository)
...repeat n times
check into source control
CI Build Server initiated
new nuget package created in corporate repository
Is this possible using Visual Studio, MSBuild, a CI Build Server and nuget? I'm especially interested in the making of local packages while developing locally.
Note that I have native projects, although except the generation of nuget package post-build, this would be a workflow that I hope should work for both C# and C++ projects.
The solution I have now, though far from ideal, is what I could figure out works best. Oh! and it is a work in progress so it WILL change in the coming weeks/months as I figure out how to get around the kinks.
I mostly have to deal with managed DLL right now but I do have some native code and worst, multi-platform native code to deal with eventually.
Create a local repository, basically just a folder and configure it in your list of nuget feeds.
Then I created a task (MSBuild) that will package the project and output it in the local repository's root folder. Make sure the version of your package is always increasing. Presently I do this manually by editing the assembly version.
Once built, update your other projects that reference it, I usually do this though the package manager console (update-package).
Each projects that was updated, bump up their version rinse lathe and repeat until you get to your top-most project (the actual program).
Once everything is nice and good and you are ready to commit then the build system should do it's own packaging and send it to your official repository.
The Good
No clogging of the repository and build system with intermediary development versions, that garbage remains (as it should) local.
Local repos are super easy to set-up, can even be done without changes to VS though the global nuget config.
This is friendly to both paradigms of package recover or checking-in packages with the project. That said I would recommend not checking in the packages you built locally but rather one that was committed to your local repository ideally through the build system. What's built local should remain local.
The Bad
Still much more complicated than just adding projects to a solution.
The deeper (or wider) your dependency tree the bigger the pain.
The Ugly
Makes some native nuget behaviors quite quirky and annoying :
Update operation takes forever if your VS is connected to a version system (perforce for me). I hear they "solved" the problem, would hate to see how it was before if it was worst that it is now !
Having nuget change non-code reference back to never copy is a major pain.
If Only
Configure the desired state of a content dependency (copy always, never or newer) directly from the nuspec and be done with it ! (oh and same story with ClickOnce content status include, exclude etc)
Make the update operation quick, 2 minutes for a dozen project is just insane, especially if the ultimate goal is to manage 500+.
Perhaps a hybrid mode where locally we work with projects inclusion but the build system would work with nuget dependency (and build them if necessary)
If you are to parse the project do follow MSBuild parsing rules and honor the conditional statements.
There are still issues I have yet to figure out like how to manage multiple branches of the code in the repository. How to handle version conflict further up the food chain. In a large project (ultimately we have to bring 500+ separate projects together in a single application executable, conflicts are expected).
I would love to bring all the goodness of sane dependency management à la Maven but thus far I did not find nuget to be mature enough to even think of proposing it to the dev team.
Certainly. In our solutions, NuGet parks the libraries in the "packages" directory of the solution's hierarchy which is ultimately kept in TFS. This allows for complete solution check-outs that includes the required libraries. If it's your intention to update the libraries normally provided by NuGet, you'll need to update the dependent projects' references to point to the project containing the updated code normally provided by the NuGet process.
Prior to checking-in your regular solution work (not the NuGet related libs,) make sure the solution's NuGet libs are up to date, and the references in the solution point back to the NuGet installed libs. Of course, you'll check-in and fetch the NuGet related libs beforehand.

Testing a NuGet package

We are big users of NuGet, we've got 25-30 packages which we make available on a network share.
We'd like to be able to test new packages before they're built and released in the consuming applications. Ideally, this could be done using something similar to Maven's snapshot and having a specific development package (e.g. snapshot functionality).
Has anyone else come up with a, ideally reasonably non-hacky, way of doing it?
Our favoured method is to generate the package assemblies and then manually overwrite the assemblies in the packages/ directory, i.e. to replace the actual project references, but that doesn't seem particularly clean.
Update:
We use a CI build server which creates builds on every commit and has a specific manually triggered NuGet build which works off specifically tagged versions of the codebase. We don't want to create a NuGet build off every commit, but we would like to be able to test a likely candidate in the wild before we trigger the manual NuGet package build.
I ended up writing a unit / integration testing framework to solve a simular problem. Basically, I needed to verity the content of the package, the versions and info, what would happen when I installed and uninstalled the package, what versions were the assemblies in the lib, what bits the assemblies were built as (x86 or x64) and so on - and I needed it all to run without Visual Studio installed and on my build machine (headless) as a quality gate.
Standing on the shoulders of giants like: Pester, PETools, and SharpDevelop's package management module I put together - nuget-test
Clone the project into your package directory (where your .nuspec file and package files are). If for whatever reason you want to keep the nuget-test project as a "git" repo then simple remove "remove-item nuget-test/.git -Recurse -Force" from the command below.
git clone https://github.com/nickfloyd/nuget-test.git; remove-item nuget-test/.git -Recurse -Force
Run Setup.ps1 in the root of the nuget-test directory in an x86 instance of PowerShell.
PS> .\setup.ps1
Write tests and place them in the nuget-test/test directory using the Pester syntax.
Run the tests.
PS> Invoke-Pester
Project page: nuget-test
On github: https://github.com/nickfloyd/nuget-test
I hope this helps you get closer to what you're trying to get done.
If you're using NuGet packages to distribute your libraries, you should not limit to only testing the libraries. You should test the packages themselves as well (if your binaries are OK but incorrectly installed, consumers still have issues). The whole point is to improve this experience.
One way could be to have an additional CI or QA repository. The one you currently have is actually your "production" repository containing consumable releases, considered finished high-quality products.
Going further, you could have a logical package promotion flow (based on Continuous Integration or even using a Continuous Delivery approach), where:
- each check-in produces a package on your CI repository
- testers pick up a CI package for QA and if found OK promote it to either a QA feed, or to the Production feed (whatever you prefer, depends on the quality of your testing and how well it is automated)
There are various ways of implementing this scenario, using simple network shares, internal NuGet.Server or Gallery implementations, or simply use http://myget.org to give it a try with minimal cost and zero effort.
Hope that helps!
Cheers,
Xavier

How do you distribute the IDE and it's configuration within your Team?

I'm wondering how Software Development Team distribute their Standard IDE(s)?
E.g. developing with Eclipse, custom Code formatter, svn Resository, Copyright Header..
At the moment my Team has a standard zip File which is then distributed withhin the developers.
Problem:
If one file, a Plugin or the IDE itself changes, e.g. new Coding Guidlines, Upgrade Eclipse 3.5.1 the whole distribution has to be done again. Every developer needs to unzip the bundel again. Imagine your working with different Workspaces (Jetty, different Tomcamt Versions, WTP) due to Project History That doesn't scale
I know that there are some related Articels
A new version of Eclipse just came out. Is there anything I can do to avoid having to manually hunt down my plugins again?
Manage Your Eclipse Install With A Local Git Repository
And some comercial Programs.
Eclipse also has a new Update-Installer Approach
But I don't see the Killer App. How do your team solve this? Is there a best practice?
I guess best would be a Program letting you choose your current Project and then downloads the configured IDE from the Server and leting you know if Project Config Files are Updated
For eclipse look at Buckminster it targets exactly your target I suppose, didn't use it personally through.
At my previous company they wrote a custom update agent that pulled from a centrally configured server which was updated by the team leaders. It worked well, until people wanted to install their own plugins.
Basically, a developer wanted a plugin, fought in futility to get it included in the default (managed) repo, installed it himself, then updates broke on his machine when the team lead had a sudden stroke of common sense and included it.
They never did come up with a 'good' way to manage it. But, at least they didn't put us all on terminal servers with thin clients.

What build tool do you use professionally?

At home, I use CTRL+SHIFT+B or F7 or whatever key sequence initiates the build for my build tool. At work, this doesn't quite cut it.
At my first job (an internship) we used a product called Visual Build, which I have come to like very much. It's the best build tool I've ever worked with. The down side here is that it's not free.
At my latest job, I came in knowing literally nothing about Ant. Now, unfortunately, I've become deeply involved in our build processes, and cannot extricate myself. It works, yes, but after coming from Visual build, it seems like it's fighting me every step of the way. Yes, it's free, but we're not trying to be a free-software-only development company or anything.
I've never looked in to make or any other build tools, so I don't really know what else is out there.
Has anybody ever seen or had experience with Visual Build? Mostly I'm fond of a few key things:
it has a GUI
it runs arbitrary VBScript without the need of a compiled class
you can step through the build process, or start from anywhere in the middle.
Are there any free build tools that have this? Is there any way to convince people that it's worth it to move on? It's 2008. We use IDEs to develop, why not (IBEs) to build?
Edit: I'm mostly looking for an answer to my last questions; Is there a solution with a built-in GUI that I can use for free?
Not very sophisticated, but we use a set of batch files. And that works great.
We use FinalBuilder - I think it's very similar to VisualBuild, though I've not used the latter.
It does run from the command line, and you can integrate it with CC.Net if you want.
For Java projects we use Teamcity, sort of cruise control like, but you can also do a remote run, i.e. you send your changes to the server, it builds and does unit tests, if everything works ok, THEN you checkin, very nice build tool and free for up to 20 build configurations.
For our Visual Studio 2005 projects including packaging the final exes and dlls with InstallShield and putting them up on a shared server we use Final Builder, it's not free, but it is very easy to use and get started with.
We also telnet out (from FinalBuilder) to a number of other platforms (Unix/Linux/OpenVMS) and start remote builds by running makefiles there.
We do not use continous build, but there is a FinalBuilder Server which handles that and comes free with the FinalBuilder Professional license.
We are very happy with FinalBuilder, it's quite easy to get up to speed with and powerful enough to solve most problems.
CMake. Generates build file for KDevelop, Eclipse, Makefiles and Visual Studio (and XCode), and it really works. You can easily extend it with macros, although the programming capabilities are rather limited. It's easy to learn, and porting an existing application from Visual Studio to it is pretty easy. However, you are limited to C++/C and IIRC Fortran code.
KDE is also using CMake now, so it seems to scale very well (i.e. generation time for the projects/dependency checking is not too bad).
I am not sure this is exactly what you are looking for, but I LOVE CruiseControl.NET. I have it build my projects using the MSBuild task. It doesn't have a GUI exactly, but there is a web interface to view the results of your builds and a System Tray resident program which will alert you of the build status.
UppercuT. It's free.
UppercuT uses NAnt to build and it is the insanely easy to use Build Framework.
Automated Builds as easy as (1) solution name, (2) source control path, (3) company name for most projects!!!
http://code.google.com/p/uppercut/
Some good explanations here: UppercuT
Going back to the keystrokes thing for a sec, I found Hoekey which the CTO loves. I don't use it myself, but as a way of assign keystrokes to things, it's pretty good.
I know nothing of Visual Build, but from your description it sounds like it is tied to Windows and doesn't run from the command line.
If you are building Java software (I assume you are since you are using Ant), it's preferable to have a cross-platform tool. If you can run the tool from the command-line, then it is scriptable which is extremely important for automation.
Ant is also extensible and a de facto standard. Many tools that you may use (Cobertura, TestNG, etc.) provide Ant tasks so that they can easily be intergrated with your build.
I use Ant for all Java projects. Some people prefer Maven, but I'm not one of them. Ant is far from perfect (the XML syntax is a bit clunky) but it is well documented, extremely stable and pretty straightforward.
If you use a standard tool, such as Ant or Maven, you will be able to take advantage of any number of Continuous Integration products. I doubt you will find many that work with Visual Build.
Most IDEs support Ant, so they give you a GUI of sorts and your CI server will give you a web interface for doing builds.
NAnt (.NET port of Ant). Works great and is easily extensible.
For small projects I do use post-build scripts and with the support of 7z, Nsis and similar CLI tools it's doing the job perfectly for me.
TeamCity and CuriseControl works well for any projects,but here is why you would like to choose TeamCity:-
Ease of setup: During setup we found TeamCity easier to setup and use especially compared to CruiseControl. We did not need to edit XML files or massively configure individual build machines like CruiseControl.
Ease of extensibility:TeamCity stands out in its ease of extensibility too. If we find that builds are waiting in the queue too long, we can add more computers as agents. The only additional work on our end is registering the new computers with the TeamCity server and installing msbuild and subversion.
Interaction with Subversion: One can check how many and what changes were committed to subversion since the last build, who started a build etc.
I've grown very fond of scons for building C++ files. It's very straightforward and the build scripts are written in Python (which is much better than some hacked together DSL IMO).
Ant or Maven are great little build tools.
And if you want to automate the build process there are some great tools like TeamCity and Bamboo.
Personally I use Makefiles for pretty much everything because they are simple as hell. But in my work, I'm forced to use ant.
The main problem I have against ant is that XML makes it hard to read and understand, even with the correct indentation. On the other hand, the verbosity of XML can help when reading someone else's ant file, but still makes it a PITA when the file is more than a few tens of lines.
As for having a GUI to build... I've always felt that's a minus rather than a plus.
Maven is the best for me because it handles the project dependencies