Is nuget appropriate for daily development workflow? - automation

I am looking at nuget for improving automatic handling of dependencies (both internal and third party) during development.
A long as you develop through the CI Build Server, all is good:
get latest source for A and B, where B depends on A
fix bug in A
build A
check into source control
CI Build Server initiated
new nuget package is created and placed in corporate repository
build B (which will get the updated A package)
run B to verify that the bug in A was fixed
n. repeat n times
However, I'm wondering if it is possible to work locally as a single developer, without having to wait for the CI Build Server to produce a new package?
Nuget has a feature Package Restore, which will download all dependencies automatically on build. You can also list the repository order that the Package Restore should look for packages.
If the workflow could become:
get latest source for A and B, where B depends on A
fix bug in A
build A
(building creates a local nuget package)
run B to test the (resolved) bug in A (should now use our local nuget package, not local repository)
...repeat n times
check into source control
CI Build Server initiated
new nuget package created in corporate repository
Is this possible using Visual Studio, MSBuild, a CI Build Server and nuget? I'm especially interested in the making of local packages while developing locally.
Note that I have native projects, although except the generation of nuget package post-build, this would be a workflow that I hope should work for both C# and C++ projects.

The solution I have now, though far from ideal, is what I could figure out works best. Oh! and it is a work in progress so it WILL change in the coming weeks/months as I figure out how to get around the kinks.
I mostly have to deal with managed DLL right now but I do have some native code and worst, multi-platform native code to deal with eventually.
Create a local repository, basically just a folder and configure it in your list of nuget feeds.
Then I created a task (MSBuild) that will package the project and output it in the local repository's root folder. Make sure the version of your package is always increasing. Presently I do this manually by editing the assembly version.
Once built, update your other projects that reference it, I usually do this though the package manager console (update-package).
Each projects that was updated, bump up their version rinse lathe and repeat until you get to your top-most project (the actual program).
Once everything is nice and good and you are ready to commit then the build system should do it's own packaging and send it to your official repository.
The Good
No clogging of the repository and build system with intermediary development versions, that garbage remains (as it should) local.
Local repos are super easy to set-up, can even be done without changes to VS though the global nuget config.
This is friendly to both paradigms of package recover or checking-in packages with the project. That said I would recommend not checking in the packages you built locally but rather one that was committed to your local repository ideally through the build system. What's built local should remain local.
The Bad
Still much more complicated than just adding projects to a solution.
The deeper (or wider) your dependency tree the bigger the pain.
The Ugly
Makes some native nuget behaviors quite quirky and annoying :
Update operation takes forever if your VS is connected to a version system (perforce for me). I hear they "solved" the problem, would hate to see how it was before if it was worst that it is now !
Having nuget change non-code reference back to never copy is a major pain.
If Only
Configure the desired state of a content dependency (copy always, never or newer) directly from the nuspec and be done with it ! (oh and same story with ClickOnce content status include, exclude etc)
Make the update operation quick, 2 minutes for a dozen project is just insane, especially if the ultimate goal is to manage 500+.
Perhaps a hybrid mode where locally we work with projects inclusion but the build system would work with nuget dependency (and build them if necessary)
If you are to parse the project do follow MSBuild parsing rules and honor the conditional statements.
There are still issues I have yet to figure out like how to manage multiple branches of the code in the repository. How to handle version conflict further up the food chain. In a large project (ultimately we have to bring 500+ separate projects together in a single application executable, conflicts are expected).
I would love to bring all the goodness of sane dependency management à la Maven but thus far I did not find nuget to be mature enough to even think of proposing it to the dev team.

Certainly. In our solutions, NuGet parks the libraries in the "packages" directory of the solution's hierarchy which is ultimately kept in TFS. This allows for complete solution check-outs that includes the required libraries. If it's your intention to update the libraries normally provided by NuGet, you'll need to update the dependent projects' references to point to the project containing the updated code normally provided by the NuGet process.
Prior to checking-in your regular solution work (not the NuGet related libs,) make sure the solution's NuGet libs are up to date, and the references in the solution point back to the NuGet installed libs. Of course, you'll check-in and fetch the NuGet related libs beforehand.

Related

Options for local hosted client side package management in VS2019?

A common issue I keep bumping into for web projects in net core, is the need for sharing javascript in easy to use modules from project to project. Often times large quantities of code written in VS project A could be very much used in project B, sometimes in the same solution.
Restrictions:
Must be self hosted, not publicly exposed, only within local network etc etc can access the libs/modules/packages/etc
Ideally can be performed via visual studio projects and make use of build tasks, powershell, msbuild, or other such automation tools to deploy and package, minify, bundle, etc etc the javascript libraries.
The absolute ideal is if this can all be hosted from just a network folder
NPM/Yarn
I'm not super familiar with either of these, but is there a way we can drag and drop javascript code we've built into some designated folder, perhaps modify some form of manifest, json or xml file or what have you, and then anyone can just npm install those packages? I guess what I'm wondering is, is there a way to tell npm "This folder now is a source of packages you can install from"?
Bonus points: If said "trust this folder" config can be set inside of the VS project, so if someone new grabs the git repo, it will just work "out of the box" and they dont need to go through steps configuring npm or yarn so it knows how to find those packages.
Libman
Same as above, but mostly I'm trying to figure out if there is any way at all to configure libman from VS. It's the default and what is currently in use, but it just has its four default CDNs it comes with that it trusts and I am not seeing any way at all to tell Libman "Here's a now resource for files to trust, add that to the selectable drop down"
But I am seeing basically zero configuration as an option for libman, which is quite disappointing.
Nuget
This is the other option that is already popular locally, but something about using nuget to deliver js files when NPM, Yarn, and Libman already exist sets my teeth on edge, but, we have I believe a locally hosted nuget server that could be used already, so the infrastructure I believe is already setup, if not, I know how to do it. I do like the fact that nuget 100% for sure could leverage actual projects and build steps and msbuild and etc for deploying.
Conclusion
What's the popular and easy way to do this nowadays? Best case scenario is if there's a way to go, "Put a manifest.json file in the folder root that points to all the modules inside, then add it as a trusted source to your package manager, and now you can install those packages"

I made a Nuget Package that I use in other projects. Do I have to wait everytime I update it?

I'm building an Azure Function that depends on another project I'm building that's on Nuget. Everytime I update the nuget project, I publish the updates to https://nuget.org. Then I wait for the validation. Then I update my other project to pull the latest version. It's really annoying waiting for the validation... sometimes a couple times per day.
Is there a way I can use my nuget package without waiting for validation to complete? Keep in mind I'm developing both packages side-by-side on the same laptop.
There are three levels to doing this efficiently:
Whenever possible, do local development within the same solution using project references instead of package references. As zivkan said in the comments, this "inner loop" is fastest. Do this until you need to test the package itself (e.g., making sure it installs correctly).
Use a local NuGet feed if you are able to test the package without hosting it on a cloud repository. (nuget add my-package.1.0.0.nupkg -source C:\somedirectory\localnuget) Visual Studio and the nuget CLI can both be configured to look in a local directory, which makes the testing loop much quicker.
If your Azure functions must get the package from a real hosted repository, use MyGet (or if you want to DIY it, Artifactory) to host your own NuGet feed. Publishing your packages to that feed and consuming them from your function should be faster than waiting for official verification on nuget.org.

Updating transitive dependencies of a NPM package

Our company has a few web applications which in turn depend on a very long chain of internally created and hosted npm packages (we use JFrog Artifactory) each with their own dependencies (and so on). Whenever a bug is fixed or a feature is implemented in a low-level package the current process requires a developer to check in their changes, wait for the CICD build to complete and tests to run, update the parent package, and rinse / repeat all the way up the chain (which can be a very long process).
This can't possibly be a unique situation yet it impacts our productivity greatly and encourages monolithic package development to limit number of packages to update instead of proper code separation.
I can only think of two solutions:
1) Update the web application to use the transitive dependency directly in package.json. This however breaks "encapsulation" because how a direct dependency manages its job shouldn't be known to the web application. If the direct dependency were to use some other transitive dependency later the web application shouldn't be left referencing a now irrelevant package.
2) Modify the web application's package-lock.json to point at the new version of the transitive dependency. This however seems to only work temporarily as merge conflicts or new installs of direct dependencies tend to revert these changes.
I realize that the answer might be to optimize the build / publish process to be less painful and manual but I was hoping others might have encountered a different solution.
FYI - All dependencies are installed with '~' as a version prefix by default.
The correct way to do this (Be in mind that this feature was introduced recently in npm 8.x) is using the overrides section. This new feature introduced allows replacing a package in your dependency tree with another version, or another package entirely.

How to avoid a build and deployment of dependencies which have no code changes

I'm doing a proof of concept on continuous integration and whether our development team will benefit from automated builds and automated deployments to reduce human error.
I've already come quite far in the process but have some questions on how to configure our incremental builds to avoid rebuilding of dependencies that had no code changes.
In addition I’d like our deployment tool to identify and deploy only assemblies rebuilt as a result of a code change.
We already use Microsoft products like TFS for source control, Visual Studio for development and Team Foundation Build for continuous integration builds. We’re currently leaning toward InRelease for deployment as it seem to integrate well with Team Foundation Build.
But first, here is our current setup...
There are 200+ C# solution files, each containing one or more projects. It is not practical in the environment to combine these projects into less solutions, i.e. by design. Projects within a solution uses project references to resolve dependencies and file references to projects in other solutions. As far as I know, this is the recommended approach by Microsoft when dealing with a large amount of projects.
We use a "branch by feature" strategy e.g. isolated development on concurrent features branches which is merged up to a stable Main branch when complete. When it's time for a release, a release is branched from main and isolated for hotfixes and deployment. The feature branches and main branch have a CI build triggered by code check-ins. Releases will mostly like be manually executed from InRelease against a selected release branch. A release will be deployed through various environments including INTEGRATION/TEST, UAT and ultimately to all our clients. We're still fleshing out the details of branching strategy, but that's a question for another time.
The current problems to solve:
1. Avoid rebuilding of dependencies that have no code changes...
When we deploy new functionality or a patch to a client, we want to push the absolute minimum in files. Our company has a very large customer base (thousands of customers) with sometimes slow internet connections, so doing a full deployment of all assemblies (200+) to every customer is not an option. I've partially solved the problem by setting up incremental builds which correctly rebuilds only changed projects as expected but also rebuilds all the dependent projects even though NO CODE CHANGES were made to them. This results in both the changed assemblies and dependencies having new timestamps. If we use the change of timestamp to identify which assemblies to deploy, then this would result in deployment of functionally unchanged assemblies. The goal here is to deploy only assemblies where the code has changed and assemblies where breaking changes occur.
For example:
Solution B, has a project called Project B
Solution A, has a project called Project A
Project B -> Project A (where Project A has a file dependency on Project B)
When a non-breaking change is made in Project A, say to the interior of a method, then the expected result is: only A is built and therefore a candidate for deployment.
When a breaking change is made in Project A then that will break Project B, the expected result is: Both A and B is built and therefore a candidate for deployment.
Currently MSBuild rebuilds all dependents regardless, which is not what we want.
2. Automatically identify which assemblies should be deployed...
I have a partial solution to the problem.
When a build is performed, my build process template is configured to run a MSBuild script containing a list of solutions to build in a particular order.
This operation is performed in the build agent’s workspace. Every time a new build is performed the build process template creates an unique drop folder in the format and copies the binaries from the build agent workspace to the drop folder. This is out of the box functionality taken care of by the standard build process template. The build has been configured not to clear the build agent workspace, so the first time it runs it will build all projects within a solution but subsequent builds will only build projects that have code changes or is dependent on other projects (incremental build?). Therefore unchanged assemblies will have the original time stamps and changed assemblies will have new timestamps.
We have a tool that can do folder comparisons between drop folders and output the results to a txt file. This allows us to identify which binaries have been added/changed/removed since the last deployment. It also gives us the added benefit of comparing the list of actual artefacts to a manifest of expected artefacts as defined by the developer. This will ensure that no assemblies get deployed that has not been specified and proven to be unit tested.
The question is how can be we leverage InRelease to deploy only the required files as per the example above and not all files in the drop folder?
Install a TFS Proxy in before your build machine, this reduce the net traffic
You will start with a branch strategy like Service Pack, you can read a documentation about in ALM Rangers guidance... And adapt you process template to build just the part of code changed. I think in BRD Lite, another guidance by ALM Rangers, you will found more information.

Testing a NuGet package

We are big users of NuGet, we've got 25-30 packages which we make available on a network share.
We'd like to be able to test new packages before they're built and released in the consuming applications. Ideally, this could be done using something similar to Maven's snapshot and having a specific development package (e.g. snapshot functionality).
Has anyone else come up with a, ideally reasonably non-hacky, way of doing it?
Our favoured method is to generate the package assemblies and then manually overwrite the assemblies in the packages/ directory, i.e. to replace the actual project references, but that doesn't seem particularly clean.
Update:
We use a CI build server which creates builds on every commit and has a specific manually triggered NuGet build which works off specifically tagged versions of the codebase. We don't want to create a NuGet build off every commit, but we would like to be able to test a likely candidate in the wild before we trigger the manual NuGet package build.
I ended up writing a unit / integration testing framework to solve a simular problem. Basically, I needed to verity the content of the package, the versions and info, what would happen when I installed and uninstalled the package, what versions were the assemblies in the lib, what bits the assemblies were built as (x86 or x64) and so on - and I needed it all to run without Visual Studio installed and on my build machine (headless) as a quality gate.
Standing on the shoulders of giants like: Pester, PETools, and SharpDevelop's package management module I put together - nuget-test
Clone the project into your package directory (where your .nuspec file and package files are). If for whatever reason you want to keep the nuget-test project as a "git" repo then simple remove "remove-item nuget-test/.git -Recurse -Force" from the command below.
git clone https://github.com/nickfloyd/nuget-test.git; remove-item nuget-test/.git -Recurse -Force
Run Setup.ps1 in the root of the nuget-test directory in an x86 instance of PowerShell.
PS> .\setup.ps1
Write tests and place them in the nuget-test/test directory using the Pester syntax.
Run the tests.
PS> Invoke-Pester
Project page: nuget-test
On github: https://github.com/nickfloyd/nuget-test
I hope this helps you get closer to what you're trying to get done.
If you're using NuGet packages to distribute your libraries, you should not limit to only testing the libraries. You should test the packages themselves as well (if your binaries are OK but incorrectly installed, consumers still have issues). The whole point is to improve this experience.
One way could be to have an additional CI or QA repository. The one you currently have is actually your "production" repository containing consumable releases, considered finished high-quality products.
Going further, you could have a logical package promotion flow (based on Continuous Integration or even using a Continuous Delivery approach), where:
- each check-in produces a package on your CI repository
- testers pick up a CI package for QA and if found OK promote it to either a QA feed, or to the Production feed (whatever you prefer, depends on the quality of your testing and how well it is automated)
There are various ways of implementing this scenario, using simple network shares, internal NuGet.Server or Gallery implementations, or simply use http://myget.org to give it a try with minimal cost and zero effort.
Hope that helps!
Cheers,
Xavier