Updating transitive dependencies of a NPM package - npm

Our company has a few web applications which in turn depend on a very long chain of internally created and hosted npm packages (we use JFrog Artifactory) each with their own dependencies (and so on). Whenever a bug is fixed or a feature is implemented in a low-level package the current process requires a developer to check in their changes, wait for the CICD build to complete and tests to run, update the parent package, and rinse / repeat all the way up the chain (which can be a very long process).
This can't possibly be a unique situation yet it impacts our productivity greatly and encourages monolithic package development to limit number of packages to update instead of proper code separation.
I can only think of two solutions:
1) Update the web application to use the transitive dependency directly in package.json. This however breaks "encapsulation" because how a direct dependency manages its job shouldn't be known to the web application. If the direct dependency were to use some other transitive dependency later the web application shouldn't be left referencing a now irrelevant package.
2) Modify the web application's package-lock.json to point at the new version of the transitive dependency. This however seems to only work temporarily as merge conflicts or new installs of direct dependencies tend to revert these changes.
I realize that the answer might be to optimize the build / publish process to be less painful and manual but I was hoping others might have encountered a different solution.
FYI - All dependencies are installed with '~' as a version prefix by default.

The correct way to do this (Be in mind that this feature was introduced recently in npm 8.x) is using the overrides section. This new feature introduced allows replacing a package in your dependency tree with another version, or another package entirely.

Related

How can I achieve a "hard-pin" with NPM inside my project?

I would like to hard-pin my NPM dependencies. "Hard-pinning" would mean that an automated process would check my dependency list for certain packages with certain versions and if a package has been locally upgraded, an custom error message should be shown (ideally, this should be integrated in a pre-push Git hook).
The reasons for wanting this behavior could be:
external dependencies (e.g. other teams integrating with your project, requiring certain versions)
broken or unwanted behavior because of certain issues (e.g. "wait until #124 is fixed")
known non-obvious migration effort for major upgrades
upgrade incompatibilities (e.g. newest version requires, but does not enforce, a newer peer dependency).
Normal pinning does not cut it in this case: it's trivial to update pinned packages anyway, comments do not work with package.json without extra effort and sometimes the reasons are too important not to be displayed explicitly.
How can I achieve such "hard-pinning"?

Straight forward way to use your own NPM package without the NPM registry

I want to split up the code base of several of my project into isolated package like projects. Those should be easily usable by npm but they do not seem significant enough to be published to the global npm registry.
So, my question is if there is a middle way to handling them like local provided packages and installing them with their path and publishing them in the global repository.
Concerns:
cluttering the npm registry with packages which don't seem to be significant enough to take up the name
the need to document and to create tests for each package seems to be too much and I would not sleep well publishing packages which are not well documented and tested
I take up a name which might be more appropriate to be used by a more sophisticated package and maintainers
I still want other to be able to easily try / use this package, to see if it fits their needs
Alternatives:
A) creating a private npm repository (with CouchDB?)
+ is pretty much identical to the npm repository and would be easy to use
+ the versioning is identical just pure semver lookup
- every user needs to set up this repository if they want to use this package or need it as a child dependency in their (public npm) package (even though this is unlikely)
- Need to invest time into setting it up and maintaining it
B) Using my username npm namespace
+ would solve pretty much every problem
- namespaces seem to be meant for projects and its sub packages which wouldn't be the case for my packages since their only connection is the creator
- it seems arrogant to prefix your packages with your name, like you are tagging it with a big sign THIS WAS DONE BY ME
C) Using GitHub with a special detached branch which contains the (tagged) releases
+ you could use it like the global npm repository since the npm resolving strategy allows the repository url with a semver range in place of the version
- special case which is bound to break
- GitHub is not meant to provide npm packages, about no developer expects a git url instead of the versionrange, tools and firewalls might have problem with this
- workflow is really not meant this way neither for git nor for npm
D) using a local package and install package by its path
+ easy to setup and use
- no version management
- build steps must be done manually beforehand
- can not publish packages depending on those packages
- all dependencies have to be installed locally
E) making those packages more useful, implementing edge cases, writing documentation and testing the whole package
+ would resolve about all problems
- ALOT of extra work, primarily thinking about edge cases and giving the developer a good api
- sometimes you can't really get the name for you package (it collides with other) which results in weird
- it is your responsibility, you have to maintain it, be responsible (test it well, edge cases)
- cluttering of the npm repository
So those are all the alternatives which came to mind when I tried to find a solution. Please leave a comment / answer if you have another idea or maybe you can remove / reduce the importance of those contra points.
Maybe you could include your own experience, so I get a better view for the whole problem.
Currently I would just try to make the package more helpful to the greater majority but this does not work in all cases.
Thank you all for your time!
Installing from git is pretty standard feature in package managers. npm doesn't have Github-support, it's generic support for any git repo. Unless you can find some discussion about deprecating it from npm, I'd not worry about it. It's used internally in many companies for private packages.
Of course, there is still some trade offs: build artifacts and maybe a bit more clumsy workflow. Things like npm outdated doesn't understand git semver. For build artifacts, I have seen many projects to commit them to master branch to support direct git-install. If you look around older open source projects for example, that's the case quite often.
We went for a private repository with verdaccio running in a docker container, which is very similar to version A. It took some setup, but for our developers all it took was a single npm command to add the private repo "in front of" npm for all packages of the namespace we created. Granted, our packages are project specific, but in a private repository that does not really matter either way, does it?
We considered the local package option at first, but the drawbacks were just too big for us, even if it's very easy to setup.
I'm not sure this helps, but this is at least the setup we decided upon when we had the same issue a few months ago.

Missing npm dependency - how it affects on app

I consider to use Hexo (the static blog generator) based on npm. I wonder one thing, what if although one npm package (dependency) will not be longer available? Each package has its own author and it can cancel support or completely remove it from npm's repository at any time.
So what do I do if missing one of the npm package affect on running Hexo and consequently I'll not be able to generate my blog in the future?
Although this can happen (and happened at least once), it is not a serious problem ussually. While you will be waiting until somebody will fix the missing dependency (it take place quickly on popular packages like Hexo) you can use older working version. And if you want to be 100% sure, you can commit node_modules together with your web sources (see discussion here).

How do you deal with people removing npm package versions?

Today I found that an npm package version, Babel 6.0.15, that my application relies on had been removed from npm.
This caused compilation failure on a new pc, and I had to go manually find the closest available version for it, and all the cascading version changes it affected on related packages.
What is the best of way of dealing with npm packages, now that I know they can go missing at any time?
Do you check your node_modules folder into source control?
Is there a rule on npm about what versions (major, minor, etc) may be removed by the creator, and which are more 'long term support' and must be retained?
How do you get npm locally to inform you when 'npm update' fails on a new pc, rather than silently failing?
After thinking about this for a while I wrote a blog post summarising what I think is best practice. Reproduced below:
Summary
Specify exact versions of all npm modules, e.g. “alt”: “0.17.8”
Commit your node_modules folder into source control
Don’t use DefinitelyTyped or any other external library Typescript definition tools
Why?
Some of these principles might be controversial, so here’s my reasoning:
Specify exact versions of all npm modules
Semver (semantic versioning) says that breaking changes should occur only if the major version changes. So you should be able to say just “alt”: “0.17”
But I’ve found in practice that even patch changes (bugfixes) can break your application – because libraries that rely on these library often expect some tiny behaviour in a particular version not to change. So in order for all your particular versions of particular libraries to work, they need to rely on exact versions of other libraries.
Commit your node_modules folder into source control
I first assumed that all versions of famous npm libraries would remain there indefinitely. But I then discovered that creators often remove old versions of their software from npm – which then breaks the cascading chain of exact version number dependencies you’ve configured for your app.
Yes, committing all your npm libraries will take up space in your repository, but they’re text files after all, not .DLLs, so they’ll get compressed really small. And the alternative is one day not being able to compile your app at all on a new computer because a library has been completely removed from npm.
Don’t use DefinitelyTyped or any other external library Typescript definition tools
It’s wonderful to be given compile errors for external tools you use. But I’ve found it’s not worth the effort because:
there’s no way to match the definition file version number and npm library version number, so you get definitions that are out of sync with the library you are using
they often have bugs
the type bugs you catch at compile time are probably going to occur in your own app, not in how you call external libraries
Instead of using .d.ts files for external libraries, just say:
declare module 'lodash'
{
let x: any;
export = x;
}
Or use the –allowJS flag in Typescript 1.8 onwards.

Is nuget appropriate for daily development workflow?

I am looking at nuget for improving automatic handling of dependencies (both internal and third party) during development.
A long as you develop through the CI Build Server, all is good:
get latest source for A and B, where B depends on A
fix bug in A
build A
check into source control
CI Build Server initiated
new nuget package is created and placed in corporate repository
build B (which will get the updated A package)
run B to verify that the bug in A was fixed
n. repeat n times
However, I'm wondering if it is possible to work locally as a single developer, without having to wait for the CI Build Server to produce a new package?
Nuget has a feature Package Restore, which will download all dependencies automatically on build. You can also list the repository order that the Package Restore should look for packages.
If the workflow could become:
get latest source for A and B, where B depends on A
fix bug in A
build A
(building creates a local nuget package)
run B to test the (resolved) bug in A (should now use our local nuget package, not local repository)
...repeat n times
check into source control
CI Build Server initiated
new nuget package created in corporate repository
Is this possible using Visual Studio, MSBuild, a CI Build Server and nuget? I'm especially interested in the making of local packages while developing locally.
Note that I have native projects, although except the generation of nuget package post-build, this would be a workflow that I hope should work for both C# and C++ projects.
The solution I have now, though far from ideal, is what I could figure out works best. Oh! and it is a work in progress so it WILL change in the coming weeks/months as I figure out how to get around the kinks.
I mostly have to deal with managed DLL right now but I do have some native code and worst, multi-platform native code to deal with eventually.
Create a local repository, basically just a folder and configure it in your list of nuget feeds.
Then I created a task (MSBuild) that will package the project and output it in the local repository's root folder. Make sure the version of your package is always increasing. Presently I do this manually by editing the assembly version.
Once built, update your other projects that reference it, I usually do this though the package manager console (update-package).
Each projects that was updated, bump up their version rinse lathe and repeat until you get to your top-most project (the actual program).
Once everything is nice and good and you are ready to commit then the build system should do it's own packaging and send it to your official repository.
The Good
No clogging of the repository and build system with intermediary development versions, that garbage remains (as it should) local.
Local repos are super easy to set-up, can even be done without changes to VS though the global nuget config.
This is friendly to both paradigms of package recover or checking-in packages with the project. That said I would recommend not checking in the packages you built locally but rather one that was committed to your local repository ideally through the build system. What's built local should remain local.
The Bad
Still much more complicated than just adding projects to a solution.
The deeper (or wider) your dependency tree the bigger the pain.
The Ugly
Makes some native nuget behaviors quite quirky and annoying :
Update operation takes forever if your VS is connected to a version system (perforce for me). I hear they "solved" the problem, would hate to see how it was before if it was worst that it is now !
Having nuget change non-code reference back to never copy is a major pain.
If Only
Configure the desired state of a content dependency (copy always, never or newer) directly from the nuspec and be done with it ! (oh and same story with ClickOnce content status include, exclude etc)
Make the update operation quick, 2 minutes for a dozen project is just insane, especially if the ultimate goal is to manage 500+.
Perhaps a hybrid mode where locally we work with projects inclusion but the build system would work with nuget dependency (and build them if necessary)
If you are to parse the project do follow MSBuild parsing rules and honor the conditional statements.
There are still issues I have yet to figure out like how to manage multiple branches of the code in the repository. How to handle version conflict further up the food chain. In a large project (ultimately we have to bring 500+ separate projects together in a single application executable, conflicts are expected).
I would love to bring all the goodness of sane dependency management à la Maven but thus far I did not find nuget to be mature enough to even think of proposing it to the dev team.
Certainly. In our solutions, NuGet parks the libraries in the "packages" directory of the solution's hierarchy which is ultimately kept in TFS. This allows for complete solution check-outs that includes the required libraries. If it's your intention to update the libraries normally provided by NuGet, you'll need to update the dependent projects' references to point to the project containing the updated code normally provided by the NuGet process.
Prior to checking-in your regular solution work (not the NuGet related libs,) make sure the solution's NuGet libs are up to date, and the references in the solution point back to the NuGet installed libs. Of course, you'll check-in and fetch the NuGet related libs beforehand.