Missing npm dependency - how it affects on app - npm

I consider to use Hexo (the static blog generator) based on npm. I wonder one thing, what if although one npm package (dependency) will not be longer available? Each package has its own author and it can cancel support or completely remove it from npm's repository at any time.
So what do I do if missing one of the npm package affect on running Hexo and consequently I'll not be able to generate my blog in the future?

Although this can happen (and happened at least once), it is not a serious problem ussually. While you will be waiting until somebody will fix the missing dependency (it take place quickly on popular packages like Hexo) you can use older working version. And if you want to be 100% sure, you can commit node_modules together with your web sources (see discussion here).

Related

Straight forward way to use your own NPM package without the NPM registry

I want to split up the code base of several of my project into isolated package like projects. Those should be easily usable by npm but they do not seem significant enough to be published to the global npm registry.
So, my question is if there is a middle way to handling them like local provided packages and installing them with their path and publishing them in the global repository.
Concerns:
cluttering the npm registry with packages which don't seem to be significant enough to take up the name
the need to document and to create tests for each package seems to be too much and I would not sleep well publishing packages which are not well documented and tested
I take up a name which might be more appropriate to be used by a more sophisticated package and maintainers
I still want other to be able to easily try / use this package, to see if it fits their needs
Alternatives:
A) creating a private npm repository (with CouchDB?)
+ is pretty much identical to the npm repository and would be easy to use
+ the versioning is identical just pure semver lookup
- every user needs to set up this repository if they want to use this package or need it as a child dependency in their (public npm) package (even though this is unlikely)
- Need to invest time into setting it up and maintaining it
B) Using my username npm namespace
+ would solve pretty much every problem
- namespaces seem to be meant for projects and its sub packages which wouldn't be the case for my packages since their only connection is the creator
- it seems arrogant to prefix your packages with your name, like you are tagging it with a big sign THIS WAS DONE BY ME
C) Using GitHub with a special detached branch which contains the (tagged) releases
+ you could use it like the global npm repository since the npm resolving strategy allows the repository url with a semver range in place of the version
- special case which is bound to break
- GitHub is not meant to provide npm packages, about no developer expects a git url instead of the versionrange, tools and firewalls might have problem with this
- workflow is really not meant this way neither for git nor for npm
D) using a local package and install package by its path
+ easy to setup and use
- no version management
- build steps must be done manually beforehand
- can not publish packages depending on those packages
- all dependencies have to be installed locally
E) making those packages more useful, implementing edge cases, writing documentation and testing the whole package
+ would resolve about all problems
- ALOT of extra work, primarily thinking about edge cases and giving the developer a good api
- sometimes you can't really get the name for you package (it collides with other) which results in weird
- it is your responsibility, you have to maintain it, be responsible (test it well, edge cases)
- cluttering of the npm repository
So those are all the alternatives which came to mind when I tried to find a solution. Please leave a comment / answer if you have another idea or maybe you can remove / reduce the importance of those contra points.
Maybe you could include your own experience, so I get a better view for the whole problem.
Currently I would just try to make the package more helpful to the greater majority but this does not work in all cases.
Thank you all for your time!
Installing from git is pretty standard feature in package managers. npm doesn't have Github-support, it's generic support for any git repo. Unless you can find some discussion about deprecating it from npm, I'd not worry about it. It's used internally in many companies for private packages.
Of course, there is still some trade offs: build artifacts and maybe a bit more clumsy workflow. Things like npm outdated doesn't understand git semver. For build artifacts, I have seen many projects to commit them to master branch to support direct git-install. If you look around older open source projects for example, that's the case quite often.
We went for a private repository with verdaccio running in a docker container, which is very similar to version A. It took some setup, but for our developers all it took was a single npm command to add the private repo "in front of" npm for all packages of the namespace we created. Granted, our packages are project specific, but in a private repository that does not really matter either way, does it?
We considered the local package option at first, but the drawbacks were just too big for us, even if it's very easy to setup.
I'm not sure this helps, but this is at least the setup we decided upon when we had the same issue a few months ago.

Updating transitive dependencies of a NPM package

Our company has a few web applications which in turn depend on a very long chain of internally created and hosted npm packages (we use JFrog Artifactory) each with their own dependencies (and so on). Whenever a bug is fixed or a feature is implemented in a low-level package the current process requires a developer to check in their changes, wait for the CICD build to complete and tests to run, update the parent package, and rinse / repeat all the way up the chain (which can be a very long process).
This can't possibly be a unique situation yet it impacts our productivity greatly and encourages monolithic package development to limit number of packages to update instead of proper code separation.
I can only think of two solutions:
1) Update the web application to use the transitive dependency directly in package.json. This however breaks "encapsulation" because how a direct dependency manages its job shouldn't be known to the web application. If the direct dependency were to use some other transitive dependency later the web application shouldn't be left referencing a now irrelevant package.
2) Modify the web application's package-lock.json to point at the new version of the transitive dependency. This however seems to only work temporarily as merge conflicts or new installs of direct dependencies tend to revert these changes.
I realize that the answer might be to optimize the build / publish process to be less painful and manual but I was hoping others might have encountered a different solution.
FYI - All dependencies are installed with '~' as a version prefix by default.
The correct way to do this (Be in mind that this feature was introduced recently in npm 8.x) is using the overrides section. This new feature introduced allows replacing a package in your dependency tree with another version, or another package entirely.

How to make npm use the lowest version that matches all requirements

We're using NodeJS for some projects and are faced with an issue that must have a simple solution (seeing as nobody else seems to have the problem).
In the packages.json there are a bunch of dependencies mentioned with a minimum version, each of which may have overlapping dependencies of their own. The default way a dependency is added is using the ^ operator which seems to mean 'compatible with' or 'same major version, but minor versions may differ'.
The way I understand npm to work is on npm install to take the highest minor version available that matches. Unfortunately 'compatible with' is not quite as enforced as you'd hope.
The situation this puts us in is that for instance on a developer machine version 1.1.0 is installed, but between development and publishing a new version 1.2.0, that has a bug, is introduced. On our build machine a fresh build is made which ends up using 1.2.0 and we've introduced a bug that wasn't there in development.
We tried changing the ^ operator to = for instance, but this gives us trouble when dependencies have subdependencies that aren't compatible with the requested version.
All in all I'm a bit confused, but this thing keeps biting us anytime something changes since the development machines don't do anything on npm install if the package is already there, but the build machine always gets fresh copies.
I know from NuGet that it always takes the lowest version that matches all combined requirements. Since this is always the same for a given set of dependencies, I much prefer this approach. Is there a way to make npm work like this too?
To answer my own question:
npm has introduced a new command npm ci which does something similar to npm install but enforces that the specific versions are used that were also used when a package was initially added by using the package-lock file.
See https://docs.npmjs.com/cli/ci for more information

Can yarn be considered a viable option as a replacement for bower and npm?

I should clarify that I'm not that experienced with front-end tools, so I'm sorry in advance if I'm asking something obvious and stupid.
So far I've been using bower for front-end and npm for server-side, although each of the tools mentioned has its advantages and by that I mean flat dependency management of bower (reduces load from the client) and nested dependency management of npm (helps with versioning), it has become quite cumbersome to use so many tools (webpack, browserify, etc.). I may've been using these tools the wrong way and could've used either of them with some option (not known to me) and have been only scratching the surface, I just took this answer as my rule of thumb and have been doing so ever since I've read it. It would be great if I could reduce at least these two to one.
Lately I've become curious about yarn and with all the hype around it, it seems as if it has been doing a good job and as if it will replace npm completely. As I've read the docs I've discovered the --flat option and that made me wonder if it would be possible to use it as bower replacement as well? If so, does it mean I could have either flat or nested dependency manager (by just having multiple JSON files for back- and front-end)?
I would really appreciate if someone could point me in the right direction!
It depends on your exact use case, but... probably.
Currently, the major trend seems to be towards module bundlers such as Webpack and Browserify (and hence either npm or Yarn) and away from Bower. You can read an excellent overview of the situation at NPM vs. Bower vs. Browserify vs. Gulp vs. Grunt vs. Webpack, along with some reasons why you might want Webpack instead of Bower.
At the minute, you're probably using HTTP, where it works out faster to have one JavaScript bundle file rather than lots of source files (as would occur with Bower). That's why Webpack and Browserify are so popular (among other reasons) — they should increase performance and simplify development a lot.
Side note: HTTP/2 will diminish the value of module bundling, because multiple requests will become far less costly. See What is the value of using Webpack with HTTP/2 for a more detailed description of the issues involving HTTP/2.
If you use npm or Yarn, it shouldn't really matter if the dependencies are nested—your frontend dependencies will all be bundled anyway with Webpack/Browserify, so the main cost of using nested packages is that it takes up more space and more download time.
Since npm v3 and Yarn can do flat installs, there shouldn't be any issues with that anyway. In short: you probably can do it, and many other people are doing exactly that.
In recent days, there is an upward trend in the popularity of Yarn and its mainly due to couple of things different from npm.
One, it’s 100% deterministic i.e if you run yarn from any state, any time, 1000x times, it will still work the same way all the time. npm’s installs are nondeterministic. If you run it from various states, it will install different ways.
Yarn does some better caching too. In fact, it does it so well you’ll see a significant reduction in your install times. You can see 10x reduction in install times for big application.
Yarn also locks down your dependencies by default. It’s possible to do this with an npm shrinkwrap command but if you’ve ever had to maintain one of those, it can be messy.

How do you deal with people removing npm package versions?

Today I found that an npm package version, Babel 6.0.15, that my application relies on had been removed from npm.
This caused compilation failure on a new pc, and I had to go manually find the closest available version for it, and all the cascading version changes it affected on related packages.
What is the best of way of dealing with npm packages, now that I know they can go missing at any time?
Do you check your node_modules folder into source control?
Is there a rule on npm about what versions (major, minor, etc) may be removed by the creator, and which are more 'long term support' and must be retained?
How do you get npm locally to inform you when 'npm update' fails on a new pc, rather than silently failing?
After thinking about this for a while I wrote a blog post summarising what I think is best practice. Reproduced below:
Summary
Specify exact versions of all npm modules, e.g. “alt”: “0.17.8”
Commit your node_modules folder into source control
Don’t use DefinitelyTyped or any other external library Typescript definition tools
Why?
Some of these principles might be controversial, so here’s my reasoning:
Specify exact versions of all npm modules
Semver (semantic versioning) says that breaking changes should occur only if the major version changes. So you should be able to say just “alt”: “0.17”
But I’ve found in practice that even patch changes (bugfixes) can break your application – because libraries that rely on these library often expect some tiny behaviour in a particular version not to change. So in order for all your particular versions of particular libraries to work, they need to rely on exact versions of other libraries.
Commit your node_modules folder into source control
I first assumed that all versions of famous npm libraries would remain there indefinitely. But I then discovered that creators often remove old versions of their software from npm – which then breaks the cascading chain of exact version number dependencies you’ve configured for your app.
Yes, committing all your npm libraries will take up space in your repository, but they’re text files after all, not .DLLs, so they’ll get compressed really small. And the alternative is one day not being able to compile your app at all on a new computer because a library has been completely removed from npm.
Don’t use DefinitelyTyped or any other external library Typescript definition tools
It’s wonderful to be given compile errors for external tools you use. But I’ve found it’s not worth the effort because:
there’s no way to match the definition file version number and npm library version number, so you get definitions that are out of sync with the library you are using
they often have bugs
the type bugs you catch at compile time are probably going to occur in your own app, not in how you call external libraries
Instead of using .d.ts files for external libraries, just say:
declare module 'lodash'
{
let x: any;
export = x;
}
Or use the –allowJS flag in Typescript 1.8 onwards.