Does the javascript package manager have an impact on build time? - npm

Given there are now quite a few alternative package managers to npm, e.g. pnpm, yarn classic, and yarn berry.
Now, pnpm and yarn berry try to convince users to adopt their tools by showing run time differences for materializing node_modules, changing packages etc, pp. However, I wonder if the package manager (or the node_modules layout, to be precise) has an impact on build times.
Are there any numbers available?

Related

Can i safely migrate to pnpm from yarn

I am currently staying at a location where internet and disk space are at a premium and yarn/npm constantly having to install module every single time isn't the most efficient use of both my disk space and internet data which bring me to my question,
I recently came across pnpm and it perfectly solves my problem (it install modules in a central location and symlink them to your projects), my question is this if i completely migrate to pnpm will that affect the project setup if i was working with someone using yarn/ npm for instance and if i publish a project will the users be forced to use pnpm or they can use any package manager
It is recommended to pick one package manager for a project and force its usage by everyone. pnpm, yarn, npm all have their own lockfiles.
If it is a small project and you don't commit the lockfile, then you may use different package managers. But I still don't recommend this.

What is the prescribed way to maintain multiple VueJS, #vue/cli dependent applications over time?

Global NPM dependencies make me a bit nervous. I've developed several applications that need to be stable, rebuildable for years. Tools like #vue/cli are great, initially, but I'm concerned that as time goes by, the tool will be updated, but the products I've built will still need the earlier versions. So, ideally, all of their dependencies would be specified and included within the project folders. #vue/cli is designed to be installed globally and then generates package.json files specifying a set of additional #vue/cli dev dependencies. I tried installing the #vue/cli locally, but that only works if it's installed within a parent, container directory, which doesn't work well with monorepo setups. (Lerna will hoist and link common dependencies and #vue/cli can no longer find it's dependencies in the application directories it creates, where it expects them.)
Is there a recommended way to install #vue/cli locally so that I can maintain a set of applications over a long period of time, which may each have been initialized with different versions of #vue/cli?

Group project uses both NPM + Yarn. How to transition to use only one?

As title indicates, I'm working on a project where different members have used different tools (NPM and Yarn) for handling packages and modules etc.
We aim to transition to use ONLY Yarn (not our decision). Would anyone be able to share resources detailing how to accomplish such a thing? Or help quickly walk me through the steps?
I tried googling for answers but every single result is yet another article explaining why you should ditch NPM/Yarn and move your project to Yarn/NPM, without explaining the steps one would need to take to move from using both to just one mid-project. Thanks!
It looks like Yarn has a page talking about how to migrate to it from NPM:
https://yarnpkg.com/lang/en/docs/migrating-from-npm/
In most cases, running yarn or yarn add for the first time will just work. In some cases, the information in a package.json file is not explicit enough to eliminate dependencies, and the deterministic way that Yarn chooses dependencies will run into dependency conflicts. This is especially likely to happen in larger projects where sometimes npm install does not work and developers are frequently removing node_modules and rebuilding from scratch. If this happens, try using npm to make the versions of dependencies more explicit, before converting to Yarn.
As of Yarn 1.7.0, you can import your package-lock.json state, generated by npm to Yarn, by using yarn import.
They use many of the same files and structures. The important thing is to check-in the yarn.lock file and make sure everyone is installing using Yarn instead of NPM.
If you have a build server, you could probably use it to enforce those dependencies, but it would be more work.

Dependency resolution approach - comparing NPM to Homebrew?

I recently got confused and almost installed a tool via brew install when in fact it was an npm package and all I needed to do was npm install -g.
So these tools are strangely similar yet obviously different.
What's the difference in crystal clarity?
NPM exists to resolve dependencies for application code, on a per app basis, allowing an app to be self-contained and portable. This means that (in its default mode of operation) it will install the same stuff many times, uniquely, repeatedly, and separately, for every app on your system that needs the same package, inside of that apps own directory and isolated from everything else.
Homebrew is not like this. The reason is it serves the system itself, not individual apps, so is more comparable to just the npm -g part of npm.
There is one extra bit to understand about homebrew, though - some system packages have specific dependencies and could even have conflicts. This means that for the global installs that homebrew provides, it also has to solve some nesting and conflict issues. It's a kind of magic?

NPM Best Practices for Continuous Integration

I am building a HTML5 front-end using NPM-based tools (grunt).
One of the first steps of my continuous integration build process is to run an npm install.
npm install is SLOW. Even with a local NPM proxy caching artifacts (Sonatype's Nexus 3), it is still taking 4 minutes!
$> time npm install
real 4m17.427s
user 0m0.170s
sys 0m0.290s
If I follow my usual best practices for continuous integration, I would start from a pristine SCM repository and run the build. This means that each time the CI build will have to do a fresh npm install and take on the cost of 4 minutes.
This is a significant proportion of my build time. I am discontent that the build is taking so long.
The alternative seems to be to keep the node_modules around between builds. However, I've had problems with the build becoming unstable as a result.
Removing dependencies from package.json does not remove them from node_modules with a simple npm install. I can work-around this with an npm prune first.
What is considered to be best practice here?
Since March 5, 2018 and npm 5.7.1, you can use npm ci . This is much faster than a regular npm install because it omits some user-friendly features and installs packages for a userless CI environment.
The caveat here is that you'll need to make sure your package.json and package-lock.json files are in sync. If you install a new package, commit package.json but forget to do the same for package-lock.json, you'll get an error when running npm ci.
Considering that in order to build you must install new packages, you have no choice but to call install. As for pristine, I strongly believe they refer to the "build" process and not the "dependency management" process.
Why are they different? Let's go through an example to make it more apparent.
As a developer, when you first start your job, you MUST "install"
softwares that will enable to code. This is usually done once.
Afterwards, you can start coding. The later is the "build" part
as you are generating value for each feature your code produce. From
time to time, you can update your tool list by removing, adding or updating one.
In this example, installing your tools everyday you arrive at work before starting coding would be hell.
I would suggest you to make sure that the building process, which means producing an artifact (like a Jar for example), is decoupled from the dependency installation process. Meaning that installation is done once and building can proceed without trouble. You don't mention what will be built, but grunt can take care of the rest for sure.
Hence, I believe pruning and installing is a good strategy. You shouldn't worry for the fist times. Think of it as a cold start. Any system implemented with sub components working together as a pipeline have this "issue". Take a car for example. It will not be as fuel efficient when you start it as when you drive it after an hour.
Schedule a daily job to build a docker container with your dependencies. Run your CI job in the latest container. Artifact the CI job's build.
You should install npm packages offline in local machine or local network, you can found some tips here => Offline installation of npm packages
Have you considered using npm link or even symlinking your entire node_modules folder?
At least npm link could be used for your dev dependencies, which you normally want a controlled version of on the server anyway. This should speed things up a bit.