How to set npm not to install packages that had been installed globally? - npm

My project references mocha, phantomjs, etc, which takes a lot of time to download during npm install. This is not a problem in my local machine because I only download them once and can use them forever unless I decide to manually upgrade them.
However, in my CI machine, my jenkins server need to download them every time that I did a git commit and git push to do the testing and deploy.
So can I just speed up that process by set the npm not to download these slow packages from the remote server? Rather, install them from local cache or not to install them if I installed them globally?
Anyone knows how to configure that?

I found some packages that might be helpful
npm-install-changed will run npm install only if the contents of package.json's devDependencies and dependencies were changed, note that it assumes that node_modules persists across different builds which might not be helpful if your CI server always start from scratch
npm-install-cache runs npm install and then copies your current node_modules folder (to somewhere in \tmp), if you call the script again it will verify any changes to package.json (instead of changes done on devDependencies or dependencies), if it didn't change then it will copy the node_modules folder stored in \tmp, the only limitation I see is that it's not cross platform and that the cache folder is \tmp which is erased on reboot (or maybe even when a is process finished!)
The second package might not work as it is but it seems like a good place to start :)

You can specify all of the packages you want to use locally in devDependencies in package.json, and then running npm install -d will install those instead of the main dependencies.

Related

Is there a way to only run npm scripts for installed packages (opposite of --ignore-scripts)

I am in a situation where I need to ship node_modules with the rest of my code because the destination machines do not have access to our private network (and our private npm repository).
My problem is that I want to execute everything that happens after npm downloads all the files so that individual packages can build themselves correctly for the target machine. Is there a way to accomplish this? Here are a couple other ways to phrase this question:
How can I run npm install, but skip the download step?
How can I run postinstall for installed node_modules only?
I finally got it figured it out. There were a couple of important steps to make this happen:
When we get ready to package our code for distribution, we download all of the npm dependencies with the --ignore-scripts and --no-bin-links option. This prevents any packages from building/compiling or linking any bin files. This is effectively only downloading the node_modules.
npm install --omit=dev --ignore-scripts --no-bin-links
We then distribute our code to the target machine and run the following command so that any compilations and bin links happen on the target machine:
npm rebuild

How can i prevent NPM to delete locally installed modules from nodes_modules

I have some local modules which are inhouse developed and I copy to my node_modules folder manually.
When I do this they work fine but after I install some other stuff via ng add or npm install the folder is removed. My question is how can I prevent this from happening so I don't have to copy the files again ?
You need to specify your dependencies in package.json or else you cannot rely on them being in node_modules. Various npm commands might remove it, notably npm ci but also others.
If your package is not publicly published, some options are:
Use a non-public registry and publish it there.
Publish it as a scoped package with limited visibility. You will need a paid or organization account on npm for this. Individual accounts are US$7 a month.
Use npm link to "install" it from your local file system.
Use a postinstall or other life cycle script to have npm copy in your packages for you each time after npm ci or npm install is run.
There are likely other options, but those are the ones that come to mind immediately.

Avoid 'npm install'?

Context
I have a .NET project and we use some npm packages for the UI. I use a pre-built check if the folder node_modules does not exist, run the command npm install.
When committing an update to package.json, npm install will not trigger for other people who are updating the repository. Because node_modules exists on their machine which leads to errors like Can't resolve....
Question(s)
Is the folder check obsolete? Is npm install smart enough to download only the necessary things and not all the dependencies? Or do i need a hash check on package.json?
There's no harm in running npm install multiple times as it will simply not do any operation in case there's nothing to do.
You don't need to check for the node_modules folder. npm will download and update any dependencies that are missing.
It's also important to run npm install since other machines may be running different systems and the dependencies may need to be compiled differently.
You can hash check package.json and/or package-lock.json for caching purposes, but it's not really necessary.

Check if npm dependencies from pacakge.json are installed without network

How can I have npm tell me if all dependencies are already installed without checking the network?
My goal is to first test locally if anything needs to be installed, and then only if something is missing, I'll run a normal npm install to install it. I'm trying to avoid the initial check across the network though if everything is already there.
This is also given a package.json file with fixed versions, since obviously allowing auto upgraded packages would always require a remote repository check.
Update:
I've tested npm list which doesn't seem to access the network, and it prints out "UNMET DEPENDENCY" if something is in package.json but not installed. Is that the best way to accomplish this?
I'll probably end up with something like:
npm list | grep -c 'UNMET DEPENDENCY'
I'm not aware of anything that will explicitly tell you whether dependencies were installed from a remote repo or not. However I think that the shrinkpack package will help you achieve your aim.
Shrinkpack will cache your npm modules locally and only contact a remote repository when existing modules change or new modules are added.
I've used this in the past to reduce the number of network requests required for an npm install.

Speeding up NPM package install

When I deploy my app to AWS, it's copied into a new directory, so NPM will install all the same packages, during each deploy, which can take a lot of time. Most of these packages haven't changed between builds (if at all), so having it do a full npm-install seems like a waste.
My app server runs a bunch of different Node apps, so installing globally isn't an option. Instead I'd like to have the app store it's node packages in a location that isn't wiped out during deployment, but have the option to update packages as necessary during npm install.
Does NPM have a concept of an app-specific module directory that isn't located in a subfolder of an app? That way I can delete the app folder, and not have to reinstall the same packages over and over again.
I could achieve this by using symlinks, or migrating the current node_module directory.
If you lock down your dependencies versions, NPM is likely to cache the packages. So the installation wouldn't take much longer.
If you prefer not to do this, you can install dependencies globally and link them with the npm link command (which is basically creating a symlink yourself!). Then, it'll be up to you update the globally installed packages regularly.