NPM global install still creating a node_modules directory - npm

In Ubuntu, I have a directory with my code /code which has my packages.json and such. My NPM prefix is /usr via npm prefix -g. I am trying to install packages globally so as to not create a node_modules directory in /code, but really as long as node_modules is not in /code, that will also work.
While in /code, I run npm install -g. I see a .staging directory created under my global prefix (eg, /usr/lib/node_modules/.staging) but then NPM takes another step and begins creating a /code/node_modules directory.
If I instead just run npm install -g <package> it correctly ends up in /usr/lib/node_modules/<package>. But I do not want to have to run npm install -g <package> for every package I need.
Why is NPM doing that and how can I make it stop doing that? At the very least, I would like the node_modules directory to NOT be in my /code directory (this is due to some environment constraints I can't change).

Globally installed packages are (almost always) not going to be used by your code. That's not how node or npm (conventionally) works. There isn't a shared global node_modules that programs all look in. (OK, there kinda is, but you don't want to use it if you can avoid it. Read on.) Having a local node_modules helps you avoid dependency hell. Global modules are for CLI tools that end up in your PATH and things like that.
You don't explain why you don't want a local node_modules for your project, so I will mention that if it is simply because it somehow bothers you, then please please please just get over it. You will be vastly happier. Type npm install and move on.
That said, for legacy reasons, your code in /code will look in /node_modules if there is no /code/node_modules. This is true for CommonJS modules that use require(). If you are using ESM modules via import, you may need to read the docs.
So, if you are using CommonJS/require(), you could have a directory structure like /project-name/code. You could have your index.js or whatever other code in the code subdirectory, while your package.json could be in project-name. Run npm install from /project-name and it will create a node_modules directory there rather than in the code subdirectory, and your Node.js files in code will find those modules.
If you don't like that, then for other options, you can review the relevant docs. At the time of this writing, those options include a NODE_PATH environment variable, $HOME/.node_modules, $HOME/.node_libraries, and $PREFIX/lib/node. However it all comes with this caveat that I implore you to abide by if you can:
It is strongly encouraged to place dependencies in the local node_modules folder. These will be loaded faster, and more reliably.

Related

Automatically downloading npm packages listed in package.json file

I'm working on creating a local repository that will contain all packages I use in my project, so I can have those packages installed on a machine that does not have access to the internet. I think of the repository that I could clone on the machine and run yarn install to have all the packages available in the project from the local repository. How can I do that? Similar question was asked here Using npm how can I download a package as a zip with all of its dependencies included in the package
There's not enough information in your question to fully understand your situation, but if you commit your node_modules directory to the repository, the modules will be there without the user having to run npm or yarn to install them. This assumes the user will run code from the repo workspace and that there aren't any modules that require a compilation step or other build step that may be platform-specific. But if they're all plain ol' JavaScript modules, you should be fine.
If you want to have all the modules as a separate repo rather than checking in node_modules, I can offhand think of two ways this might work.
Have the packages repo be a check-in of a fully installed node_modules directory. Then make that repo a Git submodule of the main repo that gets cloned as node_modules in the main repo.
Use npm pack to create .tgz files for each package you need. Store those files in the packages repo. Clone that repo into a known path on your target machine. Have the main repo install via path names. For example, if you run npm install /var/packages/foo-1.0.0.tgz, it will add a line to your package.json that might look something like this: "foo": "file:../../../var/packages/foo-1.0.0.tgz". In that case, npm install will install from that path rather than over the network.

How can i prevent NPM to delete locally installed modules from nodes_modules

I have some local modules which are inhouse developed and I copy to my node_modules folder manually.
When I do this they work fine but after I install some other stuff via ng add or npm install the folder is removed. My question is how can I prevent this from happening so I don't have to copy the files again ?
You need to specify your dependencies in package.json or else you cannot rely on them being in node_modules. Various npm commands might remove it, notably npm ci but also others.
If your package is not publicly published, some options are:
Use a non-public registry and publish it there.
Publish it as a scoped package with limited visibility. You will need a paid or organization account on npm for this. Individual accounts are US$7 a month.
Use npm link to "install" it from your local file system.
Use a postinstall or other life cycle script to have npm copy in your packages for you each time after npm ci or npm install is run.
There are likely other options, but those are the ones that come to mind immediately.

Avoid 'npm install'?

Context
I have a .NET project and we use some npm packages for the UI. I use a pre-built check if the folder node_modules does not exist, run the command npm install.
When committing an update to package.json, npm install will not trigger for other people who are updating the repository. Because node_modules exists on their machine which leads to errors like Can't resolve....
Question(s)
Is the folder check obsolete? Is npm install smart enough to download only the necessary things and not all the dependencies? Or do i need a hash check on package.json?
There's no harm in running npm install multiple times as it will simply not do any operation in case there's nothing to do.
You don't need to check for the node_modules folder. npm will download and update any dependencies that are missing.
It's also important to run npm install since other machines may be running different systems and the dependencies may need to be compiled differently.
You can hash check package.json and/or package-lock.json for caching purposes, but it's not really necessary.

Install other package.json dependencies

Simple question : Is it possible, in a package.json, to reference another package.json, and install its dependencies ?
Thank you.
Yes, this is possible, and this is automatically done by npm install.
If you have pkg-a that depends on pkg-b, including pkg-a in your dependencies will install both pkg-a and pkg-b when running npm install. That is because dependencies are actually references to the package.json of other packages. NPM, upon running install, builds a dependency tree of all the packages that are indirectly required by your current project, and installs all of them in the node_modules directory, and keeps track of them all in package-lock.json.
Good question! but this is not possible as you cannot internally reference one json document from another (json is just a document format, it lacks any ability to process logic, import files etc), npm is configured to run using a single package.json file so your best best would be to put all your dependencies in a single package.json file or split your project into two directories with two separate package.json files, two npm installs etc, if for some reason you require your dependencies to be separate. You could then run your two node projects separately and connect via http if you wish.
The only way that you could come close to doing this would be to write an npm start script in the package.json that cds to another directory with a package.json and runs npm install, this would however only install the dependencies in the second directory node-modules/ folder

Speeding up NPM package install

When I deploy my app to AWS, it's copied into a new directory, so NPM will install all the same packages, during each deploy, which can take a lot of time. Most of these packages haven't changed between builds (if at all), so having it do a full npm-install seems like a waste.
My app server runs a bunch of different Node apps, so installing globally isn't an option. Instead I'd like to have the app store it's node packages in a location that isn't wiped out during deployment, but have the option to update packages as necessary during npm install.
Does NPM have a concept of an app-specific module directory that isn't located in a subfolder of an app? That way I can delete the app folder, and not have to reinstall the same packages over and over again.
I could achieve this by using symlinks, or migrating the current node_module directory.
If you lock down your dependencies versions, NPM is likely to cache the packages. So the installation wouldn't take much longer.
If you prefer not to do this, you can install dependencies globally and link them with the npm link command (which is basically creating a symlink yourself!). Then, it'll be up to you update the globally installed packages regularly.