I'm getting unexpected behavior when using npm. For example when installing express with the command:
npm install express
I would expect that a folder named, "express" would be created in the "node_modules" directory and that it's dependencies would be installed within a "node_modules" sub-directory within this folder.
What I am seeing is that the "express" folder is being created but all of it's dependencies are being added to the root "node_modules" directory (same level as express) in my project folder and not nested within the "express" folder.
Why is this happening? (using npm v3.3.5)
It's a design change for npm#3, it deduplicates by default. See:
Flat, flat, flat!
Your dependencies will now be installed maximally flat. Insofar as is
possible, all of your dependencies, and their dependencies, and THEIR
dependencies will be installed in your project's node_modules folder with no
nesting. You'll only see modules nested underneath one another when two (or
more) modules have conflicting dependencies.
https://github.com/npm/npm/blob/ff47bc138e877835f1c0f419fea5f5672110678a/CHANGELOG.md#flat-flat-flat
https://github.com/npm/npm/issues/6912
Related
I have a repository that has a bunch of peer dev dependencies (this repo is a standalone eslint config, like airbnb).
If I used this in a another project as a node module (importing it through github, not retrieving it through npm), will the package-lock.json file that is generated in this module be used in the callers project?
No. Per the documentation, package-lock.json files "will be ignored if found in any place other than the toplevel package."
If you want lock-file behavior in a dependency, use a shrinkwrap file instead.
Assuming you are still installing with the npm client, the fact that you are retrieving the package from somewhere other than the npm registry doesn't affect this behavior.
I have built a project generator for my company. It's a globally installed npm package, that when run, takes the entire contents of a /template directory inside the package and copies it to the user's chosen destination.
Inside /template I have 2 files that npm pack refuses to include in the final published module:
/template/.gitignore
/template/.npmrc
Everything else, including other hidden files, get packed as expected.
These 2 files are not in any root (or nested) .gitignore files, and I'm not manually specifying any files array in any package.json that npm might pick up on.
This is intentional behaviour https://docs.npmjs.com/misc/developers#keeping-files-out-of-your-package
The workaround was to create .gitignore.template and rename on install
Explicitly adding to files in package.json will force npm pack to include the files
"files": [
"dist",
"dist/template/.npmrc",
"dist/template/.gitignore"
]
Simple question : Is it possible, in a package.json, to reference another package.json, and install its dependencies ?
Thank you.
Yes, this is possible, and this is automatically done by npm install.
If you have pkg-a that depends on pkg-b, including pkg-a in your dependencies will install both pkg-a and pkg-b when running npm install. That is because dependencies are actually references to the package.json of other packages. NPM, upon running install, builds a dependency tree of all the packages that are indirectly required by your current project, and installs all of them in the node_modules directory, and keeps track of them all in package-lock.json.
Good question! but this is not possible as you cannot internally reference one json document from another (json is just a document format, it lacks any ability to process logic, import files etc), npm is configured to run using a single package.json file so your best best would be to put all your dependencies in a single package.json file or split your project into two directories with two separate package.json files, two npm installs etc, if for some reason you require your dependencies to be separate. You could then run your two node projects separately and connect via http if you wish.
The only way that you could come close to doing this would be to write an npm start script in the package.json that cds to another directory with a package.json and runs npm install, this would however only install the dependencies in the second directory node-modules/ folder
I have repository with app and modules. Modules includes in app by package.json like:
"application-module": "file:../modules/application-module"
After yarn install that dependency added to node_modules.
I want to make module-base application. app folder reproduce root module. Other modules like admin-panel-module, account-module should be in modules folder. So, app may have node_modules inside, but also modules folder for modules. Those modules will add by git subtree from another repos. So, this way I can develop independently
Is any way to avoid adding and use local directories?
Multiple node_modules and package.json
In any node/npm project, you can have multiple package.json across your directory tree, like:
app/
package.json
node_modules
src...
account_module/
package.json
node_modules
src...
admin_module/
package.json
node_modules
src..
When you invoke yarn (or npm install ofc) on any of the children modules, the dependencies listed in their local package.json will be installed on the local node_modules folder.
So basically you can solve your issue ensuring that every children has their own package.json with their dependencies.
Still, you can place common dependencies in the root app folder. If all your projects for instance use lodash, you can place the lodash dependency in the add's package_json. After performing yarn in the app folder, the lodash package will be installed in the app's node_modules.
After that, if you:
require('lodash');
In any of the children, they will search for lodash in the app's node_modules folder if they don't find lodash in their own node_modules.
If you don't have a root node_modules, you can still declare a package.json local to any of the submodules, and they'll have their own node_modules.
So maybe you may want to avoid common dependencies at all, or maybe you want to store common dependencies in the app folder. Npm has you covered either ways.
However, if you don't want to handle common dependencies, yet are concerned about having to store a lot of duplicated packages in local machines, you may want to checkout pnpm, which is a wrapper over npm which allows to save a lot of space in local development machines.
I'm having (almost) the same issue with yarn.
I have create a package (typescript) called "test". In this package, after built, there are mainly 3 directories : dist (after built), node_modules and src.
Then I have created another package "test2" and in this package, have added "test" as a dependency "yarn add c:/.../.../test".
The package is the well installed in the node_module of test2. BUT, in "test2/node_modules/test", I can find the node_modules of test ( "test2/node_modules/test/node_modules". Why ? It increases the size of the package a lot.
In both tsconfig.json (test and test2) node_modules is excluded...
thks
Does it make sense to put any modules into package.json dependencies when I use webpack?
When I want to develope a package, I use git clone <url> then npm install, then npm installs all dependencies and devDependencies from package.json file and it makes sense.
When I'm end-user and I just want to install some package into my node_modules to use it in my project, I run npm install package-name, then npm installs package-name with only its dependencies, and it makes sense too.
But does it make sense to put any modules into dependencies when I use webpack? The webpack will bundle all dependencies into eg. bundle.js, so, for me, there is no need to install dependencies then (while they're included into bundle.js file).
Let's assume that I put all neccessary modules into devDependencies (keep the dependencies object empty) for my project: my-project, bundle it with webpack and publish:
the developer-user will use git clone <url to my_project>, then run npm install, then npm will install devDependencies from package.json (and ommit empty dependencies object), then it is ready to develope.
the end-user will use npm install my-project, then npm will install my-project, do not install devDependencies (because this is for production) and do not install dependencies (because dependencies object in package.json remain empty). Putting anything into dependencies would double the dependencies: both the dependencies would be installed, and the same dependencies would be accessible in the bundle.js file.
Am I right?
You are correct that there may be no dependencies once it's been transpiled with webpack. However, some packages are multi-purpose and may be used in multiple ways, so in some circumstances dependencies may still be required.
If you look at the package.json specification, there are two possible entry points, 'main' and 'browser'. There is also the proposed 'module' entry point. It is currently under discussion about how to handle these in webpack and users seem to want webpack to prioritize them as module > browser > main, however browser is currently used by webpack first.
The idea behind prioritizing them in the order module > browser > main is presumably that browsers could use the pre-transpiled stuff directly in "browser", whereas another project calling require() or include() on your package would use non-transpiled code from the "module" entry. The "module" entry code could contain modern JavaScript with new features and the project/package requiring it could then transpile it to their own specifications, using "browserslist" for example.
I found this question because I was wondering the same thing...