Does it make sense to put any modules into package.json dependencies when I use webpack?
When I want to develope a package, I use git clone <url> then npm install, then npm installs all dependencies and devDependencies from package.json file and it makes sense.
When I'm end-user and I just want to install some package into my node_modules to use it in my project, I run npm install package-name, then npm installs package-name with only its dependencies, and it makes sense too.
But does it make sense to put any modules into dependencies when I use webpack? The webpack will bundle all dependencies into eg. bundle.js, so, for me, there is no need to install dependencies then (while they're included into bundle.js file).
Let's assume that I put all neccessary modules into devDependencies (keep the dependencies object empty) for my project: my-project, bundle it with webpack and publish:
the developer-user will use git clone <url to my_project>, then run npm install, then npm will install devDependencies from package.json (and ommit empty dependencies object), then it is ready to develope.
the end-user will use npm install my-project, then npm will install my-project, do not install devDependencies (because this is for production) and do not install dependencies (because dependencies object in package.json remain empty). Putting anything into dependencies would double the dependencies: both the dependencies would be installed, and the same dependencies would be accessible in the bundle.js file.
Am I right?
You are correct that there may be no dependencies once it's been transpiled with webpack. However, some packages are multi-purpose and may be used in multiple ways, so in some circumstances dependencies may still be required.
If you look at the package.json specification, there are two possible entry points, 'main' and 'browser'. There is also the proposed 'module' entry point. It is currently under discussion about how to handle these in webpack and users seem to want webpack to prioritize them as module > browser > main, however browser is currently used by webpack first.
The idea behind prioritizing them in the order module > browser > main is presumably that browsers could use the pre-transpiled stuff directly in "browser", whereas another project calling require() or include() on your package would use non-transpiled code from the "module" entry. The "module" entry code could contain modern JavaScript with new features and the project/package requiring it could then transpile it to their own specifications, using "browserslist" for example.
I found this question because I was wondering the same thing...
Related
I have a library that uses npm packages that I want as peer dependencies where the app would have to install them (eg. axios, react, react-dom). The goal is to avoid having libraries add to overall bundle size of the app.
Should I add these only under "peerDependencies" or is it okay to add it to "dependencies" as well? I thought it would be good to remove them from "dependencies" just to be sure that they don't get installed on top of what the app already has. However, if I exclude them from "dependencies" then my library tests start to fail because they are only peer dependencies and npm didn't install them.
I thought of installing them manually or adding them as devDependencies just for the purpose of running tests, but I think there has to be a better way. If I add both "dependencies" and "peerDependencies" does npm actually ignore the same dependencies in peerDepenedencies?
Lets say there is a monorepo with package A and B.
Package A is dependent on B. Package B is also published in npm registry. So when installing dependencies it does not install the package B from npm registry instead it symlinks to the local package B as intended.
But is there any way to avoid this behvaiour and always resolve package from npm registry?
Currently, it seems it's not possible out-of-the-box. For example, nohoist option in yarn is for packages-dependencies which don't exist in the monorepo. For example, react-native and so on. For same-monorepo packages being consumed by their sibling packages, nohoist does not work, I tried setting it on root package.json and also on the package. It does not work.
Notice I said "out-of-the-box".
I came here looking for a solution because I want to migrate my custom ESLint plugins to my npm package monorepo which happens to consume them.
The challenge is, ESLint plugins are not ESM, they are CJS, and in TS/pure ESM monorepo, ESM and CJS don't play well together.
My CJS ESLint plugins would get hoisted and ESLint plugin would refuse to load them, completely breaking my VSCode linting.
I came up with an idea to avoid this hosting by keeping npm packages which should not be hosted with a different name, for example, B-temp, and then, on CI, during publishing, rename them last second before publishing to npm to B.
This way, during local yarn/npm i call, local monorepo package B-temp would not get recognised and therefore, not hoisted. The yarn would fetch B from the internet. I could keep b-temp in ESM (set "type": "module" in package.json) then use esbuild to transpile to esnext-spec CJS (no transpiling to ES5 etc), then publish those dist/*.cjs.js builds to npm — except before publishing, edit the package.json, remove the "ESM-ness" — "type": "module" and exports.
Simple question : Is it possible, in a package.json, to reference another package.json, and install its dependencies ?
Thank you.
Yes, this is possible, and this is automatically done by npm install.
If you have pkg-a that depends on pkg-b, including pkg-a in your dependencies will install both pkg-a and pkg-b when running npm install. That is because dependencies are actually references to the package.json of other packages. NPM, upon running install, builds a dependency tree of all the packages that are indirectly required by your current project, and installs all of them in the node_modules directory, and keeps track of them all in package-lock.json.
Good question! but this is not possible as you cannot internally reference one json document from another (json is just a document format, it lacks any ability to process logic, import files etc), npm is configured to run using a single package.json file so your best best would be to put all your dependencies in a single package.json file or split your project into two directories with two separate package.json files, two npm installs etc, if for some reason you require your dependencies to be separate. You could then run your two node projects separately and connect via http if you wish.
The only way that you could come close to doing this would be to write an npm start script in the package.json that cds to another directory with a package.json and runs npm install, this would however only install the dependencies in the second directory node-modules/ folder
If you have a project that depends on packageA and you yarn add packageA but packageA has a devDependency on packageB to build, shouldn't that cause packageA to not work for you? Since packageA won't be able to build unless its devDependencies are installed too?
I guess my main question is if a pacakge has a devDependency on a built tool like babel, how does it get built and work when it gets yarn added by a project? Shouldn't build tools like webpack be a normal dependency?
No, they shouldn't, because the package that is yarn added is already built in an environment where the devDependencies are available. For example, when a package needs babel or webpack to build, then during the publishing a built bundle is created in a CI/CD pipeline that is valid es5 code and that is what you pull from npm. No build required after that.
GOOD MORNING :)
If you are having dependency problems on your dependencies of package.json, it is very simple to solve =]
What happens is that the dependency modules that the modules of your project need (dependencies) must be installed in the global npm as a package node (module), that is:
npm install -g youPackageName
If you have already installed a module in other projects or in the current project and want to turn it into a global package, you can use the command:
npm link youPackageName
I have a weird set of "local" npm modules that use TypeScript and depend on each other similar to:
A -> B, C
B -> C
C -> D
I need to run npm install and get all of my TypeScript compiled in order or it won't be able to find things properly. I'm under the impression I should use prepublish scripts to handle the TypeScript compile but it doesn't seem to cascade the prepublish request for local dependencies.
How am I supposed to set up a bunch of local modules with prepublish scripts such that they all get resolved appropriately when running npm install?
Another way to word what I am asking: How do I maintain multiple, local node modules and modify them at the same time? The modules have varying dependencies on each other and it is extremely inconvenient to modify them in isolation.
TypeScript compile but it doesn't seem to cascade the prepublish request for local dependencies.
Indeed prepublish only runs install for dependencies. Your dependencies should already be built (with prepublish) before putting them on NPM and installing them.
I figured out how to do what I needed. After updating to npm 3.3.9 and TypeScript 1.6, I was able to use a postinstall script to build things on the fly. The prototype lives here: https://github.com/MrHen/TypeScriptNpm
But the important pieces are:
// In the module's package.json
"scripts": {
"build": "gulp npmbuild",
"postinstall": "npm run build"
},
And:
// In the server's package.json
"dependencies": {
"hen-doodad": "file:../modules/hen-doodad",
"hen-widget": "file:../modules/hen-widget"
}
And:
// In the gulpfile
gulp.task('npmbuild', function() {
gulp_util.log('Detecting appropriate starting directory...', process.env.INIT_CWD);
var out = process.env.INIT_CWD + '/app';
var build = [process.env.INIT_CWD + '/**/*.ts', 'typings/tsd.d.ts', '!' + process.env.INIT_CWD + '/node_modules/**/*'];
var typings = 'typings/tsd.d.ts';
// ... do typescript build using above paths
Feels like a bit of a hack but this worked more consistently than prepublish. To run the whole thing, do an npm install inside of the server folder.
The gulp task is optional. Presumably you could use tsc directly.
It should be noted that this is most certainly not how you should be packaging up npm modules. The reason I had to do this involved details from a pre-existing build system.
Pitfalls from doing it this way:
Does not fit the standard publishing patterns used by npm.
The built files only ever live inside of the server's node_modules which can be awkward depending on what else you do with them.
Running npm install twice will not grab latest changes. You need to either remove the installed modules or update the version number on the module.
Each module gets its own TypeScript build instead of building en masse. If you can just build everything all at once you should do that instead.
Requires TypeScript 1.6 or greater due to how they auto-detect typings files included inside of node_modules.
Assuming that you are using TypeScript 1.6+ which support node module resolution. As you are trying to keep all modules synced, I guess your project is just starting.
I think symlinks would do the job, but if you have more concerns, please let me know.
More specifically, you may either create symlinks manually or take the advantages of npm link.
cd /path-of-a-module
npm link # this will create a link as global module
cd /path-of-your-app
npm link your-module-name
Then you may just maintain these modules happily.
As for dependencies configuration in your package.json file, it could be a git repository. But you could probably left the compiled *.js and *.d.ts files there and everything would just work.