Automatically downloading npm packages listed in package.json file - npm

I'm working on creating a local repository that will contain all packages I use in my project, so I can have those packages installed on a machine that does not have access to the internet. I think of the repository that I could clone on the machine and run yarn install to have all the packages available in the project from the local repository. How can I do that? Similar question was asked here Using npm how can I download a package as a zip with all of its dependencies included in the package

There's not enough information in your question to fully understand your situation, but if you commit your node_modules directory to the repository, the modules will be there without the user having to run npm or yarn to install them. This assumes the user will run code from the repo workspace and that there aren't any modules that require a compilation step or other build step that may be platform-specific. But if they're all plain ol' JavaScript modules, you should be fine.
If you want to have all the modules as a separate repo rather than checking in node_modules, I can offhand think of two ways this might work.
Have the packages repo be a check-in of a fully installed node_modules directory. Then make that repo a Git submodule of the main repo that gets cloned as node_modules in the main repo.
Use npm pack to create .tgz files for each package you need. Store those files in the packages repo. Clone that repo into a known path on your target machine. Have the main repo install via path names. For example, if you run npm install /var/packages/foo-1.0.0.tgz, it will add a line to your package.json that might look something like this: "foo": "file:../../../var/packages/foo-1.0.0.tgz". In that case, npm install will install from that path rather than over the network.

Related

Is there a way to install an npm package locally but not affect package.json or package-lock.json?

I have a project that I'm working on for a client where I have two private packages (which I can't get access to npm install) are inside the package.json.
I do however have access to clone the repos for those said packages. If I simply run an npm install I'll get a permission denied error. Same if I run npm link to the packages.
I've been working around this by removing the packages from the package.json then running npm install ../some-package. This works but isn't a great solution because if I wanted to add a new package I'd have to deal with a bit of a mess with the package.json.
Is there a better way than this?
I have tried running npm link ../some-package but I still get access denied. The only way I've managed to complete an install is by removing the packages then installing them from a local dir.
I don't know the details of your situation, but I see at least two potential solutions to explore.
Option 1: Install the package from the repo
I do however have access to clone the repos for those said packages.
You can install from a git repo and package.json will record that git repo as the source of the package rather than the npm registry.
From the docs at https://docs.npmjs.com/cli/v8/commands/npm-install:
npm install :
Installs the package from the hosted git provider, cloning it with git. For a full git remote url, only that URL will be attempted.
Option 2: Install from the local file system with --no-save
If that approach doesn't work for you, you can try npm install --no-save ../some-package as a build step. The --no-save makes it so it doesn't modify package.json.

How can I install all files from GitHub repo with npm and ignore file field on package.json?

I was using yarn in my project and use two git repositories as dependencies. Yarn get all the files from the GitHub repo, and ignore the field "files" in package.json or the .npmignore file.
Particullary I need the whole repo because. The reason is that I really need the generated stuff when I ran the commands there. I want to migrate to npm.

Does npm install have an equivalent to pip install --no-deps?

I'm more familiar with the Python ecosystem at this point and have a question about how I can do something with npm that I'm used to doing with pip.
Let's say I have a wheel for a particular Python package, as well as a wheel file for each of the Python package's dependencies. And let's say I have all these wheel files in a folder called /path/to/wheel/files. To install this package and all of its dependencies, I could run something like pip install /path/to/wheel/files/*.whl --no-deps, where --no-deps keeps me from having to install the various dependencies in the proper order.
Does npm have an equivalent to this? I'm using npm-offline-packager to create a tarball that contains a Node package (as its own tarball) and all of its dependencies (as their own tarballs). I know I can tell npm install to install a particular tarball. However, when I do this, it tries pulling in the required dependencies from the online NPM registry instead of pulling in the dependencies from the tarballs I already have.
Ideally, I'd like npm install to use the tarballs to add the main package to my project's package.json while adding the package's dependencies to my project's package-lock.json. And of course, I'd also like the main package and all its dependencies to be installed to my project's node_modules directory as well.
TL;DR Does npm have something equivalent to pip install /path/to/wheel/files/*.whl --no-deps?
I'm responding to my own question here, but note that my answer is only applicable to my particular use case and may not be applicable in general.
For my use case, I have access to two computers: one that has access to the internet and one that doesn't. For the machine that doesn't have access to the internet, I was attempting to use Verdaccio as a way of creating a self-hosted NPM registry. However, publishing packages to Verdaccio wasn't working because it kept trying to pull in the package's dependencies from the public NPM repository. The solution was to remove all references to "npmjs" in Verdaccio's config file (which, for me, Verdaccio created at ~/.config/verdaccio/config.yaml).
So, in case anyone needs to do development on a machine that doesn't have access to the internet, the process for setting up Verdaccio looks something like this:
On the machine that has access to the internet, create an NPM project using npm init (I called my project "verdaccio_runner"). The reason I did this is because, without already having an NPM registry on the machine that doesn't have access to the internet, it was hard doing a global install of Verdaccio.
Run npm install verdaccio to install Verdaccio to the NPM project that was created in the previous step.
Transfer this project over to the machine that doesn't have access to the internet.
Once it's transferred over, run Verdaccio from the project like this: npx verdaccio.
Quit out of Verdaccio.
Remove all references to "npmjs" from the config file that Verdaccio created (again, mine was at ~/.config/verdaccio/config.yaml).
Run Verdaccio again to pull in those changes.
Tell NPM where your private registry is: npm config set registry http://localhost:4873/.
Add yourself as a user by running npm adduser and by then filling out the information you're prompted for.
And the process for publishing packages to Verdaccio on a machine that doesn't have access to the internet looks like this:
For the package you want to install, on the machine that has access to the internet, run npo fetch <package name> --no-cache (assuming you've already done a global install of npm-offline-packager on the machine that has internet access).
Bring the tarball that npo created for you over to the machine that doesn't have internet access.
Untar the tarball.
From the directory that's created, run for file in ./*.tgz; do npm publish $file; done.
The published packages can now be npm installed to projects on the machine that doesn't have internet access.
Note: in order for Verdaccio to be accessible to other machines on the private network, I also had to add the following to Verdaccio's config file:
listen:
0.0.0.0:4873

Install other package.json dependencies

Simple question : Is it possible, in a package.json, to reference another package.json, and install its dependencies ?
Thank you.
Yes, this is possible, and this is automatically done by npm install.
If you have pkg-a that depends on pkg-b, including pkg-a in your dependencies will install both pkg-a and pkg-b when running npm install. That is because dependencies are actually references to the package.json of other packages. NPM, upon running install, builds a dependency tree of all the packages that are indirectly required by your current project, and installs all of them in the node_modules directory, and keeps track of them all in package-lock.json.
Good question! but this is not possible as you cannot internally reference one json document from another (json is just a document format, it lacks any ability to process logic, import files etc), npm is configured to run using a single package.json file so your best best would be to put all your dependencies in a single package.json file or split your project into two directories with two separate package.json files, two npm installs etc, if for some reason you require your dependencies to be separate. You could then run your two node projects separately and connect via http if you wish.
The only way that you could come close to doing this would be to write an npm start script in the package.json that cds to another directory with a package.json and runs npm install, this would however only install the dependencies in the second directory node-modules/ folder

Use artifactory without remote npm repository

I'm working in an environment where artifactory does not have internet access. We would like to use artifactory as a npm registry and host. Is it possible to upload external dependencies and their transitive dependencies?
For example: I'm on a computer with internet access and downloaded webpack and all its dependencies using npm install. Now I go to a different network with artifactory access and want to upload my node_modules Folder. Does that work somehow?
In addition to Artifactory's proxy/caching features, it can also host multiple local repositories (such as npm repositories) in it. This basically means that you can create an npm local repository in Artifactory and deploy any npm *.tgz packages (your dependencies) into this repository and Artifactory will generate the relevant metadata for your client. All you'll need to do is to deploy the relevant packages and configur your npm clients to resolve from Artifactory.
I have recently made an node module that should help with this problem.
You give it a list of packages that you want downloaded and it will download the packages with all dependencies as a tar.gz. It will then save them in the original npm folder structure, and create a tar.gz with everything inside.
You can then take the tar.gz with all your dependencies and deploy it to Artifactory using the deploy wizard.
When you deploy, select the checkbox "Deploy as Bundle Artifact". This will extract the tar.gz of packages and load them into the npm repository. Artifactory will read the package.json of all packages, and will load the relevant information, allowing you to pull packages with npm.
The package is called package-bundle, and can be downloaded from npm using npm install -g package-bundle
To download packages you can run the command pb bluebird express#1.0.1, which will fetch the specified packages, and all the required dependencies.