Configure `.npmrc` to get one scoped package module from npm and the others from github packages - npm

I'm using a scoped package in my application, some modules from it are stored on GitHub packages and the rest are in npm registry. Till now I was using only one module that is stored on GitHub, but now I need to install another one stored on npm.
Currently my .npmrc file looks like this:
registry=https://registry.npmjs.org/
#custompackage:registry=https://npm.pkg.github.com/
I want to inform npm to install specific scoped module from npm registry and keep installing others from GitHub packages. Updating .npmrc like this doesn't work (it continues looking the subpackage on GitHub):
registry=https://registry.npmjs.org/
#custompackage:registry=https://npm.pkg.github.com/
#custompackage/module1:registry=https://registry.npmjs.org/
Is it possible at all to configure .npmrc to get a part of scoped package modules from npm and the rest from GitHub pages?

Related

Does npm install have an equivalent to pip install --no-deps?

I'm more familiar with the Python ecosystem at this point and have a question about how I can do something with npm that I'm used to doing with pip.
Let's say I have a wheel for a particular Python package, as well as a wheel file for each of the Python package's dependencies. And let's say I have all these wheel files in a folder called /path/to/wheel/files. To install this package and all of its dependencies, I could run something like pip install /path/to/wheel/files/*.whl --no-deps, where --no-deps keeps me from having to install the various dependencies in the proper order.
Does npm have an equivalent to this? I'm using npm-offline-packager to create a tarball that contains a Node package (as its own tarball) and all of its dependencies (as their own tarballs). I know I can tell npm install to install a particular tarball. However, when I do this, it tries pulling in the required dependencies from the online NPM registry instead of pulling in the dependencies from the tarballs I already have.
Ideally, I'd like npm install to use the tarballs to add the main package to my project's package.json while adding the package's dependencies to my project's package-lock.json. And of course, I'd also like the main package and all its dependencies to be installed to my project's node_modules directory as well.
TL;DR Does npm have something equivalent to pip install /path/to/wheel/files/*.whl --no-deps?
I'm responding to my own question here, but note that my answer is only applicable to my particular use case and may not be applicable in general.
For my use case, I have access to two computers: one that has access to the internet and one that doesn't. For the machine that doesn't have access to the internet, I was attempting to use Verdaccio as a way of creating a self-hosted NPM registry. However, publishing packages to Verdaccio wasn't working because it kept trying to pull in the package's dependencies from the public NPM repository. The solution was to remove all references to "npmjs" in Verdaccio's config file (which, for me, Verdaccio created at ~/.config/verdaccio/config.yaml).
So, in case anyone needs to do development on a machine that doesn't have access to the internet, the process for setting up Verdaccio looks something like this:
On the machine that has access to the internet, create an NPM project using npm init (I called my project "verdaccio_runner"). The reason I did this is because, without already having an NPM registry on the machine that doesn't have access to the internet, it was hard doing a global install of Verdaccio.
Run npm install verdaccio to install Verdaccio to the NPM project that was created in the previous step.
Transfer this project over to the machine that doesn't have access to the internet.
Once it's transferred over, run Verdaccio from the project like this: npx verdaccio.
Quit out of Verdaccio.
Remove all references to "npmjs" from the config file that Verdaccio created (again, mine was at ~/.config/verdaccio/config.yaml).
Run Verdaccio again to pull in those changes.
Tell NPM where your private registry is: npm config set registry http://localhost:4873/.
Add yourself as a user by running npm adduser and by then filling out the information you're prompted for.
And the process for publishing packages to Verdaccio on a machine that doesn't have access to the internet looks like this:
For the package you want to install, on the machine that has access to the internet, run npo fetch <package name> --no-cache (assuming you've already done a global install of npm-offline-packager on the machine that has internet access).
Bring the tarball that npo created for you over to the machine that doesn't have internet access.
Untar the tarball.
From the directory that's created, run for file in ./*.tgz; do npm publish $file; done.
The published packages can now be npm installed to projects on the machine that doesn't have internet access.
Note: in order for Verdaccio to be accessible to other machines on the private network, I also had to add the following to Verdaccio's config file:
listen:
0.0.0.0:4873

Using Gitlab as Proget's feed with a unique place to store packages

Is it possible to use gitlab's package repository to feed our npm packages as well as public packages online.
On proget it is possible to register common npm packages and my private npm packages under the same URL using a proxy. Is it possible to do the same with Gitlab so that pointing to gitlab's repository in the .npmrc would be enough to install all the dependencies ?
Yes, you can have a different registry for your personal packages and e.g. company packages. You can reference them by #my-gitlab-username/foo-package or #company/bar-package.
NPM packages hosted on npmjs.com which get installed by npm install <package> will always be resolved if the lookup on your provided Gitlab package registry fails. Usually you do not have to provide a separate proxy.
Multiple private/non-public registries can be targeted by using npm install #company/<package>. So there should be no issue in targeting multiple Proget and/or Gitlab npm registries at the same time.
Authentification is described here: https://stackoverflow.com/a/42648251/4236831

using npm link with web3

I'm trying to patch one of the web3 packages and use the patched version in my node script. I'm confused about what to use in the npm link program: web3 or web3.js.
Here's what I did:
Cloned the web3 repo.
Executed npm bootstrap (which linked web3 subpackages).
Ran npm link (which put a link to the web3.js folder into my global modules folder).
Created a project named web3test and ran npm install web3 for it.
Now I don't know how to link my project to the local copy of web3. If I run npm link web3, it puts a web3 folder in the global modules directory, which is not the same as my web3.js repository. But my project is supposed to use web3, not web3.js, so it doesn't make sense to link to web3.js.
I'm on Windows 10.
I realized that web is actually a package within web3.js. So I went into the \web3.js\packages\web3 directory and executed npm link from there. Of course, I had to also run npm build for the main package so that all web3 packages were built.

How to nicely include a private NPM dependency in a lerna workspace project?

To preface, the project is using lerna with yarn workspaces and we are pulling in an internal NPM package from our private npm registry (not hosted with npm).
I currently understand how .yarnrc and .npmrc files can authenticate to private registries but our current project has settings already changed in these files. A developer could add credentials in these files but then they cannot commit the file to github.
I was hoping to find a solution where a developer on the project can add the private registry credentials in a .env file and then auth with said credentials somehow in a hook before lerna runs "install"
My main goal was to make it easy to work with the project and not rely on having each developer run 'npm login' or some other commands besides just including the correct credentials in their .env file. This also makes it easy for CI/deployment pipelines.
Are there any specific lifecycle hooks that can run before install in a lerna package?

Treat project file as npm module

Is there a way using npm to treat a file in the project as a node-module without linking it and it having it's own package.json?
Ideally I could just have a sub-module definition within my main app package.json and be able to install things to a specific module that way.
Here's an example
/app
/index.js
/file.js
/action.js
What I'd like
npm set-as-module ./action.js "action"
Then within any file in my project I can call
var action = require("action")
Then when I want to install specific dependancies for action I could do this
npm install underscore --save --sub=action
This this kind of feature exist within NPM? Anything close to it?
This would offer the following perks
Easy to branch out or publish into full module
ability to require with module string instead of path
I created something to do this
https://github.com/reggi/npm-link-file
npm install npm-link-file -g
npm-link-file ./action.js "action"
It does not handle linked dependancies.