Is it possible to have two private registries with different authentication data in .npmrc? - npm

There is an npm package on artifactory_1 which is private. This means that the _auth property in .npmrc contains the authentication data for artifactory_1 in base64 and always-auth is set to true. This way the npm install works, the private package can be downloaded from artifactory_1 because the authentication data is provided from the _auth property.
Now I have an other private package on an other private artifactory (may it be artifactory_2). I want to use the private package of artifactory_2 in the same project, where artifactory_1 is already configured with its authentication data.
This means, that I should somehow add the new registry and the authentication data of it to the .npmrc. It should be possible to use the private package from artifactory_1 and the private package from artifactory_2 at the same time.
It is documented here, how is it possible to handle one private package in a project.
But I do not find any documentation/example for more private packages from different artifactories.
Any idea how should this be done? Is this even possible, what I want to do?

I did not make it to use repos from two different artifactories. I migrated the content of the first artifactory into a repo of the second artifactory. After that I was able to create a virtual repository, that showed the content of the two repos as one. This virtual repo is set up in my project as registry and this way I'm able to install the content of the two repos.

Related

Switching from NPM to GitHub Packages

I have a NPM package with a small user base, yesterday I created a new version and wanted to release it. I thought that I might as well make use of the new GitHub Packages and so I setup everything as GitHub suggested and released!
Now the problem is that I still have the old NPM page running on version 2.0.2 while everyone currently uses this as their dependency while the new GitHub package is on 2.0.4, Is there a way to 'synchronize' these two. Of course the GitHub Packages uses the <USER>/<PACKAGE> labeling while NPM just uses <NAME>.
Is the only thing I can do to publish on GitHub Packages and on NPM and just try to move users away from the NPM page?
If your publishing a public package, your better off just publishing it on NPM, as that is what most developers are used to.
I use GitHub Packages at work and the only advantage is that is effective free for hosting internal packages, as we are already paying for GitHub anyway. If it wasn’t for the zero price we wouldn’t be using it.
If you really want to force all your users to migrate to GitHub packages, and have to set up npm to work with it you could mark you old version npm deprecated and use that to point people to the new version.
https://docs.npmjs.com/cli/v6/commands/npm-deprecate
Here is another solution, but there is a catch.
Change your registry.npmjs.org package content to
index.js
export * from '#schotsl/my-package';
Now your registry.npmjs.org package is (almost) pointing to your npm.pkg.github.com package.
Only almost because any development directory for a project downstream of registry.npmjs.org/my-package, must configure the scope-to-server mapping for #schotsl/my-package to npm.pkg.github.com in a package manager config file.
In the case of package managers 'npm' and 'yarn' (v1) that can be done in
an .npmrc file at the same level as package.json.
The required .npmrc content is
#schotsl:registry=https://npm.pkg.github.com
# Github PAT token, packages:read authorization only ok
//npm.pkg.github.co/:_authToken="ghp_XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
The first line is the scope to server mapping.
The second line is a Github personal authorization token (PAT) with at least package:read permission. It is actually pretty liberal. A PAT with package:read issued from any github account will allow read access to every github accounts packages.
For the 'yarn' v2 package, the .npmrc file does not work, and instead a couple of keys need to be set in .yarnrc.yml.
Unfortunately there is no way to set the scope-to-server mapping and the token inside the registry.npmjs.org/my-package package itself.
Putting the .npmrc file in there doesn't work, it is ignored. And that wouldn't be a good solution anyway, because not all package managers read .npmrc files.
That is the 'catch' - using npm.pkg.github.com packages requires package manager specific config settings to be made by every downstream developer.
In addition, what if two different upstream packages have colliding scope names, each mapping to a different server? The current config methodology fails in that case.
Feature Proposal not current behavior
Ideally, there would be a common interface agreed upon by all package managers inside package.json - and the scope-to-server mapping would be defined in the package that directly references the scope. For example, in the package.json of my-package on registry.npmjs.org
{
dependencies:{
"#schotsl/my-package":"1.0.0"
},
registries:{
"#schotsl/my-package":"https://npm.pkg.github.com",
},
auths:{
"https://npm.pkg.github.com":"ghp_XXXXXXXXXXXXXXXX",
},
}
Then downstream users would not need to config for each scope, and predictable (and risky) problems with scope name or package name collisions would not occur.
But that is not the way it is. Therefore Github Packages (npm.pkg.github.com) doesn't really seem to be a feasible way to provide public packages which may become dependencies of other public packages. No problem for private packages though.

How can I shim access to a private repository with scoped npm packages on a machine that can't access it?

Let's say I have a private, scoped NPM repository that lives behind a corporate firewall. I'd like to set my project up on another computer that will not connect to the VPN, so it will not be able to access that private repo.
How can I set up my project to easily import those dependencies from local folders and/or my local npm cache and skip the private repo?
That is, if my package.json file has...
"dependencies": {
"#privateRepo/some-library-framework": "4.2.1"
}
... and I can't get to the server, but I can get the files that are needed and would've been installed from another node_modules folder that lives on a machine that can access the repo.
I tried taking the files from the packages in #privateRepo and using npm cache add D:\path\to\lib\with\packageDotJsonInside for each of them, but still got...
Not Found - GET https://registry.npmjs.org/#privateRepo%2some-library-framework - Not found
... when I tried to npm i the rest.
I think that means that I need to set something up in .npmrc like is described here...
registry=https://registry.npmjs.org/
#test-scope:registry=http://nexus:8081/nexus/content/repositories/npm-test/
//nexus:8081/nexus/content/repositories/npm-test/:username=admin
//nexus:8081/nexus/content/repositories/npm-test/:_password=YWRtaW4xMjM=
email=…
... where you'd normally set up auth, but where you're also setting up the URL to a scoped package. I think I want to set up #privateRepo:registry=http://localhost/something/something here.
But I think that also implies I would at least need to create a local webserver (or npm repo?) to answer requests (and then maybe I'm looking for something like verdaccio?).
So, simplest case, is there a way to force the app to use the cached version or is there more I need to shim? If not, what's the easiest way to create a local repo to serve those packages in the place of the private repo?
Seeing nothing better, the easiest answer does seems to be setting up a local npm repo. You can then set up your .npmrc to point to localhost for the scoped private registry instead of the "real" version behind a VPN.
And as it turns out, Verdaccio actually does exactly this -- you could also use it to host a "real" private repo, including behind your firewall, but installing on your dev box will allow you to provide your npm packages to any new codebase locally.
This is described in some detail by this video that's linked on Verdaccio's docs site. Here's the quick version:
Install verdaccio: npm install --global verdaccio
Run verdaccio: verdaccio
You can then check out its interface at http://localhost:4873/ (or elsewhere if you changed defaults)
Create a user: npm adduser --registry http://localhost:4873
Login: npm login --registry http://localhost:4873
You can now log in as that user on the web UI too, if you want.
Navigate to your packages' files. Navigate into the folder that's package specific.
That is, if you pull all of your packages from another project's node_modules, you need to go into each folder where the individual package's package.json file lives to publish it.
Publish the package: npm publish --registry http://localhost:4873
You can double-check that it "took" by refreshing the web UI.
Repeat for each additional package.
That's it! You now have an npm repo for the packages you can use to remove the VPN requirement for running npm i. Just schlep the new versions of the packages over to your local npm and publish them as appropriate.
You will need to set up a scoped entry for this registry in your .npmrc, but you were already doing that for your repo behind the firewall, so no big deal, right?
Ready to move the check for a better answer, but this seems like it oughta work.

Why does my project require multiple npmrc registries when the artifact already includes them as upstream sources?

I have two mono repositories that use Node/NPM/Lerna to manager and distribute multiple packages.
Project X includes .npmrc file with a single registry. This registry is for a private Azure Feed that includes three upstream sources. Named ( A, B, C )
A - Public NPMJS
B - Private Package
C - Private Package
Project Y requires .npmrc file with two namespaced registries.
The first is the same that Project X uses.
The second *seems* to be required and #register's the Azure Feed for B.
My question is if my .npmrc file is registering a package that contains three upstream feeds, why do I additionally have to register one of these upstream feeds in my .npmrc file (B)?
It was my understanding from documentation that Upstream Feeds are an alternative approach to registering multiple namespaced packages in your .npmrc file.
I'm clearly missing something. Please assist. Thanks
You are correct that Project X is using the set-up that we recommend, which is a single Azure Artifacts feed in your .npmrc's registry= line that upstreams to any other feeds you need.
However, Project Y may have chosen to use scopes to instead only use limited packages from Azure Artifacts while pulling most packages directly from npmjs.com. We generally don't recommend this, but it's a valid way to work. Note that in this configuration, Project Y does not get to take advantage of the benefits of upstream sources (like a saved copy of anything you use from npmjs.com, in case it's later deleted).
If you want to migrate to the recommended configuration, try ensuring that Project Y's .npmrc has a single registry= line:
registry=https://dev.azure.com/.../ProjectXFeedWithUpstreams/...
and see if it still builds (delete or rename node_modules when you run npm install).

Overriding default npm registry for single packages served from local folder

I'm adding several dependencies to a project that currently uses the default npm registry. Obviously the dependencies cannot be resolved since the packages are not found there.
I'm wondering if I can provide the packages via a folder or zip file instead and tell npm to bypass the registry for certain dependencies and take the packages directly from the folder. I want to avoid to setup my own registry.
Sinopia seems to be a lightweight solution for the problem. It is a private repository server that allows to use private packages, cache the npmjs.org registry, and to override public packages.
Disclaimer: I haven't tried it because my problem was solved by another private registry I didn't know at the time of writing the question. However, maybe it helps someone else.

What's the best way to deploy a npmjs Private Module?

npmjs recently released their private npm modules feature which looks pretty cool.
To publish or fetch a private module from npm you need to have an authenticated npm client using npm login so the .npmrc file will get updated or created with the access token.
What is the best practice to deploy or CI an application that uses a private module?
The best way to do this is to include the .npmrc file but replace the auth token with an environment variable. Step 4 of this tutorial shows you how to do this and should work for any CI/deployment scenario.
If you are using Heroku, then you can follow Step 5 to set the environment variable. If not, just figure out how you configure env variables for the service you're using.