Azure Devops Artifacts: disable saving packages from upstream sources - npm

I have a feed for npm packages with npmjs set as upstream source (by default). When you try to install your uploaded custom package with other dependencies, those dependency packages get saved automatically in your feed to save time for future installs. I however do not want it and want my feed to only host my own packages and just download from the upstream source every time an install is made. Is there a way to do this?

What I did instead is make all our packages in a scope and modify the npmrc file to use the azure feed for only that scope so the rest of the packages still gets downloaded from registry.npmjs.org
here's what the .npmrc file would look like
registry=https://registry.npmjs.org/
#customScope:registry=https://ourregistry.org/ourfeed
always-auth=true

Is there a way to do this?
The answer is yes.
If you are creating new feed, you could uncheck the option Upstream sources when you creating the new feed:
If the feed is already created by you, you could delete the upstream sources. Go to the Settings->Upstream sources:
Hope this helps.

Related

Permanent fix for lockfileVersion of npm-shrinkwrap to lockfileVersion#1, it automatically makes lockfileversion#2?

I am getting this when I am trying to push my code into github actions or building dockerimage.
shell: /usr/bin/bash -e {0}
npm WARN read-shrinkwrap This version of npm is compatible with lockfileVersion#1, but package-lock.json was generated for lockfileVersion#2. I'll try to do my best with it!
I tried to implement this Link it works but again after some commit I am getting the same error and I have to repeat the same procedure again and again.
Any fix for that?
Look in your .gitignore if you have the lines :
package-lock.json
node_modules/
if not,then add them,
after that look in your Github repository and delete the package-lock.json file and the node_modules directory (if any)
Important Edit :
My bad, Kevin Martin is right the official documentation tell us to add it to the repository for CI/CD.
This file is intended to be committed into source repositories, and
serves various purposes:
Describe a single representation of a dependency tree such that
teammates, deployments, and continuous integration are guaranteed to
install exactly the same dependencies.
Provide a facility for users to "time-travel" to previous states of
node_modules without having to commit the directory itself.
To facilitate greater visibility of tree changes through readable
source control diffs.
And optimize the installation process by allowing npm to skip repeated
metadata resolutions for previously-installed packages.
But for my case (Azure Devops) i had a lot of trouble with it.

Switching from NPM to GitHub Packages

I have a NPM package with a small user base, yesterday I created a new version and wanted to release it. I thought that I might as well make use of the new GitHub Packages and so I setup everything as GitHub suggested and released!
Now the problem is that I still have the old NPM page running on version 2.0.2 while everyone currently uses this as their dependency while the new GitHub package is on 2.0.4, Is there a way to 'synchronize' these two. Of course the GitHub Packages uses the <USER>/<PACKAGE> labeling while NPM just uses <NAME>.
Is the only thing I can do to publish on GitHub Packages and on NPM and just try to move users away from the NPM page?
If your publishing a public package, your better off just publishing it on NPM, as that is what most developers are used to.
I use GitHub Packages at work and the only advantage is that is effective free for hosting internal packages, as we are already paying for GitHub anyway. If it wasn’t for the zero price we wouldn’t be using it.
If you really want to force all your users to migrate to GitHub packages, and have to set up npm to work with it you could mark you old version npm deprecated and use that to point people to the new version.
https://docs.npmjs.com/cli/v6/commands/npm-deprecate
Here is another solution, but there is a catch.
Change your registry.npmjs.org package content to
index.js
export * from '#schotsl/my-package';
Now your registry.npmjs.org package is (almost) pointing to your npm.pkg.github.com package.
Only almost because any development directory for a project downstream of registry.npmjs.org/my-package, must configure the scope-to-server mapping for #schotsl/my-package to npm.pkg.github.com in a package manager config file.
In the case of package managers 'npm' and 'yarn' (v1) that can be done in
an .npmrc file at the same level as package.json.
The required .npmrc content is
#schotsl:registry=https://npm.pkg.github.com
# Github PAT token, packages:read authorization only ok
//npm.pkg.github.co/:_authToken="ghp_XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
The first line is the scope to server mapping.
The second line is a Github personal authorization token (PAT) with at least package:read permission. It is actually pretty liberal. A PAT with package:read issued from any github account will allow read access to every github accounts packages.
For the 'yarn' v2 package, the .npmrc file does not work, and instead a couple of keys need to be set in .yarnrc.yml.
Unfortunately there is no way to set the scope-to-server mapping and the token inside the registry.npmjs.org/my-package package itself.
Putting the .npmrc file in there doesn't work, it is ignored. And that wouldn't be a good solution anyway, because not all package managers read .npmrc files.
That is the 'catch' - using npm.pkg.github.com packages requires package manager specific config settings to be made by every downstream developer.
In addition, what if two different upstream packages have colliding scope names, each mapping to a different server? The current config methodology fails in that case.
Feature Proposal not current behavior
Ideally, there would be a common interface agreed upon by all package managers inside package.json - and the scope-to-server mapping would be defined in the package that directly references the scope. For example, in the package.json of my-package on registry.npmjs.org
{
dependencies:{
"#schotsl/my-package":"1.0.0"
},
registries:{
"#schotsl/my-package":"https://npm.pkg.github.com",
},
auths:{
"https://npm.pkg.github.com":"ghp_XXXXXXXXXXXXXXXX",
},
}
Then downstream users would not need to config for each scope, and predictable (and risky) problems with scope name or package name collisions would not occur.
But that is not the way it is. Therefore Github Packages (npm.pkg.github.com) doesn't really seem to be a feasible way to provide public packages which may become dependencies of other public packages. No problem for private packages though.

How can I shim access to a private repository with scoped npm packages on a machine that can't access it?

Let's say I have a private, scoped NPM repository that lives behind a corporate firewall. I'd like to set my project up on another computer that will not connect to the VPN, so it will not be able to access that private repo.
How can I set up my project to easily import those dependencies from local folders and/or my local npm cache and skip the private repo?
That is, if my package.json file has...
"dependencies": {
"#privateRepo/some-library-framework": "4.2.1"
}
... and I can't get to the server, but I can get the files that are needed and would've been installed from another node_modules folder that lives on a machine that can access the repo.
I tried taking the files from the packages in #privateRepo and using npm cache add D:\path\to\lib\with\packageDotJsonInside for each of them, but still got...
Not Found - GET https://registry.npmjs.org/#privateRepo%2some-library-framework - Not found
... when I tried to npm i the rest.
I think that means that I need to set something up in .npmrc like is described here...
registry=https://registry.npmjs.org/
#test-scope:registry=http://nexus:8081/nexus/content/repositories/npm-test/
//nexus:8081/nexus/content/repositories/npm-test/:username=admin
//nexus:8081/nexus/content/repositories/npm-test/:_password=YWRtaW4xMjM=
email=…
... where you'd normally set up auth, but where you're also setting up the URL to a scoped package. I think I want to set up #privateRepo:registry=http://localhost/something/something here.
But I think that also implies I would at least need to create a local webserver (or npm repo?) to answer requests (and then maybe I'm looking for something like verdaccio?).
So, simplest case, is there a way to force the app to use the cached version or is there more I need to shim? If not, what's the easiest way to create a local repo to serve those packages in the place of the private repo?
Seeing nothing better, the easiest answer does seems to be setting up a local npm repo. You can then set up your .npmrc to point to localhost for the scoped private registry instead of the "real" version behind a VPN.
And as it turns out, Verdaccio actually does exactly this -- you could also use it to host a "real" private repo, including behind your firewall, but installing on your dev box will allow you to provide your npm packages to any new codebase locally.
This is described in some detail by this video that's linked on Verdaccio's docs site. Here's the quick version:
Install verdaccio: npm install --global verdaccio
Run verdaccio: verdaccio
You can then check out its interface at http://localhost:4873/ (or elsewhere if you changed defaults)
Create a user: npm adduser --registry http://localhost:4873
Login: npm login --registry http://localhost:4873
You can now log in as that user on the web UI too, if you want.
Navigate to your packages' files. Navigate into the folder that's package specific.
That is, if you pull all of your packages from another project's node_modules, you need to go into each folder where the individual package's package.json file lives to publish it.
Publish the package: npm publish --registry http://localhost:4873
You can double-check that it "took" by refreshing the web UI.
Repeat for each additional package.
That's it! You now have an npm repo for the packages you can use to remove the VPN requirement for running npm i. Just schlep the new versions of the packages over to your local npm and publish them as appropriate.
You will need to set up a scoped entry for this registry in your .npmrc, but you were already doing that for your repo behind the firewall, so no big deal, right?
Ready to move the check for a better answer, but this seems like it oughta work.

Why does my project require multiple npmrc registries when the artifact already includes them as upstream sources?

I have two mono repositories that use Node/NPM/Lerna to manager and distribute multiple packages.
Project X includes .npmrc file with a single registry. This registry is for a private Azure Feed that includes three upstream sources. Named ( A, B, C )
A - Public NPMJS
B - Private Package
C - Private Package
Project Y requires .npmrc file with two namespaced registries.
The first is the same that Project X uses.
The second *seems* to be required and #register's the Azure Feed for B.
My question is if my .npmrc file is registering a package that contains three upstream feeds, why do I additionally have to register one of these upstream feeds in my .npmrc file (B)?
It was my understanding from documentation that Upstream Feeds are an alternative approach to registering multiple namespaced packages in your .npmrc file.
I'm clearly missing something. Please assist. Thanks
You are correct that Project X is using the set-up that we recommend, which is a single Azure Artifacts feed in your .npmrc's registry= line that upstreams to any other feeds you need.
However, Project Y may have chosen to use scopes to instead only use limited packages from Azure Artifacts while pulling most packages directly from npmjs.com. We generally don't recommend this, but it's a valid way to work. Note that in this configuration, Project Y does not get to take advantage of the benefits of upstream sources (like a saved copy of anything you use from npmjs.com, in case it's later deleted).
If you want to migrate to the recommended configuration, try ensuring that Project Y's .npmrc has a single registry= line:
registry=https://dev.azure.com/.../ProjectXFeedWithUpstreams/...
and see if it still builds (delete or rename node_modules when you run npm install).

How to modify existing NPM package with my changes?

I installed a package from npm, but I needed to customize it. The problem is that, when the team install or update npm packages, the customization is overwritten.
I would like to know if there is anyway to preserve this customization or if I need to upload another package with the customization...
as semanser says, you need fork the project, but the right way for include this in your package.json file is
"package-name": "<your user name>/<your package name>#<your branch>"
you can find additional information here
Create a Github fork of a package that you need to customize.
Make changes that you want in your fork (don't forget to commit and push them).
Add the link to the fork to your package.json file in the following format:
"dependencies": {
"bar": "git://github.com/foo/bar.git"
}
(optional) Create a Pull Request and wait until your changes will be approved in the original repo.