Working with multiple npm registies and changing manual npmrc files can be a lot of additional work, so are there some better option to save and switch between multiple npmrc files or manage the registry quickly?
I would be interested to explore any operating system option out there.
Related
I am getting this when I am trying to push my code into github actions or building dockerimage.
shell: /usr/bin/bash -e {0}
npm WARN read-shrinkwrap This version of npm is compatible with lockfileVersion#1, but package-lock.json was generated for lockfileVersion#2. I'll try to do my best with it!
I tried to implement this Link it works but again after some commit I am getting the same error and I have to repeat the same procedure again and again.
Any fix for that?
Look in your .gitignore if you have the lines :
package-lock.json
node_modules/
if not,then add them,
after that look in your Github repository and delete the package-lock.json file and the node_modules directory (if any)
Important Edit :
My bad, Kevin Martin is right the official documentation tell us to add it to the repository for CI/CD.
This file is intended to be committed into source repositories, and
serves various purposes:
Describe a single representation of a dependency tree such that
teammates, deployments, and continuous integration are guaranteed to
install exactly the same dependencies.
Provide a facility for users to "time-travel" to previous states of
node_modules without having to commit the directory itself.
To facilitate greater visibility of tree changes through readable
source control diffs.
And optimize the installation process by allowing npm to skip repeated
metadata resolutions for previously-installed packages.
But for my case (Azure Devops) i had a lot of trouble with it.
I've created a web app template that I use frequently for many different projects.
I would like to create an NPM package for it so that it's easier to install for new projects, separate the template from the project files, separate the template dependencies from the project dependencies, and allow easier updating of the template across all projects.
The issue I have is that I need some files/folders to be installed in the root directory (i.e. where package.json is saved). Most can go in the node_modules folder however I have some files that must be placed in the root directory.
For example, the template uses Next.js with a custom _app.js file. This must be in the root directory in a folder named pages. I also have various config files that must be in the root directory.
Can this be done with NPM, or does everything need to be installed in the node_modules folder? I'm having trouble finding anything on SO or Google that answers this, so if you happen to know a guide online on how to do this or can outline things I should search for it would be much appreciated.
With pure npm, everything has to go to the node_modules folder, so you can't solve your issue this way.
Maybe going with a templating tool such as grunt init or yeoman could be a solution here, although – unfortunately – you'll then lose some of the benefits of being able to install a package via npm.
Another option might be to use GitHub template repositories, which have just been introduced recently.
Last but not least one option might also be to just have the files' contents in the npm package, but create the pages/_app.js manually, but inside of it simply require the file contents from an npm module, and that's it. This at least helps to have the content portable, but of course it still asks you to setup the file and folder structure on your own.
Sorry that I don't have a better answer, but I hope it helps anyway.
PS: One "solution" might also be to use the postinstall step in an npm module's package.json file to create folder structure, copy files to where they should be and so on, but at least to me this feels more like a clumsy workaround than like a real solution.
For large web apps npm install resp. yarn install does take a lot of time, mostly in a step called Linking Dependencies. What is happening here? Is it fetching the dependencies of the dependencies? Or something completely different? Which files are created during this step?
When you call yarn install, the following things happen in order:
Resolution: Yarn starts resolving dependencies by making requests to the registry and recursively looking up each dependency.
Downloading/Fetching: Next, Yarn looks in a global cache directory to see if the package needed has already been downloaded. If it hasn't, Yarn fetches the tarball for the package and places it in the global cache so it can work offline and won't need to download dependencies more than once. Dependencies can also be placed in source control as tarballs for full offline installs.
Linking: Finally, Yarn links everything together by copying all the files needed from the global cache into the local node_modules directory after identifying what's already there and what's not there.
yarn install does take a lot of time, mostly in a step called Linking Dependencies
You should notice that Step 3: Linking is taking more time than Step 1: Resolution and Step 2: Fetching where the actual download happens. During by this step we already have things that we need ready and downloaded, then why is it taking long, did we miss anything?
Yes, COPY to local project into node_modules folder...! The reason for this is that this copy is not equivalent to copying one large 4.7GB ISO file. Instead it's multiple super small files (Don't take it light when I say multiple, it can be 15k+ files :P ), hence take a lot of time to copy. (Also, it is important to note that when you download the packages, you download one large tar file per package, whose contents should then be extracted into the cache which also takes time)
It is slower due to
Anti-virus: Your antivirus is sitting in the middle and doing a quick inspect (in addition to our yarn checking if it already exists) on every single file yarn is trying to copy cutting its speed by so much. If you are on Windows, try adding your project's parent folder as exception to Windows Defender.
Storage medium's transfer rate: SSDs can improve this speed hugely (Sorry, SSHDs and FireCudas will not help either, this is gonna be one time).
But is this efficient? Can I have it taken from the global node_modules (after creating one)?
Nope for both questions. Because of the way node works each package finds its dependencies only relative to its own location. Also because each project may want to use different versions of the same package to ensure its working properly and not broken by package updates.
Ideally, the project folder should be lean. An efficient way of doing this would be to have a global node_modules folder. Any and all requested packages are downloaded if not already present AND used from this location. Actually Ruby does it this way. Here's my global Ruby's equivalent of node_modules folder. Notice the presence of different versions of the same package for use in different projects.
But keep in mind that it would reduce project portability. It's a trade-off that any manager (be it rubygems or node modules) has to make. I can just copy the node project folder (which in fact may take hours because you will be copying the (local) node_modules folder as well, but I can expect it to work if I have just that project folder, as opposed to copying a ruby project would only some seconds to few minutes, as there is no local packages (or gems as they call them) folder, but running the project on different system would require those packages to be present on the global gems folder.
The documentation for yarn install can be found here.
You can use the command
yarn install --verbose
Show additional logs while installing dependencies
The output will show what the yarn/npm install is doing.
It's good for debugging in case the process is failing or taking a long time.
The linking phase works in essentially 3 big steps:
Find every file that need to be in node_modules
Check this list versus what is already there and find what need to be copied around
from cache to node_modules
Do the copy
Maybe this issue on Github will help you out.
https://github.com/yarnpkg/yarn/issues/1496
I'm adding several dependencies to a project that currently uses the default npm registry. Obviously the dependencies cannot be resolved since the packages are not found there.
I'm wondering if I can provide the packages via a folder or zip file instead and tell npm to bypass the registry for certain dependencies and take the packages directly from the folder. I want to avoid to setup my own registry.
Sinopia seems to be a lightweight solution for the problem. It is a private repository server that allows to use private packages, cache the npmjs.org registry, and to override public packages.
Disclaimer: I haven't tried it because my problem was solved by another private registry I didn't know at the time of writing the question. However, maybe it helps someone else.
I'm using Dropbox on daily basis and put my programming projects in there.
It works great, but once I got many projects my /node_modules dir's are putting
a struggle on Dropbox. It's syncing process starts to be slow and it eats up CPU time.
Is there any way to do a selective sync based on directory name or a mask pattern?
Would be nice to have to a .gitignore equivalent to configure.
Any 3rd party software for that task?
There is a way to selectively sync but I don't believe it has any advanced rules like you're describing:
https://www.dropbox.com/help/175/en
2 way to resolve this problem:
You can put node_modules upper then project directory in files tree. For example:
Project dir: c:/prj/myProjWrapper/myProj
In the c:/prj/myProjWrapper put package.json and make npm install here, NodeJS recursively will find it.
Win and Linux only, not for Mac! In project dir create .ds_store folder (it is not sync by dropbox). Put package.json in to it and do npm install. You must set NODE_PATH=./.ds_store/node_modules;. when starting NodeJS