Should I git-push a dist folder of a npm package - npm

The JS package I've prepared (using tsdx) is being used across multiple company systems. It lies within our gitlab so the package.json entry looks like this:
"some-package": "git+ssh://git#gitlab.company.com:some-place/some-package.git#some-branch"
Now, every time a user does npm install it takes a large amount of time until the process completes. Is it because of me pushing a src/ folder instead of dist/ (which I do)? Does the package builds itself every time it gets downloaded by npm install? Should I push dist/ folder to shorten the time needed to complete npm install?

Npm install are downloading all your dependencies and placing them I the node_modules folder.
The dist folder is typically for your builded application, so committing and pushing the dist folder wouldn't help on the process time..

Related

Is there a way to only run npm scripts for installed packages (opposite of --ignore-scripts)

I am in a situation where I need to ship node_modules with the rest of my code because the destination machines do not have access to our private network (and our private npm repository).
My problem is that I want to execute everything that happens after npm downloads all the files so that individual packages can build themselves correctly for the target machine. Is there a way to accomplish this? Here are a couple other ways to phrase this question:
How can I run npm install, but skip the download step?
How can I run postinstall for installed node_modules only?
I finally got it figured it out. There were a couple of important steps to make this happen:
When we get ready to package our code for distribution, we download all of the npm dependencies with the --ignore-scripts and --no-bin-links option. This prevents any packages from building/compiling or linking any bin files. This is effectively only downloading the node_modules.
npm install --omit=dev --ignore-scripts --no-bin-links
We then distribute our code to the target machine and run the following command so that any compilations and bin links happen on the target machine:
npm rebuild

Why does NPM create unwanted folder when installing packages?

Every time when I install any npm package, it creates a folder called 'tmpnodejsnpm-cache'. See the picture below.
Why does this happen?
The unwanted generated folder

Install other package.json dependencies

Simple question : Is it possible, in a package.json, to reference another package.json, and install its dependencies ?
Thank you.
Yes, this is possible, and this is automatically done by npm install.
If you have pkg-a that depends on pkg-b, including pkg-a in your dependencies will install both pkg-a and pkg-b when running npm install. That is because dependencies are actually references to the package.json of other packages. NPM, upon running install, builds a dependency tree of all the packages that are indirectly required by your current project, and installs all of them in the node_modules directory, and keeps track of them all in package-lock.json.
Good question! but this is not possible as you cannot internally reference one json document from another (json is just a document format, it lacks any ability to process logic, import files etc), npm is configured to run using a single package.json file so your best best would be to put all your dependencies in a single package.json file or split your project into two directories with two separate package.json files, two npm installs etc, if for some reason you require your dependencies to be separate. You could then run your two node projects separately and connect via http if you wish.
The only way that you could come close to doing this would be to write an npm start script in the package.json that cds to another directory with a package.json and runs npm install, this would however only install the dependencies in the second directory node-modules/ folder

Angular 5: Is it possible to link a local npm module?

I have a custom node module, that I can't publish on NPM.
I'd like to use it as any module I have on NPM.
Is it possible without the awful thing of copying the folder into the node_modules?
The answer is: YES, IT's POSSIBLE.
Let's assume your module has a dist folder, with the built source
(for example I run gulp on my src folder and procude the dist folder).
you simply got to run npm pack ./dist in your library.
this will produce a tgz archive with your library named your-library-version
then you can install your module in your project by simply running
npm i path-to/your-library-version.tgz
And you're done.
Let's say my library fodler is C:\ngx-mat-lib
so my tgz will be in this folder, since the dist folder should be a child of ngx-mat-lib.
In my project I'll run
npm i C:/ngx-mat-lib/ngx-mat-lib-0.0.1.tgz
Note: using forward slashes to avoid doubling them

Speeding up NPM package install

When I deploy my app to AWS, it's copied into a new directory, so NPM will install all the same packages, during each deploy, which can take a lot of time. Most of these packages haven't changed between builds (if at all), so having it do a full npm-install seems like a waste.
My app server runs a bunch of different Node apps, so installing globally isn't an option. Instead I'd like to have the app store it's node packages in a location that isn't wiped out during deployment, but have the option to update packages as necessary during npm install.
Does NPM have a concept of an app-specific module directory that isn't located in a subfolder of an app? That way I can delete the app folder, and not have to reinstall the same packages over and over again.
I could achieve this by using symlinks, or migrating the current node_module directory.
If you lock down your dependencies versions, NPM is likely to cache the packages. So the installation wouldn't take much longer.
If you prefer not to do this, you can install dependencies globally and link them with the npm link command (which is basically creating a symlink yourself!). Then, it'll be up to you update the globally installed packages regularly.