yarn add <path-to-local-package> --offline does not install dependencies - npm

I am working in a very restricted environment, which means I do not have any access to internet. I setup everything in order to import an Angular Template Project and install all the packages from an offline cache. I followed these steps:
On the online machine
configure yarn-offline-mirror with pruning to false (directory X)
create a new angular app with ng new foo --skipInstall=true
install the packages with yarn (install) in order to generate a yarn.lock and to store all the tgz-packages in directory X
delete node_modules
Whenever I need more packages, I use npm-package-downloader with the argument -d to download all the dependencies as tgz-files too and copy them to directory X
On my offline machine
I import the tgz-files from directory X and put them in the same location
I configure yarn-offline-mirror like on my online machine
I Import the angular template and install the needed packages with yarn clean cache followed by yarn --offline
As long all the tgz-packages are on my offline machine, this works like a charm (because of yarn.lock).
My Problem
I want to add further packages on my offline machine. The packages are present in the offline mirror, because I copied them onto the machine. When I run
yarn clean cache
yarn add <absolute-path-to-tgz-package> --offline
two possible scenarios arise:
The package has no dependencies, in which case it works as intended
The package has dependencies: in this case, follwing error gets thrown for each "missing" dependency (package-name and package-version are placeholders):
Couldn't find any versions for "package-name" that matches
"package-version" in our cache. This is usually caused by a missing
entry in the lockfile, running Yarn without the --offline flag may
help fix the issue.
As far as I understand, yarn needs somehow to know, which dependencies a package requires. But it has to know, otherwise it couldn't throw this specific error. So my question is, how can I force yarn to look for dependencies in directory X too? The packages are there, since I download them with the respective dependencies. And obviously I am not supposed to manually edit yarn.lock. The docs for yarn add do not list such an option (as a matter of fact, it does not even list the --offline flag)

Related

How to use Font Awesome after it being installed with Yarn

I am using VS 2019 for Core 3.1 development and I installed Font Awesome whith Yarn:
yarn add #fortawesome/fontawesome-free
However, whenI try to reference it in my HEAD section like this:
<script defer src="~/lib/#fortawesome/fontawesome-free/js/all.js"></script>
I get the following error:
the 'fortawesome' doesnt exist in actual context
Package managers like yarn, npm, etc. just add the packages to your project. They generally aren't ready to deploy directly at that point, but rather require a build pipeline to create the actual JS/CSS resources. The #fortawesome/fontawesome repo is an exception in that 1) it's not actually a package and 2) the files are already "built". Even then, though, they still won't be in the right location.
I'm not overly familiar with yarn, but npm, for example, puts everything in a node_modules directory. That directory is not served by default, and should not be served, because it contains "raw" stuff. You'd need a separate build pipeline (using npm scripts, webpack, gulp, etc.) to build/copy assets into a directory that is served (i.e. wwwroot). That is likely the piece you are missing.
For this, since there's no build actually required, you just need to have the assets copied to something like wwwroot/lib and then you'll be able to reference it via the script tag on the page. That can be done with any tool like npm, webpack, gulp, grunt, etc. so you'll just have to research and pick the one that fits your needs best.

Intellij/Webstorm Yarn - Cannot find Package unless on root

I am trying to add packages via yarn inside of intellij. I can get it to install the package fine, and I can even get it to move the packages to my own custom folder via --modules-folder "ExternalLibs".
The issue I am having is, unless I allow yarn to install on the root and under the node_modules folder, it won't recognize that there is a package.
Is there a way to point the package.json to look in the custom path?
You can try setting NODE_PATH environment variable pointing to your folder location in Node.js run configuration template: Run | Edit Configurations..., expand Templates node, select Node.js configuration, specify NODE_PATH in Environment variables field
Please see https://youtrack.jetbrains.com/issue/WEB-19476#focus=streamItem-27-2819977.0-0
Note that, though the modules in require() calls will actually be resolved, you will still see warnings about non-installed packages due to WEB-25792; you have to disable JavaScript | General | Missing module dependency inspection to get rid of the warnings

Vue CLI Project doesnt Hot Reload after yarn -> npm switch. No config files either?

Due to clients/coworkers wish I have to switch to npm. The Project was created using Vue CLI and Yarn as default Package Manager.
I first thought no big deal, so I deleted node_modules folder and yarn.lock file. Then I ran npm install and then npm run serve.
It works and compiles like normal, recompiles when I change a file, all good so far but here is the weird part: the changes do not reflect in the browser. I have to refresh the page manually.
I tried to look into config files for vue or webpack. But there are none. No Webpack config, no vue cli config, no build folder.
What I have is:
- .eslintrc.js
- .browserlistrc
- babel.config.js
- postcss.config.js
I dont know what else to look for? Anyone any idea what this might be?
Thanks a lot,
-J
What was the cause in my case(I have Linux OS):
It seems that the value of: max_user_watches
in the /proc/sys/fs/inotify/max_user_watches
is affecting webpack => which is interfering with hot reloading.
To check your actual value
$cat /proc/sys/fs/inotify/max_user_watches
16384
16384 was in my case and it still wasn't enough.
So you will need to increase your value as well.
SOLUTION if you have Linux OS:
Create the file:
sudo nano /etc/sysctl.d/90-override.conf
And populate it with:
fs.inotify.max_user_watches=200000
It seems 200000 is enough for me, you can have a higher value if you want.
After you create the file and add the value, just restart the PC and you should be ok.
Vue CLI depends on #vue/cli-service which acts exactly like Facebook's Create React App (react-scripts).
If you are familiar with create-react-app, #vue/cli-service is roughly
the equivalent of react-scripts, although the feature set is
different.
https://cli.vuejs.org/guide/#cli-service
So what both of them do is "simplify" configuration of the project for you by hiding all the bundling configs (e.g. webpack.config.js) under the carpet. Which is handy in most cases, unless you decide to do something fancy (like switch package manager). In Create React App one can bail out from this behavior by running yarn eject or npm run eject, but on Vue CLI platform you are locked in. So there's no straightforward way to make all underlying config files to appear and fix the faulty bits in them.
To be contunued?..

Peer dependency that is also dev dependency of linked npm module is acting as a separate instance

In my app, I have these dependencies:
TypeORM
typeorm-linq-repository AS A LOCAL INSTALL ("typeorm-linq-repository": "file:../../../IRCraziestTaxi/typeorm-linq-repository"), who has a dev dependency AND a peer dependency of TypeORM
The reason I use a "file:" installation of typeorm-linq-repository is that I am the developer and test changes in this app prior to pushing releases to npm.
I was previously using node ~6.10 (npm ~4), so when I used the "file:" installation, it just copied the published files over, which is what I want.
However, after upgrading to node 8.11.3 (npm 5.6.0), it now links the folder rather than copying the published files.
Note, if it matters, that my environment is Windows.
The problem is this: since both my app and the linked typeorm-linq-repository have TypeORM in their own node_modules folders, TypeORM is being treated as a separate "instance" of the module in each app.
Therefore, after creating a connection in the main app, when the code that accesses the connection in typeorm-linq-repository is reached, it throws an error of Connection "default" was not found..
I have searched tirelessly for a solution to this. I have tried --preserve-symlinks, but that does not work.
The only way for me to make this work right now is to manually create the folder in my app's node_modules and copy applicable files over, which is a huge pain.
How can I either tell npm to NOT symlink the "file:" installation or get it to use the same instance of the TypeORM module?
I made it work pretty easily, although I feel like it's kind of a band-aid. I will post the answer here to help anybody else who may be having this issue, but if anybody has a more proper solution, feel free to answer and I will accept.
The trick was to link my app's installation of TypeORM to the TypeORM folder in my other linked dependency's node_modules folder.
...,
"typeorm": "file:../../../IRCraziestTaxi/typeorm-linq-repository/node_modules/typeorm",
"typeorm-linq-repository": "file:../../../IRCraziestTaxi/typeorm-linq-repository",
...

What does "Linking Dependencies" during npm / yarn install really do?

For large web apps npm install resp. yarn install does take a lot of time, mostly in a step called Linking Dependencies. What is happening here? Is it fetching the dependencies of the dependencies? Or something completely different? Which files are created during this step?
When you call yarn install, the following things happen in order:
Resolution: Yarn starts resolving dependencies by making requests to the registry and recursively looking up each dependency.
Downloading/Fetching: Next, Yarn looks in a global cache directory to see if the package needed has already been downloaded. If it hasn't, Yarn fetches the tarball for the package and places it in the global cache so it can work offline and won't need to download dependencies more than once. Dependencies can also be placed in source control as tarballs for full offline installs.
Linking: Finally, Yarn links everything together by copying all the files needed from the global cache into the local node_modules directory after identifying what's already there and what's not there.
yarn install does take a lot of time, mostly in a step called Linking Dependencies
You should notice that Step 3: Linking is taking more time than Step 1: Resolution and Step 2: Fetching where the actual download happens. During by this step we already have things that we need ready and downloaded, then why is it taking long, did we miss anything?
Yes, COPY to local project into node_modules folder...! The reason for this is that this copy is not equivalent to copying one large 4.7GB ISO file. Instead it's multiple super small files (Don't take it light when I say multiple, it can be 15k+ files :P ), hence take a lot of time to copy. (Also, it is important to note that when you download the packages, you download one large tar file per package, whose contents should then be extracted into the cache which also takes time)
It is slower due to
Anti-virus: Your antivirus is sitting in the middle and doing a quick inspect (in addition to our yarn checking if it already exists) on every single file yarn is trying to copy cutting its speed by so much. If you are on Windows, try adding your project's parent folder as exception to Windows Defender.
Storage medium's transfer rate: SSDs can improve this speed hugely (Sorry, SSHDs and FireCudas will not help either, this is gonna be one time).
But is this efficient? Can I have it taken from the global node_modules (after creating one)?
Nope for both questions. Because of the way node works each package finds its dependencies only relative to its own location. Also because each project may want to use different versions of the same package to ensure its working properly and not broken by package updates.
Ideally, the project folder should be lean. An efficient way of doing this would be to have a global node_modules folder. Any and all requested packages are downloaded if not already present AND used from this location. Actually Ruby does it this way. Here's my global Ruby's equivalent of node_modules folder. Notice the presence of different versions of the same package for use in different projects.
But keep in mind that it would reduce project portability. It's a trade-off that any manager (be it rubygems or node modules) has to make. I can just copy the node project folder (which in fact may take hours because you will be copying the (local) node_modules folder as well, but I can expect it to work if I have just that project folder, as opposed to copying a ruby project would only some seconds to few minutes, as there is no local packages (or gems as they call them) folder, but running the project on different system would require those packages to be present on the global gems folder.
The documentation for yarn install can be found here.
You can use the command
yarn install --verbose
Show additional logs while installing dependencies
The output will show what the yarn/npm install is doing.
It's good for debugging in case the process is failing or taking a long time.
The linking phase works in essentially 3 big steps:
Find every file that need to be in node_modules
Check this list versus what is already there and find what need to be copied around
from cache to node_modules
Do the copy
Maybe this issue on Github will help you out.
https://github.com/yarnpkg/yarn/issues/1496