In npm/yarn workspaces, should packages consume src or dist - npm

I want to use a monorepo for our frontend app. We want to divide some react UI components into a package folder under "/packages/ui-components" and leave the app in an "/apps/app" folder and then have the app consume ui-components by importing it (simplified setup). We don't plan to release these packages to individual npm repos anytime soon, but just have the final app running.
I'm starting to worry a bit about how we can have the best workflow, and for some reason I cannot find this in my research:
Should the app consume the src-files from packages or instead compile each package to the dist folder and import only these?
Workflow wise we would like working in different packages seamless, so if someone edits in a package we would like these changes to immediately show in the App.
I see some pros and cons of using source-files compared to using a dist-output.
Pros of using src directly:
Better tree-shaking, as dependencies can be peer-dependencies and libraries that are used by multiple packages can be combined.
Smaller final bundle size due to webpack having better access to original data like full dependency-tree and common functions etc.
Faster development iterations with smaller projects as there's only one build and smart webpack could potentially only recompile a few changed files.
Pros of using dist:
More independent packages as they can contain their own build-pipeline.
Will be easier to import as less peer-dependencies and special webpack-config is needed
Ready to be published as a public npm package
Possibly faster build-time as only changed packages and main-app needs to recompile on changes (I assume webpack can do cache, so maybe this doesnt matter much)
I'm sure I'm missing a lot of details; setting up the good development flow is quiet complicated these days and I would like to make it as simple to use as possible for my colleagues..
TL;DR;
Should packages in a mono-repo build to their dist for others to consume, or is it better to import directly from src.

It is a matter of tradeoffs.
If you consume dist of your packages, this means that in order to "apply" changes inside your package you need to build, then consume it in the app.
Some even suggest publishing the package to the registry (public or private), this allows you more loose coupling between the app and the packages.
On the other hand, working on the src has "seem less" like advantages, but will require your app setup to support it, since it will be the one that compile the package's code.
Personally, I'm using the second method, my apps are consuming from the src, and it was not so trivial to configure to since most of the tools are ignoring code from node_modules by default (Like babel-loader, which will ignore transpilation of code inside node_modules).
Most of my code was based on next-transpile-modules source code.

Related

How to make a buildable publishable library with Nx that is framework agnostic for browser and node.js

I have a monorepo (Nx Workspace) that has many libraries that were generated with #nrwl/js. These libraries depend on each other. They are small utility libraries that I intend to publish on NPM as separate packages.
According to Nx docs:
#nrwl/js is particularly useful if you want to
Create framework agnostic, just plain TypeScript libraries within an existing Nx workspace (say to use in your React, Node or Angular app)
Publish TypeScript packages to NPM
Yet, I now realize that the build command associated with #nrwl/js does not take into account a library's dependencies on other libraries in order to include them as peerDependencies into its publishable package.json. It also looks like it is making just a simple commonjs build that would be compatible with node.js but not with browsers.
I know that #nrwl/angular takes care of all that stuff for you when you build a library with it if you had marked it as publishable. My question is how do I get the same behavior but for a library that is not meant for Angular but for general purpose Javascript use in any framework or environment.
It's still early in development so if a solution to my problem would involve regenerating the libs using some other Nx generator, it wouldn't be too much of a hassle for me to do so, and I'd consider it.
Edit
I have since changed the build executor to #nrwl/node, which has the buildable and publishable options and behave the way I need.
I believe that #nrwl/js will eventually gain that ability but as of 2022-01-05 it didn't have it. For now, someone looking to publish libraries on NPM probably should use #nrwl/node to generate his new library projects and not #nrwl/js
If you want to publish a NPM package as a "universal package" to be consumed by Node.js and the browser, you would need to build it as UMD, to cover most consumer environments.
Though modern browsers support ES modules (import and export), you might need extra setup steps to consume them.

Updating transitive dependencies of a NPM package

Our company has a few web applications which in turn depend on a very long chain of internally created and hosted npm packages (we use JFrog Artifactory) each with their own dependencies (and so on). Whenever a bug is fixed or a feature is implemented in a low-level package the current process requires a developer to check in their changes, wait for the CICD build to complete and tests to run, update the parent package, and rinse / repeat all the way up the chain (which can be a very long process).
This can't possibly be a unique situation yet it impacts our productivity greatly and encourages monolithic package development to limit number of packages to update instead of proper code separation.
I can only think of two solutions:
1) Update the web application to use the transitive dependency directly in package.json. This however breaks "encapsulation" because how a direct dependency manages its job shouldn't be known to the web application. If the direct dependency were to use some other transitive dependency later the web application shouldn't be left referencing a now irrelevant package.
2) Modify the web application's package-lock.json to point at the new version of the transitive dependency. This however seems to only work temporarily as merge conflicts or new installs of direct dependencies tend to revert these changes.
I realize that the answer might be to optimize the build / publish process to be less painful and manual but I was hoping others might have encountered a different solution.
FYI - All dependencies are installed with '~' as a version prefix by default.
The correct way to do this (Be in mind that this feature was introduced recently in npm 8.x) is using the overrides section. This new feature introduced allows replacing a package in your dependency tree with another version, or another package entirely.

Why does Aurelia install so many dependencies?

I am curious to know why when I create a new Aurelia project, each project installs +600 node_modules. Understandably, the modules collectively don't take up a lot of space, but are all of these modules necessary? I was under the impression that Aurelia's aim was to help developers move away from depending on 3rd party libraries so it seems odd that each project comes with a massive dump of 3rd party libraries.
My guess is that you are starting your project from CLI - which comes preset with HTTP server, ES6/2015, SASS, live-reloading and more.
I created clean Aurelia project and looked at the package.json - there were 5 dependencies and 34 dev dependencies. Using all of above mentioned tools is somewhat standard in today's JS web development, and generating project from CLI reduces time needed for upfront setup. All of these features come with their own dependencies, and that's why node_modules/ folder grows rapidly.
The bottom line is - you could start new Aureila project with much fewer dependencies. On their home page you can find starter project with just three. But that also means that you won't have access to most of the tools used today.
Also, and correct me if I'm wrong, I haven't got the impression Aurelia ever aimed to move devs from third party libs and modules, just to be modern, fast, and unobtrusive.
All modern web frameworks have a host of tooling. The reasons in no particular order -
1. Transpiling ESNext or TypeScript - if you want to write in Future JavaScript but have it work in all browsers, you need this step. Both Babel and TypeScript tooling comes with extra stuff too. If you want to see coverage (everyone does) there's another tool.
2. Testing - Unit test and End to End testing require testing frameworks, test runners, and if you want to write like above (esnext or TypeScript) you also need transpiling.
3. Module Loading / Bundling - Require.js, JSPM/System.js, WebPack, etc... are used to allow your code to actually run in the browser. Without a module loader you could not break your code out in to separate files. Without a bundler you would be loading a lot of extra files in production.
4. Serving your application - If you want to run your app locally you need a way to serve it up and watch for changes.
5. Debugging - You want to debug? Now you need a way to debug the file that gets served to the browser back to the original source.
6. Linting - Lint your code base for style consistencies.
Each of these packages usually have their own dependencies, and they get pulled down as well.
This convention of small packages that have a single focus is arguably better than massive packages that do everything for you. This allows you to remove a package and replace it with the one that does the same thing but in a way you want it.

Is nuget appropriate for daily development workflow?

I am looking at nuget for improving automatic handling of dependencies (both internal and third party) during development.
A long as you develop through the CI Build Server, all is good:
get latest source for A and B, where B depends on A
fix bug in A
build A
check into source control
CI Build Server initiated
new nuget package is created and placed in corporate repository
build B (which will get the updated A package)
run B to verify that the bug in A was fixed
n. repeat n times
However, I'm wondering if it is possible to work locally as a single developer, without having to wait for the CI Build Server to produce a new package?
Nuget has a feature Package Restore, which will download all dependencies automatically on build. You can also list the repository order that the Package Restore should look for packages.
If the workflow could become:
get latest source for A and B, where B depends on A
fix bug in A
build A
(building creates a local nuget package)
run B to test the (resolved) bug in A (should now use our local nuget package, not local repository)
...repeat n times
check into source control
CI Build Server initiated
new nuget package created in corporate repository
Is this possible using Visual Studio, MSBuild, a CI Build Server and nuget? I'm especially interested in the making of local packages while developing locally.
Note that I have native projects, although except the generation of nuget package post-build, this would be a workflow that I hope should work for both C# and C++ projects.
The solution I have now, though far from ideal, is what I could figure out works best. Oh! and it is a work in progress so it WILL change in the coming weeks/months as I figure out how to get around the kinks.
I mostly have to deal with managed DLL right now but I do have some native code and worst, multi-platform native code to deal with eventually.
Create a local repository, basically just a folder and configure it in your list of nuget feeds.
Then I created a task (MSBuild) that will package the project and output it in the local repository's root folder. Make sure the version of your package is always increasing. Presently I do this manually by editing the assembly version.
Once built, update your other projects that reference it, I usually do this though the package manager console (update-package).
Each projects that was updated, bump up their version rinse lathe and repeat until you get to your top-most project (the actual program).
Once everything is nice and good and you are ready to commit then the build system should do it's own packaging and send it to your official repository.
The Good
No clogging of the repository and build system with intermediary development versions, that garbage remains (as it should) local.
Local repos are super easy to set-up, can even be done without changes to VS though the global nuget config.
This is friendly to both paradigms of package recover or checking-in packages with the project. That said I would recommend not checking in the packages you built locally but rather one that was committed to your local repository ideally through the build system. What's built local should remain local.
The Bad
Still much more complicated than just adding projects to a solution.
The deeper (or wider) your dependency tree the bigger the pain.
The Ugly
Makes some native nuget behaviors quite quirky and annoying :
Update operation takes forever if your VS is connected to a version system (perforce for me). I hear they "solved" the problem, would hate to see how it was before if it was worst that it is now !
Having nuget change non-code reference back to never copy is a major pain.
If Only
Configure the desired state of a content dependency (copy always, never or newer) directly from the nuspec and be done with it ! (oh and same story with ClickOnce content status include, exclude etc)
Make the update operation quick, 2 minutes for a dozen project is just insane, especially if the ultimate goal is to manage 500+.
Perhaps a hybrid mode where locally we work with projects inclusion but the build system would work with nuget dependency (and build them if necessary)
If you are to parse the project do follow MSBuild parsing rules and honor the conditional statements.
There are still issues I have yet to figure out like how to manage multiple branches of the code in the repository. How to handle version conflict further up the food chain. In a large project (ultimately we have to bring 500+ separate projects together in a single application executable, conflicts are expected).
I would love to bring all the goodness of sane dependency management à la Maven but thus far I did not find nuget to be mature enough to even think of proposing it to the dev team.
Certainly. In our solutions, NuGet parks the libraries in the "packages" directory of the solution's hierarchy which is ultimately kept in TFS. This allows for complete solution check-outs that includes the required libraries. If it's your intention to update the libraries normally provided by NuGet, you'll need to update the dependent projects' references to point to the project containing the updated code normally provided by the NuGet process.
Prior to checking-in your regular solution work (not the NuGet related libs,) make sure the solution's NuGet libs are up to date, and the references in the solution point back to the NuGet installed libs. Of course, you'll check-in and fetch the NuGet related libs beforehand.

Testing a NuGet package

We are big users of NuGet, we've got 25-30 packages which we make available on a network share.
We'd like to be able to test new packages before they're built and released in the consuming applications. Ideally, this could be done using something similar to Maven's snapshot and having a specific development package (e.g. snapshot functionality).
Has anyone else come up with a, ideally reasonably non-hacky, way of doing it?
Our favoured method is to generate the package assemblies and then manually overwrite the assemblies in the packages/ directory, i.e. to replace the actual project references, but that doesn't seem particularly clean.
Update:
We use a CI build server which creates builds on every commit and has a specific manually triggered NuGet build which works off specifically tagged versions of the codebase. We don't want to create a NuGet build off every commit, but we would like to be able to test a likely candidate in the wild before we trigger the manual NuGet package build.
I ended up writing a unit / integration testing framework to solve a simular problem. Basically, I needed to verity the content of the package, the versions and info, what would happen when I installed and uninstalled the package, what versions were the assemblies in the lib, what bits the assemblies were built as (x86 or x64) and so on - and I needed it all to run without Visual Studio installed and on my build machine (headless) as a quality gate.
Standing on the shoulders of giants like: Pester, PETools, and SharpDevelop's package management module I put together - nuget-test
Clone the project into your package directory (where your .nuspec file and package files are). If for whatever reason you want to keep the nuget-test project as a "git" repo then simple remove "remove-item nuget-test/.git -Recurse -Force" from the command below.
git clone https://github.com/nickfloyd/nuget-test.git; remove-item nuget-test/.git -Recurse -Force
Run Setup.ps1 in the root of the nuget-test directory in an x86 instance of PowerShell.
PS> .\setup.ps1
Write tests and place them in the nuget-test/test directory using the Pester syntax.
Run the tests.
PS> Invoke-Pester
Project page: nuget-test
On github: https://github.com/nickfloyd/nuget-test
I hope this helps you get closer to what you're trying to get done.
If you're using NuGet packages to distribute your libraries, you should not limit to only testing the libraries. You should test the packages themselves as well (if your binaries are OK but incorrectly installed, consumers still have issues). The whole point is to improve this experience.
One way could be to have an additional CI or QA repository. The one you currently have is actually your "production" repository containing consumable releases, considered finished high-quality products.
Going further, you could have a logical package promotion flow (based on Continuous Integration or even using a Continuous Delivery approach), where:
- each check-in produces a package on your CI repository
- testers pick up a CI package for QA and if found OK promote it to either a QA feed, or to the Production feed (whatever you prefer, depends on the quality of your testing and how well it is automated)
There are various ways of implementing this scenario, using simple network shares, internal NuGet.Server or Gallery implementations, or simply use http://myget.org to give it a try with minimal cost and zero effort.
Hope that helps!
Cheers,
Xavier