Several people on same project using different package manager? - npm

Let's say we have a Vue/React project that requires 3 people, but each of those people has a different taste in package manager.
The first person already feels cozy using npm, the second one uses yarn because he thinks it has better security, and the third person loves to use pnpm because he thinks it can save storage when having multiple projects.
Is it possible if that one project that is being worked on by those 3 people to run on each person's device using their chosen package manager?
Even if it is possible, is it something that is normal? Or is it something that we should avoid?

It is something that you should avoid. Even if they used the same lockfile, there would be slight differences in how they work, so people would get "works on my machine" issues. You don't want to spend your time figuring out such issues.
Each project needs to pick one package manager and stick to a given major version of that package manager. You can even go one step further and stick to a given exact version of that package manager. That will make your setup most stable. You can use the new packageManager field for that in package.json:
{
"packageManager": "<package manager name>#<version>"
}
But you need to enable corepack as it is an experimental feature of Node.js for now:
corepack enable

Related

Is there a way to deploy 2 versions of the same package for 2 different use cases at once?

The answer seems like 'no' but I wanted to check with colleagues here.
We provide an npm package for our own sites as well as some 3rd party sites.
There's a fairly heavy and old homegrown npm package that we also have in our package.
We don't need that package any longer on our sites but the 3rd party sites do.
We also have no way of controlling the code on those 3rd party sites so we need to keep the deployed bundle name and location the same for them.
Is there a way to publish a version of our package first without the extra package for us and then a version with it for the third parties from the same repository?
ourpackage-new.js (without the dependency)
ourpackage.js (with the dependency)
I had some success with a new package json in a subdirectory. I would create a new package and the original package via a command in gitlab.yaml to cd into that directory and npm publish there after the first one. This requires copying some dependency files down there as well which would mean if one version was updated, we'd need to remember to update the copy. Not a situation we'd want.
Even if we created a 2nd repository for the change just for us, we'd still need to update 2 repositories every time we had a new change to deploy.
Checked into Aliasing as well, we wouldn't be planning to import a new version and an old version though, more like sister versions.
In any case, thanks for the input and thoughts. I realize Npm was prob not made for this type of situation. If I remember right, I could do this with Gulp years ago, but I haven't even thought about Gulp in so long :) And then, I'd have to deploy manually via an FTP program ... wow, those were days.
Thanks again!

Are there any tools or practices for tracking developer IDE&Tool versions?

I am developing on multiple machines and have the repository and/or project-folder on a private cloud.
I would like to have a file or something, that includes every used tool (NP++ v1.x.x, VS2019 v4.x.x, yEd v2 etc.).
I find the idea of the "package.json" for NPM extremly useful. Maybe their is something similar for OS-Level. (Win10 by the way)
Possible solutions I thinked of:
of course just track it manually
Virtual Machine (which I don't want to use and cannot host anyway)
The tool/practice/extension/whatever should only track some given IDEs/tools. Not setup a OS from zero.

Ensure npm/pip dependencies are binary-preserved

My company has a policy that all projects should not reference any 3rd party code servers after the release. Basically they ask to make local mirrors of all package servers. This is to ensure we can reproduce the release, given that it is always a risk that somebody will change the code on the server, not controlled by us, without changing the library version. It is also a security risk to blindly use external servers.
What is the proper way to fulfill this policy with npm? If I understand it correctly, package-lock is not enough - it will give me a warning if hash is changed, but I will not be able to reproduce the build.
There is npm-mirror, but it seems old and I was not able to run it. Are there better up-to-date alternatives?
Also, I was thinking about just preserving a copy of node_modules, but this doesn't really work. We are building our projects on different environments, and node_modules folder is environment specific and needs to be built separately for each.
We also use python and I would assume I need to find the solution for the same problem with pip

PHPUnit: local VS global install

Installing PHPUnit with composer globally seems more convenient to me for those two reasons:
1. Using it everywhere without needing an extra install.
2. Just running phpunitinstead of vendor/bin/phpunit (using an alias might solve this)
Are there any reasons why a local install might be the better choice? For example: using the exact same versions every time. (don't have a lot of experience with PHPUnit, so not sure if this really is an issue or not)
The big disadvantage of installing packages globally is that you might end up with different versions of PHPUnit between developers in your team (unless you are the only developer). This might cause some side effects.
If you install it locally using composer.json, then every developer in your team will have exactly the same version as you do for that specific application. Also, everybody will see when you change the version in composer.json.
If you don't like typing vendor/bin/phpunit, you can use Makefile (which is also in your project):
test:
vendor/bin/phpunit --configuration=test/Unit/phpunit.xml
then run it ...
make test
I like to install it via composer and the require-dev block, but another way that does come highly recommended is to download the phpunit.phar into the project, to use that.
Either way, you control exactly which version is being used (and when it's updated) - which is the most important part, as you can't so easily control what people have installed globally.

How do you distribute the IDE and it's configuration within your Team?

I'm wondering how Software Development Team distribute their Standard IDE(s)?
E.g. developing with Eclipse, custom Code formatter, svn Resository, Copyright Header..
At the moment my Team has a standard zip File which is then distributed withhin the developers.
Problem:
If one file, a Plugin or the IDE itself changes, e.g. new Coding Guidlines, Upgrade Eclipse 3.5.1 the whole distribution has to be done again. Every developer needs to unzip the bundel again. Imagine your working with different Workspaces (Jetty, different Tomcamt Versions, WTP) due to Project History That doesn't scale
I know that there are some related Articels
A new version of Eclipse just came out. Is there anything I can do to avoid having to manually hunt down my plugins again?
Manage Your Eclipse Install With A Local Git Repository
And some comercial Programs.
Eclipse also has a new Update-Installer Approach
But I don't see the Killer App. How do your team solve this? Is there a best practice?
I guess best would be a Program letting you choose your current Project and then downloads the configured IDE from the Server and leting you know if Project Config Files are Updated
For eclipse look at Buckminster it targets exactly your target I suppose, didn't use it personally through.
At my previous company they wrote a custom update agent that pulled from a centrally configured server which was updated by the team leaders. It worked well, until people wanted to install their own plugins.
Basically, a developer wanted a plugin, fought in futility to get it included in the default (managed) repo, installed it himself, then updates broke on his machine when the team lead had a sudden stroke of common sense and included it.
They never did come up with a 'good' way to manage it. But, at least they didn't put us all on terminal servers with thin clients.