Ensure npm/pip dependencies are binary-preserved - npm

My company has a policy that all projects should not reference any 3rd party code servers after the release. Basically they ask to make local mirrors of all package servers. This is to ensure we can reproduce the release, given that it is always a risk that somebody will change the code on the server, not controlled by us, without changing the library version. It is also a security risk to blindly use external servers.
What is the proper way to fulfill this policy with npm? If I understand it correctly, package-lock is not enough - it will give me a warning if hash is changed, but I will not be able to reproduce the build.
There is npm-mirror, but it seems old and I was not able to run it. Are there better up-to-date alternatives?
Also, I was thinking about just preserving a copy of node_modules, but this doesn't really work. We are building our projects on different environments, and node_modules folder is environment specific and needs to be built separately for each.
We also use python and I would assume I need to find the solution for the same problem with pip

Related

Is there a way to deploy 2 versions of the same package for 2 different use cases at once?

The answer seems like 'no' but I wanted to check with colleagues here.
We provide an npm package for our own sites as well as some 3rd party sites.
There's a fairly heavy and old homegrown npm package that we also have in our package.
We don't need that package any longer on our sites but the 3rd party sites do.
We also have no way of controlling the code on those 3rd party sites so we need to keep the deployed bundle name and location the same for them.
Is there a way to publish a version of our package first without the extra package for us and then a version with it for the third parties from the same repository?
ourpackage-new.js (without the dependency)
ourpackage.js (with the dependency)
I had some success with a new package json in a subdirectory. I would create a new package and the original package via a command in gitlab.yaml to cd into that directory and npm publish there after the first one. This requires copying some dependency files down there as well which would mean if one version was updated, we'd need to remember to update the copy. Not a situation we'd want.
Even if we created a 2nd repository for the change just for us, we'd still need to update 2 repositories every time we had a new change to deploy.
Checked into Aliasing as well, we wouldn't be planning to import a new version and an old version though, more like sister versions.
In any case, thanks for the input and thoughts. I realize Npm was prob not made for this type of situation. If I remember right, I could do this with Gulp years ago, but I haven't even thought about Gulp in so long :) And then, I'd have to deploy manually via an FTP program ... wow, those were days.
Thanks again!

Versioning APIs during internal development

In our team we have a number of APIs specified using the Open API Specification (formerly Swagger). We use Maven and OpenAPI Generator to generate code, build and publish the artifact to our local nexus. We build our code on TeamCity. The artifact is given the version that is specified in the pom.xml file of Maven.
During development we only use snapshot versions, that is versions which can be overwritten and will be cleaned up. This is opposite to release versions, that cannot be overwritten and needs administrative privileges to clean up. The reason for this is, that a developer usually changes a little bit at the time, which is much more convenient with snapshot versions. This also makes cleaning up outdated unreleased artifacts much easier.
Our problem is, that from time to time a developer makes API changes but forgets to set a new version. This works fine locally, but when the code is build on TeamCity the changed API overwrites the artifact of an older version. A developer not working on this branch will then experience a compile error, because the code does not match the API artifact being used.
What does others do? Is there a best practice? Preferably with standard tools. We have tried many things and nothing works well. At the same time this issue is so basic that someone must have a good solution - or at least experience enough to point to the least bad solution.

PHPUnit: local VS global install

Installing PHPUnit with composer globally seems more convenient to me for those two reasons:
1. Using it everywhere without needing an extra install.
2. Just running phpunitinstead of vendor/bin/phpunit (using an alias might solve this)
Are there any reasons why a local install might be the better choice? For example: using the exact same versions every time. (don't have a lot of experience with PHPUnit, so not sure if this really is an issue or not)
The big disadvantage of installing packages globally is that you might end up with different versions of PHPUnit between developers in your team (unless you are the only developer). This might cause some side effects.
If you install it locally using composer.json, then every developer in your team will have exactly the same version as you do for that specific application. Also, everybody will see when you change the version in composer.json.
If you don't like typing vendor/bin/phpunit, you can use Makefile (which is also in your project):
test:
vendor/bin/phpunit --configuration=test/Unit/phpunit.xml
then run it ...
make test
I like to install it via composer and the require-dev block, but another way that does come highly recommended is to download the phpunit.phar into the project, to use that.
Either way, you control exactly which version is being used (and when it's updated) - which is the most important part, as you can't so easily control what people have installed globally.

Is nuget appropriate for daily development workflow?

I am looking at nuget for improving automatic handling of dependencies (both internal and third party) during development.
A long as you develop through the CI Build Server, all is good:
get latest source for A and B, where B depends on A
fix bug in A
build A
check into source control
CI Build Server initiated
new nuget package is created and placed in corporate repository
build B (which will get the updated A package)
run B to verify that the bug in A was fixed
n. repeat n times
However, I'm wondering if it is possible to work locally as a single developer, without having to wait for the CI Build Server to produce a new package?
Nuget has a feature Package Restore, which will download all dependencies automatically on build. You can also list the repository order that the Package Restore should look for packages.
If the workflow could become:
get latest source for A and B, where B depends on A
fix bug in A
build A
(building creates a local nuget package)
run B to test the (resolved) bug in A (should now use our local nuget package, not local repository)
...repeat n times
check into source control
CI Build Server initiated
new nuget package created in corporate repository
Is this possible using Visual Studio, MSBuild, a CI Build Server and nuget? I'm especially interested in the making of local packages while developing locally.
Note that I have native projects, although except the generation of nuget package post-build, this would be a workflow that I hope should work for both C# and C++ projects.
The solution I have now, though far from ideal, is what I could figure out works best. Oh! and it is a work in progress so it WILL change in the coming weeks/months as I figure out how to get around the kinks.
I mostly have to deal with managed DLL right now but I do have some native code and worst, multi-platform native code to deal with eventually.
Create a local repository, basically just a folder and configure it in your list of nuget feeds.
Then I created a task (MSBuild) that will package the project and output it in the local repository's root folder. Make sure the version of your package is always increasing. Presently I do this manually by editing the assembly version.
Once built, update your other projects that reference it, I usually do this though the package manager console (update-package).
Each projects that was updated, bump up their version rinse lathe and repeat until you get to your top-most project (the actual program).
Once everything is nice and good and you are ready to commit then the build system should do it's own packaging and send it to your official repository.
The Good
No clogging of the repository and build system with intermediary development versions, that garbage remains (as it should) local.
Local repos are super easy to set-up, can even be done without changes to VS though the global nuget config.
This is friendly to both paradigms of package recover or checking-in packages with the project. That said I would recommend not checking in the packages you built locally but rather one that was committed to your local repository ideally through the build system. What's built local should remain local.
The Bad
Still much more complicated than just adding projects to a solution.
The deeper (or wider) your dependency tree the bigger the pain.
The Ugly
Makes some native nuget behaviors quite quirky and annoying :
Update operation takes forever if your VS is connected to a version system (perforce for me). I hear they "solved" the problem, would hate to see how it was before if it was worst that it is now !
Having nuget change non-code reference back to never copy is a major pain.
If Only
Configure the desired state of a content dependency (copy always, never or newer) directly from the nuspec and be done with it ! (oh and same story with ClickOnce content status include, exclude etc)
Make the update operation quick, 2 minutes for a dozen project is just insane, especially if the ultimate goal is to manage 500+.
Perhaps a hybrid mode where locally we work with projects inclusion but the build system would work with nuget dependency (and build them if necessary)
If you are to parse the project do follow MSBuild parsing rules and honor the conditional statements.
There are still issues I have yet to figure out like how to manage multiple branches of the code in the repository. How to handle version conflict further up the food chain. In a large project (ultimately we have to bring 500+ separate projects together in a single application executable, conflicts are expected).
I would love to bring all the goodness of sane dependency management à la Maven but thus far I did not find nuget to be mature enough to even think of proposing it to the dev team.
Certainly. In our solutions, NuGet parks the libraries in the "packages" directory of the solution's hierarchy which is ultimately kept in TFS. This allows for complete solution check-outs that includes the required libraries. If it's your intention to update the libraries normally provided by NuGet, you'll need to update the dependent projects' references to point to the project containing the updated code normally provided by the NuGet process.
Prior to checking-in your regular solution work (not the NuGet related libs,) make sure the solution's NuGet libs are up to date, and the references in the solution point back to the NuGet installed libs. Of course, you'll check-in and fetch the NuGet related libs beforehand.

Archivable, replicable releases when building with Maven: is there a right way?

We have a largish standalone (i.e. not Java EE) commercial Java project (10,000+ classes, four or five SVN repositories, ten or twenty third-party libraries) that's in the process of switching over to Maven. Unfortunately only one engineer (in a team of a dozen or so distributed across three countries) has any prior Maven experience, so we're kind of figuring it out as we go.
In the old Ant way of doing things, we'd:
check out source code from three or four repositories
compile it all into a single monolithic JAR
release that (as part of a ZIP file with library JARs, an installer, various config files, etc.)
check the JAR into SVN so we had a record of what the customers had actually got.
Now, we've got a Maven repository full of artifacts, and a build process that depends on Maven having access to that repository. So if we need to replicate what we actually shipped to a customer, we need to do a build against a Maven repository that has all the proper versions of everything. This is doable, I guess, if in (some version of) the (SVN-controlled) POM files we set all the dependencies to released versions?
But it gives our release engineer the creepy-crawlies, because there doesn't seem to be any way:
to make sure that somebody doesn't clobber the copy of foo-api-1.2.3.jar on the WebDAV server by mistake (the WebDAV server has access control, but that wouldn't stop a buggy build script)
to detect it if they did
to recover afterwards
His idea is, for release builds, to use a local file system as the repository rather than the WebDAV server, and put that local repository under SVN control.
Our one Maven-experienced engineer doesn't like that -- I guess because he doesn't like putting binaries under version control? -- and suggests that maybe the professional version of the Nexus server can solve the clobbering or clobber-tracking/recovery problem.
Personally, I'm not happy (sorry, Sonatype readers) with shelling out money for a non-free build system when we haven't even seen any benefit from the free version yet, and there's no guarantee it will actually solve the problem.
So our choices seem to be:
WebDAV server
Pros: only one server, also accessible by devs, ...?
Cons: easy clobbering, no clobber-tracking/recovery
Local file system
Pros: can be placed under revision control
Cons: only works with the distribution script
Frankly, both of these seem like hacks to me, and I have to wonder if there isn't a better way to do this.
So: Is there a right thing to do here?
I'm not sure to get everything but I would:
Use the maven-release-plugin (which automates the release process i.e. execute all the steps documented in release:prepare).
Use WebDAV with anonymous read-only and authenticated write policy (so only release engineer can actually deploy released artifacts to the corporate repo).
There is a no need to put generated artifacts under version control (if you have the poms under version control). I don't see the benefits of using the local file system instead of WebDAV (this is not providing more security, you can secure WebDAV as well). I don't see what the commercial version of Nexus would solve here.
Nexus has a setting which prevents you from clobbering an already released artefact in a release repository.
For a team of about a dozen, the free version of Nexus should be enough.