Is it necessary to pay attention to npm vulnerabilities warning,? - npm

Most of the time wile installing new npm dependecies in a project, I get warnings and issues ( either low moderate or even crtitical).
And running npm audit fix, or fix by force doesn't resolve the issues.
Can I ignore them or should i fix myself ? Especially when it's about dependency tree

Warnings are here to tell that a bug have been found and will be probably corrected in a future release. Nonetheless, to avoid breaking a package, an old dependency (which can also have issue) can still be used. So, those who maintain the package have either the choice to rebuild their package without the problematic dependencies. Or they wait that the dependencies (or work on it) get corrected.
if you want to fix yourself every problems of every dependencies of every packages you are using, you will certainly need to reincarnate thousand & thousands times on earth to clean that mess xD
the best is to have the less packages possibles. But here are 3 points i'm used to check before deciding to use a package or not:
does it have a future ? (stars, download, last update)
are there a lot of peoples working on it ? (forks, issues, pull)
does it have too much dependencies ?
choose wisely can avoid you to fall into a devellopement nightmare.

Related

Excluding New Dependencies in Gradle File

I have an app that runs perfectly without new dependencies like this one:
A newer version of androidx.navigation:navigation-fragment-ktx than 2.4.2 is available: 2.5.0
If I upgrade to v.2.5.0, my app has warnings about unrelated elements like for example references to menu objects.
Should I wait and allow these Gradle warnings such as above notice until another upgrade comes along and try the new dependency then?
You shouldn't need to update anything unless you have a reason to. It's often a good idea (as well as new features you also get bug fixes) but it's usually not required. Specifying all your dependency versions means you get a repeatable build that should always work, so long as those versions are available
The thing about libraries is they often have dependencies on other libraries, and updating one might introduce a requirement for other stuff to be updated (which might be why you're seeing other errors appear. That broadly shouldn't happen (making things independently updateable is part of the reason for breaking everything out into separate libraries!) but here's a blog post from when they introduced it:
Starting with the AndroidX refactor, library versions have been reset from 28.0.0 to 1.0.0. Future updates will be versioned on a per-library basis, following strict semantic versioning rules where the major version indicates binary compatibility. This means, for example, that a feature may be added to RecyclerView and used in your app without requiring an update to every other library used by your app. This also means that libraries depending on androidx may provide reasonable guarantees about binary compatibility with future releases of AndroidX -- that a dependency on a 1.5.0 revision will still work when run against 1.7.0 but will likely not work against 2.0.0.
Really you have to look at the release notes for a library to see if there are any breaking changes you need to worry about. For example, here's the one for the Activity Jetpack component and if you search "dependency changes" you'll see where updating actually requires a specific minimum version of another thing
Also sometimes a library will pull in an old version of another library it depends on, so you might be explicitly interacting with a very old version of a component just because you never added it as a dependency yourself. Then if that first library requires a much newer version of that dependency, you might suddenly get a large jump that requires a bunch of changes to your code, even though it doesn't seem to have anything to do with what you updated!

Can a change to package-lock.json ever affect the deployment?

I'm reading the NPM docs about package-lock.json and my interpretation is that a committed change to it can never cause issues in the deployed version.
During the roll-out we run npm install which creates (or overwrites) the lock file anyway. In my mind, the lock file is more of a receipt of the state of the concurrent world while installing, rather than a pointer on how the installation should be performed.
However, I haven't been successful convincing my team that it is so. They feel uneasy relying on the statement above (not contradicting it nor arguing against it, just not entirely convinced to the degree that they would bet a testicle on it).
Is it at all possible that package-lock.json might affect the actual installation?
Since I'm new with the company, my track record of 10+ years has limited impact. And I'm myself humbly considering that even though the lock file never caused me any issues before, my experience might be irrelevant if the local environment is configured in a way I'm not familiar with yet. So I'm too cautious to bet my reputation as we're about to make a very important release.
In my mind, the lock file is more of a receipt of the state of the concurrent world while installing, rather than a pointer on how the installation should be performed.
Maybe I am interpreting your statement wrong but package-lock is a pointer for future installations in a way. See the general documentaion on lock files (different link than the one you shared), following statement from the above doc might be helpful:
This file describes an exact, and more importantly reproducible node_modules tree. Once it’s present, any future installation will base its work off this file, instead of recalculating dependency versions off package.json.`
A read on following discussion on this topic might be helpful to you too. Thanks!

How to stop or limit indexing in IntelliJ 13?

My IntelliJ 13.1.5 constantly indexes my project which really slows my machine down. It does it when I rebuild my project as well as when I start my jetty server.
Does anybody know how to disable or at least limit that behavior?
The previous version didn't do that so often.
Actually, I found what was wrong.
Once of my modules didn't have the target folder excluded and that was causing IntelliJ to always index and since that module is big it would take forever to index it.
Solution:
Go to "Project Structure" -> "Modules" and excluded all target folders.
Starting from IntelliJ 2017.2, indexing can at least be paused:
To other unfortunate souls working for enterprise mostly on VDI-s without an SSD: Idea actually parses/indexes a lot more then your project folders. Likely candidates that makes your whole day a rant session:
Libraries and Linters specified at a global level. For example "Languages & Frameworks/ Javascript/ Libraries" or "TypeScript / TsLint / TsLint Packages". If you work in multiple languages then this can bloat your index quite a lot. Its usually much better to open just one tiny bit from a project related to what your are working on to keep the index as small as possible.
as mentioned before: target, node_modules folders
dist, mock, resource folders
Do not open multiple projects/ modules in the same project scope. I theory this saves you time because you dont have to wait to reopen the given module in an other window, but the reality is that you just adding more stuff to index. If you happen to git pull a project with 5-6 different modules your idea will go into stasis for half an hour to index all the changes.
Try Invalidating the cache and restarting IntelliJ.
I had similar issue it solve with :
IntelliJ IDEA caches a great number of files, therefore the system cache may one day become overloaded. In certain situations the caches will never be needed again, for example, if you work with frequent short-term projects. Also, the only way to solve some conflicts is to clean out the cache.
To clean out the system caches:
On the main menu, choose File | Invalidate Caches/Restart. The Invalidate Caches message
Source link.

How to break a maven build when dependencies are out of date?

I love the maven-versions-plugin but sometimes I forget to run it for a while. Is there a way to make a maven build fail (and thus have a continuous build fail) when certain important dependencies are out of date?
I think you're approaching this incorrectly. Mail yourself the output of the maven-versions-plugin if you want, but don't fail the build due to changes outside of your control.
Even more, why would you want to needlessly update to the latest versions? I have seen many tricky problems appear due to upgrades which have brought slight changes to previous behaviour.
This, in general, is a bad practice - to update versions automatically. There is no practical reason of using the latest version of any package. If the library you're using satisfies your requirements you should stay with this version for security/stability reasons. And forever.
I think that maven-versions-plugin is an anti-pattern itself.
ps. When and if you want to do integration testing of modules developed by different teams/programmers, it is "integration testing". Even in this case I still think that on-fly version updating is the wrong approach. Root project should not do this integration testing, instead, every sub-module (or JAR, in your case), has to be responsible for integration testing of itself together with the rest of the system. When a sub-module increases its version it has to validate whether everything is still fine, and only then has to release a new version to the repository. And when the sub-module is doing the validation it has to be dependent on statically specified version numbers.

Debugging Maven's "The artifact has no valid ranges"

We're using Maven at work at quite regularly we get the error message "The artifact has no valid ranges". After a long time of Googling and experimenting I realised what this error message means: The artifact does have valid ranges, just too many of them.
For example, my master POM has a dependency on superframework v.1.0 only, but there is also a transitive dependency on superframework v.0.5-0.9.
Until now, whenever I had such a problem I've looked at the (very cryptic) error message and sorta guessed which POM I needed to change - basically a lot of trial an error. The problem is that mvn dependency:tree doesn't work if you have a dependency resolution problem.
The Eclipse plugin sometimes helps a little, but sometimes it is way off.
Any tips on how to resolve these problems?
This might not be the expected answer but my advice would be to actually not use dependency ranges as they worsen build reproducibility.
I prefer to use fixed versions (which also make dependencies conflicts resolution easier, see the note at the bottom of 9.4.3. Dependency Version Ranges) and use intensively the Dependency Convergence report to manage them.
This isn't a direct answer to my question but rather a word of advice. I learned something new since askin the question: the order in which dependencies are listed in the POM files, much to my surprise, does matter.
So, if you include a dependency on
superframework [0.5,1.5)
it will fetch the latest available version, say 1.1.
If you then have a transitive dependency further down that includes
superframework [0.5, 1.0)
Maven will generate this misleading error, since it will not select anything other than the 1.1 it already has, even though it could just select 0.9 without producing a version conflict. If you swap the order, weirdly, it works.
Am I right in thinking that this is a flaw in Maven's behaviour?