What relations View dependencies in SSMS don't show - sql

I've heard that you cannot rely on SSMS view dependencies, that dependencies for objects on linked servers and dynamic code dependencies aren't showed.
Is there anything else that is now recognized as dependency and shown? What dependencies are not?

I've never done a 'proper' investigation into the behaviour, but I've certainly come to distrust the feature - I find that even local dependencies get out of date and I simply can't base critical decisions on the results (e.g. can I really delete that object)

Related

Excluding New Dependencies in Gradle File

I have an app that runs perfectly without new dependencies like this one:
A newer version of androidx.navigation:navigation-fragment-ktx than 2.4.2 is available: 2.5.0
If I upgrade to v.2.5.0, my app has warnings about unrelated elements like for example references to menu objects.
Should I wait and allow these Gradle warnings such as above notice until another upgrade comes along and try the new dependency then?
You shouldn't need to update anything unless you have a reason to. It's often a good idea (as well as new features you also get bug fixes) but it's usually not required. Specifying all your dependency versions means you get a repeatable build that should always work, so long as those versions are available
The thing about libraries is they often have dependencies on other libraries, and updating one might introduce a requirement for other stuff to be updated (which might be why you're seeing other errors appear. That broadly shouldn't happen (making things independently updateable is part of the reason for breaking everything out into separate libraries!) but here's a blog post from when they introduced it:
Starting with the AndroidX refactor, library versions have been reset from 28.0.0 to 1.0.0. Future updates will be versioned on a per-library basis, following strict semantic versioning rules where the major version indicates binary compatibility. This means, for example, that a feature may be added to RecyclerView and used in your app without requiring an update to every other library used by your app. This also means that libraries depending on androidx may provide reasonable guarantees about binary compatibility with future releases of AndroidX -- that a dependency on a 1.5.0 revision will still work when run against 1.7.0 but will likely not work against 2.0.0.
Really you have to look at the release notes for a library to see if there are any breaking changes you need to worry about. For example, here's the one for the Activity Jetpack component and if you search "dependency changes" you'll see where updating actually requires a specific minimum version of another thing
Also sometimes a library will pull in an old version of another library it depends on, so you might be explicitly interacting with a very old version of a component just because you never added it as a dependency yourself. Then if that first library requires a much newer version of that dependency, you might suddenly get a large jump that requires a bunch of changes to your code, even though it doesn't seem to have anything to do with what you updated!

Unable to derive module descriptor for legacy signed JAR

I'm trying to update a software system to JDK-11 using modules, and everything was going just fine right up until I slammed head-on into the aforementioned issue.
I have a legacy signed JAR that I need to incorporate for interaction with legacy systems. There's no way to update the JAR and no way to get a new version. The JAR must be signed in order to be usable (the whole "trusted code" deal and whatnot). The problem is that the JAR contains classes in the unnamed (root) package. Yeah. Stupid. Bad practice. Blablabla. It's still there, and I still need to use it.
I've not found any documentation or answers anywhere that would remotely suggest that what I need is possible. In fact, the opposite is true: everyone is adamant that in the "new"(ish) module system, no class may reside in the unnamed package.
Needless to say I'm unable to both modify the contents of the JAR, or get at the sources to render a new one - that's without even considering the issue of the signature...
That said: I refuse to believe the folks at Oracle would leave such a glaring oversight with regards to legacy code. As we all know, a lot of the time we have no choice but to use it for legitimate reasons, and we can't do anything to fix/update/refactor/etc... I would have hoped there was a mechanism added to the module system to support this, albeit for extreme cases only, etc...etc...
Disclaimer: I do fully understand why this isn't meant to be supported. What I'm having a hard time with is the lack of a workaround...
Thanks!
I've already tried:
creating a facade module that transitively adds the offending module (obviously no dice, same problem)
unpacking-and-repacking the module while temporarily disabling signature validation in a test env (fails because the class is apparently referenced within many other, properly-organized classes)
finding an updated module (no luck here, either)
beheading a chicken and roasting it over a pentagram while invoking the aid of ancient pagan gods (tasty, but didn't fix it)
curling up in a ball under my desk and weeping until execution succeeds (that's where I'm typing this from)...

How to break a maven build when dependencies are out of date?

I love the maven-versions-plugin but sometimes I forget to run it for a while. Is there a way to make a maven build fail (and thus have a continuous build fail) when certain important dependencies are out of date?
I think you're approaching this incorrectly. Mail yourself the output of the maven-versions-plugin if you want, but don't fail the build due to changes outside of your control.
Even more, why would you want to needlessly update to the latest versions? I have seen many tricky problems appear due to upgrades which have brought slight changes to previous behaviour.
This, in general, is a bad practice - to update versions automatically. There is no practical reason of using the latest version of any package. If the library you're using satisfies your requirements you should stay with this version for security/stability reasons. And forever.
I think that maven-versions-plugin is an anti-pattern itself.
ps. When and if you want to do integration testing of modules developed by different teams/programmers, it is "integration testing". Even in this case I still think that on-fly version updating is the wrong approach. Root project should not do this integration testing, instead, every sub-module (or JAR, in your case), has to be responsible for integration testing of itself together with the rest of the system. When a sub-module increases its version it has to validate whether everything is still fine, and only then has to release a new version to the repository. And when the sub-module is doing the validation it has to be dependent on statically specified version numbers.

Is com.sun.org.apache same as org.apache package?

I mean, can I use the com.sun.org.apache (all subpackages) classes as I use they from org.apache (in any Apache lib)?
Will the OpenJDK maintain this package up to date with apache updates?
And the JDK7 will maintain this package?
Where can I find information about that?
It is a very bad idea to use it. Once upon a time, Sun took a copy of Xerces, chock full of bugs. They made some changes. Perhaps they subtracted some bugs. We know that there are many very serious bugs that they did not subtract.
And they renamed it to com.sun.... for one reason: to tell you not to use it. At any time, in any point release, in any patch, they can change those classes incompatibly or remove them.
Further, these classes may not be in IBM's copy of the JRE, or Apple's, or (haha) Microsoft's, or JRocket.
If you want Xerces, use Xerces. To find information about this, read the Xerces-j mailing list archive for many stern warnings from the Xerces developers about the version forked by Sun.
The fact that the classes are formally 'public' means nothing except that Sun needed to be able to new them from some other package.
Err I wouldn't, just based on the fact that they're internal classes and there is a risk of them changing over time. Use the org.apache classes instead.
No idea about the intentions with keeping them up to date, maybe try posting a message on the openjdk forum:
http://mail.openjdk.java.net/mailman/listinfo
My understanding is that this is a fork of the apache code. At one point they were the same, but no more. So you can't count on the same bug fixes being present in both versions.
If documentation for this package says that it is public, then it's OK to use.
Otherwise they can pool a floor underneath you when they decide not to support it in the future.
Usually, you should not rely on anything other than java and javax in JDK.

Debugging Maven's "The artifact has no valid ranges"

We're using Maven at work at quite regularly we get the error message "The artifact has no valid ranges". After a long time of Googling and experimenting I realised what this error message means: The artifact does have valid ranges, just too many of them.
For example, my master POM has a dependency on superframework v.1.0 only, but there is also a transitive dependency on superframework v.0.5-0.9.
Until now, whenever I had such a problem I've looked at the (very cryptic) error message and sorta guessed which POM I needed to change - basically a lot of trial an error. The problem is that mvn dependency:tree doesn't work if you have a dependency resolution problem.
The Eclipse plugin sometimes helps a little, but sometimes it is way off.
Any tips on how to resolve these problems?
This might not be the expected answer but my advice would be to actually not use dependency ranges as they worsen build reproducibility.
I prefer to use fixed versions (which also make dependencies conflicts resolution easier, see the note at the bottom of 9.4.3. Dependency Version Ranges) and use intensively the Dependency Convergence report to manage them.
This isn't a direct answer to my question but rather a word of advice. I learned something new since askin the question: the order in which dependencies are listed in the POM files, much to my surprise, does matter.
So, if you include a dependency on
superframework [0.5,1.5)
it will fetch the latest available version, say 1.1.
If you then have a transitive dependency further down that includes
superframework [0.5, 1.0)
Maven will generate this misleading error, since it will not select anything other than the 1.1 it already has, even though it could just select 0.9 without producing a version conflict. If you swap the order, weirdly, it works.
Am I right in thinking that this is a flaw in Maven's behaviour?