Intellij unused Injects are not detected - intellij-idea

I am trying to clean my code and delete #Inject fields in my classes which are not used. I tried to follow this https://blog.jetbrains.com/idea/2009/04/global-unused-declaration-inspection/ in order to achieve this. Somebody seems to have succeeded in making it work in the comments of the link but it didn't work at all for me. As soon as I add #Inject over my fields, there is no warning, even if the injected variable is private and never used !
Is this not supported for some reason or is there somewhere else I need to change the settings ? What I did was to set private on the inspection of the fields. I even tried to add an entry point on the Inject annotation. Needless to say that nothing of this worked.

Related

#NotNull, #Nonnull etc. all don't work in IntelliJ IDEA

I have tried annotating a field with
org.checkerframework.checker.nullness.qual.NonNull
org.jetbrains.annotations.NotNull
javax.annotation.Nonnull
And in all cases, assigning a null to it generates no complaints from IntelliJ 2016.2.
public class GreetingController {
#NotNull Integer x = 3;
public void foo() { x = null; }
}
That all compiles fine according to IntelliJ.
This page from IntelliJ specifically states that "IntelliJ IDEA highlights the problems “on-the-fly”, so you can see the inspection results right in the editor." I have even copied the example code (public class TestNullable) into my editor and it produces no errors.
This other page from IntelliJ states you can change the annotations it responds to. So I chose javax.annotation.Nonnull and made sure that was the one I was using in my code, still no luck.
To be clear, what I'm hoping for, and what I understand should be provided, is that the editor window / compiler alerts me to the problem (I am not looking for a runtime check, NullPointerException already works fine at runtime.)
In case it didn't work in real time, I tried "Rebuild Project".
I'm sure this must work, what am I doing wrong?
I have uploaded an example of this not working here: ZIP download.
As I can see from your screenshots and the sample project, IntelliJ IDEA does show you the warnings. Note that these warnings are shown by the code inspections which are running on the fly and will be displayed in the editor or in the Analyze | Inspect Code results. These warnings will not be displayed by the compiler.
Note that you can configure the warnings highlighting if needed (for example add the underwave effect):
You can also change the severity of the inspection (like to Error):
You may also want to vote for this feature request:
IDEA-78625 Provide inspection severity level that will work like validation and abort compilation
As a bonus, pay attention to the javax.annotation.Nullable annotation, it may be not what you think it's for, see this comment and the documentation. For some years IntelliJ IDEA has incorrectly suggested to use this annotation, while the correct one for such cases would be javax.annotation.CheckForNull:
This annotation is useful mostly for overriding a Nonnull annotation.
Static analysis tools should generally treat the annotated items as
though they had no annotation, unless they are configured to minimize
false negatives. Use CheckForNull to indicate that the element value
should always be checked for a null value.
"Settings" > "Inspections" > "Probable Bugs" > "Constant conditions & exceptions"
Tick the first option: "Suggest #NotNull annotation for methods that possibly return null and report nullable values passed to non-annotated parameters.
Click "Configure Annotations". By default, Intellij will use their own annotations from org.jetbrains.annotation. I was using the more general (my own opinion) annotations from javax.annotation.
I set Nullable to: javax.annotation.Nullable
I set NotNUll to : javax.annotation.NotNull
In order to set these new options, you must click them, then click the tiny checkmark button to the right to set it. Selecting the javax.annotation annotations then hitting "OK" will NOT lock in the new settings, you must use the checkbox button.
After successfully specifying javax.annotation.Nullable and javax.annotation.NotNull, the code correctly highlighted null problems.
The best that this can do is offer up warnings. It will not stop compilation from happening, since the annotations do not prohibit or preclude code compilation from taking place.
Be sure that you have the appropriate inspections enabled in your IDE, and be sure that you remain aware of what parameters you're passing into your method. The IDE can at best warn you, but it can't really stop you.
Alternatively, introduce a unit test to fail if that method receives a null parameter, and rely on that to ensure that you're not breaking code or expectations.

How do you Cache-bust individually rendered files while debugging?

Currently it is impossible for devs to easily work together. While debugging our code minification and bundling are turned off and so is the cache buster. This leads to every dev that touches javascript having to open every javascript file and force-refresh to make sure they aren't missing changes.
I found a couple references that I thought might work but none of the implementations have worked out yet.
The first is to apply a transform to the individual Bundles via an IBundleTransform.
Public Class DebugCacheBuster
Implements IBundleTransform
Public Sub Process(context As BundleContext, response As BundleResponse) Implements IBundleTransform.Process
If BundleTable.EnableOptimizations Then
Exit Sub
End If
For Each file As BundleFile In response.Files
file.IncludedVirtualPath &= GetPathHash(HostingEnvironment.MapPath(file.IncludedVirtualPath))
Next
End Sub
End Class
This looked promising but I have not been able to get it to work. I tried adding a new instance of this class to the constructor of each bundle and I also tried looping over all of the bundles after they were created. My break-points are hit and IncludedVirtualPath appears to have been updated. After continuing on with rendering the paths are not updated.
I also tried to create a custom VirtualPathProvider and a custom VirtualFile and overrode VirtualPath to return the correct value but again, when it rendered, the path was bare.
Did I do something wrong with the transform? Is there some other way to implement this?
Apparently this code will not work with version 1.1.0 of System.Web.Optimizations. After upgrading to version 1.1.3 (and adding an assembly binding redirect to solve a compatibility issue with Web Grease) the snippet in the question works flawlessly.

Play Framework 2.1.1: bindFromRequest() returns the correct data but ignores all data pertaining to relations

I have a form that is supposed to create an entity of type Load, but for some reason, doesn't seem to be actually passing or seeing any of the data related to associations of the entity (load.user, load.client, etc). This all used to work fine but stopped working at some point during a bunch of refactoring (that didn't change any of the fields in any of the models). Now all of the forms in my website have broken the same way and I have no clue where to even look to start fixing it.
From the view, I submit the form for a new Load, printing out the data everywhere I can along the way. Printing out the data being sent to the server before it's sent shows all the data is there like it should be. Printing out Form.form(Load.class).bindFromRequest() in the controller shows the form's data contains everything needed, for example, the value user.id=1 is in the data. However, there is also a validation error saying that the user is missing. How can this be?
Form(of=class models.Load, data={ a bunch of stuff, user.id=1, a bunch more stuff}, value=None, errors={=[ValidationError(,Logged in user is missing or invalid.,[])]})
The validation error is being generated by public String validate() in the Load class, which is simply checking if(user==null) and returning that string if it is. I should note that every form that submits multiple entities (for example, submitting a Dock and then also the Dock's Location) only saves the main entity (in this example, the Dock) and ignores all others (the Dock's Location is never saved, even though Dock cascades in the model to also save the Location). All of our form fields are labelled correctly, this code did used to work at some point before it mysteriously stopped working!
So why did all of my forms suddenly stop correctly dealing with anything but the main model for the form? It is as if they cannot even "see" the data contained in bindFromRequest(). If I print out a variable in the validation method of Load, such as this.status, it prints the correct thing. But if I try to print something like this.user.id or this.client.id I get a null pointer error. Where is the code in Play that actually interprets the data (user.id=1) and turns it into the User associated with the Load, and how could it be breaking?
Edit: Also, yes, I did try "play clean", it was the first thing I tried since usually it fixes weird errors like these! But this time, no dice.
Edit2: I'm including the html from the form, in case it is helpful.
<input type="text" id="user_id" name="user.id" value="1" class="idfield">
Edit3: The only change I made during the refactoring that might have influenced this is that I had to make some setter methods like Load.setBroker() because the ones that are supposedly generated by Play didn't work. For example, load.broker=aBroker would not have set the Load's Broker before, so I had to make a public void setBroker(Broker broker) method in Load. Does Play use the auto-generated setters to bind the data? Could overwriting them cause problems?
Whoops, I figured it out. It was the setters I had written. Some of them were set to private purely by mistake, and apparently this was preventing Play from setting the values when binding the data. Changed them all to public and the mystery error vanished.

"No appropriate method" error generated when calling new function using class-defined object

I defined a class called "FilterCriteria" which has a bunch of function .m files (getAMask, getBMask, etc.) associated with it. When I create the FilterCriteria object and call the functions using it, I don't have any problems. However, recently I added another function (which, on a side note, is almost identical to another function that still works), and Matlab returns the error, "No appropriate method, property, or field getHMask for class FilterCriteria."
I've searched online for this problem, but I can't find anything. The file getHMask.m is definitely in the correct folder, so I don't understand why Matlab seems to have such a problem finding it.
Here's getHMask.m's header:
function mask = getHMask(object, quadrant, channel)
Any help would be greatly appreciated. Thanks in advance.
1) A mistake I make sometimes is not saving the file with the correct name. Make sure capital letters are in the right places etc!
2) Another layer of error checking here... You can call methods('object here') (see here) and make sure it lists the method (function) that you are trying to add to it. If it doesn't show up here you should check into the implementation of the method and make sure it's correctly being added to the class you're using for your object.
I had the same problem that's kind of suggested by Ben's bullet #2 and it was driving me crazy. Turns out MatLab wasn't loading the latest version of my class's m-file. I vaguely remembered it gave me a warning earlier about that, because there were old instances of the class in the workspace and to keep from invalidating them it said it wouldn't update the class until I cleared the workspace...
So if that's the problem, restarting MatLab will work, or you can just enter >> clear

Weird JavaCore IType cache problem

I'm developing a plugin that takes all enums in workspace that implements certain interface (IDomain) parses the code (Using AST) does some modification over the enum and marks it as processed with an annotation (#IDomainInfo).
For example, it takes someting like this:
public
enum SomeEnum implements IDomain {
// ...
}
And generates something like this:
public #IDomainInfo(domainId = 1)
enum SomeEnum implements IDomain {
// Some changes here...
}
The idea behind of the #IDomainInfo is that annotated enums have not to be processed anymore by the plugin.
Basically what I do to accomplish the task is to make a search with JavaSearch API to find all the enums implementing IDomain (easy task), and as result I get a list of IJavaElements (which are in fact instances of IType). Then I call a method that iterates through the resulting list and creates a new list of all the IType instances that are not annotated with #IDomainInfo and then process the resulting list: For each non annotated IType do some work, annotate the IType with the #IDomainInfo annotation (Using AST) and then save back the results to file (using IFile, so I can see the changes without refresh, and in fact, if I have the enum open in the editor I see it refreshed instantly :-)
All that works fine, but if I open an #IDomainInfo annotated enum (just for testing) then remove the #IDomainInfo, save the file (I'm sure) and then call the action that does all the job I've described before, when I get to the part that filters annotated IType from non annotated ones, code is something like this:
for (IType type : typeList) {
IAnnotation annotation = type.getAnnotation(“IDomainInfo”);
if (!annotation.exists()) {
// The annotation does not exist, so add the type to the
// list of elements to update and go on...
ret.add(type);
continue;
}
// Something else here...
}
Well, it results that for the file I've just saved the IType detects the annotation I've just removed as if it's still there. If I close and reopen eclipse all works normally.
Now, I've just checked and triple checked my code, so I'm sure that I'm not keeping a stale copy of the old IType unedited still with the annotation version (all my IType come from a fresh java search call every time I run the action).
So the question is, what might I be doing wrong? I mean, I've just read the JavaCore API many times to check If I might be using it wrong or if I have some conceptual flaw there but really I have no clue, it's like if eclipse would be caching the IType ignoring the changes I've just made in the editor :-/
If any one have an idea I would appreciate it a lot :-)
When or how is your plugin called ? Did you register a resource listener or is it a project builder or something else ? If it is called by a resource listener, your plugin may be reading the 'primary copy' for your IType, which has not been saved yet. Hence your changes are still in the Working Copy.