I've been experimenting with JSDT.
Under Indigo, the validation has stopped complaining about unknown fields of objects. In Helios, it does.
This is for all objects, as far as I can tell, but here's one example
I have a JSDT user library that among other things documents a log object.
log = function(){};
log.prototype = new Object();
log.debug = function(str){};
log.info = function(str){};
log.warn = function(str){};
log.error = function(str){};
And if I type
log.
into a Javascript editor, I get a completion list that includes info, warn etc.
If I continue and type
log.foobar()
in Helios, an error is detected and the message is about foobar not being known. This is good and what I want.
In Indigo, nothing. No error, it will happily accept any old garbage (I know that's what Javascript can do, but the point of JSDT is to do some inference and point out potential problems like this)
Is there some preference or option I've missed?
I do want to use Indigo since JSDT in Helios is more than a little buggy and I'm hoping Indigo is better.
There are too many dynamic ways for the foobar property to be added to the log object, and false positives can drown out real problems. This was removed in Indigo.
Related
I am struggling to prevent IDEA from complaining about assignment to variable declared by def keyword.
The assignment is absolutely safe as far as I can tell (I observe the same for literals of other types assigned to defed variables). I do not see why assigning Integer into Object would be a problem in the first place.
There does not seem to be any explanation in inspections settings (Groovy > Assignment issues > Incompatible type assignments) nor any fine grained configuration to tune this. When turned off completely, it stops reporting real problems like Integer a = "" as well.
How do I get it to report real problems without highlighting safe assignments?
IntelliJ IDEA can not relay grok the groovy code unless there are both Groovy and Java SDKs configured. By importing the project Groovy got configured automatically, but Java did not.
I tried to use the community version of IntelliJ IDEA for a proprietary scripting language similar to Java (and I don't do actual Java development for now) which generally works quite well regarding simple refactoring and type hinting.
Because of mainly two differences in the language (boolean is bool and the API calls don't expect java.lang.String types but API specific ones) each class file is contaminated with several errors, which make it useless to quickly find actual errors in that scripting language using IDEA's inspections.
Examples (shown error by IDEA in the comment):
bool boolTypeVariable = true; //cannot resolve symbol bool
if(boolTypeVariable) //Incompatibles types: required boolean, found bool
nlvm.lang.String str = "a String"; //Incompatibles types: required nlvm.lang.String, found java.lang.String
(Other errors arise when using !bool, bool && bool and so on)
Is it possible to ignore just these specific kind of inspections or preferably make IDEA to inspect these as desired with small modifications?
I'm aware that it's possible to define own languages for IntelliJ but I'm mostly targeting for some quick solution (which can be dirty, if it works for that case)
I would also accept a suggestion for another IDE that does supports type hinting, inspections and some basic refactoring when it's easily possible to achieve this there (I just choose IDEA community for that since I'm familiar with PyCharm).
If disable inspection intention action is not available it means the error is reported either by the parser (such syntax is not supported) or a validation that can not be disabled (e.g. the class bool that must be present in the classpath (i.e. in JDK or module libraries or the javac would report the error). To completely remove validation errors you can disable highlighting for the file.
I have an errors.rs file with error_chain! {}, which exports Result, ResultExt, Error and ErrorKind.
If I use self::errors::*, IntelliJ thinks that I'm using the default Result (std::result::Result, I think). However, if I explicitly import the types using use self::errors::{Result, ...}, everything works out hunky dory.
I can tell because the standard result has two type params, but the error_chain one has only one.
In either case, it still compiles.
I'm using the standard Rust IntelliJ plugin, version 0.1.0.1991.
Help! Does anyone know how to get the plugin to understand what the macro is doing?
The IntelliJ-Rust plugin uses its own code parser. It allows to leverage all the IntelliJ platform capabilities (like code navigation, formatting, refactoring, inspections, quick documentation, markers and many others) but requires implementing all the language features, which is not a simple task for Rust (you can find a more in-depth discussion of the Rust compiler parser versus IDE parser in this reddit post).
Macros expansion is probably the biggest language feature that is not supported by the plugin parser at the moment. That is, the plugin sees this error_chain! call, can resolve it to its definition, but doesn't expand it to the actual code and hence doesn't know about the new Result struct that shadows the one from stdlib. Unfortunately, in some cases it leads to such false positive error messages.
I've converted this error annotation into an inspection, so in the next plugin version you'll be able to switch it off entirely or for the particular code block. The work on macros expansion is also in progress.
I have a new Java project started that is using Gradle (version 3.4.1) and IntelliJ. I used gradle init at the command line and then imported the project into IntelliJ, using the default wrapper (which is recommended, according the the IDE).
I have the following in the build.gradle file:
jar {
manifest {
attributes 'Main-Class': "com.testerstories.textadv.Voxam"
}
}
The attributes line, however, provides a consistent warning that says:
'attributes' cannot be applied to '(['Main-Class':groovy.lang.GString])'
The issue appears to be "incompatible types." But it's unclear what "attributes" is being "applied to". I would presume the manifest block. But then ... what does it want me to do exactly?
I realize this is a warning and I could just ignore it. I also realize I can block inspections. But in all the discussions about this, and I realize there are a few (although most of those seem to deal with 'groovy.lang.Closure'), I have yet to find anything that unambiguously states quite simply what the actual issue is and how to resolve it. I say "resolve it" in contrast to (1) just cover it up or (2) pretend it doesn't exist.
It concerns me that something that appears to be very basic Gradle usage is not recognized or is indicating a problem in one of the core IDEs that supports Java.
Finally, I'll note that before adding to the StackOverflow space, I had asked JetBrains support and their answer was to make sure my .iml file was indicating a "JAVA_MODULE" rather than a "WEB_MODULE", which I did verify. Their next answer was that I needed to provide (quoting directly) a "namespace to resolve the ambiguity, which should resolve erroneous type errors." Ironically, perhaps, that reply needed some ambiguity resolution of its own, but I have gotten no further responses.
UPDATE (RELATED TO ABOVE; BUT DIFFERENT CONTEXT)
If I use something like this in my Gradle:
systemProperties(System.getProperties())
I'm told that getProperties() is an ambiguous method call. Originally I thought that was unrelated to my original issue, but the IntelliJ inspector reports that this is also an issue with incompatible types.
The conclusion I'm reaching -- and what will probably be the "answer" -- is that Gradle and IntelliJ just don't play nice with each other in a variety of contexts.
I wrote custom remote debugger for a specific environment. However, the remote environment performs several optimizations that move or delete pieces of original code and therefore it can't accept all breakpoints. Before debugger session starts and connects to the remote runtime, we can't predict which of the breakpoints can't be set. I would like to keep these breakpoints as they are set in editor, but when the debugger starts, it must somehow tell the user that certain breakpoints are invalid. I think that these breakpoints should look different way, but I haven't found API methods for this purpose. I tried to set IMarker attributes such as IMarker.PROBLEM and IMarker.SEVERITY, but it didn't help. What is the best way to do this?
This code snippet from the Eclipse Debugger guide is probably what you are looking for:
public PDALineBreakpoint(IResource resource, int lineNumber) throws CoreException {
IMarker marker = resource.createMarker(
"org.eclipse.debug.examples.core.pda.lineBreakpoint.marker");
setMarker(marker);
setEnabled(true);
ensureMarker().setAttribute(IMarker.LINE_NUMBER, lineNumber);
ensureMarker().setAttribute(IBreakpoint.ID, IPDAConstants.ID_PDA_DEBUG_MODEL);
}
https://www.eclipse.org/articles/Article-Debugger/how-to.html
I've found solution by myself, but it looks like a dirty hack. It works only with IJavaLineBreakpoint, for another languages another solution required, but for now Java support is enough. IJavaLineBreakpoint had the isInstalled method that indicates whether the breakpoint is installed into some JVM. Unfortunately, you have no direct way to modity this flag. Internal implementation just exposes value of the org.eclipse.jdt.debug.core.installCount attribute. So, to set installed property of a breakpoint, you should do the following:
breakpoint.getMarker().setAttribute("org.eclipse.jdt.debug.core.installCount", 1);
Also, you can just increase/decrease this attribute the same way. However, I am not sure that this approach is compatible across versions of JDT.