Why are Java binaries necessary in Sonar? - testing

I'm trying to check if there are any difference between an analysis with only source code and with source code and the .jar generated after compiling.
If I delete the '-Dsonar.java.binaries' property I get this error:
ERROR: Error during SonarQube Scanner execution
ERROR: Your project contains .java files, please provide compiled classes with sonar.java.binaries property, or exclude them from the analysis with sonar.exclusions property.
ERROR:
ERROR: Re-run SonarQube Scanner using the -X switch to enable full debug logging.
The command I'm using:
sonar-scanner '-Dsonar.host.url=http://192.168.1.25' '-Dsonar.projectKey=org.javaProject:myProject' '-Dsonar.projectName=myProject' '-Dsonar.sourceEncoding=UTF-8' '-Dsonar.sources=src' '-Djavax.net.ssl.trustStore=/certs'
Do you know if it is possible only to analyze source code without any binary file?

For SonarQube to be of any significant value, it should be run as part of a build, after the code is compiled and unit tests are run. I frankly don't know if it's possible to run a scan without class files, but I don't suggest you try to pursue that.
If you really only want to look at static analysis issues, I believe there is a "Sonar Lint" tool that runs in Eclipse or possibly other desktop tools.

Related

Executing java program and writing files inside the files structure of Bazel

I’m using a “ctx.action.run” command to execute a java.jar program.
The java program uses as imputs “bazel-bin/src/files”, which are a preprocessing step of a special C precompilation. This java edit the precompilated files and mix them, and the returns this post-edited file that have to be written in the bazel file structure.
The java program can’t run inside the bazel sandbox?
I tried installing the program under the bazel structure of directories, but the error persists
How bazel permit the java execution in it sandbox? and how this java program can write insithe files structure of bazel?
If you are using ctx.action.run this is how we execute the binary target. Can you elaborate your query with more details like what is error or problem seeing while doing the above.

creating a Minecraft PVP client: error message when running minecraft [duplicate]

What are the possible causes of a "java.lang.Error: Unresolved compilation problem"?
Additional information:
I have seen this after copying a set of updated JAR files from a build on top of the existing JARs and restarting the application. The JARs are built using a Maven build process.
I would expect to see LinkageErrors or ClassNotFound errors if interfaces changed. The above error hints at some lower level problem.
A clean rebuild and redeployment fixed the problem. Could this error indicate a corrupted JAR?
(rewritten 2015-07-28)
Summary: Eclipse had compiled some or all of the classes, and its compiler is more tolerant of errors.
Long explanation:
The default behavior of Eclipse when compiling code with errors in it, is to generate byte code throwing the exception you see, allowing the program to be run. This is possible as Eclipse uses its own built-in compiler, instead of javac from the JDK which Apache Maven uses, and which fails the compilation completely for errors. If you use Eclipse on a Maven project which you are also working with using the command line mvn command, this may happen.
The cure is to fix the errors and recompile, before running again.
The setting is marked with a red box in this screendump:
try to clean the eclipse project
you just try to clean maven by command
mvn clean
and after that following command
mvn eclipse:clean eclipse:eclipse
and rebuild your project....
Your compiled classes may need to be recompiled from the source with the new jars.
Try running "mvn clean" and then rebuild
The major part is correctly answered by Thorbjørn Ravn Andersen.
This answer tries to shed light on the remaining question: how could the class file with errors end up in the jar?
Each build (Maven & javac or Eclipse) signals in its specific way when it hits a compile error, and will refuse to create a Jar file from it (or at least prominently alert you). The most likely cause for silently getting class files with errors into a jar is by concurrent operation of Maven and Eclipse.
If you have Eclipse open while running a mvn build, you should disable Project > Build Automatically until mvn completes.
EDIT:
Let's try to split the riddle into three parts:
(1) What is the meaning of "java.lang.Error: Unresolved compilation
problem"
This has been explained by Thorbjørn Ravn Andersen. There is no doubt that Eclipse found an error at compile time.
(2) How can an eclipse-compiled class file end up in jar file created
by maven (assuming maven is not configured to used ecj for
compilation)?
This could happen either by invoking Maven with no or incomplete cleaning. Or, an automatic Eclipse build could react to changes in the filesystem (done by Maven) and re-compile a class, before Maven proceeds to collect class files into the jar (this is what I meant by "concurrent operation" in my original answer).
(3) How come there is a compile error, but mvn clean succeeds?
Again several possibilities: (a) compilers don't agree whether or not the source code is legal, or (b) Eclipse compiles with broken settings like incomplete classpath, wrong Java compliance etc. Either way a sequence of refresh and clean build in Eclipse should surface the problem.
I had this error when I used a launch configuration that had an invalid classpath. In my case, I had a project that initially used Maven and thus a launch configuration had a Maven classpath element in it. I had later changed the project to use Gradle and removed the Maven classpath from the project's classpath, but the launch configuration still used it. I got this error trying to run it. Cleaning and rebuilding the project did not resolve this error. Instead, edit the launch configuration, remove the project classpath element, then add the project back to the User Entries in the classpath.
I got this error multiple times and struggled to work out. Finally, I removed the run configuration and re-added the default entries. It worked beautifully.
Just try to include package name in eclipse in case if you forgot it
Import all packages before using it, EX: import java.util.Scanner before using Scanner class.
These improvements might work and it will not give Java: Unresolved compilation problem anymore.
Also make sure to check compiler compliance level and selected jdk version is same
As a weird case, I encountered such an exception where the exception message (unresolved compilation bla bla) was hardcoded inside of generated class' itself. Decompiling the class revealed this.
I had the same issue using the visual studio Code. The root cause was backup java file was left in the same directory.
Removed the backup java file
When the build failed, selected the Fix it, it cleaned up the cache and restarted the workSpace.

Execution failed for task ':cinteropAFNetworkingIOS'. > Cannot perform cinterop processing for AFNetworking: cannot determine headers location

This error pops up in Xcode after a gradle build
or like the one below in your intelliJ
When you come across an error like this,
This property is used to replace the default Kotlin/Native compiler used by the Gradle plugin with a local distribution. Since this sample is included in the K/N repo, it has this property specified to use a compiler built from sources.
So you may just remove org.jetbrains.kotlin.native.home from your gradle.properties and rerun the build.
And for the error in the second image, you can ignore that because when you try to perform ./gradlew build in the terminal, this is bound to occurs as this is not handled by the gradle in IntelliJ or Android Studio. It will be recognized by the Xcode when you follow the Readme.md(https://github.com/JetBrains/kotlin-native/tree/master/samples/cocoapods) and unlink the "Pods_ios_app.framework" and re-link by browsing again to make it work.

TFS Build ignores configured Code Analysis ruleset

I have a solution that is using an hybrid .csproj and project.json combination (for nuget management purposes). So basically the "project.json" file is working as a "packages.config" file with a floating version capability.
This solution is using a custom RuleSet that is being distributed via Package, and is imported automatically. On the dev machine, works without a problem.
At the build machine (that is, inside the machine itself, working as an user) the solution also compiles without a problem.
However, when a vNext build (is this the name for the new build system?) is queued, it ignores completely the custom ruleset and just uses the StyleCop one (that is also included), which gives a bunch of warnings. Said warnings should not appear as the Custom RuleSet basically suppresses those warnings (ie: Warning SA1404: Code analysis suppression must have justification,
Warning SA1124: Do not use regions, etc)
As far as I have checked, there is no setting to specify the ruleset, and this works with XAML Builds. What is different in this new build system that is causing this? Is there a way to force/specify the Code Analysis Rule Set from the definition?
Thanks in advance for any help or advice on the matter.
Update/Edit
After debugging back and forth with the wonderful help of jessehouwing I must include the following detail on my initial report (that I ignored as I did not know that it was influential):
I am using SonarQube Analysis on my build definition.
I initially did not mention it as I did not know that it replaces the Code Analysis at Build Time (and not only when it "analyzes", as I thought).
If you are using the SonarQube tasks
The SonarQube tasks generate a new Code Analysis Ruleset file on the fly and will overwrite the one configured for the projects. These rulesets will be used regardless of what you've previously specified.
There is a trick to the naming of the rulesets through which you can include your own overrides.
More information on the structure can be found in the blog post from the SonarQube/Visual Studio team. Basically when you Bind your solution to SonarQube it will generate 2 ruleset files. One which will be overwritten during build, the other containing your customizations.
There is a toolkit/SDK to generate a SonarQube plugin for custom analyzers which allow you to import your rules into SonarQube, so it will know what rules to activate for your project(s).
If you're not using SonarQube
Yes you can specify the ruleset you want to use and force Code Analysis to run. It requires a couple of MsBuild arguments:
/p:RunCodeAnalysis=true /p:CodeAnalysisRuleset="PathToRuleset"
Or you can use my MsBuild helper extension to configure these settings with the help of a UI template:

Error while compiling a C++ project with devenv using Coverity. (cov-build.exe)

When running a Coverity build I get the following error:
Failed to locate msbuild.exe when handling devenv template configuration. Shutting down resident msbuild processes is impossible.
Can't find it in Google!
Does anyone know what this might mean?
How do I investigate this?
When I build from command line without Coverity it works fine.
When you start cov-build devenv one of the things it tries to do is kill off idle msbuild.exe processes because if they are not killed, devenv will pass the build directive to msbuild without cov-build being able to see it (and that's how it knows how to build your files).
There are a few ways you can resolve this - it depends on how you are invoking cov-build, how your compiler configuration is set up, etc. For example, you could call cov-build msbuild directly rather than going through devenv.
I would recommend opening a support case with Coverity (since you have support if you have a license for it). E-mail them at support#coverity.com and I'm sure they can suggest additional debugging steps.