Kotlin internal members not accessible from alternative test source set in Gradle - testing

Following https://docs.gradle.org/current/userguide/java_testing.html#sec:configuring_java_integration_tests and https://www.michael-bull.com/blog/2016/06/04/separating-integration-and-unit-tests-with-gradle we are attempting to separate our integration tests from plain unit tests.
The problem we have is internal members in Kotlin are not accessible from such tests. As per Kotlin doco there is a visibility exception for test source sets.
The internal visibility modifier means that the member is visible
within the same module. More specifically, a module is a set of Kotlin
files compiled together:
an IntelliJ IDEA module;
a Maven project;
a Gradle source set (with the exception that the test source set can access the internal declarations of main);
a set of files compiled with one invocation of the Ant task.
Is there a way around it other than not trying to access them? That would call for a major refactoring of hundreds of tests and potentialy the whole codebase.

I was able to get a custom test sourceSet to access internal classes by adding the following code to my custom Gradle plugin.
NamedDomainObjectContainer<KotlinWithJavaCompilation<KotlinJvmOptions>> compilations = project
.getExtensions()
.getByType(KotlinJvmProjectExtension.class)
.target.getCompilations();
compilations.getByName(sourceSet.getName())
.associateWith(compilations.getByName(SourceSet.MAIN_SOURCE_SET_NAME));
I looked at the kotlin-gradle-plugin source code and found the following:
https://github.com/JetBrains/kotlin/blob/v1.3.61/libraries/tools/kotlin-gradle-plugin/src/main/kotlin/org/jetbrains/kotlin/gradle/plugin/KotlinPlugin.kt#L488-L490
With change, the tests in my custom source set run just fine, but IntellIJ still shows compilation errors. I'll look further to see if I can make IntelliJ happy as well

Related

How to use kotlin.parallel.tasks.in.project=true

Long ago, when Kotlin version 1.3.20 was released (https://blog.jetbrains.com/kotlin/2019/01/kotlin-1-3-20-released/), the ability to build in parallel using Gradle Workers was added. Simply adding the kotlin.parallel.tasks.in.project = true setting does not give any gain in build speed. As far as I understand, this parameter can be useful only if I have several folders with classes independent of each other within the same project. I saw the use of this setting when assembling the gradle itself, but did not see anywhere that separate source sets were created for each folder.
Could you provide examples of how to correctly describe the build process in build.gradle.kts so that mentioned option is really used and gives an increase in build speed when there are several processor cores.
As of yet, there's no simple way to parallelize compilation of a single source set containing Kotlin code (like just the main sources), as the compiler has to analyze all of the sources together and resolve cross-references within the source set.
By default, without any additional options, Gradle runs compilation of Kotlin sources in parallel only in different subprojects. The option kotlin.parallel.tasks.in.project also allows Gradle to run parallel compilation tasks in one project, but that only works for different source sets (that don't depend on each other!), or different targets.
For example, in multiplatform projects, if you have several targets, kotlin.parallel.tasks.in.project allows Gradle to build the compilation outputs (JVM/Android classes, *.js, Kotlin/Native *.klibs and binaries) in parallel. In Android projects, if you build multiple product variants, this option also allows parallel Kotlin compilation for those variants.
In simpler project layouts, where you only have main and test source sets and a single target, there's no way to improve Kotlin compilation speed by using multiple processors, unless you split one project into several projects.

Kotlin kapt: generating code through annotation processors: cannot find symbol (of generated classes) in stubs

I have a module with annotation processor which generates quite lot of classes I use throughout the project.
I used that module in one of my projects, though it was hell to make it work properly, it works as expected - I run build, it generates classes and everything's fine.
In new project, I use the same module (just copied code), the same config etc. but if I run build, in stubs it replaces all my "to be generated classes" references as Class<Any>, and build fails.
Workaround is that I use kaptKotlin task separately:
it generates code, and than I have to build again (kapt again is run there but no errors thrown) and only than it works.
But even in kaptKotlin task I get lot's of e: cannot find symbol in generated stubs (though code is generated).
I think that while running that tasks those should be ignored or something.
How can I correct that behavior?
Or how can I make it work through single "build" task, so kapt would in stubs reference "to be generated code" correctly and not replacing with Class<Any>?
P.S. using latest versions of Kotlin. Tried correctErrorTypes = true - no success.

Android studio generating new DaggerComponet.java file

I have defined my Dagger2 component file in a class named LpComponent.java so I need to instantiate things using DaggerLpComponent class reference.
However when I update LpComponent.java file DaggerLpComponent is not getting recreated , only way I can get this is to clean the whole project, and rebuild it.
Is there good old make style dependency I can specify DaggerLpComponent.java depends on LpComponent.java?
Also its not clear to be what rule generates DaggerLpComponent.java file. I have tried ./gradlew tasks to see if there is some dagger specific task that generates the file, but didn't see anything..
Dagger 2 works via annotation processing, which happens at compile time. A simple compilation of your project should trigger the Dagger 2 annotation processor to run and generate new sources. With Android, that should be minimally one of the tasks starting with "compile" that has your build type and flavor in the name.

Why there are no stubs for interfaces in Microsoft.Fakes

I'm about to use Microsoft.Fakes in my unit tests. I read a tutorial where Microsoft.Fakes creates a stub for an interface (implementred inside the solution), but in my solution stubs are available only for classes.
Can you tell me what should I do to get stubs also for all the intercaes. Both interfaces and classes are defined as public.
Fakes generates stubs for both classes and interfaces by default. You may have bumped into one of the current limitations, which is causing Fakes to skip your interface. To troubleshoot,
open the .Fakes file and set Verbosity attribute of the Fakes element to "Verbose"
open TOOLS -> Options -> Projects and Solutions -> Build and Run and change MSBuild output verbosity to "Detailed"
build the project that contains the .Fakes file
open the Output window and search for the GenerateFakes task; review its output for information that explains why a particular interface was not stubbed.
In the upcoming Quarterly Update 1 of Visual Studio 2012, this information reported as warnings in the Error List window, regardless of the logging settings, which should make troubleshooting much easier.
You may also not have drilled down to the proper namespace. The Fakes are generated in the same namespace as the interfaces are in in your assembly under test. So, for example, if you're testing MyApp.Validators.IRequestValidator, in your unit test, you'll have to use new MyApp.Validators.Fakes.StubIRequestValidator() as opposed to new MyApp.Fakes.StubIRequestValidator().

How to do post-build modifications in an Eclipse builder

I'm currently working an Eclipse plug-in to provide iPOJO manipulation support.
The principle of iPOJO is to modify the .class files generated by the Java compiler to inject some methods and to add/update an entry to the Manifest.mf file.
Currently, my plug-in provides a project Nature and adds a Builder, added at the end of a project builder list, that calls the iPOJO Manipulator.
I use it on PDE projects.
The complete process works but I have a problem :
When my builder has finished its job (and the building process), the whole building process restarts, erasing the output folder and calling my builder again.
If I don't add a safety trick, it makes the building process loop over and over.
As I work on IResource, an IResourceDeltaEvent must be sent at the end of the building process, so I think the best way to avoid that kind of problem is to hide the fact that the resource has changed.
To be clear, I'm looking for a way to modify the class files after a PDE build, without inducing a new build, and without disabling the workspace auto-build property.
Thanks for answers.
I am a little unclear as to what you are describing.
You mention that you want this to work for PDE builds, but PDE builds happen largely outside of the workspace using ant scripts. They do not use IResource, Builder, or IResourceDeltaEvent.
I am guessing that you don't really mean PDE builds, but rather the building of plugin projects inside of the workspace.
In general, Eclipse (JDT in particular) expects that it has complete control over the output folders. However, there is an option in Preferences -> Java -> Building -> Output Folder called "Rebuild class files generated by others". Ensure that this is disabled. Eclipse should not try to rebuild class files that you touch. If your builder only touches class files then it will not trigger other builds after it changes the class files. The only thing is that you need to be careful not to compile things twice (and I think this is the problem that you are describing).
Alternatively, it may be easier for you to implement a CompilationParticipant (and the org.eclipse.jdt.core.compilationParticipant extension point). This will allow you to know exactly when JDT calls a compilation and exactly what it compiles.
Additionally, you will be notified of reconcile operations (ie- changes in working copies that have not been saved). This may be useful for you if you wanted to manipulate files as-you-type.