How does IntelliJ decide if you're running a test? - intellij-idea

IntelliJ separates a project's resource files into "Resource Folders" and "Test Resource Folders". My understanding is if you run unit tests, then it uses resources listed under "Test Resource Folders", and otherwise, will use "Resource Folders". The problem is that when running the same code, it seems to arbitrarily decide to switch from one to the other every once in a while. The only workaround I've found is to move each directory from "Resource Folders" to "Test Resource Folders" or vice versa when it happens, but the project is large with lots of resources, so IntelliJ ends up taking 10-20 minutes on the "Copying resources..." step.
Why is this happening and how can I avoid this problem?

Test resource roots are resources only referenced from tests, and resource roots are resources that can be referenced from both production and test code. The compilation process copies the resources and test resources of each module into separate output folders.
When you run a test run configuration (JUnit or TestNG), the classpath will include both production and test output folders; which of them takes precedence depends on the details of the test runner. When you run any other run configuration type, the classpath will only include the production output folders.
There is no logic in IntelliJ to use one resource root type instead of the other one. Either they're both on the classpath, or only one is.

Related

optional artifacts download task in bamboo?

Is it possible to configure a deployment project with optional 'Artifact Download' task?
The artifact comes from another plan which has 2 stages producing 2 artifacts. If only 1 stage is executed, it will have 1 shared artifact. I want my deployment project to run even there is only 1 artifact.
But bamboo fail the whole execution with error: "Unable to download artifact Shared artifact: ..." trying to locate the 2nd artifact.
How can I tell Bamboo to ignore the missing artifact and continue the execution?
The only way I've figure this out is to instead of name an artifact, put all of the artifacts into a "directory" as part of the build process, say "artifacts/", and define the artifacts as "artifacts/**". Then on the Deployment side, be clever about manipulating the artifacts for deployments.
Note, in my case, I have an issue with multiple branches for the same build (think, "future release", "current release", "legacy release") that may have different artifacts on them (either new features in "future release", or aged off artifacts from "legacy release"). I had to wrap the actual deployments into a script that was "smart enough" to just iterate through artifacts that actually existed for a given deployment environment.
I'm not completely happy with Bamboo's treatment of special cases for artifact management at all. In fact, I've found that judicious use of the "script" task in Bamboo (and managing those scripts in some external git repo) seems to be the only real way to manage larger Bamboo installations in general.

Why does gradle idea plugin configure classpath to use unprocessed/source test resources instead of filtered/generated resources

I'm trying to run my test cases in idea 12 after configuring my multi-project build with the gradle idea plugin. My project is configured to use the gradle defaults for test resources (src/test/resources). I apply some filtering to these resources:
// filter test resources
processTestResources {
doLast {
ant.replace(dir: sourceSets.test.output.resourcesDir, replacefilterfile: testProps, includes: "**/*.xml,**/*.properties")
}
}
Additionally processTestResources depends on a custom task that copies some "generated" files to sourceSets.test.output.resourcesDir.
Many of my tests are failing, because they rely on the filtered test resources. When I look at the classpath that is being used for the test cases, it is pointing to rootProject/out/test/targetProject. When I look in there I see my test classes and my pre-filtered resources (and non of the resources that I explicitly copied over before processing the test resources). It appears they have simply been copied from src/test/resources. Is this expected behavior?
Also, why are the test classes and resources put into rootProject/out/rootProject as opposed the the defaults gradle defaults rootProject/targetProject/build/...?
When you build in IDEA, Gradle isn't involved. It's IDEA that's copying the resources and compiling the code. You can add Gradle-generated resources to the IDEA build, but you have to run the corresponding Gradle tasks yourself, or configure IDEA run configurations to invoke the tasks. (I can't seem to find a post-compilation hook in IDEA.)

Where does Bamboo look for artifacts?

I have created a Bamboo build plan that is supposed to generate artifacts. And it does - I see the generated files on the server. Unfortunately, Bamboo does not copy the files to the desired location -> it does not treat them as artifacts that I can download from Bamboo server.
I am working with Bamboo 4.3.3. The documentation tells me to describe the artifacts location relative to the "working directory", so I am trying to copy everything to ${bamboo.build.working.directory}.
I have tried different location / copy pattern settings, but to no avail.
Where should I put them? I have a scripting environment, and there is no Maven or Ant to help me.
I finally understood what was going on with my artifacts and test results that Bamboo did not see:
Test results: there is a known bug that is affecting all versions up to 4.4.5, which manifests itself in scripting environments. Fortunately, it has a workaround: JUnit Parser: Test results are not found
Bamboo uses system property bamboo.fs.timestamp.precision to define FS timestamp resolution. By default it is set to 100 (ms), please set it to higher value in order to make file date check less strict. Bamboo does the check in the following way:
private boolean isFileRecentEnough(final File file)
{
return file.lastModified() >= (taskStartDate.getTime() - SystemProperty.FS_TIMESTAMP_RESOLUTION_MS.getTypedValue());
}
Other items to check
Double check the task configuration and confirm that it is configured it to look for the test results file in the current working directory of the job (Ex.: C:\Users\ssetayeshfar\bamboo-home-445\xml-data\build-dir\PROJECT-PLAN-JOB) and NOT a sub-directory (Ex. C:\Users\ssetayeshfar\bamboo-home-445\xml-data\build-dir\PROJECT-PLAN-JOB/test-results).
In case test report is not produced by the build (it was produced earlier) use a 'touch' command right before the JUnit task.
Artifacts: at the beginning of my work with Bamboo I did not understand that the working directory is defined PER JOB and tried to copy something produced in a previous job as an artifact of the current one.

Unable to use Performance Plugin with Hudson

I have been trying to integrate JMeter test with Maven and Hudson. I came across this beautiful post on Maven JMeter and got it set up easily.
And then I got to know that Hudson has a performance plugin using with JMeter results could be directly displayed on Hudson dashboard.
Now problem I face is in using Performance plugin of Hudson.
Performance plugin says to specify path of JMeter Report files wherein default path is considered as "**/*.jtl" if no path is specified.
My Maven JMeter tests produce this file under "target\jmeter-reports\GoogleAdvanceSearch.xml"
Notice that an "xml" file is generated here but it is same as ".jtl" file.
I provided the entire path this xml file under "report files" section of Hudson but when ever I initiate a build I encounter following exception after performance test execution -
Performance: Recording JMeter reports 'C:\SelNG\jmeter2\target\jmeter-reports\GoogleAdvanceSearch-100905.xml'
Performance: no JMeter files matching 'C:\SelNG\jmeter2\target\jmeter-reports \GoogleAdvanceSearch-100905.xml' have been found. Has the report generated?. Setting Build to FAILURE
But I know that file is physically available at the location I specified.
To double check if problem is coz of "xml" file and not having ".jtl" file as specified in performance plugin of Hudson. Now I created a build file which would execute JMeter tests and generate ".jtl" file. Now get to see following ".jtl" file generated. "C:\SelNG\jmeter\GoogleSearch.jtl". I specified this path under "Performance Report > Report files" section of Hudson plugin but again encountered same exception on build execution
Performance: Recording JMeter reports 'C:\SelNG\jmeter\GoogleSearch.jtl'
Performance: no JMeter files matching 'C:\SelNG\jmeter\GoogleSearch.jtl' have been found. Has the report generated?. Setting Build to FAILURE
Finished: FAILURE
Though I know that file 'C:\SelNG\jmeter\GoogleSearch.jtl' is physically available at specified location.
What is that I am missing in here? has any one of u come across such problem while using Hudson-Performance plugin?
Thanks in advance
~ T
I hope this will help you a little.
Hudson select job. Choose configure.
Let's focus on paths
This is my path to tests
c:\Hudson\data\jobs\template-peformance-test2-mvn\workspace\trunk\src\test\jmeter\
This is my path to report files
c:\Hudson\data\jobs\template-peformance-test2-mvn\workspace\trunk\target\jmeter-reports\
For Hudson the root starts in job workspace.
My Build conf:
Root POM: trunk\pom.xml
Goals and options: celan verify
Then in Post Build Actions
selected Publish Performance test result report
Performance report JMeter
Report files */target/jmeter-reports/*.xml

Maven: local development deploy vs bundling for distribution

Bear with me, I'm migrating from Ant to Maven2: I think I've hit one of those little things that was easy in Ant, but not so in Maven...
How do I handle the difference between a local deployment vs. creating an archive/bundle for distribution to another machine?
Let's assume my project's output is an EAR plus some additional config files. A developer that is actively working on the project will need to deploy and re-deploy frequently to his local app-server (say JBoss), while an Integration Engineer that is building for QA/production will need only to create the final archive assembly (tar/gz).
In Ant we had two targets for this: "dev-deploy" and "bundle". Both do a complete build, but differ in the final step: "dev-deploy" copies the EAR and config files to the respective local folders, while "bundle" just puts the EAR & config files in a tar.gz assembly.
How do you do this in Maven?
I've seen that the assembly plugin can create either archives (tar, gz, etc.) or exploded directories (from the same assembly descriptor). I can invoke either assembly:assembly or assembly:directory, but for the latter, how do I copy the final output to the local JBoss deployment folders? From a related post it seems that ad-hoc copying of files is not really what Maven is about, so an antrun copy is probably the most appropriate?
Finally, since the type of assembly may differ depending on who invokes it, it doesn't seem wise to bind assembly to the build lifecycle, not so? But this means that a developer will always need to invoke 'mvn package' followed by 'mvn assembly:directory' to rebuild and test a change. Conversely, an Integration Engineer will always need to run 'mvn package' followed by 'mvn assembly:assembly' to create the distributable archive. I was hoping for a one-command solution for each, or should I just script it?
In Ant we had two targets for this: "dev-deploy" and "bundle". Both do a complete build, but differ in the final step: "dev-deploy" copies the EAR and config files to the respective local folders, while "bundle" just puts the EAR & config files in a tar.gz assembly.
Not sure what you mean by respective local folders about "dev-deploy" but this sounds like what mvn pacakge is doing and "bundle" indeed sounds like a maven assembly.
I've seen that the assembly plugin can create either archives (tar, gz, etc.) or exploded directories (from the same assembly descriptor). I can invoke either assembly:assembly or assembly:directory, but for the latter, how do I copy the final output to the local JBoss deployment folders? From a related post it seems that ad-hoc copying of files is not really what Maven is about, so an antrun copy is probably the most appropriate?
I guess that we are talking about the Integration Engineer's tasks here. As you didn't explain what the "bundle" contains exactly, what the target application server is (my understanding is that you are using JBoss for QA/production too but, again, this is a guess), if this bundle has to be deployed automatically, it's hard to imagine all solutions and/or alternatives to antrun. But indeed, to copy/move/unzip/whatever the assembly, the maven antrun plugin is a candidate.
Finally, since the type of assembly may differ depending on who invokes it, it doesn't seem wise to bind assembly to the build lifecycle, not so? But this means that a developer will always need to invoke 'mvn package' followed by 'mvn assembly:directory' to rebuild and test a change. Conversely, an Integration Engineer will always need to run 'mvn package' followed by 'mvn assembly:assembly' to create the distributable archive. I was hoping for a one-command solution for each, or should I just script it?
My understanding was that the Integration Engineer was building the bundle. Why would a developer need the bundle? This is confusing... Anyway, I don't really need the details to think of an answer. You could actually declare the maven assembly plugin into specific build profiles, one for development and one for integration, and bind either the single or the directory-single mojos to the project's build lifecycle in each profile. This would allow to use only one command and avoid any scripting (really, don't go this way).