How can the code coverage data from Flutter tests be displayed? - testing

I'm working on a Flutter app using Android Studio as my IDE. I'm attempting to write tests and check the code coverage but I can't work out how to view the data in the IDE or any other application.
By running flutter test --coverage, a coverage report seems to be generated into a file /coverage/lcov.info. That file looks something like this:
SF:lib\data\Customer.g.dart
DA:9,2
DA:10,2
DA:11,2
DA:12,2
DA:13,2
DA:20,0
DA:21,0
DA:22,0
DA:23,0
DA:24,0
...
Looking at the file it seems to have a list of my project files with line by line coverage data. Is there a way to view this information in Android Studio?

You can also install lcov and convert the lcov.info file to HTML pages and then see the result in the browser with sorting option.
1. Installation
1.1. Installing in Ubuntu
sudo apt-get update -qq -y
sudo apt-get install lcov -y
1.2. Installing in Mac
brew install lcov
2. Run tests, generate coverage files and convert to HTML
flutter test --coverage
genhtml coverage/lcov.info -o coverage/html
3. Open coverage report in browser
open coverage/html/index.html
Note This way you can add it to circleci artifacts and coveralls as well.

This is what you want to run to see tests coverage in your browser on macOS
flutter test --coverage
genhtml coverage/lcov.info -o coverage/html
open coverage/html/index.html

You can view the code coverage generated by flutter with the Atom editor.
You just need to install the Dart and lcov-info packages.
Then you load your project folder and press Ctrl+Alt+c, coverage will be displayed with a summary of the whole projects coverage and also with specific line highlighting.
There doesn't appear to be any plugin for Android studio which does this as of yet.

Update 9/18/2021:
See new answer for how it's done within the editor
Update 5/9/2020:
Turns out you can just run flutter test --coverage, then in the same terminal session run bash <(curl -s https://codecov.io/bash) -t token token should be the repository token you get from CodeCov. That command should automatically find and upload the coverage data and will be visible on your CodeCov dashboard. So you don't need Bitrise.
Original:
I've been using Bitrise for continuous integration on my flutter project and there is an easy way to send your reports to CodeCov then visualize it there. This requires you to gain some knowledge on how to set up and use Bitrise but a lot of its automatic so don't worry, also if you're a small team you should be fine with the free tier. Here are the key points for getting CodeCov to work.
Make sure you add the --coverage variable to the Flutter Test workflow.
Add the token from CodeCov as a secret key, you will need to sign up for CodeCov and link your repository to receive a token.
Add the CodeCov workflow and select the CODECOV_TOKEN key.
After that, you should be able to fire off a build and if successful you should see your dashboard update at CodeCov.

The Flutter Enhancement Suite does exactly that. It is an Android Studio/IntelliJ plugin which generates coverage reports.
It shows the coverage per file and also highlights covered lines (red/green bars next to the line numbers):
install the plugin from the plugin options (Preferences > Plugins > Marketplace tab > Search for Flutter Enhancement Suite).
Create a new Run Configuration for testing with coverage
(Run > Edit Configurations > click the plus button to add a new configuration > Choose Flutter Test in the dropdown)
Name your configuration (e.g. "All tests"), set the scope and the file or directory containing your tests.
Run your tests with coverage from the top menu.

I just developed a simple dart package (test_cov_console), so you can run it directly from Android Studio terminal. The tool would read the lcov.info that was generated by flutter test --coverage. Find this link for source code.
You can install the lib globally, so it would not change your current project:
flutter pub global activate test_cov_console
And run it:
flutter pub global run test_cov_console
Here is the sample of output:
flutter pub run test_cov_console
---------------------------------------------|---------|---------|---------|-------------------|
File |% Branch | % Funcs | % Lines | Uncovered Line #s |
---------------------------------------------|---------|---------|---------|-------------------|
lib/src/ | | | | |
print_cov.dart | 100.00 | 100.00 | 88.37 |...,149,205,206,207|
print_cov_constants.dart | 0.00 | 0.00 | 0.00 | no unit testing|
lib/ | | | | |
test_cov_console.dart | 0.00 | 0.00 | 0.00 | no unit testing|
---------------------------------------------|---------|---------|---------|-------------------|
All files with unit testing | 100.00 | 100.00 | 88.37 | |
---------------------------------------------|---------|---------|---------|-------------------|
The output can be saved to CSV file:
flutter pub run test_cov_console -c --output=coverage/test_coverage.csv
Sample CSV output file:
File,% Branch,% Funcs,% Lines,Uncovered Line #s
lib/,,,,
test_cov_console.dart,0.00,0.00,0.00,no unit testing
lib/src/,,,,
parser.dart,100.00,100.00,97.22,"97"
parser_constants.dart,100.00,100.00,100.00,""
print_cov.dart,100.00,100.00,82.91,"29,49,51,52,171,174,177,180,183,184,185,186,187,188,279,324,325,387,388,389,390,391,392,393,394,395,398"
print_cov_constants.dart,0.00,0.00,0.00,no unit testing
All files with unit testing,100.00,100.00,86.07,""

With the release of Flutter 2.5, you can now view test coverage within IntelliJ and Android Studio.
See this post
In addition, the latest IJ/AS plugin for Flutter allows you to see the
coverage information for both unit test and integration test runs. You
can access this from the toolbar button next to the “Debug” button:
Android Studio and IntelliJ:

Coverage reporting is now available on Android Studio

You can use SonarQube with an additional plugin for Flutter as per this link SonarQube plugin for Flutter / Dart.
I have tried it with the free version of SonarQube on docker, and if you have configured it correctly, you just need to run the following commands on Android Studio terminal:
# Download dependencies
flutter pub get
# Run tests with User feedback (in case some test are failing)
flutter test
# Run tests without user feedback regeneration tests.output and coverage/lcov.info
flutter test --machine --coverage > tests.output
# Run the analysis and publish to the SonarQube server
sonar-scanner
Here is the sample of the report, and you can drill deep into line codes.

So, the actual answer is no, you cannot view a coverage report within Android Studio (or in IntelliJ IDEA) at this time.
Unlike JavaScript/TypeScript and Java and probably Python, the IntelliJ IDE (and by extension, Android Studio) do not have integrated IDE support for showing test coverage of Flutter code in the editor. This is a shame because the ability to see your untested code branches and lines highlighted in the source code of your editor is a beautiful thing. Not sure why a plugin for this does not exist yet, since it is well-supported for other languages, and a standard lcov.info file is generated.
There is a bundled code coverage tool window in IntelliJ that is supposed to allow you to browse the lcov.info file in a tree/table drill-down format, but it doesn't seem to work with the coverage report generated by flutter (flutter test --coverage). I thought it might be the relative paths in the lcov.info and my multi-module app structure, but I tried to manually edit the file paths in lcov.info, but I had no luck getting the stats to show.

FOR WINDOWS
Required:
Chocolatey
Perl
LCOV
1. INSTALL CHOCOLATEY
Open PowerShell (with Admin)
Set-ExecutionPolicy Bypass -Scope Process -Force; iex ((New-Object System.Net.WebClient).DownloadString('https://chocolatey.org/install.ps1'))
2. INSTALL PERL
choco install strawberryperl
Add path to the environmental variable
3. INSTALL LCOV
choco install lcov
LCOV OPERATION
go to the android studio terminal & run this flutter test --coverage
now next, open your project root dir in the power shell in my case (eg :C:\Users\CIPL\Documents\Project\Flutter\myProject)
& run this cmd perl C:\ProgramData\chocolatey\lib\lcov\tools\bin\genhtml coverage/lcov.info -o coverage/html
that's it open the html folder and click the HTML file to view the visual in the browser.
NOTE: when I tried to do 3'rd point, I faced this error "ERROR: cannot create directory 'coverage/html'"
so manually created the html folder and tried 3rd point again.
Found this Windows solution in this https://blog.tech-andgar.me/flutter-test-coverage

Related

Sigasi in Eclipse

I have just installed the Sigasi Studio pluginin Eclipse (version: Eclipse IDE 2018-12). When I try to launch it,to make a new VHDL file, I get the following:
The selected wizard could not be started. org/eclipse/lsp4j/Range
(occurred in com.sigasi.hdt.vhdl.ui.VhdlExecutableExtensionFactory)
org/eclipse/lsp4j/Range
How I could solve it, please?
Thank you in advance.
Thanks to the Sigasi support, I was able to solve the problem. They wrote me:
The lsp4j plugin version is to recent for the xtext version that ships
with Sigasi Studio 4.2. This issue has been resolved in the preview
channel of release 4.3. Therefore - if you wish to use the plugin
version of Sigasi Studio - I recommend to install the 4.3 preview
following the steps explained on
http://insights.sigasi.com/tech/preview.html.
That's all. Now, I would like to configure Sigasi with GHDL (as a compiler, when I run the project) and GTKWAVE (ad a waves viewer). How can I do that?
Thanks in advance.
SIGASI + GHDL + GTKWAVE (all in one)
It is very powerful combo that you can set up. ATTENTION i use macOS 10.13.6:
Step 1
Make sure you have both installed GHDL and GTKWAVE typing
$ which gtkwave
/usr/local/bin/gtkwave
$ which ghdl
/usr/local/bin/ghdl
Step 2
Open Sigasi an make new Project and create an additional compile.sh file with:
#!/bin/sh
PROJECT_NAME="PWM_Generator"
PROJECT_NAME_TB="PWM_Generator_tb"
WORKING_DIR="/Users/imeksbank/Dropbox/UMHDL"
/usr/local/bin/ghdl -a --workdir=$WORKING_DIR/work.ghdl $WORKING_DIR/$PROJECT_NAME/$PROJECT_NAME.vhd;
/usr/local/bin/ghdl -a --workdir=$WORKING_DIR/work.ghdl $WORKING_DIR/$PROJECT_NAME/$PROJECT_NAME_TB.vhd;
/usr/local/bin/ghdl -e --workdir=$WORKING_DIR/work.ghdl $PROJECT_NAME_TB;
/usr/local/bin/ghdl -r --workdir=$WORKING_DIR/work.ghdl $PROJECT_NAME_TB --vcd=$WORKING_DIR/$PROJECT_NAME/simulation.vcd;
now, be aware, for each project you create your own variables like
PROJECT_NAME
PROJECT_NAME_TB
WORKING_DIR
I use always Dropbox for such approach because then i can access via Windows as well.
And of course, there is a possibility to create custom variables in Sigasi -> External Tool Configurator -> Program -> compile_sh -> environment to pass them to make compile.sh independent. Here you have to deal with it by yourself =)
Step 3 .
Set up you External Tools Configurations to let shell script be executed by Sigasi Studio and create the .vcd file for gtkwave:
Click on currently created Project (in my case it is the PWM_Generator).
After that click on Run -> External Tools -> External Tools Configurations ....
Then go to the left sidebar and under Program create your own anchor like compile_sh.
Finally you have your route :
Program
--compile_sh
And now extend this anchor by a custom created shell script :
Main->Location gets ${workspace_loc:/PWM_Generator/compile.sh}
Main->Working Directory gets ${workspace_loc:/PWM_Generator}
Click Apply and Run and that's it !!! After this you can program VHDL / Verilog and compile via Run -> External Tools -> compile_sh having created .vcd. In your project appears the gtkwave file and there just double click and it starts. =)

Download Android APK file from Fabric Beta

Is it possible to download the Android APK file from Fabric Beta? We have multiple releases uploaded.
Mike from Fabric here. We currently don't provide a way to download the .APK, they are only provided via the Beta by Crashlytics apps.
Late answer but someone may need this. You can download it in a hacky way from devices that apps install by Beta or any way:
Connect the device to your computer and run the following command, ensure that you have configured the adb correctly:
adb shell pm list packages | grep xyz # get the package name of the app
adb shell pm path app.xyz.stg # get the path of the app
adb pull /data/app/app.xyz.stg/base.apk . # pull the app to PWD
the name of the app is base.apk, change it to xyz. This can be used for the same device.
Mesut's answer is correct. Just to make it more clear.
adb shell pm path ${package_name}
adb pull /data/app/${package_name_2}/base.apk
In the second command, the value ${package_name_2}/base.apk is from the first command. Sometimes it's not exactly the package name.
In my case, it's ${package_name}-1/base.apk
If you just want to download a specific build, say "1.0(143)" then you can choose that build in the beta app and download it.
If your need is to upload multiple apks from same build (say an apk for each deployment environment such as prevalidation, validation, production) then you need to setup your gradle to define productFlavors for each deployment environment like this:
android {
...
flavorDimensions "deploymentEnvironment"
productFlavors {
prevalidation {
dimension "deploymentEnvironment"
}
validation {
dimension "deploymentEnvironment"
}
production {
dimension "deploymentEnvironment"
}
}
...
}
Then you publish multiple APKs from the same build (one for each target deployment environment) to the same Fabric project using following gradle tasks as illustrative examples. Actual tasks depend on the variants defined for your project:
./gradlew -s assemblePrevalidationRelease assembleValidationRelease
./gradlew -s crashlyticsUploadDistributionPrevalidationRelease crashlyticsUploadDistributionValidationRelease
The Fabric console beta page does show both apks and you can choose to download and install one or the other. The only missing part is that both variants are listed as exactly the same (since they have the same versionName and versionCode). This could easily be solved if Fabric console shows the actual apk name in addition to the version / build info. I would love for the awesome Fabric team to address this small feature request sometime soon.
Until then a workaround I use is to identify the build based on order in Fabric beta console (risky but works) and put the target deployment info in the release notes for each apk in Fabric for a given build.

Using Sonarqube with Xcode

I am following this article to integrate SonarQube with Xcode and analyse Objective-C code. Though the setup is functional and getting no error/warnings after running the shell script, no violations are shown in the Dashboard. All i get to see is basic metrics like no. of lines of code, no. of files, etc.
Is there anyone who has tried this and guide me further.
In addition to the article you have specified above, I have few additions to that. You can follow the steps below,
Prerequisites:
Sonar
Sonar-runner
SonarQube Objective-C plugin (Licensed)
XCTool
OCLint (violations) and gcovr (code coverage)
MySql and JDK
Installation Steps:
Download and install MySql dmg. And then start the MySQL server from the System Preferences or via the command line or if restarted it has to be command line.
To start - sudo /usr/local/mysql/support-files/mysql.server start
To restart - sudo /usr/local/mysql/support-files/mysql.server restart
To stop - sudo /usr/local/mysql/support-files/mysql.server stop
Download and install latest JDK version.
Go to the terminal and enter the following commands to install the
prerequisites. (Homebrew is the package
management system for Mac Operating System. to install homebrew, enter the command -
ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)")
Sonar - brew install sonar
Sonar-runner - brew install sonar-runner
XCTool - brew install xctool
OCLint - brew install oclint or
brew install https://gist.githubusercontent.com/TonyAnhTran/e1522b93853c5a456b74/raw/157549c7a77261e906fb88bc5606afd8bd727a73/oclint.rb for version 0.8.1(updated))
gcovr - brew install gcovr
Configuration:
- Set environment path of the Sonar:
export SONAR_HOME=/usr/local/Cellar/sonar-runner/2.4/libexec
export SONAR=$SONAR_HOME/bin
export PATH=$SONAR:$PATH
finally the command echo $SONAR_HOME should return the path - /usr/local/Cellar/sonar-runner/2.4/libexec
- Set up MySql DB:
export PATH=${PATH}:/usr/local/mysql/bin
mysql -u root;
CREATE DATABASE sonar_firstdb;
CREATE USER 'sonar'#'localhost' IDENTIFIED BY 'sonar';
GRANT ALL PRIVILEGES ON sonar_firstdb.* TO 'sonar'#'localhost';
FLUSH PRIVILEGES;
exit
- Set Sonar configuration settings:
vi /usr/local/Cellar/sonar/5.1.2/libexec/conf/sonar.properties
You can comment out most options except credentials and mysql and make sure that you enter the correct database name.
eg:
sonar.jdbc.url=jdbc:mysql://localhost:3306/**sonar_firstdb**?useUnicode=true&characterEncoding=utf8&rewriteBatchedStatements=true
.
vi /usr/local/Cellar/sonar-runner/2.4/libexec/conf/sonar-runner.properties
You can comment out most options except credentials and mysql and make sure that you enter the correct database name.
eg:
sonar.jdbc.url=jdbc:mysql://localhost:3306/sonar_firstdb?useUnicode=true&characterEncoding=utf8
Start sonar using the command -
sonar start
The command will launch sonar so navigate to http://localhost:9000 in your browser of choice. Login (admin/admin) and have a look around.
Now you have to install the Objective-C or Swift plugin.
Move to Settings -> System -> Update Center -> Available Plugins (install the required plugin).
You have to restart the sonar to complete the installation once the pligin is added, And add license key once the plugin is installed.
through terminal go to the root directory of a project you want sonar to inspect, and create a project specific properties file with the following command:
vi sonar-project.properties
Add the following project specific properties and edit the bolded sections as per your project.
// Required configuration
sonar.projectKey=**com.payoda.wordsudoku**
sonar.projectName=**DragDrop**
sonar.projectVersion=**1.0**
sonar.language=**objc**
// Project description
sonar.projectDescription=**Sample description**
// Path to source directories
sonar.sources=**~/path to your project**
// Path to test directories (comment if no test)
//sonar.tests=testSrcDir
// Xcode project configuration (.xcodeproj or .xcworkspace)
// -> If you have a project: configure only sonar.objectivec.project
// -> If you have a workspace: configure sonar.objectivec.workspace and sonar.objectivec.project
// and use the later to specify which project(s) to include in the analysis (comma separated list)
sonar.objectivec.project=**DragDrop.xcodeproj**
// sonar.objectivec.workspace=myApplication.xcworkspace
// Scheme to build your application
sonar.objectivec.appScheme=**DragDrop**
// Scheme to build and run your tests (comment following line of you don't have any tests)
//sonar.objectivec.testScheme=myApplicationTests
/////////////////////////
// Optional configuration
// Encoding of the source code
sonar.sourceEncoding=**UTF-8**
// JUnit report generated by run-sonar.sh is stored in sonar-reports/TEST-report.xml
// Change it only if you generate the file on your own
// Change it only if you generate the file on your own
// The XML files have to be prefixed by TEST- otherwise they are not processed
// sonar.junit.reportsPath=sonar-reports/
// Cobertura report generated by run-sonar.sh is stored in sonar-reports/coverage.xml
// Change it only if you generate the file on your own
// sonar.objectivec.coverage.reportPattern=sonar-reports/coverage*.xml
// OCLint report generated by run-sonar.sh is stored in sonar-reports/oclint.xml
// Change it only if you generate the file on your own
// sonar.objectivec.oclint.report=sonar-reports/oclint.xml
// Paths to exclude from coverage report (tests, 3rd party libraries etc.)
// sonar.objectivec.excludedPathsFromCoverage=pattern1,pattern2
sonar.objectivec.excludedPathsFromCoverage=.*Tests.*
// Project SCM settings
// sonar.scm.enabled=true
// sonar.scm.url=scm:git:https://...
Save the file and you can reuse the same for other projects.
In the project root directory run the command - sonar-runner
You should try it with an older version of SonarQube (< 4.0 usually works).

Xvfb, Jenkins, Selenium tests - Capture Screenshots of all pages

I'm trying to find some clues on the following issues and not able to find good help online.
I'm running Xvfb (X virtual frame buffer), firefox on a Linux machine in headless mode. Xvfb main service is up and running and DISPLAY variable is set.
/usr/bin/Xvfb :99 -ac -screen 0 1600x1200x16
I have some automated selenium based tests which I'm running using Gradle (gradle test). They run successfully and in Jenkins I'm able to get this working using Xvfb plugin. JUnit post publish report/result info and Gradle's reports/test/index.html file is showing successful test run.
I just run the following to run tests in Gradle:
gradle test -DsomePropConfigFileForEnv=SomeSourceConfigFilewithPathvalue
My questions:
1. How can I get the screenshots of all the pages that this automated test/run is rendering (i.e. login page, application main page after login, user clicks on the main page here and there (i.e. opening/clicking on various tabs, links, tables, buttons etc) and finally log out page.
I'm able to get the screenshot from the Xvfb_screen<N> file, which is getting created under -fbdir folder (what we specify while running Xvfb via a Jenkins job) but the screenshot is a Black page if test runs successfully (this can be due to the 2nd bullet I mentioned below) --OR it's a valid single page image screenshot (if an error is encountered during the test run).
I'm trying to get all the pages which the automated Selenium tests are rendering (the config file I passed to Gradle as a -D parameter has URLs / user name / browser, version etc info in it). PS: It's not just for some random URL that I'm trying to get an image screenshot using Xvfb DISPLAY virtual frame buffer.
During the test, I see there's a valid virtual framebuffer file, with a valid size.
For ex: While Jenkins job is in progress and running Gradle test task and Xvfb plugin has started a new xvfb instance, I see:
/production/JSlaves/kobaloki2_1/xvfb-2015-02-04_01-16-37-6170319257811815857.fbdir/Xvfb_screen0
but as soon as the test is complete (or errors our), this file is getting deleted from this xxxx.fbdir folder and there's no file at all.
Why is this file getting deleted.
If it'll remain there, then I can use xwd/xwud command and other tools (imagemagick convert etc commands) to create an image file as a POST BUILD action or even within the BUILD section after "Invoke Gradle" step.
The following command will create a .png image file of the firefox screenshot (only one page screenshot) and assuming xvfb is running on DISPLAY=:107
xwd -root -display :107 | convert xwd:- /tmp/capture2.png
and the following xvfb process (which is still running, containing a valid Xvfb_screen**** file in it - which was created by the Jenkins job where Xvfb plugin is configured with offset base 100 and 7 is the node/build number thus, making :107 as DISPLAY number).
u10002 30717 19950 1 01:16 ? 00:00:00 Xvfb :107 -screen 0 1024x768x8 -fbdir /production/JSlaves/kobaloki2_1/xvfb-2015-02-04_01-16-37-6170319257811815857.fbdir
I'm not running Xvfb / Imagemagick etc to just get an image of a URL (ex: www.google.com) but trying to get all the screenshots what a test is rendering behind Xvfb memory virtual framebuffer/file during the test run.
Are there any other tools (simple enough to install without messing up with the Linux server) which can achieve the same (capturing screenshots of all the pages that a test is rendering behind Xvf/firefox/Linux server in Headless way)?
I also tried Selenium Grid server, but FF is acting up there (due to some reason) thus I'm trying to run these tests using Jenkins, Gradle, Xvfb plugin on a Linux server (Headless mode) using firefox browser and planning to have N no. of executors to run multiple runs of these tests and finally capturing the results per run.
I'm archiving the artifacts (if any) and using Image Gallery plugin as well, but don't have the images for all the rendered pages which ran in Selenium behind Xvfb/firefox.
Any inputs are greatly appreciated.
Thanks.
If you're running with Selenium then you could use driver.getScreenshotAs()
http://docs.seleniumhq.org/docs/04_webdriver_advanced.jsp
Set this at the end of a step or method where you want a screenshot and output it to disc.
OK, this is what I did. This approach doesn't require any change to the source code of the project.
Installed imagemagick (..ck) i.e. yum install imagemagick on RHEL.
Created a script on the target server and it works now. All I do is, in the Jenkins job, when I have already started the Xvfb instance (using Xvfb plugin in Jenkins), then just a second before before running the Selenium GUI tests via Gradle (or any build tool), I call the following script and pass the parameters (where DISPLAY variable value is available to the Jenkins job as we are using Xvfb plugin in it). At the end of tests, the script exists automatically (as xwd command doesn't get any more input so it exits gracefully) and finally I publish the images and .mp4 (video) file on Jenkins (as a side bar link to show Test results / video) and archive the artifacts (.png image files using "Image Galary Plugin" and .mp4 file).
NOTE: This requires that your machine has: imagemagick, xwd and ffmpeg installed. If the options passed to any commands differs acc. to your OS machine, then tweak it accordingly. The framerate value in ffmpeg command can be a fraction i.e. 1/5 or 0.5 or 15 or anything you want (try it and see what you get).
It's up to you, if you want to ARCHIVE this big amount of data or not. You can do it if you have good space and if your Jenkins job have a better old build clean retention policies.
#!bin/bash
##
## This script will capture Screenshot (every 0.1 seconds) of an automated GUI (for ex: Selenium tests) tests running behind a HEADLESS Xvfb display instance.
## Then, it'll create a mp4 format movie using the captured screenshots.
##
## Machine where you run this script, should have: Xvfb service running, a session started by Xvfb plugin via Jenkins, xwd,ffmpeg OS commands and imagemagick (utilities).
## - For ex, try this on RHEL to install imagemagick: yum install imagemagick
##
## Variables
ws=$1; ## Workspace folder location
d=$2; d=$(echo $d | tr -d ':'); ## Display number associated with the Xvfb instance started by Xvfb plugin from a Jenkins job
wscapdir=${ws}/capturebrowserss; ## Workspace capture browser's screen shot folder
if [[ -n $3 ]]; then wscapdir=${wscapdir}/$3; fi ## If a user pass a 3rd parameter i.e. a Jenkins BUILD_NUMBER, then create a child directory with that name to archive that specific run.
i=1;
rm -fr ${wscapdir} 2>/dev/null || ( echo - Oh Oh.. Cant remove ${wscapdir} folder; echo -e "-- Still exiting gracefully! \n"; exit 0);
mkdir -p ${wscapdir}
while : ; do
xwd -root -display :$d 2>/dev/null | convert xwd:- ${wscapdir}/capFile_${d}_dispId`printf "%08d" $i`.png 2>/dev/null;
if [[ ${PIPESTATUS[0]} -gt 0 || ${PIPESTATUS[1]} -gt 0 ]]; then echo -e "\n-- Something bad happened during xwd or imagemagick convert command, manually check it.\n"; exit 0; fi
((i++)); sleep 0.1;
done
ffmpeg -r 5 -i ${wscapdir}/capFile_dispId_%08d.png ${wscapdir}/out_byRateOf5.mp4 2>/dev/null || echo -e "\n-- Some error occurred (may be too many files opened), exiting gracefully!\n";

Generation of Selenium reports using Hudson (now known as Jenkins) from JUnit XML format file

For Test Automation of web project we use Hudson, PHPUnit, and Selenium. The results of the build are stored in the JUnit XML format.
When I try to include generation of reports using the Hudson Publish JUnit test result report option, the build finishes with Failed status.
Below is my Hudson configuration to run the tests.
sudo -u apache phpunit - log-junit /var/lib/hudson/jobs/Work-stars-Tests/builds/${BUILD_ID}/ seleniumReports/seleniumTests.xml + path to test php files
Generation of reports is enabled through the Hudson configuration option «Publish JUnit test result report», where I specify the path to the folder with PHP tests.
The user we use to run Hudson has permission to write/read files in the folder with the reports. As for the path we've tried to specify both full and relative.
The error No test report files were found. Configuration error? is displayed in the console after the build.
How do we resolve this issue?
Found my own solution to this problem :)
In Hudson project configuration settings modified execute shell command to
> #!/bin/sh -x phpunit --log-junit ${WORKSPACE}/zf/tests/_tmp/reports/seleniumTests.xml
> ${WORKSPACE}/zf/tests/selenium/; sed
> -i '/<testsuite name=".*\/"/D;/^ <\/testsuite>$/D'
> ${WORKSPACE}/zf/tests/_tmp/reports/seleniumTests.xml
and set following path to Junit reports
**/zf/tests/_tmp/reports/*.xml
Problem is solved. Yahoo!