sbt not recognizing test - testing

I am having difficulties getting sbt (version 0.12.1) to recognize any tests in src/test/scala.
I have tried both JUnit style tests and scalatest style tests but to no avial
To make things simple
I have moved my tests to the root package (src/test/scala)
I have included both org.scalatest and junit-interface in my build.sbt
libraryDependencies ++= List(
"org.scalatest" %% "scalatest" % "1.8" % "test",
"com.novocode" % "junit-interface" % "0.8" % "test->default"
)
I have made the tests as simple as possible:
scalatest example
import org.scalatest.FunSuite
import scala.collection.mutable.Stack
class ExampleSuite extends FunSuite {
test("math still works") {
assert(1+1 == 2)
}
}
junit test example:
import org.junit.Assert._
import org.junit.Test
class SimpleTest {
#Test
def testPass() {
assertEquals(1+1, 2)
}
}
my test structure is:
src/test/scala
├── FunSuiteExample.scala
└── SimpleTest.scala
What am I missing?

based on instructions at:
https://github.com/szeiger/junit-interface
modfied build.sbt
removed "junit" % "junit" % "4.10" % "test" from build.sbt
added "com.novocode" % "junit-interface" % "0.11" % "test"
put test in src/test/scala
import org.junit._
import org.junit.Assert._
class SimpleTeset {
#Test
def testTrue() {
assertEquals(1+1, 2)
}
}

Related

How can I make a PyTorch extension with cmake

This tutorial demonstrates how to make a C++/CUDA-based Python extension for PyTorch. But for ... reasons ... my use-case is more complicated than this and doesn't fit neatly within the Python setuptools framework described by the tutorial.
Is there a way to use cmake to compile a Python library that extends PyTorch?
Yes.
The trick is to use cmake to combine together all the C++ and CUDA files we'll need and to use PyBind11 to build the interface we want; fortunately, PyBind11 is included with PyTorch.
The code below is collected and kept up-to-date in this Github repo.
Our project consists of several files:
CMakeLists.txt
cmake_minimum_required (VERSION 3.9)
project(pytorch_cmake_example LANGUAGES CXX CUDA)
find_package(Python REQUIRED COMPONENTS Development)
find_package(Torch REQUIRED)
# Modify if you need a different default value
if(NOT DEFINED CMAKE_CUDA_ARCHITECTURES)
set(CMAKE_CUDA_ARCHITECTURES 61)
endif()
# List all your code files here
add_library(pytorch_cmake_example SHARED
main.cu
)
target_compile_features(pytorch_cmake_example PRIVATE cxx_std_11)
target_link_libraries(pytorch_cmake_example PRIVATE ${TORCH_LIBRARIES} Python::Python)
# Use if the default GCC version gives issues.
# Similar syntax is used if we need better compilation flags.
target_compile_options(pytorch_cmake_example PRIVATE $<$<COMPILE_LANGUAGE:CUDA>:-ccbin g++-9>)
# Use a variant of this if you're on an earlier cmake than 3.18
# target_compile_options(pytorch_cmake_example PRIVATE $<$<COMPILE_LANGUAGE:CUDA>:-gencode arch=compute_61,code=sm_61>)
main.cu
#include <c10/cuda/CUDAException.h>
#include <torch/extension.h>
#include <torch/library.h>
using namespace at;
int64_t integer_round(int64_t num, int64_t denom){
return (num + denom - 1) / denom;
}
template<class T>
__global__ void add_one_kernel(const T *const input, T *const output, const int64_t N){
// Grid-strided loop
for(int i=blockDim.x*blockIdx.x+threadIdx.x;i<N;i+=blockDim.x*gridDim.x){
output[i] = input[i] + 1;
}
}
///Adds one to each element of a tensor
Tensor add_one(const Tensor &input){
auto output = torch::zeros_like(input);
// Common values:
// AT_DISPATCH_INDEX_TYPES
// AT_DISPATCH_FLOATING_TYPES
// AT_DISPATCH_INTEGRAL_TYPES
AT_DISPATCH_ALL_TYPES(
input.scalar_type(), "add_one_cuda", [&](){
const auto block_size = 128;
const auto num_blocks = std::min(65535L, integer_round(input.numel(), block_size));
add_one_kernel<<<num_blocks, block_size>>>(
input.data_ptr<scalar_t>(),
output.data_ptr<scalar_t>(),
input.numel()
);
// Always test your kernel launches
C10_CUDA_KERNEL_LAUNCH_CHECK();
}
);
return output;
}
///Note that we can have multiple implementations spread across multiple files, though there should only be one `def`
TORCH_LIBRARY(pytorch_cmake_example, m) {
m.def("add_one(Tensor input) -> Tensor");
m.impl("add_one", c10::DispatchKey::CUDA, TORCH_FN(add_one));
//c10::DispatchKey::CPU is also an option
}
Compilation
Compile it all using this command:
cmake -DCMAKE_BUILD_TYPE=RelWithDebInfo -DCMAKE_PREFIX_PATH=`python -c 'import torch;print(torch.utils.cmake_prefix_path)'` -GNinja ..
test.py
You can then run the following test script.
import torch
torch.ops.load_library("build/libpytorch_cmake_example.so")
shape = (3,3,3)
a = torch.randint(0, 10, shape, dtype=torch.float).cuda()
a_plus_one = torch.ops.pytorch_cmake_example.add_one(a)

rust - trouble importing module

I have encountered a strange error preventing one of my files from importing a module
This is my src directory
src/
functions.rs
main.rs
unit_test.rs
Here is unit_test.rs
mod functions;
#[cfg(test)] // only compiles on test
// make module f_test
mod f_test{
// mark function as test
#[test]
#[should_panic]
fn test_basic() {
assert_eq!();
panic!("oh no");
}
#[test]
fn test_add(){
assert_eq!(functions::add(1,2), 1 + 2);
}
#[test]
#[should_panic]
fn test_bad_add(){
assert_eq!(functions::add(1,2), 1 + 2);
}
}
When I try to run cargo test I get this
[vinessa#komputilo unitTest]$ cargo test
Compiling unitTest v0.1.0 (/home/vinessa/Dev/Rust/unitTest)
error[E0583]: file not found for module `functions`
--> src/unit_test.rs:1:5
|
1 | mod functions;
| ^^^^^^^^^
|
= help: name the file either unit_test/functions.rs or unit_test/functions/mod.rs inside the directory "src"
Strange thing, if I add "mod functions;" to main.rs, cargo won't complain about that file, only for unit_test.rs
I am lost please help
I think you want to declare mod functions in main.rs:
mod unit_test;
mod functions; // declare this module here
fn main() {
println!("Hello, world!");
}
I think you want this in unit_test.rs:
#[cfg(test)] // only compiles on test
// make module f_test
mod f_test{
use crate::functions; // use module here
// mark function as test
#[test]
#[should_panic]
fn test_basic() {
assert_eq!(1, 1);
panic!("oh no");
}
#[test]
fn test_add(){
assert_eq!(functions::add(1,2), 1 + 2);
}
#[test]
#[should_panic]
fn test_bad_add(){
assert_eq!(functions::add(1,2), 1 + 2);
}
}
Additionally, your #[should_panic] is incorrect as it stands.
Obligatory link: https://doc.rust-lang.org/book/ch07-02-defining-modules-to-control-scope-and-privacy.html

Why does compiling Holden Karau's spark-testing-base generate an error?

I am trying to use Holden Karau's spark-testing-base using sbt and get 4 errors. It looks like sbt is generating invalid references to 4 jars.
The errors are:
[error] 4 not found
[error] https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common/2.6.0/hadoop-common-2.6.0.test-jar
[error] https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-hdfs/2.6.0/hadoop-hdfs-2.6.0.test-jar
[error] https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.6.0/hadoop-mapreduce-client-jobclient-2.6.0.test-jar
[error] https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-server-tests/2.6.0/hadoop-yarn-server-tests-2.6.0.test-jar
My build.sbt contains:
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "2.2.1",
"org.scalacheck" %% "scalacheck" % "1.12.4",
"com.holdenkarau" %% "spark-testing-base" % "1.6.1_0.3.3"
)
parallelExecution in Test := false
lazy val root = (project in file(".")).
settings(
name := "core",
version := "1.0",
scalaVersion := "2.11.8"
)
And my test class is from Holden's wiki example:
import org.scalatest._
import com.holdenkarou.SharedSparkContext
class SampleTest extends FunSuite with SharedSparkContext {
test("test initializing spark context") {
val list = List(1, 2, 3, 4)
val rdd = sc.parallelize(list)
assert(rdd.count === list.length)
}
}
When I execute sbt test I get a bunch of maven updates followed by the errors listed above.
It appears that sbt is generating invalid jar names - for example https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common/2.6.0/hadoop-common-2.6.0.test-jar should be hadoop-common-2.6.0-test.jar (i.e. the separator after the 2.6.0 should be -, not .).
Can someone guide me as to why this is occurring, and how to fix it?

sbt and play test with #RunWith

I'm dealing with a problem working on a Play project and sbt.
The problem is:
I have tests, some of them are using PowerMockito, some of them aren't
For the tests using PowerMockito, I've added #RunWith(PowerMockRunner.class) at the beginning of the class.
For the tests not using, I've added nothing.
Here is an example for a test that uses PowerMockito
#RunWith(PowerMockRunner.class)
#PrepareForTest({SomeMockedClass.class})
public class SomeClassTest {
#Test
public void aTest() {
PowerMockito
.stub(PowerMockito.method(SomeMockedClass.class, "aMethod"))
.toReturn(something);
assertThat(someCall).isEqualTo(something);
}
}
Here is an example for a test that doesn't use PowerMockito
public class AnotherClassTest {
#Test
public void anotherTest(){
assertThat(something).isEqualTo(something);
}
}
And here is my build.sbt dep:
libraryDependencies ++= Seq(
cache,
...
// Testing
"org.easytesting" % "fest-assert" % "1.4" % "test",
"junit" % "junit" % "4.12" % "test",
"org.powermock" % "powermock-mockito-release-full" % "1.6.2" % "test",
"org.powermock" % "powermock-module-junit4-rule-agent" % "1.6.2" % "test",
"org.easymock" % "easymock" % "3.3.1" % "test",
"com.novocode" % "junit-interface" % "0.11" % "test"
)
Here is the problem:
When I run my tests from IntelliJ, it founds all tests and everything is ok
When I run my tests from activator (activator test), it does NOT found the tests with the #RunWith.
I've read some post about it, like the sbt version or something
(even https://github.com/sbt/jacoco4sbt/issues/15).
But the sbt version is : 0.13.8-M5 so it seems ok.
Play version is : 2.3.8
If you guys have any clue that'd be great.
Thanks

How to enable play-querydsl plugin in Play 2.2?

I have problem setting up querydsl framework in play 2.2.6 with scala 2.10.3 and java 1.7
I have done installation exactly like it was in documentation. But it doesn't work.
I am gettign an error:
dany#dany1L:~/git/app$ playFramework-2.2.6
[info] Loading project definition from /home/dany/git/app/project
/home/dany/git/app/build.sbt:11: error: not found: value QueryDSLPlugin
val current = project.in(file(".")).configs(QueryDSLPlugin.QueryDSL)
^
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?
Here is my project/plugins.sbt:
// Comment to get more information during initialization
logLevel := Level.Warn
// The Typesafe repository
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
// Use the Play sbt plugin for Play projects
// changed to support play 2.2.4 addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.2.1")
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.2.6")
addSbtPlugin("com.code-troopers.play" % "play-querydsl" % "0.1.2")
And my build.sbt:
import com.typesafe.config._
import play.Project._
import sbt._
import Keys._
//javacOptions ++= Seq("-Xlint:unchecked")
playJavaSettings
playJavaSettings ++ QueryDSLPlugin.queryDSLSettings
val current = project.in(file(".")).configs(QueryDSLPlugin.QueryDSL)
val conf = ConfigFactory.parseFile(new File("conf/application.conf")).resolve()
name := conf.getString("app.name")
version := conf.getString("app.version")+"_("+conf.getString("app.releaseDate")+")"
libraryDependencies ++= Seq(
javaJdbc,
javaJpa,
"org.hibernate" % "hibernate-entitymanager" % "3.6.9.Final",
"mysql" % "mysql-connector-java" % "5.1.27",
"org.mindrot" % "jbcrypt" % "0.3m",
"org.jasypt" % "jasypt" % "1.9.2",
"org.apache.poi" % "poi" % "3.10.1",
"com.googlecode.genericdao" % "dao" % "1.2.0",
"com.googlecode.genericdao" % "search-jpa-hibernate" % "1.2.0",
"com.google.code.gson" % "gson" % "2.3.1",
"com.googlecode.json-simple" % "json-simple" % "1.1.1",
"javax.mail" % "javax.mail-api" % "1.5.3",
"javax.activation" % "activation" % "1.1.1",
"com.sun.mail" % "javax.mail" % "1.5.3",
"com.querydsl" % "querydsl-jpa" % "4.0.2",
"com.querydsl" % "querydsl-apt" % "4.0.2",
cache
)
Please give me some help.
After adding:
import codetroopers._
on top of build.sbt I am getting an error:
[info] Loading project definition from /home/dany/git/app/project
error: bad symbolic reference. A signature in QueryDSLPlugin.class refers to type AutoPlugin
in package sbt which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling QueryDSLPlugin.class.
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?
Thanks to #Nathan
and his answer here
Finally after few days of struggle I've make it working.
Here are my configuration files:
plugins.sbt
// Comment to get more information during initialization
logLevel := Level.Warn
// The Typesafe repository
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
// Use the Play sbt plugin for Play projects
// changed to support play 2.2.4 addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.2.1")
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.2.6")
addSbtPlugin("com.code-troopers.play" % "play-querydsl" % "0.1.1")
and build.sbt
import com.typesafe.config._
import play.Project._
import sbt._
import Keys._
//javacOptions ++= Seq("-Xlint:unchecked")
playJavaSettings
val conf = ConfigFactory.parseFile(new File("conf/application.conf")).resolve()
name := conf.getString("app.name")
version := conf.getString("app.version")+"_("+conf.getString("app.releaseDate")+")"
libraryDependencies ++= Seq(
javaJdbc,
javaJpa,
"org.hibernate" % "hibernate-entitymanager" % "3.6.9.Final",
"mysql" % "mysql-connector-java" % "5.1.27",
"org.mindrot" % "jbcrypt" % "0.3m",
"org.jasypt" % "jasypt" % "1.9.2",
"org.apache.poi" % "poi" % "3.10.1",
"com.googlecode.genericdao" % "dao" % "1.2.0",
"com.googlecode.genericdao" % "search-jpa-hibernate" % "1.2.0",
"com.google.code.gson" % "gson" % "2.3.1",
"com.googlecode.json-simple" % "json-simple" % "1.1.1",
"javax.mail" % "javax.mail-api" % "1.5.3",
"javax.activation" % "activation" % "1.1.1",
"com.sun.mail" % "javax.mail" % "1.5.3",
"com.querydsl" % "querydsl-jpa" % "4.0.2",
"com.querydsl" % "querydsl-apt" % "4.0.2",
cache
)
playJavaSettings ++ QueryDSLPlugin.queryDSLSettings
val current = project.in(file(".")).configs(QueryDSLPlugin.QueryDSL)
QueryDSLPlugin.queryDSLPackage := "models"
and build.properties
sbt.version=0.13.0