Play 2.3 subproject dependsOn - playframework-2.2

This is how I configure the subprojects at Play 2.3. However, it gives me sbt.ResolveException: unresolved dependency. What is wrong with my settings? This works in 2.2.
val model = Project(appName + "-model", file("models")).enablePlugins(play.PlayScala).settings(
version := appVersion,
libraryDependencies ++= modelDependencies
)
val main = Project(appName, file(".")).enablePlugins(play.PlayScala).enablePlugins(SbtWeb).settings(
version := appVersion,
libraryDependencies ++= appDependencies
).dependsOn(model % "test->test;compile->compile")

try this:
lazy val model = Project(
id = s"${appName}-model",
base = file("models"))
.enablePlugins(play.PlayScala)
.settings(version := appVersion)
.settings(scalaVersion := "2.11.1" )
.settings(libraryDependencies ++= modelDependencies)
lazy val main = Project(
id = appName,
base = file("webapp"))
.enablePlugins(play.PlayScala)
.enablePlugins(SbtWeb)
.settings(name := "play-scala")
.settings(version := appVersion)
.settings(scalaVersion := "2.11.1" )
.settings(libraryDependencies ++= appDependencies)
.dependsOn(model % "test->test;compile->compile")
override def rootProject = Some(main)

Related

Is RawModule only for Top connections?

I'm writing a SPI to Wishbone component with Chisel3, and for testing it on FPGA/real world I have to change the polarity of the reset (rstn).
To manage it I used RawModule for my Top module. And I used withClockAndReset() to change the reset polarity :
class TopSpi2Wb extends RawModule {
val clock = IO(Input(Clock()))
val rstn = IO(Input(Bool()))
// ...
withClockAndReset(clock, !rstn) {
val spi2Wb = Module(new Spi2Wb(dwidth, awidth))
// module connections IO
}
It seem's to works since I tryied to instanciate a Sync memory on the wishbone port. with this kind of code:
class TopSpi2Wb extends RawModule {
val clock = IO(Input(Clock()))
val rstn = IO(Input(Bool()))
// ...
withClockAndReset(clock, !rstn) {
val spi2Wb = Module(new Spi2Wb(dwidth, awidth))
val wmem = SyncReadMem(1 << awidth, UInt(dwidth.W))
val ackReg = RegInit(false.B)
val datReg = RegInit(0.U(dwidth.W))
ackReg := false.B
datReg := 0.U(dwidth.W)
when(spi2Wb.io.wbm.stb_o && spi2Wb.io.wbm.cyc_o) {
when(spi2Wb.io.wbm.we_o){
wmem.write(spi2Wb.io.wbm.adr_o, spi2Wb.io.wbm.dat_o)
datReg := DontCare
}.otherwise{
datReg := wmem.read(spi2Wb.io.wbm.adr_o, spi2Wb.io.wbm.stb_o &
spi2Wb.io.wbm.cyc_o & !spi2Wb.io.wbm.we_o)
}
// ...
That compile without error, but I can't manage to read/write on memory correctly (with icarus). Verilog emitted seems to drop memory instantiation.
Maybe it's discouraged to write chisel code like Register, memories, ... and just instantiate one top module in RawModule no ?
Well, if I'm wrapping a top module in the RawModule, that works really better:
// Testing Spi2Wb with a memory connexion
// and reset inverted
class TopSpi2Wb extends RawModule {
// Clock & Reset
val clock = IO(Input(Clock()))
val rstn = IO(Input(Bool()))
// Simple blink
val blink = IO(Output(Bool()))
// SPI
val mosi = IO(Input(Bool()))
val miso = IO(Output(Bool()))
val sclk = IO(Input(Bool()))
val csn = IO(Input(Bool()))
val dwidth = 8
val awidth = 7
withClockAndReset(clock, !rstn) {
val spi2Wb = Module(new Spi2WbMem(dwidth, awidth))
blink := spi2Wb.io.blink
spi2Wb.io.mosi := mosi
miso := spi2Wb.io.miso
spi2Wb.io.sclk := sclk
spi2Wb.io.csn := csn
}
}
With the code above, all connection and register instantiation for Wishbone memory are done in Spi2WbMem() standard module.
As jkoenig asked I re-written the module to reproduce the bug and ... fixed it !
Sorry for the inconvenience.
I had some difficulties to find the bug because Icarus didn't dump the content of memory and I thought it was not generated.
I think that my initial module wrapping fixed the bug without my realizing it.

sbt-proguard You have to specify '-keep' options for the shrinking step

I am struggling to get the sbt-proguard plugin to work. I have a class library that I want to obfuscate but I cannot seem to get the plugin to output without the above error. I have specified the keep option, or at lease I think that I have, but I have had no luck. I copied the keep options from the Proguard website which said it was meant for class libraries. In addition, I do not think the plug-in is responding the the options that I have configured.
For example, I wanted to have more verbose output to see if the output could give me a clue as to what I am doing wrong. However, whenever I look at the log files, it always specifies the default options. Below is my configuration. Can someone help me out with this one? I am completely lost. Thanks
import sbt.Keys._
import com.typesafe.sbt.SbtProguard._
import ProguardKeys._
lazy val commonDependencies = Seq(
Dependencies.Libraries.junit,
Dependencies.Libraries.springBootLogging,
Dependencies.Libraries.scalaMock,
Dependencies.Libraries.joda,
Dependencies.Libraries.scalaTestPlus,
Dependencies.Libraries.scalaXml,
Dependencies.Libraries.commonsCodec,
Dependencies.Libraries.typeSafeConfig
)
val keepClasses =
"""
|-injars in.jar
|-outjars out.jar
|-libraryjars <java.home>/lib/rt.jar
|-printmapping out.map
|
|-keepparameternames
|-renamesourcefileattribute SourceFile
|-keepattributes Exceptions,InnerClasses,Signature,Deprecated,
| SourceFile,LineNumberTable,*Annotation*,EnclosingMethod
|
|-keep public class * {
| public protected *;
|}
|
|-keepclassmembernames class * {
| java.lang.Class class$(java.lang.String);
| java.lang.Class class$(java.lang.String, boolean);
|}
|
|-keepclasseswithmembernames,includedescriptorclasses class * {
| native <methods>;
|}
|
|-keepclassmembers,allowoptimization enum * {
| public static **[] values();
| public static ** valueOf(java.lang.String);
|}
|
|-keepclassmembers class * implements java.io.Serializable {
| static final long serialVersionUID;
| private static final java.io.ObjectStreamField[] serialPersistentFields;
| private void writeObject(java.io.ObjectOutputStream);
| private void readObject(java.io.ObjectInputStream);
| java.lang.Object writeReplace();
| java.lang.Object readResolve();
|}
""".stripMargin
proguardSettings
lazy val skedaddleCore = (project in file(".")).
settings(BuildSettings.buildSettings: _*).
settings(
name := "core",
resolvers := Resolvers.all,
libraryDependencies ++= commonDependencies,
merge in Proguard := true,
proguardVersion in Proguard := "5.2.1",
options in Proguard --= Seq("-dontnote", "-dontwarn", "-ignorewarnings"),
options in Proguard ++= Seq("-verbose", "-dontshrink"),
options in Proguard += keepClasses
)
I finally got the plugin to do what I wanted with the following configuration.
import sbt.Keys._
import com.typesafe.sbt.SbtProguard._
lazy val commonDependencies = Seq(
Dependencies.Libraries.junit,
Dependencies.Libraries.springBootLogging,
Dependencies.Libraries.scalaMock,
Dependencies.Libraries.joda,
Dependencies.Libraries.scalaTestPlus,
Dependencies.Libraries.scalaXml,
Dependencies.Libraries.commonsCodec,
Dependencies.Libraries.typeSafeConfig
)
proguardSettings
ProguardKeys.proguardVersion in Proguard := "5.2.1"
ProguardKeys.options in Proguard ++= Seq("-dontnote", "-dontwarn", "-ignorewarnings")
ProguardKeys.inputs in Proguard <<= (dependencyClasspath in Compile) map { _.files }
ProguardKeys.filteredInputs in Proguard <++= (packageBin in Compile) map ProguardOptions.noFilter
val keepClasses =
"""
|-keepparameternames
|-keepattributes Exceptions,InnerClasses,Signature,Deprecated,
| SourceFile,LineNumberTable,*Annotation*,EnclosingMethod
|
|-keep,includedescriptorclasses interface com.** {
| <methods>;
|}
""".stripMargin
ProguardKeys.options in Proguard += keepClasses
lazy val skedaddleCore = (project in file(".")).
settings(BuildSettings.buildSettings: _*).
settings(
name := "core",
resolvers := Resolvers.all,
libraryDependencies ++= commonDependencies
)

Why does compiling Holden Karau's spark-testing-base generate an error?

I am trying to use Holden Karau's spark-testing-base using sbt and get 4 errors. It looks like sbt is generating invalid references to 4 jars.
The errors are:
[error] 4 not found
[error] https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common/2.6.0/hadoop-common-2.6.0.test-jar
[error] https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-hdfs/2.6.0/hadoop-hdfs-2.6.0.test-jar
[error] https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.6.0/hadoop-mapreduce-client-jobclient-2.6.0.test-jar
[error] https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-server-tests/2.6.0/hadoop-yarn-server-tests-2.6.0.test-jar
My build.sbt contains:
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "2.2.1",
"org.scalacheck" %% "scalacheck" % "1.12.4",
"com.holdenkarau" %% "spark-testing-base" % "1.6.1_0.3.3"
)
parallelExecution in Test := false
lazy val root = (project in file(".")).
settings(
name := "core",
version := "1.0",
scalaVersion := "2.11.8"
)
And my test class is from Holden's wiki example:
import org.scalatest._
import com.holdenkarou.SharedSparkContext
class SampleTest extends FunSuite with SharedSparkContext {
test("test initializing spark context") {
val list = List(1, 2, 3, 4)
val rdd = sc.parallelize(list)
assert(rdd.count === list.length)
}
}
When I execute sbt test I get a bunch of maven updates followed by the errors listed above.
It appears that sbt is generating invalid jar names - for example https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common/2.6.0/hadoop-common-2.6.0.test-jar should be hadoop-common-2.6.0-test.jar (i.e. the separator after the 2.6.0 should be -, not .).
Can someone guide me as to why this is occurring, and how to fix it?

How to enable play-querydsl plugin in Play 2.2?

I have problem setting up querydsl framework in play 2.2.6 with scala 2.10.3 and java 1.7
I have done installation exactly like it was in documentation. But it doesn't work.
I am gettign an error:
dany#dany1L:~/git/app$ playFramework-2.2.6
[info] Loading project definition from /home/dany/git/app/project
/home/dany/git/app/build.sbt:11: error: not found: value QueryDSLPlugin
val current = project.in(file(".")).configs(QueryDSLPlugin.QueryDSL)
^
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?
Here is my project/plugins.sbt:
// Comment to get more information during initialization
logLevel := Level.Warn
// The Typesafe repository
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
// Use the Play sbt plugin for Play projects
// changed to support play 2.2.4 addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.2.1")
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.2.6")
addSbtPlugin("com.code-troopers.play" % "play-querydsl" % "0.1.2")
And my build.sbt:
import com.typesafe.config._
import play.Project._
import sbt._
import Keys._
//javacOptions ++= Seq("-Xlint:unchecked")
playJavaSettings
playJavaSettings ++ QueryDSLPlugin.queryDSLSettings
val current = project.in(file(".")).configs(QueryDSLPlugin.QueryDSL)
val conf = ConfigFactory.parseFile(new File("conf/application.conf")).resolve()
name := conf.getString("app.name")
version := conf.getString("app.version")+"_("+conf.getString("app.releaseDate")+")"
libraryDependencies ++= Seq(
javaJdbc,
javaJpa,
"org.hibernate" % "hibernate-entitymanager" % "3.6.9.Final",
"mysql" % "mysql-connector-java" % "5.1.27",
"org.mindrot" % "jbcrypt" % "0.3m",
"org.jasypt" % "jasypt" % "1.9.2",
"org.apache.poi" % "poi" % "3.10.1",
"com.googlecode.genericdao" % "dao" % "1.2.0",
"com.googlecode.genericdao" % "search-jpa-hibernate" % "1.2.0",
"com.google.code.gson" % "gson" % "2.3.1",
"com.googlecode.json-simple" % "json-simple" % "1.1.1",
"javax.mail" % "javax.mail-api" % "1.5.3",
"javax.activation" % "activation" % "1.1.1",
"com.sun.mail" % "javax.mail" % "1.5.3",
"com.querydsl" % "querydsl-jpa" % "4.0.2",
"com.querydsl" % "querydsl-apt" % "4.0.2",
cache
)
Please give me some help.
After adding:
import codetroopers._
on top of build.sbt I am getting an error:
[info] Loading project definition from /home/dany/git/app/project
error: bad symbolic reference. A signature in QueryDSLPlugin.class refers to type AutoPlugin
in package sbt which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling QueryDSLPlugin.class.
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?
Thanks to #Nathan
and his answer here
Finally after few days of struggle I've make it working.
Here are my configuration files:
plugins.sbt
// Comment to get more information during initialization
logLevel := Level.Warn
// The Typesafe repository
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
// Use the Play sbt plugin for Play projects
// changed to support play 2.2.4 addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.2.1")
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.2.6")
addSbtPlugin("com.code-troopers.play" % "play-querydsl" % "0.1.1")
and build.sbt
import com.typesafe.config._
import play.Project._
import sbt._
import Keys._
//javacOptions ++= Seq("-Xlint:unchecked")
playJavaSettings
val conf = ConfigFactory.parseFile(new File("conf/application.conf")).resolve()
name := conf.getString("app.name")
version := conf.getString("app.version")+"_("+conf.getString("app.releaseDate")+")"
libraryDependencies ++= Seq(
javaJdbc,
javaJpa,
"org.hibernate" % "hibernate-entitymanager" % "3.6.9.Final",
"mysql" % "mysql-connector-java" % "5.1.27",
"org.mindrot" % "jbcrypt" % "0.3m",
"org.jasypt" % "jasypt" % "1.9.2",
"org.apache.poi" % "poi" % "3.10.1",
"com.googlecode.genericdao" % "dao" % "1.2.0",
"com.googlecode.genericdao" % "search-jpa-hibernate" % "1.2.0",
"com.google.code.gson" % "gson" % "2.3.1",
"com.googlecode.json-simple" % "json-simple" % "1.1.1",
"javax.mail" % "javax.mail-api" % "1.5.3",
"javax.activation" % "activation" % "1.1.1",
"com.sun.mail" % "javax.mail" % "1.5.3",
"com.querydsl" % "querydsl-jpa" % "4.0.2",
"com.querydsl" % "querydsl-apt" % "4.0.2",
cache
)
playJavaSettings ++ QueryDSLPlugin.queryDSLSettings
val current = project.in(file(".")).configs(QueryDSLPlugin.QueryDSL)
QueryDSLPlugin.queryDSLPackage := "models"
and build.properties
sbt.version=0.13.0

sbt not recognizing test

I am having difficulties getting sbt (version 0.12.1) to recognize any tests in src/test/scala.
I have tried both JUnit style tests and scalatest style tests but to no avial
To make things simple
I have moved my tests to the root package (src/test/scala)
I have included both org.scalatest and junit-interface in my build.sbt
libraryDependencies ++= List(
"org.scalatest" %% "scalatest" % "1.8" % "test",
"com.novocode" % "junit-interface" % "0.8" % "test->default"
)
I have made the tests as simple as possible:
scalatest example
import org.scalatest.FunSuite
import scala.collection.mutable.Stack
class ExampleSuite extends FunSuite {
test("math still works") {
assert(1+1 == 2)
}
}
junit test example:
import org.junit.Assert._
import org.junit.Test
class SimpleTest {
#Test
def testPass() {
assertEquals(1+1, 2)
}
}
my test structure is:
src/test/scala
├── FunSuiteExample.scala
└── SimpleTest.scala
What am I missing?
based on instructions at:
https://github.com/szeiger/junit-interface
modfied build.sbt
removed "junit" % "junit" % "4.10" % "test" from build.sbt
added "com.novocode" % "junit-interface" % "0.11" % "test"
put test in src/test/scala
import org.junit._
import org.junit.Assert._
class SimpleTeset {
#Test
def testTrue() {
assertEquals(1+1, 2)
}
}