Why can I not pickle my case classes? What should I do to solve this manually next time? - jvm

Edit 2: Observations and questions
I am pretty sure along with the commenter below Justin that the problem is due to an errant build.sbt configuration. However, this is the first time I have seen an errant build.sbt configuration that literally works for everything else except for pickers. Maybe that is becaus they use macros and I as a rule avoid them.
Why would it matter whether Flow.merge is used vs Flow.map if the problem is with the sbt?
Suspicious build.sbt extract
lazy val server = project
.dependsOn(sharedJvm, client)
Suspicious stack trace
So this is the top of the stack: it goes from a method I cannot find to the linking environment to the string encoding utils. Ok.
server java.lang.RuntimeException: stub
Huh? stub?
server at scala.sys.package$.error(package.scala:27)
server at scala.scalajs.runtime.package$.linkingInfo(package.scala:143)
server at scala.scalajs.runtime.package$.environmentInfo(package.scala:137)
HUH?
server at scala.scalajs.js.Dynamic$.global(Dynamic.scala:78)
???
server at boopickle.StringCodec$.encodeUTF8(StringCodec.scala:56)
Edit 1: My big and beautiful build.sbt might be the problem
What you cannot see is that I organized in my project folder:
JvmDependencies.scala which has regular Jvm dependencies
SjsDependencies.scala which has Def.settingsKeys of libraryDependencies on JsModuleIDs
WebJarDependencies.scala which has javascripts and css's
build.sbt
lazy val shared = (crossProject.crossType(CrossType.Pure) in file("shared"))
.configure(_.enablePlugins(ScalaJSPlugin))
.settings(SjsDependencies.pickling.toSettingsDefinition(): _*)
.settings(SjsDependencies.tagsAndDom.toSettingsDefinition(): _*)
.settings(SjsDependencies.css.toSettingsDefinition(): _*)
lazy val sharedJvm = shared.jvm
lazy val sharedJs = shared.js
lazy val cmdlne = project
.dependsOn(sharedJvm)
.settings(
libraryDependencies ++= (
JvmDependencies.commandLine ++
JvmDependencies.logging ++
JvmDependencies.akka ++
JvmDependencies.serialization
)
)
lazy val client = project
.enablePlugins(ScalaJSPlugin, SbtWeb, SbtSass)
.dependsOn(sharedJs)
.settings(
(SjsDependencies.shapeless ++ SjsDependencies.audiovideo ++ SjsDependencies.databind ++ SjsDependencies.functional ++ SjsDependencies.lensing ++ SjsDependencies.logging ++ SjsDependencies.reactive).toSettingsDefinition(),
jsDependencies ++= WebjarDependencies.js,
libraryDependencies ++= WebjarDependencies.notJs,
persistLauncher in Compile := true
)
lazy val server = project
.dependsOn(sharedJvm, client)
.enablePlugins(SbtNativePackager)
.settings(
copyWebJarResources := { streams.value.log("Copying webjar resources")
val `Web Modules target directory` = (resourceManaged in Compile).value / "assets"
val `Web Modules source directory` = (WebKeys.assets in Assets in client).value / "lib"
final class UsefulFileFilter(acceptable: String*) extends FileFilter {
// TODO ADJUST TO EXCLUDE JS MAP FILES
import scala.collection.JavaConversions._
def accept(file: File) = (file.isDirectory && FileUtils.listFiles(file, acceptable.toArray, true).nonEmpty) || acceptable.contains(file.ext) && !file.name.contains(".js.")
}
val `file filter` = new UsefulFileFilter("css", "scss", "sass", "less", "map")
IO.createDirectory(`Web Modules target directory`)
IO.copyDirectory(source = `Web Modules source directory`, target = `Web Modules target directory` / "script")
FileUtils.copyDirectory(`Web Modules source directory`, `Web Modules target directory` / "style", `file filter`)
},
// run the copy after compile/assets but before managed resources
copyWebJarResources <<= copyWebJarResources dependsOn(compile in Compile, WebKeys.assets in Compile in client, fastOptJS in Compile in client),
managedResources in Compile <<= (managedResources in Compile) dependsOn copyWebJarResources,
watchSources <++= (watchSources in client),
resourceGenerators in Compile <+= Def.task {
val files = ((crossTarget in(client, Compile)).value ** ("*.js" || "*.map")).get
val mappings: Seq[(File,String)] = files pair rebase((crossTarget in(client, Compile)).value, ((resourceManaged in Compile).value / "assets/").getAbsolutePath )
val map: Seq[(File, File)] = mappings.map { case (s, t) => (s, file(t))}
IO.copy(map).toSeq
},
reStart <<= reStart dependsOn (managedResources in Compile),
libraryDependencies ++= (
JvmDependencies.akka ++
JvmDependencies.jarlocating ++
JvmDependencies.functional ++
JvmDependencies.serverPickling ++
JvmDependencies.logging ++
JvmDependencies.serialization ++
JvmDependencies.testing
)
)
Edit 0: A very obscure chat thread has a guy saying what I am feeling: no, not **** scala, but
Mark Eibes #i-am-the-slime Oct 15 2015 09:37
#ochrons I'm still fighting. I can't seem to pickle anything anymore.
https://gitter.im/scala-js/scala-js/archives/2015/10/15
I have a rather simple requirement - I have one web socket route on a akka http server that is defined AkkaServerLogEventToMessageHandler():
object AkkaServerLogEventToMessageHandler
extends Directives {
val sourceOfLogs =
Source.actorPublisher[AkkaServerLogMessage](AkkaServerLogEventPublisher.props) map {
event ⇒
BinaryMessage(
ByteString(
Pickle.intoBytes[AkkaServerLogMessage](event)
)
)
}
def apply(): server.Route = {
handleWebSocketMessages(
Flow[Message].merge(sourceOfLogs)
)
}
}
This fits into a tiny set of routes in the most obvious way.
Now why is that I cannot get boopickle, upickle, or prickle to serialize something as simple as this stupid case class?
sealed case class AkkaServerLogMessage(
message: String,
level: Int,
timestamp: Long
)
No nesting
All primitive types
No generics
Only three of them
These all produced roughly the same error
Using all three of the common picklers to write
Using TextMessage instead of BinaryMessage and the corresponding upickle or prickle writeJs or whatever methods
Varying the case class down to nothing (nothing, as in no members)
Varying the input itself to the case class
Importing various permutations of Implicits and underscore stuff
... specifically, they gave me variations on the same stupid error (not the same error, but considerably similar)
server [ERROR] [04/21/2016 22:04:00.362] [app-akka.actor.default-dispatcher-7] [akka.actor.ActorSystemImpl(app)] WebSocket handler failed with stub
server java.lang.RuntimeException: stub
server at scala.sys.package$.error(package.scala:27)
server at scala.scalajs.runtime.package$.linkingInfo(package.scala:143)
server at scala.scalajs.runtime.package$.environmentInfo(package.scala:137)
server at scala.scalajs.js.Dynamic$.global(Dynamic.scala:78)
server at boopickle.StringCodec$.encodeUTF8(StringCodec.scala:56)
server at boopickle.Encoder.writeString(Codecs.scala:338)
server at boopickle.BasicPicklers$StringPickler$.pickle(Pickler.scala:183)
server at boopickle.BasicPicklers$StringPickler$.pickle(Pickler.scala:134)
server at boopickle.PickleState.pickle(Pickler.scala:511)
server at shindig.clientaccess.handler.AkkaServerLogEventToMessageHandler$$anonfun$1$Pickler$macro$1$2$.pickle(AkkaServerLogEventToMessageHandler.scala:35)
server at shindig.clientaccess.handler.AkkaServerLogEventToMessageHandler$$anonfun$1$Pickler$macro$1$2$.pickle(AkkaServerLogEventToMessageHandler.scala:35)
server at boopickle.PickleImpl$.apply(Default.scala:70)
server at boopickle.PickleImpl$.intoBytes(Default.scala:75)
server at shindig.clientaccess.handler.AkkaServerLogEventToMessageHandler$$anonfun$1.apply(AkkaServerLogEventToMessageHandler.scala:35)
server at shindig.clientaccess.handler.AkkaServerLogEventToMessageHandler$$anonfun$1.apply(AkkaServerLogEventToMessageHandler.scala:31)
This worked
Not using Flow.merge (defeats the purpose, I want to keep sending out put with logs)
Using a static value
Other useless things
Appeal
Please let me know where and why I am stupid... I spent four hours on this problem today in different forms, and it is driving me nuts.

In your build.sbt, you have:
lazy val shared = (crossProject.crossType(CrossType.Pure) in file("shared"))
.configure(_.enablePlugins(ScalaJSPlugin))
Do not do this. You must not enable the Scala.js plugin on a cross-project, ever. This also adds it to the JVM side, which will wreak havoc. Most notably, this will cause %%% to resolve the Scala.js artifacts of your dependencies in the JVM project, and that is really bad. This is what causes your issue.
crossProject already adds the Scala.js plugin to the JS part, and only that one. So simply remove that enablePlugins line.

Mystery solved. Thanks to #Justin du Coeur for pointing me in the right direction.
The reason boopickle wasn't working in particular was because in the dependency chain I was including both the sjs and the jvm version of boopickle in the server project.
I removed the server dependsOn for client and for sharedJs and also removed boopickle from the shared dependencies. Now it works.

Related

sbt with extra unmanagedSourceDirectories don't work with Intellij

I use sbt-projectmatrix for a project like this:
// project/plugins.sbt
addSbtPlugin("com.eed3si9n" % "sbt-projectmatrix" % "0.8.0")
// build.sbt
val foo = projectMatrix
.jvmPlatform(Seq("2.13.6", "2.12.15"))
.settings(
Compile / unmanagedSourceDirectories += (ThisBuild / baseDirectory).value / "shared"
)
// foo/src/main/scala/Foo.scala
object Foo {
val s = "foo"
def sharedS = Shared.s
}
// shared/Shared.scala
object Shared {
val s = "shared"
def fooS = Foo.s
}
When I import into Intellij, it complain:
Duplicate content roots detected: Path [/tmp/bug] of module [bug] was removed from modules [shared-sources]
Then Intellij don't know Shared.s in Foo.scala (it mark Shared red with error "Cannnot resolve symbol Shared") and don't know Foo.s in Shared.scala.
Any workaround please?
Extra info:
In my real project, I have other val bar = projectMatrix.. and foo and bar share some scala source which I put in shared folder
EDIT: Sorry. I have just found that this issue don't related to sbt-projectmatrix: Replace projectMatrix by project in build.sbt => same problem.

Akka - Unable to send Discriminated Unions as messages in F#

Akka - Discriminated Unions as messages in F#
I am unable to use discriminated unions as messages to akka actors. If anyone can point me at an example that does this, it would be much appreciated.
My own attempt at this is at git#github.com:Tweega/AkkaMessageIssue.git. (snippets below). It is a cutdown version of a sample found at https://github.com/rikace/AkkaActorModel.git (Chat project)
Problem
The DU message never finds its target on the server actor, but is sent to the deadletter box. If I send Objects, instead, they do arrive.
If I send a DU, but set my server actor to listen for generic Objects, the message does arrive, but its type is
seq [seq [seq []]
and I can't get at underlying DU.
The DU I am trying to send as message
type PrinterJob =
| PrintThis of string
| Teardown
The client code
let system = System.create "MyClient" config
let chatClientActor =
spawn system "ChatClient" <| fun mailbox ->
let server = mailbox.Context.ActorSelection("akka.tcp://MyServer#localhost:8081/user/ChatServer")
let rec loop nick = actor {
let! (msg:PrinterJob) = mailbox.Receive()
server.Tell(msg)
return! loop nick
}
loop ""
while true do
let input = Console.ReadLine()
chatClientActor.Tell(PrintThis(input))
Messages are forwarded to the client from console input
while true do
let input = Console.ReadLine()
chatClientActor.Tell(PrintThis(input))
The server code
let system = System.create "MyServer" config
let chatServerActor =
spawn system "ChatServer" <| fun (mailbox:Actor<_>) ->
let rec loop (clients:Akka.Actor.IActorRef list) = actor {
let! (msg:PrinterJob) = mailbox.Receive()
printfn "Received %A" msg //Received seq [seq [seq []]; seq [seq [seq []]]] ???
match msg with
| PrintThis str ->
Console.WriteLine("Printing: {0} Do we get this?", str)
return! loop clients
| Teardown ->
Console.WriteLine("Tearing down now")
return! loop clients
}
loop []
Dependencies
(I am not using paket here) - PM commands below:
Install-Package Akka -Version 1.4.23
Install-Package Akka.Remote -Version 1.4.23
Install-Package Akka.FSharp -Version 1.4.23
I am hosting the application in net5.0
Constructor argument names - oddity?
When passing in class instances as objects, akka seems to be sensitive to the name of constructor parameters. The message gets handled, but the data is not copied across from client to server. If you have a property called Username, the constructor parameter cannot be, for example, uName, otherwise its value is null when it reaches the server. Code for this is in branch params.
type DoesWork(montelimar: string) =
member x.Montelimar = montelimar
type DoesNotWork(montelimaro: string) =
member x.Montelimar = montelimaro
I opened an issue in the Akka.NET repository: https://github.com/akkadotnet/akka.net/issues/5194
And added a detailed reproduction for this: https://github.com/akkadotnet/akka.net/pull/5196
But it looks like Newtonsoft.Json really can't perform this deserialization without being given a type hint, which Akka.NET's network serialization does not do by default for JSON:
type TestUnion =
| A of string
| B of int * string
type TestUnion2 =
| C of string * TestUnion
| D of int
[<Fact(Skip="JSON.NET really does not support even basic DU serialization")>]
member _.``JSON.NET must serialize DUs`` () =
let du = C("a-11", B(11, "a-12"))
let settings = new JsonSerializerSettings()
settings.Converters.Add(new DiscriminatedUnionConverter())
let serialized = JsonConvert.SerializeObject(du, settings)
let deserialized = JsonConvert.DeserializeObject(serialized, settings)
Assert.Equal(du :> obj, deserialized)
That test will not pass and it doesn't use any of Akka.NET's infrastructure at all - so the default JSON serializer simply won't work for real-world F# use cases.
We can try changing the defaults of our serialization system to include a type hint, but that will take a lot of validation testing (for old Akka.Persistence data serialized without one).
A better solution, which my pull request validates, is to use Hyperion for polymorphic serialization instead - it will be similarly transparent to you but it has much more robust handling for complex types than Newtonsoft.Json and is actually faster: https://getakka.net/articles/networking/serialization.html#how-to-setup-hyperion-as-default-serializer

Problem with getting rights data [for execution / read / write] file

I'm writing a program that should output a meta-info (size, permissions to execute / read / write, time of last modification) of all files from the specified directory.
I received information about all the information, except the rights to execute / read / write.
I tried to get this info using PosixFilePermissions, but when added to the List I get Exception in thread "main" java.lang.UnsupportedOperationException.
Maybe you should use some other library? Or did I make a mistake somewhere? I would be grateful for any advice!
fun long(path:Path) : MutableList<String> {
var listOfFiles = mutableListOf<String>()
val files = File("$path").listFiles()
var attr: BasicFileAttributes
Arrays.sort(files, NameFileComparator.NAME_COMPARATOR)
files.forEach {
if (it.isFile) {
attr = Files.readAttributes<BasicFileAttributes>(it.toPath(), BasicFileAttributes::class.java)
listOfFiles.add("${it.name} ${attr.size()} ${attr.lastModifiedTime()}" +
" ${PosixFilePermissions.toString(Files.getPosixFilePermissions(it.toPath()))}")
}
else listOfFiles.add("dir ${it.name}")
}
return listOfFiles
}
PosixFilePermissions are only usable for POSIX-compatible file systems (Linux etc.).
For a Windows system, the permissions have to be accessed directly:
file.canRead()
file.canWrite()
file.canExecute()

Substitutions not occurring with Typesafe Config

When I use Config.withValue to create an updated config, substitutions are not re-evaluated, even with a call to resolve:
application.conf:
zooKeeperAddr = "localhost:2181"
zooKeeperAddr2 = ${zooKeeperAddr}
Application code:
val config = ConfigFactory.load()
.withValue("zooKeeperAddr", ConfigValueFactory.fromAnyRef("abc"))
.resolve;
val zooKeeperAddr = config.getAnyRef("zooKeeperAddr")
val zooKeeperAddr2 = config.getAnyRef("zooKeeperAddr2")
println(s"zooKeeperAddr, zooKeeperAddr2 is $zooKeeperAddr, $zooKeeperAddr2")
I expect, of course, to see
zooKeeperAddr, zooKeeperAddr2 is abc, abc
But what I see instead is:
zooKeeperAddr, zooKeeperAddr2 is abc, localhost:2181
How can I get substitutions re-evalauted?
(The larger issue is, I'm trying to inject command-line arguments, in particular, Twitter Module Flags, into the Typesafe Config. Perhaps there's a better way to achieve this goal?
My actual code is:
val config = flag.getAll(false).foldLeft(ConfigFactory.load()){
case (conf, f) if f.isDefined => conf.withValue(f.name, ConfigValueFactory.fromAnyRef(f.get.get))
case (conf, _) => conf
}.resolve
)
So I (the OP) ended up doing the following:
val config = flag.getAll(false).foldLeft(ConfigFactory.empty()){
case (conf, f) if f.isDefined => conf.withValue(f.name, ConfigValueFactory.fromAnyRef(f.get.get))
case (conf, _) => conf
}
.withFallback(ConfigFactory.defaultOverrides())
.withFallback(ConfigFactory.defaultApplication())
.withFallback(ConfigFactory.defaultReference())
.resolve
flag.getAll returns an Iterable[com.twitter.app.Flag]; for each flag that isDefined, we add it to an initaily empty config (ConfigFactory.empty()).
Then we withFallback to, in order, the default overrides (the settings properties), the application config (application.conf, and the default reference (which should include, I hope, all reference.confs in all jars).
withFallback, according to its documentation, "Returns a new value computed by merging this value with another, with keys in this value "winning" over the other one."
Finally, we resolve.
This seems to propagate the substitutions as I want, but I can't help but think the Config API provides an easier way to do this.

which OO programming style in R will result readable to a Python programmer?

I'm author of the logging package on CRAN, I don't see myself as an R programmer, so I tried to make it as code-compatible with the Python standard logging package as I could, but now I have a question. and I hope it will give me the chance to learn some more R!
it's about hierarchical loggers. in Python I would create a logger and send it logging records:
l = logging.getLogger("some.lower.name")
l.debug("test")
l.info("some")
l.warn("say no")
In my R package instead you do not create a logger to which you send messages, you invoke a function where one of the arguments is the name of the logger. something like
logdebug("test", logger="some.lower.name")
loginfo("some", logger="some.lower.name")
logwarn("say no", logger="some.lower.name")
the problem is that you have to repeat the name of the logger each time you want to send it a logging message. I was thinking, I might create a partially applied function object and invoke that instead, something like
logdebug <- curry(logging::logdebug, logger="some.lower.logger")
but then I need doing so for all debugging functions...
how would you R users approach this?
Sounds like a job for a reference class ?setRefClass, ?ReferenceClasses
Logger <- setRefClass("Logger",
fields=list(name = "character"),
methods=list(
log = function(level, ...)
{ levellog(level, ..., logger=name) },
debug = function(...) { log("DEBUG", ...) },
info = function(...) { log("INFO", ...) },
warn = function(...) { log("WARN", ...) },
error = function(...) { log("ERROR", ...) }
))
and then
> basicConfig()
> l <- Logger$new(name="hierarchic.logger.name")
> l$debug("oops")
> l$info("oops")
2011-02-11 11:54:05 NumericLevel(INFO):hierarchic.logger.name:oops
> l$warn("oops")
2011-02-11 11:54:11 NumericLevel(WARN):hierarchic.logger.name:oops
>
This could be done with the proto package. This supports older versions of R (its been around for years) so you would not have a problem of old vs. new versions of R.
library(proto)
library(logging)
Logger. <- proto(
new = function(this, name)
this$proto(name = name),
log = function(this, ...)
levellog(..., logger = this$name),
setLevel = function(this, newLevel)
logging::setLevel(newLevel, container = this$name),
addHandler = function(this, ...)
logging::addHandler(this, ..., logger = this$name),
warn = function(this, ...)
this$log(loglevels["WARN"], ...),
error = function(this, ...)
this$log(loglevels["ERROR"], ...)
)
basicConfig()
l <- Logger.$new(name = "hierarchic.logger.name")
l$warn("this may be bad")
l$error("this definitely is bad")
This gives the output:
> basicConfig()
> l <- Logger.$new(name = "hierarchic.logger.name")
> l$warn("this may be bad")
2011-02-28 10:17:54 WARNING:hierarchic.logger.name:this may be bad
> l$error("this definitely is bad")
2011-02-28 10:17:54 ERROR:hierarchic.logger.name:this definitely is bad
In the above we have merely layered proto on top of logging but it would be possible to turn each logging object into a proto object, i.e. it would be both, since both logging objects and proto objects are R environments. That would get rid of the extra layer.
See the http://r-proto.googlecode.com for more info.
Why would you repeat the name? It would be more convenient to pass the log-object directly to the function, ie
logdebug("test",logger=l)
# or
logdebug("test",l)
A bit the way one would use connections in a number of functions. That seems more the R way of doing it I guess.