FAKE build a project with unsafe flag - f#-fake

I am trying to build a solution where one of the projects needs to be build with the unsafe flag on, it is set correctly in the project however when building I get the error:
"Unsafe code may only appear if compiling with /unsafe"
This is my target at the moment
Target "CompileApp" (fun _ ->
!! #"**\*.csproj"
|> MSBuildRelease buildDir "Build"
|> Log "AppBuild-Output: "
)
I tried adding MsBuildParams but not sure how to use them yet (ie there doesnt seem to be an option in MsBuildRelease to add something like this
let setParams defaults =
{ defaults with
Verbosity = Some(Quiet)
Targets = ["Build"]
Properties =
[
"AllowUnsafeBlocks", "True"
"Configuration", "Release"
]
}
Also would the best option here be create two different targets for projects with safe and unsafe code, of would there be a better way?

I found that the AllowUnsafeBlocks=true element was only defined under the DEBUG|AnyCPU and Release|AnyCPU PropertyGroups in my project file.
Using this fixed it for me:
Target "BuildApp" (fun _ ->
!! ".\**\MyApp.*.csproj"
|> MSBuild buildDir "Build" ["Configuration", "Release"; "Platform", "AnyCPU"]
|> Log "AppBuild-Output: "
)
Hope this helps.

Ok I think this might be the way:
Target "CompileUnsafe" (fun _ ->
let buildMode = getBuildParamOrDefault "buildMode" "Release"
let setParams defaults =
{ defaults with
Verbosity = Some(Quiet)
Targets = ["Build"]
Properties =
[
"Optimize", "True"
"DebugSymbols", "True"
"Configuration", buildMode
"AllowUnsafeBlocks", "True"
]
}
build setParams "./ProjectPlugins.sln"
)
IF there are better solutions I'm all ears (the solution was there in the docs and I just missed it)

Related

How can I include kotlin-reflect in the classpath of the Bazel compiler?

I'm trying to get moshi-kotlin-codegen to run on some Kotlin code via Bazel. After a lot of trial and error, I managed to get the plugin to run, but it's failing due to not having kotlin-reflect on the classpath. This is needed by kotlinpoet, which is used by Moshi, so it should be transitively included, AFAICT. However, even explicitly stating the dependency in the BUILD.bazel file for moshi-kotlin-codegen doesn't make it work, so I can only assume it gets filtered out somewhere.
The WORKSPACE file:
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "rules_jvm_external",
sha256 = "62133c125bf4109dfd9d2af64830208356ce4ef8b165a6ef15bbff7460b35c3a",
strip_prefix = "rules_jvm_external-3.0",
url = "https://github.com/bazelbuild/rules_jvm_external/archive/3.0.zip",
)
load("#rules_jvm_external//:defs.bzl", "maven_install")
maven_install(
artifacts = [
"com.github.ajalt:clikt:2.6.0",
"org.eclipse.jgit:org.eclipse.jgit:5.7.0.202003090808-r",
"io.github.microutils:kotlin-logging:1.7.8",
"ch.qos.logback:logback-classic:1.2.3",
"com.github.scribejava:scribejava-core:6.9.0",
"com.squareup.moshi:moshi:1.9.2",
"com.squareup.moshi:moshi-kotlin-codegen:1.9.2",
"org.kohsuke:github-api:1.108",
"com.github.ben-manes.caffeine:caffeine:2.8.2",
"javax.xml.bind:jaxb-api:2.3.1",
"org.junit.jupiter:junit-jupiter:5.6.0",
"org.junit.jupiter:junit-jupiter-params:5.6.0",
"com.google.truth:truth:1.0.1",
],
fetch_sources = True,
repositories = [
"https://maven.google.com",
"https://repo1.maven.org/maven2",
"https://jcenter.bintray.com/",
],
strict_visibility = True,
)
rules_kotlin_version = "legacy-1.4.0-rc3"
rules_kotlin_sha = "da0e6e1543fcc79e93d4d93c3333378f3bd5d29e82c1bc2518de0dbe048e6598"
http_archive(
name = "io_bazel_rules_kotlin",
urls = ["https://github.com/bazelbuild/rules_kotlin/releases/download/%s/rules_kotlin_release.tgz" % rules_kotlin_version],
sha256 = rules_kotlin_sha,
)
load("#io_bazel_rules_kotlin//kotlin:kotlin.bzl", "kotlin_repositories", "kt_register_toolchains")
kotlin_repositories()
kt_register_toolchains()
The BUILD.bazel for moshi-kotlin-codegen:
java_plugin(
name = "moshi_kotlin_codegen_plugin",
processor_class = "com.squareup.moshi.kotlin.codegen.JsonClassCodegenProcessor",
deps = [
"#maven//:com_squareup_moshi_moshi_kotlin_codegen",
],
generates_api = True,
visibility = ["//visibility:public"],
)
(I also tried adding a java_library and depending on that, no luck.)
The final BUILD file that tries to include it:
load("#io_bazel_rules_kotlin//kotlin:kotlin.bzl", "kt_jvm_binary")
kt_jvm_binary(
name = "myproject",
srcs = glob([
"**/*.kt",
]),
main_class = "my.project.MainKt",
plugins = [
"//third_party/moshi_kotlin_codegen:moshi_kotlin_codegen_plugin",
],
deps = [
"#maven//:ch_qos_logback_logback_classic",
"#maven//:com_github_ajalt_clikt",
"#maven//:com_github_ben_manes_caffeine_caffeine",
"#maven//:com_github_scribejava_scribejava_core",
"#maven//:com_squareup_moshi_moshi",
"#maven//:io_github_microutils_kotlin_logging",
"#maven//:org_eclipse_jgit_org_eclipse_jgit",
"#maven//:org_kohsuke_github_api",
"#maven//:javax_xml_bind_jaxb_api",
],
)
The exception during the compilation:
Caused by: kotlin.jvm.KotlinReflectionNotSupportedError: Kotlin reflection implementation is not found at runtime. Make sure you have kotlin-reflect.jar in the classpath
at kotlin.jvm.internal.ClassReference.error(ClassReference.kt:79)
at kotlin.jvm.internal.ClassReference.getQualifiedName(ClassReference.kt:15)
at com.squareup.kotlinpoet.ClassNames.get(ClassName.kt:49)
at com.squareup.moshi.kotlinpoet.classinspector.elements.ElementsClassInspector.<clinit>(ElementsClassInspector.kt:493)
at com.squareup.moshi.kotlin.codegen.JsonClassCodegenProcessor.process(JsonClassCodegenProcessor.kt:99)
at org.jetbrains.kotlin.kapt3.base.incremental.IncrementalProcessor.process(incrementalProcessors.kt)
at org.jetbrains.kotlin.kapt3.base.ProcessorWrapper.process(annotationProcessing.kt:147)
at jdk.compiler/com.sun.tools.javac.processing.JavacProcessingEnvironment.callProcessor(JavacProcessingEnvironment.java:980)
... 48 more
Turns out that this was indeed a bug. A fix is https://github.com/bazelbuild/rules_kotlin/pull/354.

platform dependent linker flags in bazel (for glut)

I am trying to build the c++ app with glut using bazel. It should work on both macos and linux. Now the problem is that on macos it requires passing "-framework OpenGL", "-framework GLUT" to linker flags, while on linux I should probably do soemthing like
cc_library(
name = "glut",
srcs = glob(["local/lib/libglut*.dylib", "lib/libglut*.so"]),
...
in glut.BUILD.
So the question is
1. How to provide platform-dependent linker options to cc_library rules in general?
2. And in particular how to link to glut in platform-independent way using bazel?
You can do this using the Bazel select() function. Something like this might work:
config_setting(
name = "linux_x86_64",
values = {"cpu": "k8"},
visibility = ["//visibility:public"],
)
config_setting(
name = "darwin_x86_64",
values = {"cpu": "darwin_x86_64"},
visibility = ["//visibility:public"],
)
cc_library(
name = "glut",
srcs = select({
":darwin_x86_64": [],
":linux_x86_64": glob(["local/lib/libglut*.dylib", "lib/libglut*.so"]),
}),
linkopts = select({
":darwin_x86_64": [
"-framework OpenGL",
"-framework GLUT"
],
":linux_x86_64": [],
})
...
)
Dig around in the Bazel github repository, it's got some good real world examples of using select().
I had a similar problem but with picking the right compiler depending on the platform and #zlalanne's solution didn't work for me. After 2 days of frustration, I finally found the following solution:
config_setting (
name = "darwin",
constraint_values = [ "#bazel_tools//platforms:osx" ]
)
config_setting (
name = "windows",
constraint_values = [ "#bazel_tools//platforms:windows" ]
)
I didn't have any need for linux, but adding this to your BUILD file should work:
config_setting (
name = "linux",
constraint_values = [ "#bazel_tools//platforms:linux" ]
)
Use ":darwin", ":windows" and ":linux" in your selects and you should have a solution that works.

How do you specify Fake Target inputs and output?

In the build systems that I'm familiar with (make and msbuild) there's a way to specify the inputs and outputs for a target. If the time stamps on the input files are earlier than those on the outputs the task is skipped. I can't find something similar in FAKE.
For example if I wanted to translate this Makefile to Fake
a.exe: a.fs
fsharpc a.fs -o a.exe
it might look like:
Target "a.exe" (fun _ -> ["a.fs"] |> FscHelper.compile [...])
However, when I run the build command it will always execute the compiler and produce a new a.exe regardless the modification time on a.fs. Is there a simple way to get the same behavior as the makefile?
You could use =?>and provide a function that returns true or false if the task should run.
let fileModified f1 f2 =
FileInfo(f1).LastWriteTime > FileInfo(f2).LastWriteTime
and then in target dependencies
=?> ("a.exe", fileModified "a.fs" "a.exe")
A more complete code example to flesh out Lazydevs answer:
#r "packages/FAKE/tools/FakeLib.dll"
open Fake
open System.IO
Target "build" (fun _ ->
trace "built"
)
let needsUpdate f1 f2 =
let lastWrite files =
files
|> Seq.map (fun f -> FileInfo(f).LastWriteTime)
|> Seq.max
let t1 = lastWrite f1
let t2 = lastWrite f2
t1 > t2
let BuildTarget name infiles outfiles fn =
Target name (fn infiles)
name =?> ("build", needsUpdate infiles outfiles)
BuildTarget "compile" ["Test2.fs"; "Test1.fs"] ["Test2.dll"] (fun files _ ->
files
|> FscHelper.compile [
FscHelper.Target FscHelper.TargetType.Library
]
|> function 0 -> () | c -> failwithf "compile error"
)
RunTargetOrDefault "build"

Why can I not pickle my case classes? What should I do to solve this manually next time?

Edit 2: Observations and questions
I am pretty sure along with the commenter below Justin that the problem is due to an errant build.sbt configuration. However, this is the first time I have seen an errant build.sbt configuration that literally works for everything else except for pickers. Maybe that is becaus they use macros and I as a rule avoid them.
Why would it matter whether Flow.merge is used vs Flow.map if the problem is with the sbt?
Suspicious build.sbt extract
lazy val server = project
.dependsOn(sharedJvm, client)
Suspicious stack trace
So this is the top of the stack: it goes from a method I cannot find to the linking environment to the string encoding utils. Ok.
server java.lang.RuntimeException: stub
Huh? stub?
server at scala.sys.package$.error(package.scala:27)
server at scala.scalajs.runtime.package$.linkingInfo(package.scala:143)
server at scala.scalajs.runtime.package$.environmentInfo(package.scala:137)
HUH?
server at scala.scalajs.js.Dynamic$.global(Dynamic.scala:78)
???
server at boopickle.StringCodec$.encodeUTF8(StringCodec.scala:56)
Edit 1: My big and beautiful build.sbt might be the problem
What you cannot see is that I organized in my project folder:
JvmDependencies.scala which has regular Jvm dependencies
SjsDependencies.scala which has Def.settingsKeys of libraryDependencies on JsModuleIDs
WebJarDependencies.scala which has javascripts and css's
build.sbt
lazy val shared = (crossProject.crossType(CrossType.Pure) in file("shared"))
.configure(_.enablePlugins(ScalaJSPlugin))
.settings(SjsDependencies.pickling.toSettingsDefinition(): _*)
.settings(SjsDependencies.tagsAndDom.toSettingsDefinition(): _*)
.settings(SjsDependencies.css.toSettingsDefinition(): _*)
lazy val sharedJvm = shared.jvm
lazy val sharedJs = shared.js
lazy val cmdlne = project
.dependsOn(sharedJvm)
.settings(
libraryDependencies ++= (
JvmDependencies.commandLine ++
JvmDependencies.logging ++
JvmDependencies.akka ++
JvmDependencies.serialization
)
)
lazy val client = project
.enablePlugins(ScalaJSPlugin, SbtWeb, SbtSass)
.dependsOn(sharedJs)
.settings(
(SjsDependencies.shapeless ++ SjsDependencies.audiovideo ++ SjsDependencies.databind ++ SjsDependencies.functional ++ SjsDependencies.lensing ++ SjsDependencies.logging ++ SjsDependencies.reactive).toSettingsDefinition(),
jsDependencies ++= WebjarDependencies.js,
libraryDependencies ++= WebjarDependencies.notJs,
persistLauncher in Compile := true
)
lazy val server = project
.dependsOn(sharedJvm, client)
.enablePlugins(SbtNativePackager)
.settings(
copyWebJarResources := { streams.value.log("Copying webjar resources")
val `Web Modules target directory` = (resourceManaged in Compile).value / "assets"
val `Web Modules source directory` = (WebKeys.assets in Assets in client).value / "lib"
final class UsefulFileFilter(acceptable: String*) extends FileFilter {
// TODO ADJUST TO EXCLUDE JS MAP FILES
import scala.collection.JavaConversions._
def accept(file: File) = (file.isDirectory && FileUtils.listFiles(file, acceptable.toArray, true).nonEmpty) || acceptable.contains(file.ext) && !file.name.contains(".js.")
}
val `file filter` = new UsefulFileFilter("css", "scss", "sass", "less", "map")
IO.createDirectory(`Web Modules target directory`)
IO.copyDirectory(source = `Web Modules source directory`, target = `Web Modules target directory` / "script")
FileUtils.copyDirectory(`Web Modules source directory`, `Web Modules target directory` / "style", `file filter`)
},
// run the copy after compile/assets but before managed resources
copyWebJarResources <<= copyWebJarResources dependsOn(compile in Compile, WebKeys.assets in Compile in client, fastOptJS in Compile in client),
managedResources in Compile <<= (managedResources in Compile) dependsOn copyWebJarResources,
watchSources <++= (watchSources in client),
resourceGenerators in Compile <+= Def.task {
val files = ((crossTarget in(client, Compile)).value ** ("*.js" || "*.map")).get
val mappings: Seq[(File,String)] = files pair rebase((crossTarget in(client, Compile)).value, ((resourceManaged in Compile).value / "assets/").getAbsolutePath )
val map: Seq[(File, File)] = mappings.map { case (s, t) => (s, file(t))}
IO.copy(map).toSeq
},
reStart <<= reStart dependsOn (managedResources in Compile),
libraryDependencies ++= (
JvmDependencies.akka ++
JvmDependencies.jarlocating ++
JvmDependencies.functional ++
JvmDependencies.serverPickling ++
JvmDependencies.logging ++
JvmDependencies.serialization ++
JvmDependencies.testing
)
)
Edit 0: A very obscure chat thread has a guy saying what I am feeling: no, not **** scala, but
Mark Eibes #i-am-the-slime Oct 15 2015 09:37
#ochrons I'm still fighting. I can't seem to pickle anything anymore.
https://gitter.im/scala-js/scala-js/archives/2015/10/15
I have a rather simple requirement - I have one web socket route on a akka http server that is defined AkkaServerLogEventToMessageHandler():
object AkkaServerLogEventToMessageHandler
extends Directives {
val sourceOfLogs =
Source.actorPublisher[AkkaServerLogMessage](AkkaServerLogEventPublisher.props) map {
event ⇒
BinaryMessage(
ByteString(
Pickle.intoBytes[AkkaServerLogMessage](event)
)
)
}
def apply(): server.Route = {
handleWebSocketMessages(
Flow[Message].merge(sourceOfLogs)
)
}
}
This fits into a tiny set of routes in the most obvious way.
Now why is that I cannot get boopickle, upickle, or prickle to serialize something as simple as this stupid case class?
sealed case class AkkaServerLogMessage(
message: String,
level: Int,
timestamp: Long
)
No nesting
All primitive types
No generics
Only three of them
These all produced roughly the same error
Using all three of the common picklers to write
Using TextMessage instead of BinaryMessage and the corresponding upickle or prickle writeJs or whatever methods
Varying the case class down to nothing (nothing, as in no members)
Varying the input itself to the case class
Importing various permutations of Implicits and underscore stuff
... specifically, they gave me variations on the same stupid error (not the same error, but considerably similar)
server [ERROR] [04/21/2016 22:04:00.362] [app-akka.actor.default-dispatcher-7] [akka.actor.ActorSystemImpl(app)] WebSocket handler failed with stub
server java.lang.RuntimeException: stub
server at scala.sys.package$.error(package.scala:27)
server at scala.scalajs.runtime.package$.linkingInfo(package.scala:143)
server at scala.scalajs.runtime.package$.environmentInfo(package.scala:137)
server at scala.scalajs.js.Dynamic$.global(Dynamic.scala:78)
server at boopickle.StringCodec$.encodeUTF8(StringCodec.scala:56)
server at boopickle.Encoder.writeString(Codecs.scala:338)
server at boopickle.BasicPicklers$StringPickler$.pickle(Pickler.scala:183)
server at boopickle.BasicPicklers$StringPickler$.pickle(Pickler.scala:134)
server at boopickle.PickleState.pickle(Pickler.scala:511)
server at shindig.clientaccess.handler.AkkaServerLogEventToMessageHandler$$anonfun$1$Pickler$macro$1$2$.pickle(AkkaServerLogEventToMessageHandler.scala:35)
server at shindig.clientaccess.handler.AkkaServerLogEventToMessageHandler$$anonfun$1$Pickler$macro$1$2$.pickle(AkkaServerLogEventToMessageHandler.scala:35)
server at boopickle.PickleImpl$.apply(Default.scala:70)
server at boopickle.PickleImpl$.intoBytes(Default.scala:75)
server at shindig.clientaccess.handler.AkkaServerLogEventToMessageHandler$$anonfun$1.apply(AkkaServerLogEventToMessageHandler.scala:35)
server at shindig.clientaccess.handler.AkkaServerLogEventToMessageHandler$$anonfun$1.apply(AkkaServerLogEventToMessageHandler.scala:31)
This worked
Not using Flow.merge (defeats the purpose, I want to keep sending out put with logs)
Using a static value
Other useless things
Appeal
Please let me know where and why I am stupid... I spent four hours on this problem today in different forms, and it is driving me nuts.
In your build.sbt, you have:
lazy val shared = (crossProject.crossType(CrossType.Pure) in file("shared"))
.configure(_.enablePlugins(ScalaJSPlugin))
Do not do this. You must not enable the Scala.js plugin on a cross-project, ever. This also adds it to the JVM side, which will wreak havoc. Most notably, this will cause %%% to resolve the Scala.js artifacts of your dependencies in the JVM project, and that is really bad. This is what causes your issue.
crossProject already adds the Scala.js plugin to the JS part, and only that one. So simply remove that enablePlugins line.
Mystery solved. Thanks to #Justin du Coeur for pointing me in the right direction.
The reason boopickle wasn't working in particular was because in the dependency chain I was including both the sjs and the jvm version of boopickle in the server project.
I removed the server dependsOn for client and for sharedJs and also removed boopickle from the shared dependencies. Now it works.

Easiest way to set msbuild logging verbosity in fake?

I have a target that looks like this:
Target "builddotnetcode" (fun _ ->
!! "../Mercury.sln"
|> MSBuildRelease null "Clean,Build"
|> Log "MercuryBuild - Output: "
)
I want to simply set the verbosity in there somewhere. As far as I can tell from the docs you need to specify the Verbosity member of the MSBuildParams object. But build is the only MSBuildHelper function that provides a way to pass a MSBuildParams. Using build I then need to specify Configuration=Release property, the project list, and remove the pipeline to the Log. It seems like there ought to be a simpler way that does not cause me to redefine the entire task. Am I missing something?
So what i did is the following. The reason i did is it this was as I want to create a log file per solution file that I am building
let loggerConfig : list<MSBuildFileLoggerConfig> = [
{
Number = 1
Filename = Some (baseDir + name + "_build.log")
Verbosity = Some MSBuildVerbosity.Minimal
Parameters = Some [MSBuildLogParameter.Append]
}
]
let setParams defaults =
{ defaults with
Verbosity = Some MSBuildVerbosity.Minimal
Targets = ["Build"]
MaxCpuCount = Some (Some 4)
FileLoggers = Some loggerConfig
ToolsVersion = Some "12.0"
Properties =
[
"Optimize", "True"
"DebugSymbols", "True"
"Configuration", buildMode
]
}
Lastly the only msbuild task that I could see that will let you override the msbuilddefaults was standard build.
build setParams solution
|> DoNothing