How to write short local open in ReScript? - module

This compiles in ReasonML:
let testFn = who => Js.(log("Hello " ++ who ++ "!"));
but not in ReScript:
FAILED: src/test.ast
Syntax error!
/xxx/src/test.res:1:25-27
1 │ let testFn = who => Js.(log("Hello " ++ who ++ "!"));
2 │
I'm not sure what to parse here when looking at "(".
Syntax error!
/xxx/src/test.res:1:25-27
1 │ let testFn = who => Js.(log("Hello " ++ who ++ "!"));
2 │
consecutive statements on a line must be separated by ';' or a newline
I didn't find any mention of removal in official docs. Did I miss it? Has syntax changed, or was it removed and not mentioned in docs?

As pointed out by #Yawar in the comments, this short-hand is not supported at time of writing, but is likely to be at some point in the future (see https://github.com/rescript-lang/syntax/issues/2 for discussion).
And just to save a click for those coming across this, a workaround is to rewrite it using a local scope and opening the module in that scope:
let testFn = who => {
open Js
log("Hello " ++ who ++ "!")
}

Related

Rescript open is not bringing functions into scope

How is open and module prefixing supposed to work in rescript and rescript-react. It doesn't appear to be adhering to the documentation. For example, I have a file reader module
FileReader.res
module FileReader = {
type fileReader
type file = {"name": string, "lastModified": int, "size": int, "type__": string}
#new external createFileReader: unit => fileReader = "FileReader"
#bs.send
external readAsDataURL: (fileReader, file) => unit = "readAsDataURL"
let onload: (fileReader, string => unit) => unit = %raw(`
function (reader, cb) {
reader.onload = function (e) {
cb(e.target.result);
}
}
`)
let fileToDataUrl: (file, string => unit) => unit = (file, continue) => {
let reader = createFileReader()
onload(reader, continue)
readAsDataURL(reader, file)
}
}
That I am trying to use from a react component:
Upload.res
open FileReader
let k = x => (_ => x)
let setState = stateHook => (newVal => stateHook(_ => newVal))
let firstFileFromEvent = event => ReactEvent.Form.target(event)["files"][0]
#react.component
let make = () => {
let (dataUrl, setDataUrl) = React.useState(k(""))
let setDataUrlState = setState(setDataUrl)
let fileOnChange = (event) =>
event ->
firstFileFromEvent ->
FileReader.fileToDataUrl(setDataUrlState)
<div>
<input type_="file" onChange=fileOnChange/>
<img src=dataUrl/>
</div>
}
The only way to get the code to compile is to
open
reference with the module name
my understanding is that you only need to do one or the other. If I remove the open statement, this is the output of the compiler
We've found a bug for you!
src/Upload.res:15:13-36
13 ┆ event -> 14 ┆ firstFileFromEvent -> 15 ┆
FileReader.fileToDataUrl(setDataUrlState) 16 ┆ 17 ┆
The value fileToDataUrl can't be found in FileReader
Adding the open statement back and changing the statement FileReader.fileToDataUrl(setDataUrlState) to fileToDataUrl(setDataUrlState) results in this exception:
We've found a bug for you!
src/Upload.res:15:13-25
13 ┆ event -> 14 ┆ firstFileFromEvent -> 15 ┆
fileToDataUrl(setDataUrlState) 16 ┆ 17 ┆
The value fileToDataUrl can't be found
The source i posted with both the open statement and the module prefix to the function call, compiles but has a warning:
Warning number 44
src/Upload.res:1:1-15
1 │ open FileReader 2 │ 3 │ let k = x => (_ => x)
this open statement shadows the module identifier FileReader (which
is later used)
I am on a mac; using rescript 9.1.4;
In rescript, every code file is itself a module. By putting your code into FileReader.res you've already created a module called FileReader. And by using module FileReader = { ... } you're creating another submodule called FileReader inside it.
The simple solution would then be to just not wrap it in a submodule. Alternatively, you could also open FileReader.FileReader.
What the warning is telling you is that when you open FileReader, you're importing another module called FileReader which shadows (i.e. hides) the top-level module that's also called FileReader.

Elixir - Manipulating a 2 dimensional list

Hope everybody is having a beautiful 2019 even though we're just a day in.
I am currently working on a small Phoenix app where I'm manipulating PDF files (in the context of this question I'm splitting them) and then uploading them to S3. Later on I have to delete the temporary files created by pdftk ( a pdf tool ) I use to split them up and also show the s3 links in the response body since this is an API request.
The way I have structured this is as following:
Inside my Split module where the core business logic is:
filenames = []
s3_links = []
Enum.map(pages, fn(item) ->
split_filename = item
|> split(filename)
link = split_filename
|> FileHelper.result_file_bytes()
|> ManageS3.upload()
|> FileHelper.save_file(work_group_id, pass)
[filenames ++ split_filename, s3_links ++ link]
end)
|> transform()
{filenames, s3_links}
The important things are split_filename and link
This is what I'm getting when I call an IO.inspect in the transform() method:
[
["87cdcd73-5b27-4757-a472-78aaf6cc6864.pdf",
"Some_S3_LINK00"],
["0ab460ca-5019-4864-b0ff-343966c7d72a.pdf",
"Some_S3_LINK01"]
]
The structuring is [[filename, s3_link], [filename, s3_link]] whereas the desired outcome would be that of [ [list of all filenames], [list of s3 links].
If anybody can lend a hand I would be super grateful. Thanks in advance!
Sidenotes:
Assigning filenames = []; s3_links = [] in the very beginning makes zero sense. Enum.map already maps the input. What you need is probably Enum.reduce/3.
Don’t use the pipe |> operator when the pipe consists of the only call, it is considered an anti-pattern by Elixir core team.
Always start pipes with a term.
Solution:
Reduce the input into the result using Enum.reduce/3 directly to what you need.
pages
|> Enum.reduce([[], []], fn item, [files, links] ->
split_filename = split(item, filename)
link =
split_filename
|> FileHelper.result_file_bytes()
|> ManageS3.upload()
|> FileHelper.save_file(work_group_id, pass)
[[split_filename | files], [link | links]]
end)
|> Enum.map(&Enum.reverse/1)
|> IO.inspect(label: "Before transform")
|> transform()
You did not provide the input to test it, but I believe it should work.
Instead of working on lists of lists, you may want to consider using tuples with lists. Something like the following should work for you.
List.foldl(pages, {[], []}, fn(item, {filenames, links}) ->
filename = split(item, filename)
link =
file_name
|> FileHelper.result_file_bytes()
|> ManagerS3.upload()
|> FileHelper.save_file(work_group_id, pass)
{[filename | filenames], [link | links]}
end)
This will return a value that looks like
{
["87cdcd73-5b27-4757-a472-78aaf6cc6864.pdf",
"0ab460ca-5019-4864-b0ff-343966c7d72a.pdf"],
["Some_S3_LINK00",
"Some_S3_LINK01"]
}
Though, depending on how you are using these values, maybe a list of tuples would be more appropriate. Something like
Enum.map(pages, fn(item) ->
filename = split(item, filename)
link =
filename
|> FileHelper.result_file_bytes()
|> ManageS3.upload()
|> FileHelper.save_file(work_group_id, pass)
{filename, link}
end)
would return
[
{"87cdcd73-5b27-4757-a472-78aaf6cc6864.pdf", "Some_S3_LINK00"},
{"0ab460ca-5019-4864-b0ff-343966c7d72a.pdf", "Some_S3_LINK01"}
]

Turn absolute file paths and line numbers in the tool output into hyperlinks

This is an example output:
/usr/local/bin/node /usr/local/bin/elm-make src/elm/Main.elm --output=builds/main.js
-- TYPE MISMATCH ---------------------------------------------- src/elm/Main.elm
The type annotation for `init` does not match its definition.
35| init : Maybe Route.Location -> ( Model, Cmd Msg )
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The type annotation is saying:
Maybe Route.Location -> ( { route : Maybe Route.Location }, Cmd Msg )
But I am inferring that the definition has this type:
Maybe Route.Location
-> ( { route : Maybe Route.Location -> Route.Model }, Cmd a )
Detected errors in 1 module.
Process finished with exit code 1
This is the regex that i came up with:
http://regexr.com/3egqu
However, creating output filter out of it like this:
doesn't work.
Thus far, I only know that the following works: ------ ($FILE_PATH$)
And it turns the file path into a link:
Help me find a way to include the line numbers into the links.
Here's what I've come up with;
First,
elm-make --report json
outputs the build errors in structured JSON;
$ elm-make --report json src/main.elm
[{"tag":"unused import","overview":"Module `Bootstrap.CDN` is unused.","details":"Best to remove it. Don't save code quality for later!","region":{"start":{"line":3,"column":1},"end":{"line":3,"column":28}},"type":"warning","file":"src/main.elm"}]
Now you can pipe that output through jq (see here). to reformat it into
elm make src/main.elm --report json --output ./public/app.js | \
jq '.[] | { type: .type, file: .file, line: .region.start.line|tostring, tag: .tag, column: .region.start.column|tostring, tag: .tag, details: .details }' | \
jq --raw-output '. | "[" + (.type|ascii_upcase) + "] " + .file + ":" + .line + ":" + .column + " " + .tag + " -- " + .details + "\n"'
that gives you a reformatted output;
[WARNING] src/main.elm:9:1 unused import -- Best to remove it. Don't save code quality for later!
[WARNING] src/main.elm:17:1 missing type annotation -- I inferred the type annotation so you can copy it into your code:
main : Program Never Model Main.Msg
Which you pick up in intellij using the format
$FILE_PATH$:$LINE$:$COLUMN$ $MESSAGE$
You then get to click on an error message to jump to the file, and the error text in a tooltip.

Why can I not pickle my case classes? What should I do to solve this manually next time?

Edit 2: Observations and questions
I am pretty sure along with the commenter below Justin that the problem is due to an errant build.sbt configuration. However, this is the first time I have seen an errant build.sbt configuration that literally works for everything else except for pickers. Maybe that is becaus they use macros and I as a rule avoid them.
Why would it matter whether Flow.merge is used vs Flow.map if the problem is with the sbt?
Suspicious build.sbt extract
lazy val server = project
.dependsOn(sharedJvm, client)
Suspicious stack trace
So this is the top of the stack: it goes from a method I cannot find to the linking environment to the string encoding utils. Ok.
server java.lang.RuntimeException: stub
Huh? stub?
server at scala.sys.package$.error(package.scala:27)
server at scala.scalajs.runtime.package$.linkingInfo(package.scala:143)
server at scala.scalajs.runtime.package$.environmentInfo(package.scala:137)
HUH?
server at scala.scalajs.js.Dynamic$.global(Dynamic.scala:78)
???
server at boopickle.StringCodec$.encodeUTF8(StringCodec.scala:56)
Edit 1: My big and beautiful build.sbt might be the problem
What you cannot see is that I organized in my project folder:
JvmDependencies.scala which has regular Jvm dependencies
SjsDependencies.scala which has Def.settingsKeys of libraryDependencies on JsModuleIDs
WebJarDependencies.scala which has javascripts and css's
build.sbt
lazy val shared = (crossProject.crossType(CrossType.Pure) in file("shared"))
.configure(_.enablePlugins(ScalaJSPlugin))
.settings(SjsDependencies.pickling.toSettingsDefinition(): _*)
.settings(SjsDependencies.tagsAndDom.toSettingsDefinition(): _*)
.settings(SjsDependencies.css.toSettingsDefinition(): _*)
lazy val sharedJvm = shared.jvm
lazy val sharedJs = shared.js
lazy val cmdlne = project
.dependsOn(sharedJvm)
.settings(
libraryDependencies ++= (
JvmDependencies.commandLine ++
JvmDependencies.logging ++
JvmDependencies.akka ++
JvmDependencies.serialization
)
)
lazy val client = project
.enablePlugins(ScalaJSPlugin, SbtWeb, SbtSass)
.dependsOn(sharedJs)
.settings(
(SjsDependencies.shapeless ++ SjsDependencies.audiovideo ++ SjsDependencies.databind ++ SjsDependencies.functional ++ SjsDependencies.lensing ++ SjsDependencies.logging ++ SjsDependencies.reactive).toSettingsDefinition(),
jsDependencies ++= WebjarDependencies.js,
libraryDependencies ++= WebjarDependencies.notJs,
persistLauncher in Compile := true
)
lazy val server = project
.dependsOn(sharedJvm, client)
.enablePlugins(SbtNativePackager)
.settings(
copyWebJarResources := { streams.value.log("Copying webjar resources")
val `Web Modules target directory` = (resourceManaged in Compile).value / "assets"
val `Web Modules source directory` = (WebKeys.assets in Assets in client).value / "lib"
final class UsefulFileFilter(acceptable: String*) extends FileFilter {
// TODO ADJUST TO EXCLUDE JS MAP FILES
import scala.collection.JavaConversions._
def accept(file: File) = (file.isDirectory && FileUtils.listFiles(file, acceptable.toArray, true).nonEmpty) || acceptable.contains(file.ext) && !file.name.contains(".js.")
}
val `file filter` = new UsefulFileFilter("css", "scss", "sass", "less", "map")
IO.createDirectory(`Web Modules target directory`)
IO.copyDirectory(source = `Web Modules source directory`, target = `Web Modules target directory` / "script")
FileUtils.copyDirectory(`Web Modules source directory`, `Web Modules target directory` / "style", `file filter`)
},
// run the copy after compile/assets but before managed resources
copyWebJarResources <<= copyWebJarResources dependsOn(compile in Compile, WebKeys.assets in Compile in client, fastOptJS in Compile in client),
managedResources in Compile <<= (managedResources in Compile) dependsOn copyWebJarResources,
watchSources <++= (watchSources in client),
resourceGenerators in Compile <+= Def.task {
val files = ((crossTarget in(client, Compile)).value ** ("*.js" || "*.map")).get
val mappings: Seq[(File,String)] = files pair rebase((crossTarget in(client, Compile)).value, ((resourceManaged in Compile).value / "assets/").getAbsolutePath )
val map: Seq[(File, File)] = mappings.map { case (s, t) => (s, file(t))}
IO.copy(map).toSeq
},
reStart <<= reStart dependsOn (managedResources in Compile),
libraryDependencies ++= (
JvmDependencies.akka ++
JvmDependencies.jarlocating ++
JvmDependencies.functional ++
JvmDependencies.serverPickling ++
JvmDependencies.logging ++
JvmDependencies.serialization ++
JvmDependencies.testing
)
)
Edit 0: A very obscure chat thread has a guy saying what I am feeling: no, not **** scala, but
Mark Eibes #i-am-the-slime Oct 15 2015 09:37
#ochrons I'm still fighting. I can't seem to pickle anything anymore.
https://gitter.im/scala-js/scala-js/archives/2015/10/15
I have a rather simple requirement - I have one web socket route on a akka http server that is defined AkkaServerLogEventToMessageHandler():
object AkkaServerLogEventToMessageHandler
extends Directives {
val sourceOfLogs =
Source.actorPublisher[AkkaServerLogMessage](AkkaServerLogEventPublisher.props) map {
event ⇒
BinaryMessage(
ByteString(
Pickle.intoBytes[AkkaServerLogMessage](event)
)
)
}
def apply(): server.Route = {
handleWebSocketMessages(
Flow[Message].merge(sourceOfLogs)
)
}
}
This fits into a tiny set of routes in the most obvious way.
Now why is that I cannot get boopickle, upickle, or prickle to serialize something as simple as this stupid case class?
sealed case class AkkaServerLogMessage(
message: String,
level: Int,
timestamp: Long
)
No nesting
All primitive types
No generics
Only three of them
These all produced roughly the same error
Using all three of the common picklers to write
Using TextMessage instead of BinaryMessage and the corresponding upickle or prickle writeJs or whatever methods
Varying the case class down to nothing (nothing, as in no members)
Varying the input itself to the case class
Importing various permutations of Implicits and underscore stuff
... specifically, they gave me variations on the same stupid error (not the same error, but considerably similar)
server [ERROR] [04/21/2016 22:04:00.362] [app-akka.actor.default-dispatcher-7] [akka.actor.ActorSystemImpl(app)] WebSocket handler failed with stub
server java.lang.RuntimeException: stub
server at scala.sys.package$.error(package.scala:27)
server at scala.scalajs.runtime.package$.linkingInfo(package.scala:143)
server at scala.scalajs.runtime.package$.environmentInfo(package.scala:137)
server at scala.scalajs.js.Dynamic$.global(Dynamic.scala:78)
server at boopickle.StringCodec$.encodeUTF8(StringCodec.scala:56)
server at boopickle.Encoder.writeString(Codecs.scala:338)
server at boopickle.BasicPicklers$StringPickler$.pickle(Pickler.scala:183)
server at boopickle.BasicPicklers$StringPickler$.pickle(Pickler.scala:134)
server at boopickle.PickleState.pickle(Pickler.scala:511)
server at shindig.clientaccess.handler.AkkaServerLogEventToMessageHandler$$anonfun$1$Pickler$macro$1$2$.pickle(AkkaServerLogEventToMessageHandler.scala:35)
server at shindig.clientaccess.handler.AkkaServerLogEventToMessageHandler$$anonfun$1$Pickler$macro$1$2$.pickle(AkkaServerLogEventToMessageHandler.scala:35)
server at boopickle.PickleImpl$.apply(Default.scala:70)
server at boopickle.PickleImpl$.intoBytes(Default.scala:75)
server at shindig.clientaccess.handler.AkkaServerLogEventToMessageHandler$$anonfun$1.apply(AkkaServerLogEventToMessageHandler.scala:35)
server at shindig.clientaccess.handler.AkkaServerLogEventToMessageHandler$$anonfun$1.apply(AkkaServerLogEventToMessageHandler.scala:31)
This worked
Not using Flow.merge (defeats the purpose, I want to keep sending out put with logs)
Using a static value
Other useless things
Appeal
Please let me know where and why I am stupid... I spent four hours on this problem today in different forms, and it is driving me nuts.
In your build.sbt, you have:
lazy val shared = (crossProject.crossType(CrossType.Pure) in file("shared"))
.configure(_.enablePlugins(ScalaJSPlugin))
Do not do this. You must not enable the Scala.js plugin on a cross-project, ever. This also adds it to the JVM side, which will wreak havoc. Most notably, this will cause %%% to resolve the Scala.js artifacts of your dependencies in the JVM project, and that is really bad. This is what causes your issue.
crossProject already adds the Scala.js plugin to the JS part, and only that one. So simply remove that enablePlugins line.
Mystery solved. Thanks to #Justin du Coeur for pointing me in the right direction.
The reason boopickle wasn't working in particular was because in the dependency chain I was including both the sjs and the jvm version of boopickle in the server project.
I removed the server dependsOn for client and for sharedJs and also removed boopickle from the shared dependencies. Now it works.

noflo 0.5.13 spreadsheet example broken?

I am new to noflo and looking at examples in order to explore it. The spreadsheet example looked interesting but I couldn't make it run. First, it takes some time and manual debugging to identify missing components, not a big deal and I believe will be improved in the future, for now the error message I get is
return process.component.outPorts[port].attach(socket);
^
TypeError: undefined is not a function
Apparently, before this isAddressable() was also undefined. Checked with this SO issue but I don't have any noflo 0.4 as a dependency anywhere. Spent some time to debug it but seemingly stuck at it, decided to post to SO.
The question is, what are the correct steps to run the spreadsheet example?
For reproduction, here is what I have done:
0) install the following components
noflo-adapters
noflo-core
noflo-couchdb
noflo-filesystem
noflo-groups
noflo-objects
noflo-packets
noflo-strings
noflo-tika
noflo-xml
i) edit spreadsheet/parse.fbp, because first error was
throw new Error("No outport '" + port + "' defined in process " + proc
^
Error: No outport 'error' defined in process Read (Read() ERROR -> IN Display())
apparently couchdb ReadDocument component does not provide Error outport. therefore replaced ReadDocument with ReadFile.
18c18
< 'tika-app-0.9.jar' -> TIKA Read(ReadDocument)
---
> 'tika-app-0.9.jar' -> TIKA Read(ReadFile)
ii) at this point, received the following:
if (process.component.outPorts[port].isAddressable()) {
^
TypeError: undefined is not a function
and improvised a stunt by checking if isAddressable is defined at this location of code:
## -259,9 +261,11 ##
throw new Error("No outport '" + port + "' defined in process " + process.id + " (" + (socket.getId()) + ")");
return;
}
- if (process.component.outPorts[port].isAddressable()) {
+ if (process.component.outPorts[port].isAddressable && process.component.outPorts[port].isAddressable()) {
return process.component.outPorts[port].attach(socket, index);
}
return process.component.outPorts[port].attach(socket);
};
and either way fails. Again, the question is What are the correct steps to run the spreadsheet example?
Thanks in advance.