Sending binary data with GHCJS XHR - xmlhttprequest

I've got a ByteString that I need to send over XHR in GHCJS, but I can't for the life of me figure out how to get that ByteString into XHR's RequestData
data RequestData = NoData
| StringData JSString
| TypedArrayData (forall e. SomeTypedArray e Immutable)
| FormData [(JSString, FormDataVal)]
Obviously TypedArrayData is what I need to use, but I'm having no luck at all figuring out how to convert a ByteString to fit in there. I looked at this, and I tried something like this.
setData r bs = do
let (b, _, _) = fromByteString $ toStrict bs
return r { XHR.reqData = XHR.TypedArrayData $ getUint8Array b }
But for some reason, I'm getting a weird error with the kinds.
Couldn't match kind ‘AnyK’ with ‘*’
Expected type: GHCJS.Buffer.Types.SomeBuffer
'ghcjs-base-0.2.0.0:GHCJS.Internal.Types.Immutable
Actual type: Buffer
In the first argument of ‘getUint8Array’, namely ‘b’
In the second argument of ‘($)’, namely ‘getUint8Array b’
As far as I can tell, there's no reason these types should be incompatible.

Related

Scalding Unit Test - How to Write A Local File?

I work at a place where scalding writes are augmented with a specific API to track dataset meta data. When converting from normal writes to these special writes, there are some intricacies with respect to Key/Value, TSV/CSV, Thrift ... datasets. I would like to compare the binary file is the same prior to conversion and after conversion to the special API.
Given I cannot provide the specific api for the metadata-inclusive writes, I only ask how can I write a unit test for .write method on a TypedPipe?
implicit val timeZone: TimeZone = DateOps.UTC
implicit val dateParser: DateParser = DateParser.default
implicit def flowDef: FlowDef = new FlowDef()
implicit def mode: Mode = Local(true)
val fileStrPath = root + "/test"
println("writing data to " + fileStrPath)
TypedPipe
.from(Seq[Long](1, 2, 3, 4, 5))
// .map((x: Long) => { println(x.toString); System.out.flush(); x })
.write(TypedTsv[Long](fileStrPath))
.forceToDisk
The above doesn't seem to write anything to local (OSX) disk.
So I wonder if I need to use a MiniDFSCluster something like this:
def setUpTempFolder: String = {
val tempFolder = new TemporaryFolder
tempFolder.create()
tempFolder.getRoot.getAbsolutePath
}
val root: String = setUpTempFolder
println(s"root = $root")
val tempDir = Files.createTempDirectory(setUpTempFolder).toFile
val hdfsCluster: MiniDFSCluster = {
val configuration = new Configuration()
configuration.set(MiniDFSCluster.HDFS_MINIDFS_BASEDIR, tempDir.getAbsolutePath)
configuration.set("io.compression.codecs", classOf[LzopCodec].getName)
new MiniDFSCluster.Builder(configuration)
.manageNameDfsDirs(true)
.manageDataDfsDirs(true)
.format(true)
.build()
}
hdfsCluster.waitClusterUp()
val fs: DistributedFileSystem = hdfsCluster.getFileSystem
val rootPath = new Path(root)
fs.mkdirs(rootPath)
However, my attempts to get this MiniCluster to work haven't panned out either - somehow I need to link the MiniCluster with the Scalding write.
Note: The Scalding JobTest framework for unit testing isn't going to work due actual data written is sometimes wrapped in bijection codec or setup with case class wrappers prior to the writes made by the metadata-inclusive writes APIs.
Any ideas how I can write a local file (without using the Scalding REPL) with either Scalding alone or a MiniCluster? (If using the later, I need a hint how to read the file.)
Answering ... There is an example of how to use a mini cluster for exactly reading and writing to HDFS. I will be able to cross read with my different writes and examine them. Here it is in the tests for scalding's TypedParquet type
HadoopPlatformJobTest is an extension for JobTest that uses a MiniCluster.
With some hand-waiving on detail in the link, the bulk of the code is this:
"TypedParquetTuple" should {
"read and write correctly" in {
import com.twitter.scalding.parquet.tuple.TestValues._
def toMap[T](i: Iterable[T]): Map[T, Int] = i.groupBy(identity).mapValues(_.size)
HadoopPlatformJobTest(new WriteToTypedParquetTupleJob(_), cluster)
.arg("output", "output1")
.sink[SampleClassB](TypedParquet[SampleClassB](Seq("output1"))) {
toMap(_) shouldBe toMap(values)
}
.run()
HadoopPlatformJobTest(new ReadWithFilterPredicateJob(_), cluster)
.arg("input", "output1")
.arg("output", "output2")
.sink[Boolean]("output2")(toMap(_) shouldBe toMap(values.filter(_.string == "B1").map(_.a.bool)))
.run()
}
}

System.Text.Json adds an extra curly bracket ONLY TO END so it causes exception

I am sending JSON object through TcpSocket. I deserialize it after destination receives. Usually first a few objects are sent and deserialized without issue! And then suddenly one comes with an extra curly braket only at the end then run time exception.
Seriosuly, what the hell is this ?
System.Text.Json.JsonException: ''}' is invalid after a single JSON value. Expected end of data. Path: $ | LineNumber: 0 | BytePositionInLine: 32.'
{"Value":3,"Name":"Blood Sugar"}}
while(true)
{
seperateSocketForEachRequest.Receive(byteMessage);
seperateSocketForEachRequest.Send(Encoding.UTF8.GetBytes("FF"));
string stringMessage = Encoding.UTF8.GetString(byteMessage);
stringMessage = stringMessage.Substring(0, stringMessage.IndexOf('\0'));
Object message = JsonSerializer.Deserialize<Object>(stringMessage);
}
////////////////////////////////////////
while (Form.isGenerate)
{
Data newData = dataType.generate(person.generatingParameters);
Thread.Sleep(500);
clientSocket.Send(Encoding.UTF8.GetBytes(JsonSerializer.Serialize<Data>(newData)));
byte[] messageReceivedByte = new Byte[1024];
clientSocket.Receive(messageReceivedByte);
}
I found the issue. It is caused of data transmission. Apparently same buffer is used for writing data received from socket and new data is written over old data. Therefore, when value of data is 2 digit number, no issue, when a data comes with 1 digit, boom.
{"Value":76,"Name":"Blood Sugar"}
{"Value":99,"Name":"Blood Sugar"}
{"Value":76,"Name":"Blood Sugar"}
{"Value":1,"Name":"Blood Sugar"}}

Akka - Unable to send Discriminated Unions as messages in F#

Akka - Discriminated Unions as messages in F#
I am unable to use discriminated unions as messages to akka actors. If anyone can point me at an example that does this, it would be much appreciated.
My own attempt at this is at git#github.com:Tweega/AkkaMessageIssue.git. (snippets below). It is a cutdown version of a sample found at https://github.com/rikace/AkkaActorModel.git (Chat project)
Problem
The DU message never finds its target on the server actor, but is sent to the deadletter box. If I send Objects, instead, they do arrive.
If I send a DU, but set my server actor to listen for generic Objects, the message does arrive, but its type is
seq [seq [seq []]
and I can't get at underlying DU.
The DU I am trying to send as message
type PrinterJob =
| PrintThis of string
| Teardown
The client code
let system = System.create "MyClient" config
let chatClientActor =
spawn system "ChatClient" <| fun mailbox ->
let server = mailbox.Context.ActorSelection("akka.tcp://MyServer#localhost:8081/user/ChatServer")
let rec loop nick = actor {
let! (msg:PrinterJob) = mailbox.Receive()
server.Tell(msg)
return! loop nick
}
loop ""
while true do
let input = Console.ReadLine()
chatClientActor.Tell(PrintThis(input))
Messages are forwarded to the client from console input
while true do
let input = Console.ReadLine()
chatClientActor.Tell(PrintThis(input))
The server code
let system = System.create "MyServer" config
let chatServerActor =
spawn system "ChatServer" <| fun (mailbox:Actor<_>) ->
let rec loop (clients:Akka.Actor.IActorRef list) = actor {
let! (msg:PrinterJob) = mailbox.Receive()
printfn "Received %A" msg //Received seq [seq [seq []]; seq [seq [seq []]]] ???
match msg with
| PrintThis str ->
Console.WriteLine("Printing: {0} Do we get this?", str)
return! loop clients
| Teardown ->
Console.WriteLine("Tearing down now")
return! loop clients
}
loop []
Dependencies
(I am not using paket here) - PM commands below:
Install-Package Akka -Version 1.4.23
Install-Package Akka.Remote -Version 1.4.23
Install-Package Akka.FSharp -Version 1.4.23
I am hosting the application in net5.0
Constructor argument names - oddity?
When passing in class instances as objects, akka seems to be sensitive to the name of constructor parameters. The message gets handled, but the data is not copied across from client to server. If you have a property called Username, the constructor parameter cannot be, for example, uName, otherwise its value is null when it reaches the server. Code for this is in branch params.
type DoesWork(montelimar: string) =
member x.Montelimar = montelimar
type DoesNotWork(montelimaro: string) =
member x.Montelimar = montelimaro
I opened an issue in the Akka.NET repository: https://github.com/akkadotnet/akka.net/issues/5194
And added a detailed reproduction for this: https://github.com/akkadotnet/akka.net/pull/5196
But it looks like Newtonsoft.Json really can't perform this deserialization without being given a type hint, which Akka.NET's network serialization does not do by default for JSON:
type TestUnion =
| A of string
| B of int * string
type TestUnion2 =
| C of string * TestUnion
| D of int
[<Fact(Skip="JSON.NET really does not support even basic DU serialization")>]
member _.``JSON.NET must serialize DUs`` () =
let du = C("a-11", B(11, "a-12"))
let settings = new JsonSerializerSettings()
settings.Converters.Add(new DiscriminatedUnionConverter())
let serialized = JsonConvert.SerializeObject(du, settings)
let deserialized = JsonConvert.DeserializeObject(serialized, settings)
Assert.Equal(du :> obj, deserialized)
That test will not pass and it doesn't use any of Akka.NET's infrastructure at all - so the default JSON serializer simply won't work for real-world F# use cases.
We can try changing the defaults of our serialization system to include a type hint, but that will take a lot of validation testing (for old Akka.Persistence data serialized without one).
A better solution, which my pull request validates, is to use Hyperion for polymorphic serialization instead - it will be similarly transparent to you but it has much more robust handling for complex types than Newtonsoft.Json and is actually faster: https://getakka.net/articles/networking/serialization.html#how-to-setup-hyperion-as-default-serializer

first `readLine` is skipped inside a `case - of` control flow in Nim-lang

I have the following code.
import lib
var stat = false
when isMainModule:
while stat != true:
echo("Option: ")
var opt = readChar(stdin)
case opt
of 'q':
stat = true
of 'n':
echo("Salu: ")
var ss = readLine(stdin)
echo("Nam: ")
var nn = readLine(stdin)
let k = prompt("Rust")
else: discard
What I am trying to achieve is, prompting and receiving user input one after another for two variables. Upon choosing n I am expecting Salu first and once user input is supplied then Nam.
However, what I receive when I execute the following nim code by issuing the following command is, nim c -r src/mycode.nim
~~> nim c -r src/cmdparsing.nim
...
...
...
CC: stdlib_system.nim
CC: cmdparsing.nim
Hint: [Link]
Hint: operation successful (48441 lines compiled; 2.338 sec total; 66.824MiB peakmem; Debug Build) [SuccessX]
Hint: /home/XXXXX/Development/nim_devel/mycode/src/mycode [Exec]
Option:
n
Salu:
Nam:
Salu is being echoed, but readLine doesn't wait for my input and immediately echoes Nam. But, stacked readLine commands from the prompt procedure appears one after the other for receiving user input.
I was wondering what is that I am missing to understand here. Could someone enlighten me?
Code for prompt lives in lib.nim which is as follows,
proc prompt*(name: string): bool =
echo("Salutation: ")
var nn = readLine(stdin)
echo(nn&"."&name)
echo("Diesel")
var dd = readLine(stdin)
echo(dd)
return true
You do a readChar to get the opt value, and then you input two chars: n and \n. The first is the opt value, the second gets buffered or retained in the stdin waiting for further reading. The next time you try to read a line, the \n that's still hanging is interpreted as a new line, and immediately assigned to ss. You don't see anything because the line is empty except for the newline char.
E.g.
var opt = readChar(stdin)
case opt
of 'n':
var ss = readLine(stdin)
echo ss
else:
discard
Compile and run, but in the input write something like "ntest". n fires the first branch of case, test (the remainder of stdin) is assigned to ss, and echoed.
You have two options to solve the problem:
Read a line instead of a char, and store only the first char with something like var opt = readLine(stdin)[0].
Use the rdstdin module:
import rdstdin
var ss = readLineFromStdin("Salu:")

How to pass same parameter with different value

I am trying the following API using Alamofire, but this API has multiple "to" fields. I tried to pass an array of "to" emails as parameters. It shows no error but did not send to all emails. API is correct, I tested that from terminal. Any suggestions will be cordially welcomed.
http -a email:pass -f POST 'sampleUrl' from="email#email.com" to="ongkur.cse#gmail.com" to="emailgmail#email.com" subject="test_sub" bodyText="testing hello"
I am giving my code:
class func sendMessage(message:MessageModel, delegate:RestAPIManagerDelegate?) {
let urlString = "http://localhost:8080/app/user/messages"
var parameters = [String:AnyObject]()
parameters = [
"from": message.messageFrom.emailAddress
]
var array = [String]()
for to in message.messageTO {
array.append(to)
}
parameters["to"] = array
for cc in message.messageCC {
parameters["cc"] = cc.emailAddress;
}
for bcc in message.messageBCC {
parameters["bcc"] = bcc.emailAddress;
}
parameters["subject"] = message.messageSubject;
parameters["bodyText"] = message.bodyText;
Alamofire.request(.POST, urlString, parameters: parameters)
.authenticate(user: MessageManager.sharedInstance().primaryUserName, password: MessageManager.sharedInstance().primaryPassword)
.validate(statusCode: 200..<201)
.validate(contentType: ["application/json"])
.responseJSON {
(_, _, jsonData, error) in
if(error != nil) {
println("\n sendMessage attempt json response:")
println(error!)
delegate?.messageSent?(false)
return
}
println("Server response during message sending:\n")
let swiftyJSONData = JSON(jsonData!)
println(swiftyJSONData)
delegate?.messageSent?(true)
}
}
First of all if you created the API yourself you should consider changing the API to expect an array of 'to' receivers instead of multiple times the same parameter name.
As back2dos states it in this answer: https://stackoverflow.com/a/1898078/672989
Although POST may be having multiple values for the same key, I'd be cautious using it, since some servers can't even properly handle that, which is probably why this isn't supported ... if you convert "duplicate" parameters to a list, the whole thing might start to choke, if a parameter comes in only once, and suddendly you wind up having a string or something ...
And I think he's right.
In this case I guess this is not possible with Alamofire, just as it is not possible with AFNetworking: https://github.com/AFNetworking/AFNetworking/issues/21
Alamofire probably store's its POST parameter in a Dictionary which doesn't allow duplicate keys.