I'm trying to implement HMAC in Elm,
but I can't seem to figure out what I'm doing wrong here.
Some help would be greatly appreciated 🙏
type alias HashFunction =
String -> String
encrypt64 : HashFunction -> String -> String -> String
encrypt64 =
encrypt 64
encrypt : Int -> HashFunction -> String -> String -> String
encrypt blockSize hasher message key =
let
keySize =
String.length key
keyWithCorrectSize =
if keySize > blockSize then
hexStringToUtf8String (hasher key)
else if keySize < blockSize then
String.padRight blockSize (Char.fromCode 0) key
else
key
keyCodePoints =
keyWithCorrectSize
|> String.toList
|> List.map Char.toCode
partA =
keyCodePoints
|> List.map (Bitwise.xor 54 >> Char.fromCode)
|> String.fromList
partB =
keyCodePoints
|> List.map (Bitwise.xor 92 >> Char.fromCode)
|> String.fromList
in
message
|> String.append partA
|> hasher
|> hexStringToUtf8String
|> String.append partB
|> hasher
-- Utils
hexStringToUtf8String : String -> String
hexStringToUtf8String input =
input
|> String.toList
|> List.Extra.greedyGroupsOf 2
|> List.map (String.fromList >> hexStringToUtf8Char)
|> String.fromList
hexStringToUtf8Char : String -> Char
hexStringToUtf8Char input =
case Hex.fromString input of
Ok v ->
Char.fromCode v
Err err ->
Debug.crash err
You can find the related code here: https://github.com/icidasset/ongaku-ryoho/blob/master/src/App/Sources/Crypto/Hmac.elm (which includes doc tests)
edit: To be more clear... my current code here doesn't output a valid HMAC,
and I would like to know why.
Looking at the Elm SHA library, I think the problem (or at least a problem) is that the output from the hash is hex encoded. HMAC calls the hash function twice, feeding the first output back into the second call, and this needs to be the raw SHA bytes, rather than a hex string.
So you need to decode the hex output from the first call to hasher, after it is applied to partA.
Related
I'm deconstructing a list into head and tail but later I need a proof that they give me the original list back when combined:
test: Bool -> String
test b = let lst = the (List Nat) ?getListFromOtherFunction in
case lst of
Nil => ""
x :: xs =>
let eq = the ((x::xs) = lst) ?howToDoIt in ""
I'm using Idris 1.3.1.
You can do it with dependent pattern matching:
test: List Nat -> String
test lst with (lst) proof prf
| Nil = ""
| (x :: xs) = ?something
Here prf will hold your equality.
However, I think it's better to simply match on lst in the LHS, then your proofs will auto-simplify where needed.
I'm very new to elm and i want to do a simple mileage counter app.
If i get "1.2" (POINT) form input - String.toFloat returns in the OK branch with 1.2 as a number.
But if i get "1,2" (COMMA) form input, then String.toFloat returns in the Err branch with "You can't have words, only numbers!"
This pretty much works like a real time validator.
The code:
TypingInInput val ->
case String.toFloat val of
Ok success ->
{ model | inputValue = val, errorMessage = Nothing }
Err err ->
{ model | inputValue = val, errorMessage = Just "You can't have words, or spaces, only numbers!" }
.
Question: So how can i force String.toFloat of "1,2" to give me 1.2 the number?
Unfortunately the source for toFloat is hardcoded to only respect a dot as decimal separator. You can replace the comma with a dot in the string prior to passing it to toFloat as a workaround.
String.Extra.replace can be used for the simple string replacement.
The implementation of String.toFloat only supports a dot as a separator.
You should replace commas first before parsing the Float
Please see the example:
import Html exposing (text)
import String
import Regex
main =
"1,2"
|> Regex.replace Regex.All (Regex.regex ",") (\_ -> ".")
|> String.toFloat
|> toString
|> text -- 1.2
In JavaScript parseFloat doesn't support comma separator either.
Elm supports [1..100], but if I try ['a'..'z'], the compiler gives me a type mismatch (expects a number, gets a Char). Is there any way do make this work?
Just create a range of numbers and map it to chars:
List.map Char.fromCode [97..122]
Edit, or as a function:
charRange : Char -> Char -> List Char
charRange from to =
List.map Char.fromCode [(Char.toCode from)..(Char.toCode to)]
charRange 'a' 'd' -- ['a','b','c','d'] : List Char
Edit, from elm 0.18 and up, List.range is finally a function:
charRange : Char -> Char -> List Char
charRange from to =
List.map Char.fromCode <| List.range (Char.toCode from) (Char.toCode to)
I am trying to convert a string to integer using String.toInt. However, when I want to bind the result to a variable and then do some simple math with it I get this error:
Function add is expecting the 2nd argument to be:
Int
But it is:
Result String Int
How can I just extract the integer part of the result?
Here's how to supply the conversion with a default value in case the parsing fails.
String.toInt "5" |> Result.toMaybe |> Maybe.withDefault 0
toInt can fail in parsing. You need to check it using a case statement:
case toInt str of
Err msg -> ... -- do something with the error message
Ok val -> ... -- val is an Int which you can add
More about Result here
The integer can also be pulled out using
Result.withDefault 0 (String.toInt "2")
You can read more about it here
According to the Elm String reference documentation, if you are extracting a number from some raw user input, you will typically want to use Result.withDefault to handle bad data in case parsing fails. You can chain this operation using pipes for cleaner code:
String.toInt "5" |> Result.withDefault 0
Maybe.withDefault 0 (String.toInt "42")
Use map:
answer = Result.map2 (+) (String.toInt "1") (String.toInt "2")
map2:
Apply a function to two results, if both results are Ok. If not, the
first argument which is an Err will propagate through.
to have the add result as a string
resultAsString r =
case r of
Err msg -> msg
Ok value -> toString value
resultAsString answer
to make things easier you can create an addStrings function:
addStrings : String -> String -> Result String Int
addStrings a b =
Result.map2 (+) (String.toInt a) (String.toInt b)
You can even get away with the Result type altogether:
addStrings : String -> String -> String
addStrings a b =
let
r =
Result.map2 (+) (String.toInt a) (String.toInt b)
in
case r of
Err msg ->
msg
Ok value ->
toString value
Testing
import Html exposing (Html, text)
main : Html msg
main =
text (addStrings "1" "2")
output 3
The withDefault method forces you to define a value that can be used for calculations but it is not always possible to establish a value that is significant for errors. Most often you need all the possible values, and default is not fit. Here I provide a result type check function you can use to decide if you use or not the converted value:
isErrorResult r =
case r of
Err msg ->
True
Ok value ->
False
You can use it like this:
r = String.toInt "20b"
if isErrorResult r then
-- not a valid Interger, abort or whatever
else
-- a good integer, extract it
a = Result.withDefault 0 r
-- and make good use of it
the default value (0 in this case) passed to withDefault is meaningless, because we made sure that r is not an Err.
You can do this as below.
---- Elm 0.19.0 ----------------------------------------------------------------
Read <https://elm-lang.org/0.19.0/repl> to learn more: exit, help, imports, etc.
--------------------------------------------------------------------------------
> parseInt string = String.toInt string
<function> : String -> Maybe Int
> resultParseInt string = \
| Result.fromMaybe ("error parsing string: " ++ string) (parseInt string)
<function> : String -> Result String Int
> resultParseInt "12"
Ok 12 : Result String Int
> resultParseInt "12ll"
Err ("error parsing string: 12ll") : Result String Int
>
I have changed guys answers a bit, since it appears to be of type Maybe
isErrorResult : String -> Bool
isErrorResult r =
case String.toInt r of
Nothing -> True
Just value -> False
So I've decided to give F# a try and ported one the algorithms I've written in C# to it. At one point, I have noticed that debug build run faster than the release one. I then played with the optimization settings and got these results:
The times show the total execution time of the algorithm over 100000 runs. I am using the F# compiler that comes with Visual Studio 2010 SP1. Target platform is Any CPU.
Opt off, tail calls off: 5.81s
Opt off, tail calls on : 5.79s
Opt on , tail calls off: 6.48s
Opt on , tail calls on : 6.40s
I am really puzzled by this - why does the optimization make the code run slower? The C# version of the algorithm does not exhibit this behavior (altho it is implemented in a slightly different way)
Here is a stripped down version of the F# code, it is an algorithm that finds patterns in molecules. All the code that this F# program relies on is written in F#.
namespace Motives
module Internal =
type Motive =
{ ResidueSet: Set<Residue>; AtomSet: Set<IAtom> }
member this.Atoms : IAtom seq =
seq {
for r in this.ResidueSet do yield! r.Atoms
yield! this.AtomSet
}
static member fromResidues (residues : Residue seq) = residues |> Seq.fold (fun (m: Set<Residue>) r -> m.Add(r)) Set.empty |> fun rs -> { ResidueSet = rs; AtomSet = Set.empty }
static member fromAtoms (atoms : IAtom seq) = atoms |> Seq.fold (fun (m: Set<IAtom>) a -> m.Add(a)) Set.empty |> fun atoms -> { ResidueSet = Set.empty; AtomSet = atoms }
static member merge (m1: Motive) (m2: Motive) = { ResidueSet = Set.union m1.ResidueSet m2.ResidueSet; AtomSet = Set.union m1.AtomSet m2.AtomSet }
static member distance (m1: Motive) (m2: Motive) = Seq.min (seq { for a in m1.Atoms do for b in m2.Atoms -> a.Position.DistanceTo(b.Position) })
type Structure with
static member fromMotive (m: Motive) (parent: IStructure) (addBonds: bool) : IStructure =
let atoms = AtomCollection.FromUniqueAtoms(m.Atoms)
let bonds =
match addBonds with
| true -> BondCollection.Create(atoms |> Seq.map (fun a -> parent.Bonds.[a]) |> Seq.concat)
| _ -> BondCollection.Empty
Structure.Create (parent.Id + "_" + atoms.[0].Id.ToString(), atoms, bonds)
// KDTree used for range queries
// AminoChains used for regex queries
type StructureContext =
{ Structure: IStructure; KDTree: Lazy<KDAtomTree>; AminoChains: Lazy<(Residue array * string) list> }
static member create (structure: IStructure) =
match structure.IsPdbStructure() with
| false -> { Structure = structure; KDTree = Lazy.Create(fun () -> structure.Atoms.ToKDTree()); AminoChains = Lazy.CreateFromValue([]) }
| true ->
let aminoChains = new System.Func<(Residue array * string) list> (fun () ->
let residues = structure.PdbResidues() |> Seq.filter (fun r -> r.IsAmino)
residues
|> Seq.groupBy (fun r -> r.ChainIdentifier)
|> Seq.map (fun (k,rs) -> rs |> Array.ofSeq, String.concat "" (rs |> Seq.map (fun r -> r.ShortName)))
|> List.ofSeq)
{ Structure = structure; KDTree = Lazy.Create(fun () -> structure.Atoms.ToKDTree()); AminoChains = Lazy.Create(aminoChains) }
// Remember the named motives from named patterns
type MatchContext =
{ StructureContext: StructureContext; NamedMotives: Map<string, Motive> }
static member merge (c1: MatchContext) (c2: MatchContext) =
{ StructureContext = c1.StructureContext; NamedMotives = c2.NamedMotives |> Map.fold (fun m k v -> m.Add(k,v)) c1.NamedMotives }
type MatchedMotive = Motive * MatchContext
type Pattern =
| EmptyPattern
| GeneratingPattern of ( StructureContext -> MatchedMotive seq )
| ConstraintPattern of ( MatchedMotive -> MatchedMotive option ) * Pattern
static member matches (p: Pattern) (context: StructureContext) : MatchedMotive seq =
match p with
| GeneratingPattern generator -> generator context
| ConstraintPattern (transform, pattern) ->
Pattern.matches pattern context
|> Seq.choose (fun m -> transform m)
| _ -> Seq.empty
let ringPattern (names: string list) =
let fingerprint =
names
|> Seq.map (fun s -> ElementSymbol.Create(s).ToString())
|> Seq.sort
|> String.concat ""
let generator (context: StructureContext) =
let rings = context.Structure.Rings().GetRingsByFingerprint(fingerprint)
rings |> Seq.map (fun r -> Motive.fromAtoms r.Atoms, { StructureContext = context; NamedMotives = Map.empty })
GeneratingPattern generator
open Internal
type MotiveFinder (pattern: string) =
// I am using a hard coded pattern here for testing purposes
let pattern = ringPattern ["C"; "C"; "C"; "C"; "C"; "O"]
member this.Matches (structure: IStructure) =
Pattern.matches pattern (StructureContext.create structure)
|> Seq.map (fun (m, mc) -> Structure.fromMotive m mc.StructureContext.Structure false)
|> List.ofSeq
|> List.sortBy (fun s -> s.Atoms.[0].Id)
///////////////////////////////////////////////////////////////////
// performance test
let warmUp = (new MotiveFinder("")).Matches (StructureReader.ReadPdb(filename, computeBonds = true))
printfn "%i" (List.length warmUp)
let structure = StructureReader.ReadPdb(filename, computeBonds = true)
let stopWatch = System.Diagnostics.Stopwatch.StartNew()
let nruns = 100000
let result =
seq {
for i in 1 .. nruns do
yield (new MotiveFinder("")).Matches structure
} |> Seq.nth (nruns-1)
stopWatch.Stop()
printfn "Time elapsed: %f seconds" stopWatch.Elapsed.TotalSeconds
EDIT2:
I seem to have narrowed down the problem to the implementation of the Set type.
For this code:
let stopWatch = System.Diagnostics.Stopwatch.StartNew()
let runs = 1000000
let result =
seq {
for i in 1 .. runs do
let setA = [ 1 .. (i % 10) + 5 ] |> Set.ofList
let setB = [ 1 .. (i % 10) + 5 ] |> Set.ofList
yield Set.union setA setB
} |> Seq.nth (runs - 1)
stopWatch.Stop()
printfn "Time elapsed: %f seconds" stopWatch.Elapsed.TotalSeconds
printfn "%A" result
I get ~7.5s with optimization off and ~8.0s with optimization on. Still target = Any CPU (and I have i7-860 processor).
EDIT3:
And right after I posted the previous edit I figured I should try it on lists only.
So for
let stopWatch = System.Diagnostics.Stopwatch.StartNew()
let runs = 1000000
let result1 =
seq {
for i in 1 .. runs do
let list = [ 1 .. i % 100 + 5 ]
yield list
} |> Seq.nth (runs - 1)
stopWatch.Stop()
printfn "Time elapsed: %f seconds" stopWatch.Elapsed.TotalSeconds
printfn "%A" result1
I get ~3s with opt. off and ~3.5s with opt. on.
EDIT4:
If I remove the seq builder and just do
let stopWatch = System.Diagnostics.Stopwatch.StartNew()
let runs = 1000000
let mutable ret : int list = []
for i in 1 .. runs do
let list = [ 1 .. i % 100 + 5 ]
ret <- list
stopWatch1.Stop()
printfn "Time elapsed: %f seconds" stopWatch.Elapsed.TotalSeconds
printfn "%A" ret
I get ~3s with optimization both on and off. So it seem that the problem is somewhere in optimizing the seq builder code.
Strangely enough, I wrote a test app in C#:
var watch = Stopwatch.StartNew();
int runs = 1000000;
var result = Enumerable.Range(1, runs)
.Select(i => Microsoft.FSharp.Collections.ListModule.OfSeq(Enumerable.Range(1, i % 100 + 5)))
.ElementAt(runs - 1);
watch.Stop();
Console.WriteLine(result);
Console.WriteLine("Time: {0}s", watch.Elapsed.TotalSeconds);
And the code happens to run almost twice as fast as the F# solution at ~1.7s.
EDIT5:
Based on the discussion with Jon Harrop I have found out the thing that is causing the optimized code run slower (I still don't know why tho).
If I change Motive.Atoms from
member this.Atoms : IAtom seq =
seq {
for r in this.ResidueSet do yield! r.Atoms
yield! this.AtomSet
}
to
member this.Atoms : IAtom seq =
Seq.append (this.ResidueSet |> Seq.collect (fun r -> r.Atoms)) this.AtomSet
then the program runs ~7.1s in both optimized and non-optimized version. Which is slower than the seq version, but at least consistent.
So it seems that the F# compiler just can't optimize computation expressions and actually makes them slower by trying so.
I can also observe your wrapper code and penultimate example running slightly slower with optimizations enabled but the difference is less than 10% and, although anomalous, I am not surprised that optimizations can sometimes slightly degrade performance.
I should note that your style of coding leaves a lot of room for optimization but without the entire source code it is not possible for me to help optimize it. Your example uses the following code:
let result1 =
seq {
for i in 1 .. runs do
let list = [ 1 .. i % 100 + 5 ]
yield list
} |> Seq.nth (runs - 1)
when this is shorter, more idiomatic and orders of magnitude faster:
let result1 =
Seq.init runs (fun i -> List.init ((i+1) % 100 + 5) ((+) 1))
|> Seq.nth (runs - 1)
EDIT
In your comments below you say that you want to execute the function argument in which case I would not assume that Seq.nth will do this for you so I would use a for loop instead:
let mutable list = []
for i=1 to runs do
list <- List.init (i % 100 + 5) ((+) 1)
list
This is still 9× faster than the original.