Looking at lodash and it's fp facilities., I am searching for when or ifElse equivalent.
In Ramda, one can use when to do semi shorthand if. Check predicate on sent data, and when true, do something. When false, return input data unchanged.
// truncate :: String -> String
var truncate = R.when(
R.propSatisfies(R.gt(R.__, 10), 'length'),
R.pipe(R.take(10), R.append('…'), R.join(''))
);
truncate('12345'); //=> '12345'
truncate('0123456789ABC'); //=> '0123456789…'
How this will be accomplished in lodash?
I don't know how to do this in lodash/fp. (One of these days, I swear I spend some time learning more about it!) But do note that the version as written could well be simplified.
First, keeping it in Ramda (disclaimer: I'm one of the authors), but simplifying your functions with simple ES6-style lambdas:
// truncate :: String -> String
var truncate = R.when(
s => s.length > 10,
s => s.slice(0, 10) + '…'
);
truncate('12345'); //=> '12345'
truncate('0123456789ABC'); //=> '0123456789…'
I find this version extremely readable, and might leave it at that. But you can also remove the library altogether by replacing the when with another ES6-lambda and using a conditional expression:
// truncate :: String -> String
var truncate = s => s.length > 10 ? s.slice(0, 10) + '…' : s;
Point-free is a great technique that can often add readability. But there are few reasons to use it when it obscures meaning.
In lodash fp you should 'cond' for this:
const showTen = fp.pipe(
fp.slice(0, 10),
fp.join(''),
fp.add(fp.__, '...')
);
const gtThanTen = fp.pipe(
fp.result('length'),
fp.lt(10)
);
const showOnlyTen = fp.cond([
[gtThanTen, showTen],
[fp.stubTrue, fp.identity]
]);
showOnlyTen('12345678901');
Related
Using fp-ts. I have an option of an array
const arrayofKeys: Option<Array<K>>,
and an option of a record
const record: Option<Record<K,V>>
I want to pick the Vs of the Record where Ks intersect with the Array and stick the result in an Option.
In ramda: R.pick(arrayOfKeys, record)
How do i solve this with fp-ts or other packages within the fp-ts ecosystem?
I'd personally avoid Ramda et al as in my experience they're not very well typed. Here's a pure fp-ts approach (Str.fromNumber is from fp-ts-std, trivially replaced):
declare const arrayOfKeyNums: Option<Array<number>>
const arrayOfKeys = pipe(arrayOfKeyNums, O.map(A.map(Str.fromNumber)))
declare const record: Option<Record<string, number>>
const keyIntersectedVals: O.Option<Array<number>> = pipe(
sequenceT(O.Apply)(arrayOfKeys, record),
O.map(([ks, rec]) =>
pipe(
rec,
R.foldMapWithIndex(Str.Ord)(A.getMonoid<number>())((k, v) =>
A.elem(Str.Eq)(k)(ks) ? [v] : [],
),
),
),
)
It's a bit verbose owing to the need to pass typeclass instances around. On the plus side, the use of typeclass instances means that this can be trivially updated to support any value type, including non-primitive types with any given Eq.
Here's what the body might instead look like in Haskell for comparison, where typeclass instances don't need to be passed around:
keyIntersectedVals :: Maybe [Int]
keyIntersectedVals = uncurry (M.foldMapWithKey . intersectedToList) <$> sequenceT (mkeys, mmap)
where intersectedToList ks k v
| k `elem` ks = [v]
| otherwise = []
For example, given keys O.some(["a", "c"]) and a record O.some({ a: 123, b: 456, c: 789 }), we get O.some([123, 789]).
Ramda's lift lifts a function on some values to work on a container of those values. So lift (pick) will likely do what you want, so long as fp-ts's Option supports the FantasyLand Apply specification.
const {of} = folktale.maybe
const {lift, pick} = R
const keys = of (['k', 'e', 'y', 's']) // Maybe (['k', 'e', 'y', 's'])
const record = of ({s: 1, k: 2, y: 3, b: 4, l: 5, u: 6, e: 7}) // Maybe ({s: 1, k: 2, ...})
console .log (lift (pick) (keys, record) .toString())
<script src="//cdnjs.cloudflare.com/ajax/libs/ramda/0.28.0/ramda.min.js"></script>
<script src="//cdnjs.cloudflare.com/ajax/libs/folktale/2.0.0/folktale.min.js"></script>
This is a great use case for traverseArray, an optimized version of traverse. You can also use "Do notation" and apS to get a really clean, monadic pipeline. If any of these operations return a None, the entire flow will terminate early (this is a good!).
Also, lookup is a very handy function similar to get from Ramda/Lodash, but it returns an Option. Both the Record and Array modules export a version of this function.
declare const arrayofKeys: O.Option<Array<string>>
declare const record: O.Option<Record<string, number>>
export const result: O.Option<ReadonlyArray<number>> = pipe(
O.Do,
O.apS('keys', arrayofKeys),
O.apS('rec', record),
O.chain(({ keys, rec }) =>
pipe(
keys,
O.traverseArray(key => pipe(rec, R.lookup(key)))
)
)
)
Functions used:
https://gcanti.github.io/fp-ts/modules/Option.ts.html#do
https://gcanti.github.io/fp-ts/modules/Option.ts.html#aps
https://gcanti.github.io/fp-ts/modules/Option.ts.html#chain
https://gcanti.github.io/fp-ts/modules/Option.ts.html#traversearray
https://gcanti.github.io/fp-ts/modules/Record.ts.html#lookup
There is a given function, that is fixed and must not be changed:
const validate = v => v === "fred" ? "Y" : undefined
Now, because I would like to be functional and would like to avoid null-checks I've decided to use Maybe (ramda-fantasy) for validation function:
const vv = val => Maybe(val).map(v=> validate(v)).getOrElse("N")
vv should return Y if it's called with "fred" otherwise N.
vv(null) returns N -> OK
vv("fred") returns Y -> OK
vv("ding") returns undefined -> wrong, expected N
The problem is, that Maybe.map always returns Just, that I do not understand (because I'm just learning it). For me I would be beneficial if this function would behave in similar way to Maybe(val) that returns None or Just.
I have two question:
Why Maybe.map does not handle null/undefined?
How to rewrite vv that it would return expected values in all three cases?
EDIT: I would like to explain why validate should not be changed: it's just simple example of function coming from external library. I wanted to see how easy/hard is to integrate such libraries into functional programming. So is not about string operations, just about streaming values when at some point it evaluates to null.
EDIT2:
This solves my problem:
Either.ofNullable = Either.prototype.ofNullable = function (value) {
return value == null ? Either.Left("is null") : Either.Right(value);
};
EDIT3:
I've implemented my own Either with missing functionality: https://github.com/maciejmiklas/functional-ts/blob/main/src/either.ts
Note: Ramda Fantasy is no longer maintained. The team recommends that you use other implementations of these concepts.
But we can still answer this question, as it's likely to be true of any reasonable Maybe implementation
Basically, that's not how Maybe is designed to work. The idea is that you can have a Just holding absolutely any value. That includes the values null and undefined.
Ramda added a convenience constructor, Maybe (val), which turns into Just (val) if val is not a nil value, and into Nothing () if it is. But that doesn't mean that you cannot create a Just (null). The main construction technique is to use the static Maybe .of. And you can note that
Maybe (null) //=> Nothing ()
Maybe.of (null) //=> Just (null)
So we're probably not going to make that technique work. We couldn't just map such an existing validate over our Maybe and expect it to work. We could, however, work with a version like this:
const validate = v => v === "fred" ? Just ("Y") : Nothing ()
Here, we still have one problem. Maybe ('fred') .map (validate) yields Just (Just ('Y')). We have extra nesting. But this is exactly what chain is for. It removes such an additional level, so that Maybe ('fred') .chain (validate) yields Just ('Y'). We can then use it like this:
const validate = v => v === "fred" ? Just ("Y") : Nothing ()
const vv = val => Maybe (val) .chain (validate) .getOrElse ('N')
console .log ([
vv (null), // 'N'
vv ('fred'), // 'Y'
vv ('ding') // 'N'
])
I want to iterate a collection of items from a specific position.
Let's say we want to start from center and iterate the whole right part of the array:
int startFrom = arr.length / 2;
for (int i = startFrom; i < arr.length; i++)
{
String.format("Index %d value %s", i, arr[i]);
}
It's important to track real indexes and values during iteration. As an example you are going to implement in-place sorting algorithm
I've tried to do so using drop().withIndexes(), but looks like drop() creates a new collection and I lose information about real indexes.
It could be fixed manually if we create a variable and calculate proper index
val startFrom = inputData.size / 2
for ((i, item) in inputData.drop(startFrom).withIndex()){
val fixedIndex = i + startFrom
println("Index $i, fixed index $fixedIndex value $item")
}
This solution works but I was hopping there is something that can help to avoid introducing a separate fixedIndex variable and handling this problem manually.
Your original try is very close, just a small change makes it work. Reverse the calls of withIndex() and drop(N) putting withIndex first.
If you do not want to copy the collection, you can convert it to a sequence first using asSequence().
for ((index, item) in inputData.asSequence().withIndex().drop(startFrom)) { ... }
The test code:
val sampleData = listOf("a", "b", "c", "d", "e", "f")
val startFrom = sampleData.size / 2
for ((index, item) in sampleData.asSequence().withIndex().drop(startFrom)) {
println("[$index] => $item")
}
outputs:
[3] => d
[4] => e
[5] => f
That's it! The rest of this answer just provides you with alternatives, including a more efficient and Kotlinesque solution of creating your own extension function at the end.
If the copy of the collection is acceptable, you can do the following shorter version. The withIndex does not cause a copy, but the drop(N) does.
for ((index, item) in inputData.withIndex().drop(startFrom)) { ... }
The eager copy or the sequence could be faster, it depends on the size of the collection, your runtime environment, and CPU cache.
You can also use functional forEach instead of the for loop.
sampleData.asSequence().withIndex().drop(startFrom).forEach { (index, item) ->
println("[$index] => $item")
}
Which then brings up the best and most efficient option. Just write an extension function when using an Array or List so that there is no lazy evaluation using wrapper classes nor any copying. Simply a loop calling your lambda with the index and value. Here are the two new extensions that add a new variation of forEachIndexed:
inline fun <T> Array<T>.forEachIndexed(startFrom: Int,
action: (index: Int, item: T)->Unit) {
for (i in startFrom until this.size) {
action(i, this[i])
}
}
inline fun <T> List<T>.forEachIndexed(startFrom: Int,
action: (index: Int, item: T)->Unit) {
for (i in startFrom until this.size) {
action(i, this[i])
}
}
And this can be called simply for any non-primitive array or list:
sampleData.forEachIndexed(startFrom) { index, item ->
println("[$index] => $item")
}
You could do the same if you want a withIndex(startFrom) style method as well. You can always extend Kotlin to get what you want!
If this is what you're missing, the simplest solution in my opinion is to just use a ranged for loop:
val startFrom = arr.size / 2;
for (i in startFrom until arr.size) {
println(String.format("Index %d value %s", i, arr[i]));
}
If you prefer strictly avoiding expressions like arr[i], then you can change your current solution to use a sequence instead.
Scala's List classes have indexWhere methods, which return a single index for a List element which matches the supplied predicate (or -1 if none exists).
I recently found myself wanting to gather all indices in a List which matched a given predicate, and found myself writing an expression like:
list.zipWithIndex.filter({case (elem, _) => p(elem)}).map({case (_, index) => index})
where p here is some predicate function for selecting matching elements. This seems a bit of an unwieldy expression for such a simple requirement (but I may be missing a trick or two).
I was half expecting to find an indicesWhere function on List which would allow me to write instead:
list.indicesWhere(p)
Should something like this be part of the Scala's List API, or is there a much simpler expression than what I've shown above for doing the same thing?
Well, here's a shorter expression that removes some of the syntactic noise you have in yours (modified to use Travis's suggestion):
list.zipWithIndex.collect { case (x, i) if p(x) => i }
Or alternatively:
for ((x,i) <- list.zipWithIndex if p(x)) yield i
But if you use this frequently, you should just add it as an implicit method:
class EnrichedWithIndicesWhere[T, CC[X] <: Seq[X]](xs: CC[T]) {
def indicesWhere(p: T => Boolean)(implicit bf: CanBuildFrom[CC[T], Int, CC[Int]]): CC[Int] = {
val b = bf()
for ((x, i) <- xs.zipWithIndex if p(x)) b += i
b.result
}
}
implicit def enrichWithIndicesWhere[T, CC[X] <: Seq[X]](xs: CC[T]) = new EnrichedWithIndicesWhere(xs)
val list = List(1, 2, 3, 4, 5)
def p(i: Int) = i % 2 == 1
list.indicesWhere(p) // List(0, 2, 4)
You could use unzip to replace the map:
list.zipWithIndex.filter({case (elem, _) => p(elem)}).unzip._2
I'm writing a function to find triangle numbers and the natural way to write it is recursively:
function triangle (x)
if x == 0 then return 0 end
return x+triangle(x-1)
end
But attempting to calculate the first 100,000 triangle numbers fails with a stack overflow after a while. This is an ideal function to memoize, but I want a solution that will memoize any function I pass to it.
Mathematica has a particularly slick way to do memoization, relying on the fact that hashes and function calls use the same syntax:
triangle[0] = 0;
triangle[x_] := triangle[x] = x + triangle[x-1]
That's it. It works because the rules for pattern-matching function calls are such that it always uses a more specific definition before a more general definition.
Of course, as has been pointed out, this example has a closed-form solution: triangle[x_] := x*(x+1)/2. Fibonacci numbers are the classic example of how adding memoization gives a drastic speedup:
fib[0] = 1;
fib[1] = 1;
fib[n_] := fib[n] = fib[n-1] + fib[n-2]
Although that too has a closed-form equivalent, albeit messier: http://mathworld.wolfram.com/FibonacciNumber.html
I disagree with the person who suggested this was inappropriate for memoization because you could "just use a loop". The point of memoization is that any repeat function calls are O(1) time. That's a lot better than O(n). In fact, you could even concoct a scenario where the memoized implementation has better performance than the closed-form implementation!
You're also asking the wrong question for your original problem ;)
This is a better way for that case:
triangle(n) = n * (n - 1) / 2
Furthermore, supposing the formula didn't have such a neat solution, memoisation would still be a poor approach here. You'd be better off just writing a simple loop in this case. See this answer for a fuller discussion.
I bet something like this should work with variable argument lists in Lua:
local function varg_tostring(...)
local s = select(1, ...)
for n = 2, select('#', ...) do
s = s..","..select(n,...)
end
return s
end
local function memoize(f)
local cache = {}
return function (...)
local al = varg_tostring(...)
if cache[al] then
return cache[al]
else
local y = f(...)
cache[al] = y
return y
end
end
end
You could probably also do something clever with a metatables with __tostring so that the argument list could just be converted with a tostring(). Oh the possibilities.
In C# 3.0 - for recursive functions, you can do something like:
public static class Helpers
{
public static Func<A, R> Memoize<A, R>(this Func<A, Func<A,R>, R> f)
{
var map = new Dictionary<A, R>();
Func<A, R> self = null;
self = (a) =>
{
R value;
if (map.TryGetValue(a, out value))
return value;
value = f(a, self);
map.Add(a, value);
return value;
};
return self;
}
}
Then you can create a memoized Fibonacci function like this:
var memoized_fib = Helpers.Memoize<int, int>((n,fib) => n > 1 ? fib(n - 1) + fib(n - 2) : n);
Console.WriteLine(memoized_fib(40));
In Scala (untested):
def memoize[A, B](f: (A)=>B) = {
var cache = Map[A, B]()
{ x: A =>
if (cache contains x) cache(x) else {
val back = f(x)
cache += (x -> back)
back
}
}
}
Note that this only works for functions of arity 1, but with currying you could make it work. The more subtle problem is that memoize(f) != memoize(f) for any function f. One very sneaky way to fix this would be something like the following:
val correctMem = memoize(memoize _)
I don't think that this will compile, but it does illustrate the idea.
Update: Commenters have pointed out that memoization is a good way to optimize recursion. Admittedly, I hadn't considered this before, since I generally work in a language (C#) where generalized memoization isn't so trivial to build. Take the post below with that grain of salt in mind.
I think Luke likely has the most appropriate solution to this problem, but memoization is not generally the solution to any issue of stack overflow.
Stack overflow usually is caused by recursion going deeper than the platform can handle. Languages sometimes support "tail recursion", which re-uses the context of the current call, rather than creating a new context for the recursive call. But a lot of mainstream languages/platforms don't support this. C# has no inherent support for tail-recursion, for example. The 64-bit version of the .NET JITter can apply it as an optimization at the IL level, which is all but useless if you need to support 32-bit platforms.
If your language doesn't support tail recursion, your best option for avoiding stack overflows is either to convert to an explicit loop (much less elegant, but sometimes necessary), or find a non-iterative algorithm such as Luke provided for this problem.
function memoize (f)
local cache = {}
return function (x)
if cache[x] then
return cache[x]
else
local y = f(x)
cache[x] = y
return y
end
end
end
triangle = memoize(triangle);
Note that to avoid a stack overflow, triangle would still need to be seeded.
Here's something that works without converting the arguments to strings.
The only caveat is that it can't handle a nil argument. But the accepted solution can't distinguish the value nil from the string "nil", so that's probably OK.
local function m(f)
local t = { }
local function mf(x, ...) -- memoized f
assert(x ~= nil, 'nil passed to memoized function')
if select('#', ...) > 0 then
t[x] = t[x] or m(function(...) return f(x, ...) end)
return t[x](...)
else
t[x] = t[x] or f(x)
assert(t[x] ~= nil, 'memoized function returns nil')
return t[x]
end
end
return mf
end
I've been inspired by this question to implement (yet another) flexible memoize function in Lua.
https://github.com/kikito/memoize.lua
Main advantages:
Accepts a variable number of arguments
Doesn't use tostring; instead, it organizes the cache in a tree structure, using the parameters to traverse it.
Works just fine with functions that return multiple values.
Pasting the code here as reference:
local globalCache = {}
local function getFromCache(cache, args)
local node = cache
for i=1, #args do
if not node.children then return {} end
node = node.children[args[i]]
if not node then return {} end
end
return node.results
end
local function insertInCache(cache, args, results)
local arg
local node = cache
for i=1, #args do
arg = args[i]
node.children = node.children or {}
node.children[arg] = node.children[arg] or {}
node = node.children[arg]
end
node.results = results
end
-- public function
local function memoize(f)
globalCache[f] = { results = {} }
return function (...)
local results = getFromCache( globalCache[f], {...} )
if #results == 0 then
results = { f(...) }
insertInCache(globalCache[f], {...}, results)
end
return unpack(results)
end
end
return memoize
Here is a generic C# 3.0 implementation, if it could help :
public static class Memoization
{
public static Func<T, TResult> Memoize<T, TResult>(this Func<T, TResult> function)
{
var cache = new Dictionary<T, TResult>();
var nullCache = default(TResult);
var isNullCacheSet = false;
return parameter =>
{
TResult value;
if (parameter == null && isNullCacheSet)
{
return nullCache;
}
if (parameter == null)
{
nullCache = function(parameter);
isNullCacheSet = true;
return nullCache;
}
if (cache.TryGetValue(parameter, out value))
{
return value;
}
value = function(parameter);
cache.Add(parameter, value);
return value;
};
}
}
(Quoted from a french blog article)
In the vein of posting memoization in different languages, i'd like to respond to #onebyone.livejournal.com with a non-language-changing C++ example.
First, a memoizer for single arg functions:
template <class Result, class Arg, class ResultStore = std::map<Arg, Result> >
class memoizer1{
public:
template <class F>
const Result& operator()(F f, const Arg& a){
typename ResultStore::const_iterator it = memo_.find(a);
if(it == memo_.end()) {
it = memo_.insert(make_pair(a, f(a))).first;
}
return it->second;
}
private:
ResultStore memo_;
};
Just create an instance of the memoizer, feed it your function and argument. Just make sure not to share the same memo between two different functions (but you can share it between different implementations of the same function).
Next, a driver functon, and an implementation. only the driver function need be public
int fib(int); // driver
int fib_(int); // implementation
Implemented:
int fib_(int n){
++total_ops;
if(n == 0 || n == 1)
return 1;
else
return fib(n-1) + fib(n-2);
}
And the driver, to memoize
int fib(int n) {
static memoizer1<int,int> memo;
return memo(fib_, n);
}
Permalink showing output on codepad.org. Number of calls is measured to verify correctness. (insert unit test here...)
This only memoizes one input functions. Generalizing for multiple args or varying arguments left as an exercise for the reader.
In Perl generic memoization is easy to get. The Memoize module is part of the perl core and is highly reliable, flexible, and easy-to-use.
The example from it's manpage:
# This is the documentation for Memoize 1.01
use Memoize;
memoize('slow_function');
slow_function(arguments); # Is faster than it was before
You can add, remove, and customize memoization of functions at run time! You can provide callbacks for custom memento computation.
Memoize.pm even has facilities for making the memento cache persistent, so it does not need to be re-filled on each invocation of your program!
Here's the documentation: http://perldoc.perl.org/5.8.8/Memoize.html
Extending the idea, it's also possible to memoize functions with two input parameters:
function memoize2 (f)
local cache = {}
return function (x, y)
if cache[x..','..y] then
return cache[x..','..y]
else
local z = f(x,y)
cache[x..','..y] = z
return z
end
end
end
Notice that parameter order matters in the caching algorithm, so if parameter order doesn't matter in the functions to be memoized the odds of getting a cache hit would be increased by sorting the parameters before checking the cache.
But it's important to note that some functions can't be profitably memoized. I wrote memoize2 to see if the recursive Euclidean algorithm for finding the greatest common divisor could be sped up.
function gcd (a, b)
if b == 0 then return a end
return gcd(b, a%b)
end
As it turns out, gcd doesn't respond well to memoization. The calculation it does is far less expensive than the caching algorithm. Ever for large numbers, it terminates fairly quickly. After a while, the cache grows very large. This algorithm is probably as fast as it can be.
Recursion isn't necessary. The nth triangle number is n(n-1)/2, so...
public int triangle(final int n){
return n * (n - 1) / 2;
}
Please don't recurse this. Either use the x*(x+1)/2 formula or simply iterate the values and memoize as you go.
int[] memo = new int[n+1];
int sum = 0;
for(int i = 0; i <= n; ++i)
{
sum+=i;
memo[i] = sum;
}
return memo[n];