Code Monkey home page Code Monkey logo

ferrum's Introduction

Ferrum

Features from the Rust language in JavaScript: Provides Traits & an advanced library for working with sequences/iterators in JS.

Github
API Documentation

Table of Contents

Status

CircleCI codecov GitHub license GitHub issues LGTM Code Quality Grade: JavaScript semantic-release

Usage & Features

$ npm add ferrum

Hashing & Hash Tables

Ferrum features an extensible, reliable infrastructure for object hashing including an implementation of HashMap and HashSet.

It supports user defined hash functions (e.g. blake2 instead of xxhash). Support for all of the standard types is provided out of the box and support for user defined types or third party types can be provided via the trait infrastructure.

You could even integrate the object-hash package to add support for hashing arbitrary third party types! See "Sophisticated hasher integrating object hash" in the hasher trait documentation.

const assert = require('assert');
const { HashMap } = require('ferrum');

const m = new Map([[{}, 42], [7, "seven"]]);
assert.strictEqual(m.get(7), "seven");
assert.strictEqual(m.get({}), undefined); // Identity based lookup

const hm = new HashMap([[{}, 42], [7, "seven"]]);
assert.strictEqual(hm.get(7), "seven");
assert.strictEqual(hm.get({}), 42); // Content based lookup

Testing of Examples

Have you ever found out that some of the examples in your api documentation or readme contained bugs? You can now use the Ferrum Doctest companion package to run your examples as part of your regular test harness!

Sequence/Iterators

Feature Ferrum Underscore Lodash wu.js
Objects as Sequences yes no no no
Reverse Currying yes no no no
Lazy Evaluation yes no no yes
Pipelining yes no no no

Ferrum provides a library for transforming lists & iterables; it provides all the functions you would expect like map, filter, foldl and many others. In this regard it is very similar to libraries like wu.js, lodash or underscore. Ferrum has been written to remedy some of the issues in these libraries.

Objects as Sequences

Ferrum/Sequence has been designed with full iterator support in mind. Generally all functions can take iterables/iterators and returns iterators.

const {map, assertSequenceEquals} = require('ferrum');

const a = map([1,2,3,4], x => x+2); // a is an iterator
const b = map(a, x => x*2); // b is also an iterator
assertSequenceEquals(b, [6, 8, 10, 12]);

In addition to supporting iterables & iterators, Ferrum/Sequence can take objects as input:

const {map,  iter, assertEquals, assertSequenceEquals} = require('ferrum');

const a = map({a: 42, b: 43}, ([k, v]) => v+2); // a is an iterator
const b = map(a, x => x*2); // b is also an iterator
assertSequenceEquals(b, [88, 90]);

const obj = {foo: 23, bar: 24};
const log = [];
for (const [key, value] of iter(obj)) {
  log.push(`${key} | ${value}`);
}

assertEquals(log, [
  'foo | 23',
  'bar | 24',
]);

Ferrum/Sequence uses lodash.isPlainObject and always tries to use the iterator protocol to make sure object iteration is only used if it really should.

const {map, assertSequenceEquals} = require('ferrum');

const obj = {};
obj[Symbol.iterator] = function*() {
  yield 2;
  yield 3;
};

assertSequenceEquals(map(obj, x => x*2), [4, 6]);

Lodash and Underscore only support arrays as input & output; wu.js supports iterators as input & output but has no support for plain objects.

Reverse Currying

Ferrum/Sequence provides many higher order functions. These are functions that take other functions as parameters, like map() or filter().

const {map, filter, assertSequenceEquals} = require('ferrum');

// Map is used to change each value in a list/iterable
assertSequenceEquals(map([1,2,3,4], x => x*2), [2,4,6,8]);

// Filter removes elements in a list/iterable
assertSequenceEquals(filter([1,2,3,4], x => x%2 === 0), [2, 4]);

Sometimes it can be useful to create an intermediate function with just a few arguments instead of calling the function right away:

const { map, plus, list, assertSequenceEquals } = require('ferrum');

const myList = [
  [1,2,3],
  [4,5,6],
  [7,8,9]
];

// Add 2 to each number in a two dimensional list
// This example uses currying twice: in the `plus(2)`
// and in the `map()`
const a = map(myList, map(plus(2)));
assertSequenceEquals(map(a, list), [
  [3,4,5],
  [6,7,8],
  [9,10,11]
]);

// This is what the code would look like without currying:
// A lot less convenient and harder to read
const b = map(myList, (sublist) => map(sublist, (b) => plus(b, 2)));
assertSequenceEquals(map(b, list), [
  [3,4,5],
  [6,7,8],
  [9,10,11]
]);

You may have noticed, that when currying is used, the arguments are given in reverse order; this is why we call it reverse currying. We have decided to use currying this way, because there should never be extra arguments after the function (otherwise you end up with dangling arguments multiple lines below) while the function is usually also the first parameter you want to supply when currying:

// This is not very handy because you might need to scroll down to find the last
// argument; you will also need to scroll down to determine whether the call to
// each is using currying
each(() => {
  ...
}, [1,2,3]);

// This is much more handy
each([1,2,3], () => {
  ...
});

Underscore.js does not support currying at all; lodash provides curried variants of their functions in an extra module (not very handy either because it is often useful to mix curried and non currying invocations) while lodash has opted to make the function the first parameter, delivering good support for currying and not so good support for normal function invocation.

Pipelining

Ferrum provides a function called pipe() which – together with currying – can be used to build complex data processing pipelines. Pipelines are conceptually the same as the highly successful pipes in bash; the feature is currently being introduced into the JavaScript standard library in the form of the |> operator.

const { sqrt, floor } = Math;
const {
  pipe, filter, uniq, map, mul, mapSort, identity, take,
  prepend, takeWhile, all, range, assertSequenceEquals,
  extend, plus, any,
} = require('ferrum');

const a = pipe(
  [5,1,6,7,10,11,1,3,4],
  filter(x => x%2 === 1), // Get rid of even number
  uniq,                   // Get rid of duplicates
  map(mul(3)),            // Multiply each element by three
  mapSort(identity));     // Sort all numbers
assertSequenceEquals(a, [3,9,15,21,33]);

// Very simple primality test
const isPrime = (v) => v > 1 && pipe(
  range(2, floor(sqrt(v)) + 1),
  map(x => v % x !== 0), // check that v is not divisible by x
  all);

// Sequence of all prime numbers (calculated slowly)
const primes = () => pipe(
  range(0, Infinity),
  filter(isPrime));

assertSequenceEquals(take(primes(), 5), [2, 3, 5, 7, 11]);

Learning to write algorithms in this way is not always easy, but it can be very rewarding as the pipe form is often a lot more readable. To illustrate this, let's take a look how our code of the prime sequence example changes as we take a way features from Ferrum; let's first take away currying and the pipe() function itself:

const { sqrt, floor } = Math;
const { all, map, takeWhile, filter, range, assertSequenceEquals, take, extend, plus } = require('ferrum');

const isPrime = (v) => v > 1 && all(map(range(2, floor(sqrt(v))+1), x => v % x !== 0));
const primes = () => filter(range(0, Infinity), isPrime);

assertSequenceEquals(take(primes(), 5), [2, 3, 5, 7, 11]);

One way to work around the lack of currying and pipe() is to just put all our filter stages into one expression. Due to this, our code has become much shorter and much harder to read. Look at how the dataflow jumps around, see how distant the map function and it's argument are from each other and it does not help that subexpression cannot be properly documented any more. Let's try another way to write down these functions:

const { sqrt, floor } = Math;
const { assertSequenceEquals, all, map, takeWhile, filter, range, take } = require('ferrum');

const integers = () => range(1, Infinity);
const isPrime = (v) => {
  const candidates = range(2, floor(sqrt(v)) + 1);
  const tests = map(candidates, x => v % x !== 0)
  return v > 1 && all(tests);
}
const primes = () => filter(integers(), isPrime);

assertSequenceEquals(take(primes(), 5), [2, 3, 5, 7, 11]);

This is much better! The data flow is more clear and substeps can be documented again. In this version we used temporary variables to get around not having pipe() and currying; this is much better than just putting everything into one line.

Note how positiveIntegers became its own function while fromTwo and candidates became just local variables. Also note how all and map are still in the same expression. Sometimes this is the more readable variant. We have to decide each time.

This variant still has disadvantages though; first of all the code still looks more cluttered and the dataflow still jumps around more than the pipe variant. You also have to come up with a lot of names for temporary variables and take care not to reuse them (since they are lazily evaluated they must only be used once). This is one of the things you communicate by using pipe() over local variables: "This variable will never be used again" – knowing this & limiting the number of variables in a scope can be very useful, especially in large functions.

Finally, let's implement this in classic imperative style:

const { sqrt } = Math;

const isPrime = (v) => {
  if (v < 2) {
    return false;
  }

  for (let i=0; i <= sqrt(v); i++) {
    if (v % i !== 0) {
      return false;
    }
  }

  return true;
}

const primes = function *primes() {
  for (let i=0; true; i++) {
    if (isPrime(i)) {
      yield i;
    }
  }
}

The first thing that becomes noticeable in this version is that it is more than twice as long as our variant using pipe() (not counting comment lines); this version also uses two levels of nesting, while our pipelined version uses just one level of nesting. The imperative version also contains two for loops and three if statements; for loops are notoriously hard to read as well. Finally, the imperative version forces us to think in imperative terms – to consider what happens in each step of the for loop one by one and then come to the conclusion: Ah, this for loop just gets rid of all those integers that are not prime. In the imperative version this intention must be deduced, in the pipelined version it is plain to see.

To sum it up, using pipe() and currying the functions from Ferrum has a number of advantages; you end up with fewer levels of nesting, can avoid a lot of branching (if statements) and hard to write for loops; pipelining let's you break apart your problem into multiple clearly defined transformation steps with obvious data flow and obvious intention.

Underscore, lodash and wu.js all allow you to do something similar with chaining which does work quite well. They do require a bit more boilerplate since values need to be wrapped before chaining and unwrapped after chaining has finished. Pipelining will have even less boilerplate when the |> becomes available and pipelining can be used with arbitrary transformation functions, while chaining can only be used with functions supported by the library, thus pipelining is much more generic & extensible.

Lazy Evaluation

Like Python iterators, sequences support lazy evaluation. They support it, because lazy evaluation is a core feature of JavaScript ES6 iterators.

This means that the values in iterators/sequences are only evaluated once they are needed:

const { map, plus, list, assertSequenceEquals } = require('ferrum');
const a = map([1,2,3], plus(2)); // At this point, no calculations have been performed
const b = list(a); // This will actually cause the values of the `a` iterator to be calculatedA
assertSequenceEquals(a, []); // `a` is now exhausted and can no longer be used
assertSequenceEquals(b, [3,4,5]); // `b` can be used as often as we want
assertSequenceEquals(b, [3,4,5]);

Try the above example with a couple of console.log statements and see what happens.

The practical upshot of this property is that it becomes possible to work with infinite sequences, like the primes() sequence above. It can be more efficient as well, since values that are not needed are not computed.

const {take, list, assertSequenceEquals} = require('ferrum');

function* fibonacci() {
  let a=0, b=1;
  while (true) {
    yield a;
    yield b;
    a += b;
    b += a;
  }
}

// Even though fibonacci() is infinite, this just works because only the
// first five fibonacci numbers are actually generated
// Note that just list(fibonacci()) would crash the program since that would
// require infinite memory and infinite time
assertSequenceEquals(take(fibonacci(), 5), [0, 1, 1, 2, 3]);

Underscore and lodash use arrays instead of iterators, so they have no lazy evaluation support. wu.js uses iterators and thus has full lazy evaluation support.

Traits

Sequence/Traits is the second big feature this library provides; it is a concept borrowed from the Rust language. They let you declare & document a generic interface; like the sequence concept above they are not an entirely new concept; while sequence is a library designed to make working with the JavaScript iteration protocols easier, traits standardize the creation of JavaScript protocols itself, thereby reducing boilerplate. Indeed the Sequence Trait is just a wrapper over the Iterable protocol of JavaScript.

const {Trait} = require('ferrum');

// Declaring a trait
/**
 * The Size trait is used for any containers that have a known size, like
 * arrays, strings, Maps…
 * Size should be used only for sizes that are relatively quick to compute, O(1) optimally…
 * @interface
 */
const Size = new Trait('Size');

// Using it
const size = (what) => Size.invoke(what);
const empty = (what) => size(what) === 0;

// Providing implementations for own types; this implementation will be
// inherited by subclasses
class MyType {
  [Size.sym]() {
    return 42;
  }
}

// Providing implementations for third party types. These won't be inherited
// by subclasses
Size.impl(Array, (x) => x.length); // Method of type Array
Size.impl(String, (x) => x.length);
Size.impl(Map, (x) => x.size);
Size.impl(Set, (x) => x.size);

// This implementation just applies to plain objects.
Size.impl(Object, (x) => {
  let cnt = 0;
  for (const _ in x) cnt++;
  return cnt;
});

// Note: The two following examples would be a bad idea in reality,
// they are just here toshow the mechanism
Size.implStatic(null, (_) => 0); // Static implementation (for a value and not a type)

Some of the advantages of using Traits are illustrated for the code above: First of all, using traits saves us a bit of boilerplate code; by having an actual variable representing the trait, we have a good place to document the trait; the @interface jsdoc feature can be used for this. We can also use this documentation to specify laws that implementations of the trait should abide by, like the (soft) law that Size implementations should be quick to compute.

Trait also features machinery to implement traits for third party types and even built in types like Array, Object, null or undefined. The classic way to implement protocols does not work in these cases:

const Size = Symbol('Size');

// Using just Symbols works perfectly for your own types
class MyType {
  [Size]() {
    return 42;
  }
}

// Using symbols for third party types is suboptimal,
// since we have to modify the type's prototype which
// could lead to weirdness
Array.prototype[Size] = () => this.length;

// Using symbols for Object, is a very bad idea as the implementation
// will be inherited by all other classes…this implementation obviously
// is the wrong one for Set for instance.
// This also illustrates why it is generally a bad idea to enable inheritance
// for third party types
Object.prototype[Size] = () => {
  let cnt = 0;
  for (const _ in this) cnt++;
  return cnt;
}

// Using symbols on values like null or undefined will just lead to a TypeError
// being thrown

//null[Size] = () => 0; // throws TypeError

The oldest pre-ES6 implementation just used method names; this strategy is very problematic, since two different interfaces may use the same method name:

class MyDbTable {
  size() {
    return request(`https://mydb.com/${this._tableName}/size`);
  }
};

class MyLocalTable {
  size() {
    return this._payload.length;
  }
}

In the hypothetical example above, one size() method returns an integer, while the other returns a promise resolving an integer (which makes total sense since it's the size of some database table). Even though each method makes sense for itself, there is no way to distinguish between them; the developer may write a function, expecting an integer…

Since the method name size() is already taken, we cannot even implement the async size interface for MyLocalTable.

Using traits we can actually encapsulate this relationship well:

// dbtable.js
const { Trait, Size: SyncSize } = require('ferrum');
const Size = new Trait('Size');

class DbTable {
  [Size.sym]() {
    return request(`https://mydb.com/${this._tableName}/size`);
  }
};

class MyLocalTable {
  [Size.sym]() {
    return this._payload.length;
  }
}

Size.implDerived([SyncSize], ([size], v) => Promise.resolve(size(v)));

This example above illustrates how – using traits – we can not only deal with name collisions by just renaming traits on the fly, we were also able to write a generic adapter that automatically implements AsyncSize for all types supporting Size.

To sum up, using Traits provides a number of advantages: Traits let you avoid some boilerplate code, they allow you to specify and implement generic interfaces without the danger of name collisions; they let you provide implementations for third party types, built-in types and even null, undefined and plain Object without modifying these types. They even let you write generic adapters, implementing traits for entire groups of traits at once.

Operators as functions

Ferrum/Ops provides all of the JS operators and some extra boolean operators as curryable functions.

const { strictEqual: assertIs } = require('assert');
const { plus, and, not, is, xor, map, list, assertSequenceEquals } = require('ferrum');

assertSequenceEquals(
  map([1,2,3], plus(2)),   /* => */ [3,4,5]);
assertIs(and(true, false), /* => */ false);
assertIs(not(1),           /* => */ false);
assertIs(is(2, 2),         /* => */ true);
assertIs(xor(true, false), /* => */ true);

Typing utilities

Ferrum provides utilities for working with types that can be safely used with null and undefined.

const { strictEqual: assertIs } = require('assert');
const {isdef, type, typename} = require('ferrum');

assertIs(isdef(0),         /* => */ true);
assertIs(isdef(null),      /* => */ false);
assertIs(isdef(undefined), /* => */ false);

assertIs(type(22),        /* => */ Number);
assertIs(type(null),      /* => */ null);
assertIs(type(undefined), /* => */ undefined);

assertIs(typename(type(22)),        /* => */ "Number");
assertIs(typename(type(null)),      /* => */ "null");
assertIs(typename(type(undefined)), /* => */ "undefined");

The usual strategy of using value.constructor and value.constructor.name yields errors for null & undefined.

Functional Utilities

const {
  curry, pipe, filter, isdef, uniq, map, plus,
  assertSequenceEquals, assertEquals,
} = require('ferrum');

// Using pipe() + auto currying instead of chaining
assertSequenceEquals(
  pipe(
    [0, 1, 2, null, 3, 4, null, 5, 1, 3, 2, null, 1, 4],
    filter(isdef), // Filter out every null & undefined
    uniq,          // Remove duplicates
    map(plus(2))), // Add two to each element
  /* => */ [2, 3, 4, 5, 6, 7]);

// Auto currying
const pair = curry('pair', (a, b) => [a, b]);
assertEquals(pair(1,2),  /* => */ [1,2]);
assertEquals(pair(2)(1), /* => */ [1,2]);

Change Log

Features

1.9.1

  • #193 fix: take(), tryTake(), takeWithFallback() no longer broken on plain arrays. Congrats to @tobia for finding the first bug affecting the actual code (not just documentation or CI).

1.9.0

  • Hashable Trait & Hash tables (3a86070)
  • apply1(), let_in(), call() (3a86070)
  • create(), createFrom(), builder() (3a86070)

1.8.0

  • Move many tests into the documentation examples (c033897)

1.7.0

  • Use ferrum.doctest to make sure examples are valid js code (b0f9d45)

1.6.0

1.5.0

  • Alias flatten() -> flat() (2abad3f)
  • group(), multiline() and takeUntil() (0bc0ca0)

1.4.0

1.3.0

  • Add function repeatFn() (81de232)
  • Provide chunkify functions (9ff9603)
  • Provide takeShort() & takeWithFallback() (bafa834)
  • slidingWindow now returns an empty sequence if no=0 (533cff4)

1.2.0

  • Bugfix: Support for objects with Symbol keys – Before this change most functions would disregard Symbol keys in objects. E.g. size({[Symbol()]: 42}) would return zero. Now the functions pairs(), keys(), values(), size(), empty(), shallowclone(), deepclone(), iter(), each(), replace(), setdefault(), del(), assign(), has(), get(), eq, uneq, assertEquals, assertUneq are explicitly tested and with objects containing symbol keys.

Development

Build

$ npm install

Test

$ npm test

Lint

$ npm run lint

ferrum's People

Contributors

danielmschmidt avatar dependabot[bot] avatar eliot-akira avatar flopp avatar gunn avatar joaomoreno avatar koraa avatar marquiserosier avatar mrcull avatar prashcr avatar renovate-bot avatar renovate[bot] avatar semantic-release-bot avatar trieloff avatar tripodsan avatar zaygraveyard avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ferrum's Issues

skipTail(), seekTail(), splitTail()

Functions for accessing the end of a sequence…

const dropTail = (seq, n) => chain(slidingWindow(n+1), map(first));
const seekTail = (seq, n) => chain(slidingWindow(n+1), last);
example splitTail([a, b, c, d, e], 2) -> [[a, b, c], [d, e]])

Add Try variants too (those are variants that won't complain if the seq is too short)…

Pattern matching on sequences: Providing a lazy version of [head, ...tail]

Implement the standard pattern matching structure [head, ...tail] but using iterators…so with lazyness.

// Split the given sequence into head and tail
const pop = (seq) => {
  const i = iter(seq);
  const v = next(i);
  return [i, v];
};

const tryPop = ...

// Split the given sequence into an array and the rest of the iterator
const popn = (seq, headLen) => 
const tryPopn = (seq, headLen) => 

// Version of fold that will fail if the sequence has zero elements 
const foldl1 = (seq, fn) => {
  const [i, v] = pop(seq);
  return foldl(i, v, fn);
};
// Fold but without initial element; this will just use the default value in case the sequence is empty but always start folding with the first element from the sequence
const tryFoldl1 = (seq, default, fn) => {
  const None = Symbol();
  const [it, v] = tryPop(seq);
  return v === None ? default : foldl(it, v, fn);
};

const foldr1 = 
const tryFoldr1 = 

Does `uniq` preserve order?

It would be worth noting if uniq preserves the order of elements in the sequence or not. Preserving order would be better, but might be unattainable with the Set based implementation.

take, exec

Exec should be variadic.

take = (val, fn) => fn(val)
unnamed1 = (fn, val) => fn(val)
unnamed2 = (args, fn) => fn(...args)
unnamed3 = (fn, args) => fn(...args)

dropTail, seekTail, splitTail

Functions for accessing the end of a sequence…

const dropTail = (seq, n) => chain(slidingWindow(n+1), map(first));
const seekTail = (seq, n) => chain(slidingWindow(1), last);
example splitTail([a, b, c, d, e], 2) -> [[a, b, c], [d, e]])

Add Try variants too (those are variants that won't complain if the seq is too short)…

This is symmetric with #141; the API should reflect this and both features may need to be added at the same time.

ferrum.async submodule for working with promises

We should provide a support module for working with promises.

// Can be imported by path and using destructuring
const { fork } = require('ferrum/async');
const { async: { fork } } = require('ferrum');

// Promise constructor as a free function
const makePromise = builder(Promise);

// Mostly useful as an annotation in code that this starts a new fiber
const fork = (fn) => Promise.resolve(fn(v));

// Fork from a sequence; useful because Promise.all cannot be used as a free function; needs a better name
const parallel = (seq) => Promise.all(iter(seq));

// Iterate over a sequence of promises in the order they resolve (multiple races). Returns async iterator (needs better name)
const race_all = 

const race = (seq) => Promise.race(iter(seq));

// Promise with resolve/reject
class Barrier extends Promise{
  constructor(fn /* maybe null */) {}
  resolve(v) {}
  reject(v) {}
  intoPromise() {}
};

// WaitEvent
const waitEvent = (emitter, eventName) => makePromise((res) => 

class Channel {
  constructor
}

Functions from: https://github.com/adobe-rnd/helix-harmonicabsorber/blob/master/src/asyncio.js

Documentation Method Signature Err

I ran into small error in the docs. The Key, Value seem to be out of place in the Setdefault signature. I checked the SRC and it also confirms as the call is (container, k, v)

Screen Shot 2019-11-06 at 2 30 52 PM

Compex pattern matching

Is it possible to find a construction for advanced pattern matching including type signatures in vanilla* js?

The following construction doesn't seem very sound to me, but maybe we can come up with something.

// Turns numbers into strings, strings into numbers, and throws errors for anything else
patternMatch(
  is_a(String),    Number,
  is_a(Number), String,
  yes,                 identity);
tryPatternMatch(
  is_a(String),    Number,
  is_a(Number), String);

exec(Y(rematch => patternMatch(
  is_any([String, Number]), (v) => rematch([v]),
  is_a(Array), map(x => Number(x)*2 + 1),
)));

*vanilla – as in no transpilers or compilers involved

mapKey, mapValue, mapNth, mapLast, mapProp

Map on a specific element of a set or object:

const euclidRem = (n, d) => n<0 ? d+(n%d) : n%d; / for d>0
const mapNth = (seq, idx, fn) => map(seq, (tup) => {
  const l = list(tup);
  const i = euclidRem(idx, l.length);
  l[i] = fn(l[i]);
  return l;
});
const mapKey = (seq, fn) => mapNth(seq, 0, fn);
const mapValue = (seq, fn) => mapNth(seq, 1, fn);
const mapLast = (seq, fn) => mapNth(seq, -1, fn);

const mapProp = (seq, name, fn) => map(seq, (o) => { ...o, [name]: fn(o[name]) });

mapProp may be unsuitable; it would be more ferrums style to use get()/set(). This may also be a suitable generalization, but would drop the implicit coercion to list by mapKey/Value/Nth/Last.

AssertSequenceEquals is not a function

Description
When running the first example in the Readme, it uses assertSequenceEquals, which does not exist on the ferrum object

To Reproduce
Steps to reproduce the behavior:

  1. Run the first example in the readme

Expected behavior
Two maps and an assert

Version:
ferrum: 0.3.2

TryGet trait (Get() with default value)

Rename the Get trait to TryGet; the signature should be tryGet(key, default).
We can derive from this:

const get = (c, k) => {
  const none = Symbol();
  const r = tryGet(c, k, none);
  assert(r !== none, "No such element");
  return r;
};

const has = (c, k) => {
  const none = Symbol();
  return tryGet(c, k, none) !== none;
}

Edit: This is a breaking change.

Add support for `await` inside `pipe()`

It would be nice if await could be used within a pipe.

const doAwait = Symbol();

const pipe = (val, ...fns) => {
  while (fns.length > 0) {
    const fn = fns.shift()
    if (fn === doAwait) {
      return Promise.resolve(val).then((val_) => pipe(val_, ...fns));
    } else {
      val = fn(val);
    }
  }
  return val;
};

This could even be generalized and use a special trait and made to work in compose…

Using a new new function doAwait instead of the normal await is not terribly pretty, but can be done.

Action Required: Fix Renovate Configuration

There is an error with this repository's Renovate configuration that needs to be fixed. As a precaution, Renovate will stop PRs until it is resolved.

Location: package.json
Error type: The renovate configuration file contains some invalid settings
Message: Invalid configuration option: author, Invalid configuration option: bugs, Invalid configuration option: homepage, Invalid configuration option: license, Invalid configuration option: main, Invalid configuration option: name, Invalid configuration option: packageRules[0].documentation, Invalid configuration option: packageRules[0].fastestsmallesttextencoderdecoder, Invalid configuration option: packageRules[0].lodash.isplainobject, Invalid configuration option: packageRules[0].xxhashjs, Invalid configuration option: packageRules[1].@semantic-release/changelog, Invalid configuration option: packageRules[1].@semantic-release/commit-analyzer, Invalid configuration option: packageRules[1].@semantic-release/git, Invalid configuration option: packageRules[1].@semantic-release/github, Invalid configuration option: packageRules[1].@semantic-release/npm, Invalid configuration option: packageRules[1].@semantic-release/release-notes-generator, Invalid configuration option: packageRules[1].ajv, Invalid configuration option: packageRules[1].codecov, Invalid configuration option: packageRules[1].commitizen, Invalid configuration option: packageRules[1].cz-conventional-changelog, Invalid configuration option: packageRules[1].docdash, Invalid configuration option: packageRules[1].eslint, Invalid configuration option: packageRules[1].eslint-config-airbnb, Invalid configuration option: packageRules[1].eslint-plugin-header, Invalid configuration option: packageRules[1].eslint-plugin-import, Invalid configuration option: packageRules[1].eslint-plugin-jsx-a11y, Invalid configuration option: packageRules[1].eslint-plugin-react, Invalid configuration option: packageRules[1].ferrum.doctest, Invalid configuration option: packageRules[1].jsdoc, Invalid configuration option: packageRules[1].junit-report-builder, Invalid configuration option: packageRules[1].lint-staged, Invalid configuration option: packageRules[1].mocha, Invalid configuration option: packageRules[1].mocha-junit-reporter, Invalid configuration option: packageRules[1].mocha-multi-reporters, Invalid configuration option: packageRules[1].mocha-parallel-tests, Invalid configuration option: packageRules[1].nyc, Invalid configuration option: packageRules[1].object-hash, Invalid configuration option: packageRules[1].semantic-release, Invalid configuration option: renovate, Invalid configuration option: scripts, Invalid configuration option: version

Tree traversal functions

We should provide functions for traversing trees make from containers that implement Get/Set

const treeGet = (seq, tree) => foldl(seq, tree, (c, k) => get(c, k));
const treeSet = (seq, tree, value) =>
const tryTreeGet = (seq, tree, default) => // Returns default if a node didn't exist
const tryTreeSet = (seq, tree, value) => // Won't throw an exception if an intermediate node didn't exist
const treeCreateNodes = (seq, tree) => // Will create intermediate nodes (default values in sequence which is of the form [[key, defaultValue],...])

Missing dependencies for "npm run doc"

Description
Some dev dependencies are missing in package.json for "npm run doc" which is defined as a script there.

To Reproduce

$ rm -rf node_modules
$ npm install
$ npm run doc
....
FATAL: Unable to load template: Cannot find module 'jsdom'

`map` for non-sequences

Overview

This looks awesome, and makes me want to bring out the whole FP toolkit to use along with ferrum. I'm looking for thoughts and/or best practices for using ferrum with non-sequence objects, too. I.e. more general functors and monads.

Details

The README and project layout make it abundantly clear that the focus is on improve working with iterators. Also, the introduction video makes jokes (which I thoroughly enjoy) when the word "monad" is brought up that no one should have to worry about that means. Still, I am curious what thoughts there are to supporting the more mathematical basis for some of these traits, or at least using this library with other sources that do assume that.

For example, if I start using ferrum, it's only a matter of time before I will want to use some version of type classes for things like Option/Maybe, Either, Result, etc. All of these things should have a map operation. map is hardcoded as applying a function to a sequence, so it cannot be overridden (overloaded? anyway...) with the implementation for non-sequence functors.

I think what I would do is start making a separate local lib with a Functor Trait and define an fmap function for it as the basis for other things.

That said, I think that the Sequence trait could be refactors to be based on a Functor trait. I honestly cannot foresee the fallout of that, but I know it would at least make this library significantly more complex. Not least of all because one can go nuts and add in BiFunctor and Applicative and.... ooh boy then we have all of Haskell now wrapped up in this bloated library... But seriously, I would just as quickly be looking for fmap_left and chain functions as anything else... 🤓 😋

Bonus points for any concrete examples of tying ferrum in with any other fp libraries.

Cheers! And thank you!

null/undefined safe ways of working with properties & entries

Type safe way of iterating the properties of arbitrary types; this is a reflection feature and refers to arbitrary types.

const entries = (o) => isdef(o) ? Object.entries(o) : [];
const propertyDescriptors = (o) => isdef(o) ? Object.getOwnPropertyDescriptors(o) : [];
const propertyNames = (o) => isdef(o) ? Object.getOwnPropertyNames(o) : [];
const properties = (o) => map(propertyNames(o), (k) => [k, o[k]]);
const getProperty = (o, k) => isdef(o) ? o[k] : undefined;

Migrate to typescript?

Given the functional nature of ferrum, using typescript seems like it might be a good choice…

multifold

Include a fold vector processing API: Multiple fold operations at the same time; e.g. sum and count

const multiFoldl = (seq, ini, fns) => {
  const resv = list(ini);
  each(cartesian(seq, enumerate(fnv)), ([val, [idx, fn]]) => {
    resv[idx] = fn(val, resv[idx]);
  });
  return resv;
};

const multiFoldr, multFoldl1, tryMultiFoldl1, multiFoldr1, tryMultiFoldr1 = 

Run examples as tests

We should run all examples in ferrum (in the readme as well as the jsdoc) as tests in order to make sure these functions do actually work.
We can use assertEquals() instead of comments to check that the right values are returned.

composev()

Version of compose that takes a sequence. (Might need a better name).

const composev = (seq) => compose(...iter(seq));

prependSeq, appendSeq

Mostly useful inside pipe():

prependSeq = (tail, head) => concat(head, tail);
appendSeq = (head, tail) => concat(head, tail);

Should we just call this prepend/append?

Use ferrum.doctect for testing

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Describe the solution you'd like
A clear and concise description of what you want to happen.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Additional context
Add any other context or screenshots about the feature request here.

runtime type signatures (type casting/constriction)

Can we introduce type signatures using plain javascript?

Used to take all arguments to their respective types. (Use with casting functions like obj, etc…)

typedfn = (types, fn) => (...args) => fn(...map(zip(types, args), (t, v) => t(v))));
typecasg = (fn, types) => 

But typefn should correctly indicate function arity.

e.g.

addNumbers = typedfn([Number, Number], plus);
concatStrings = typedfn([String, String], plus);

This constricts arbitrary argument types by casting them into the correct type.

Custom struct infrastructure

Js structs have a number of disadvantages:

  • methods are unbound
  • new construction
  • This behaves weirdly.
  • No explicit handling of

We could substitute our own syntax:

// Foo extends Bar
const Foo = struct('Foo', Bar, {
  // property; must be set during initialization
  'foo': undefined,
  // property with default value (supplied using deep clone)
  'foo': 42,
  // property, with value generated by calling the function
  'foo': Prop.init((self) => undefined),
  // property; with getter/setter
  'foo': Prop({
    get: (self) =>,
    set: (self) =>,
  }),
  // property; 
  // property, assigned by calling the provided function
  // Bound method syntax
  'foo': (self) => 
  // Static bound method/property/etc
  '@foo': (cls) => 
});

Only hard rule is: @ is for static, no @ is instance.
Provide some trait intoProp or something that contains normalization rules for properties that turn everything into a Property({ init(), get(), set() }). The normalization rule for undefined is to throw (to enforce explicit initialization). The normalization rule for Functions is binding on initialization and hiding from enumerable properties and so on.

By default .init(foo, bar, baz) and .namedInit({ foo, bar, baz }) is provided.

Highly flexible; metaprogrammable; supports reflection.

Infrastructure for metaprogramming in pipe/compose (await inside pipe())

Provide a generalized pipe/compose metaprogramming infrastructure.

pipe.do = Tuple('do', 'fn'); // One element tuple as a tag
pipe.await = pipe.do((p, fn) => Promise.await(p));

Whenever pipe() (or compose) encounters do, it will evaluate all the functions to the left of the do statement and compose the functions to the right into a single function; passing the value and the functions into the function stored inside to.

compose(...leftFns, pipe.do(doFn), ...rightFunctions) <=> doFn(compose(...leftFns), compose(...rightFns))

We could also use a more general meta tuple that allows for generalized rewriting:

compose(...leftFns, pipe.meta(metaFn), ...rightFns) <=> metaFn(leftFns, rightFns)

In this framework do could be implemented as a special case of meta:

pipe.do = (doFn) =>
  pipe.meta((l, r) => (v) => doFn(composev(l)(v), composev(r)));

Do alone would allow for some interesting transformations on pipe; e.g. do(ifdef) would early abort pipe execution and do(map) would actually introduce loops as part of the function composition infrastructure.

Actually, I believe this would be about as general as the haskell do monad (hence the name) while staying in the fully functional framework.

This is different from the do syntax mostly because this uses explicit connectives instead of type dependent connectives as monads to (on the other hand this could be remedied with a type class).

Of course, how practical this is would have to be evaluated but the basic use case with await is in my definetly useful.

Fix typos and improve README

Fix the following issues in the README:

  • It contains typos
  • Update capitalization of programming languages e.g. Rust, Python, JavaScript
  • Remove references to Typeclasses, as that is a slightly different concept from Scala/Haskell, not Rust
  • Add a link to Rust Traits example for people that are unfamiliar with Rust

Typo in readme

Description
[2 4] should be [2, 4] in Reverse Currying section

typo in docs: compose

Expected behavior

- // => 18
+ // => 14

Additional context
Add any other context about the problem here.

ferrum/src/functional.js

Lines 92 to 97 in 7eb22ea

* const fn = compose(
* (x) => x+2,
* (x) => x*3
* );
*
* console.log(fn(4)); // => 18

Fill in missing documentation

Not every function of the library is currently documented and some are not up to the level of quality expected from the library.

We should first check of functions that lack API documentation entirely and then fill in cases where documentation is present but not complete.

Every function, class and other structure should have:

  • A text description
  • Examples
  • Version info

properties() and propertyNames()

const propertyNames = (o) => {
  for (const k in o) yield k;
};

const properties = (o) => {
  for (const k in o) yield (k, k[o]);
};

This would be useful e.g. for lazy iteration over process.env.

`mapTuple()`

Apply a separate function to each element of a list.

mapTuple(["false", "22"], [Boolean, Number]) <=> [false, 22]

Compare to rambda js

We should provide some documentation in the readme how this library compares to rambda (both can be combined).

Heterogeneous Sequence API: Sequence functions should support async sequences and reactive streams

pipe(
  fetchData(),
  map(x => x+2)
  sum);

The above example should work, regardless of whether fetchData returns a sequence/iterable, async sequence/iterable or a reactive sequence.

The implementation could – conceptually – look something like the example below (rxjs integration to be added, woefully untested).

The idea here is to use higher kinded types (TransformStream, ConsumeStream) to use different implementations for each type of stream (sequence, async sequence, rxjs observable). In order to avoid having to implement each function for each kind of stream, we instead implement the actual transformer functions (map, fold, etc) as "synchronous" coroutines (not generators) which are supplied the values from the type of stream we're using by a special coroutine scheduler. This way, the coroutine scheduler handles the particulars of the stream while the transformer function just needs to implement the abstract transformation algorithm. E.g. the scheduler for Async Iterators would reenter the transformer coroutine between async calls and resolve promises; this way constructions like map(x => Promise.resolve(x)) yield a sync sequence of promises for sync iterables and a async sequence of values for async iterables.

In the next step we need to figure out how rxJs observables should be handled. E.g. should they automatically resolve promises?

Finally we need to figure out how functions that combine multiple sequences (concat, zip, flatten, etc) should behave. E.g. how to zip a list and an observer? Or flatten an async sequence of a list and an observer? My current thinking is that this should just fail…but we can add conversion functions between sequence kinds and make flattenAs functions that explicitly produces a specific type of sequence…

const transformStream = (stream, trans) => TransformStream.invoke(stream, trans);

const map = curryStreamTransfomer('map', function* (GetNext, fn) {
  for (let { val, done } = yield GetNext; !done; { val, done } = yield GetNext) {
    yield fin(val);
  }
});

const foldl = curryStreamConsumer('foldl', function*(GetNext, initial, fn) => {
  let r = initial;
  for (let { val, done } = yield GetNext; !done; { val, done } = yield GetNext) {
    r = fn(r, val);
  }
  return r;
});

const TransformStream = Trait();
TransformStream.implDerive([Sequence], function*(_, src, trans) {
  const GetNext = Symbol('GetNext');

  const it = iter(src);
  const co = trans(GetNext);

  let [val, done] = co.next();
  while (true) {
    if (val !== GetNext) {
      yield val;
    }

    if (done) {
      break;
    }

    const supply = val === GetNext ? it.next() : undefined;
    ([val, done] = co.next(supply));
  }
});

TransformStream.implDerive([AsyncSequence], async function*(_, src, trans) {
  const GetNext = Symbol('GetNext');

  const it = asyncIter(src);
  const co = trans(GetNext);

  let [val, done] = co.next();
  while (true) {
    if (val !== GetNext) {
      yield await Promise.resolve(val);
    }

    if (done) {
      break;
    }

    const supply = await Promise.resolve(
        val === GetNext ? it.next() : undefined);
    ([val, done] = co.next(supply));
  }
});

const ConsumeStream = Trait();
ConsumeStream.implDerive([Sequence], (_, src, cons) => {
  const GetNext = Symbol('GetNext');

  const it = iter(src);
  const co = cons(GetNext);

  let [val, done] = co.next();
  while (true) {
    if (val !== GetNext) {
      return val;
    }

    if (done) {
      break;
      }

    const supply = val === GetNext ? it.next() : undefined;
    ([val, done] = co.next(supply));
  }
});

ConsumeStream.implDerive([AsyncSequence], async (_, src, cons) => {
  const GetNext = Symbol('GetNext');

  const it = asyncIter(src);
  const co = cons(GetNext);

  let [val, done] = co.next();
  while (true) {
    if (val !== GetNext) {
      return await Promise.resolve(val);
    }

    if (done) {
      return;
    }

    const supply = await Promise.resolve(
        val === GetNext ? it.next() : undefined);
    ([val, done] = co.next(supply));
  }
});

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.