Code Monkey home page Code Monkey logo

Comments (18)

ivan-kleshnin avatar ivan-kleshnin commented on May 31, 2024

Btw Ramda already supports input infinite sequences through sophisticated reduce implementation which supports breaks.

function _iterableReduce(xf, acc, iter) {
    var step = iter.next();
    while (!step.done) {
      acc = xf['@@transducer/step'](acc, step.value);
      if (acc && acc['@@transducer/reduced']) { // !!!
        acc = acc['@@transducer/value']; 
        break; // !!!
      }
      step = iter.next();
    }
    return xf['@@transducer/result'](acc);
  }

Now take and other functions can return special guard value to break reducing process.

let transducer = R.compose(R.map(x => x + 1), R.take(5));
console.log(R.transduce(transducer, R.flip(R.append), [], ImLazy.range(0, Infinity)));
// [1, 2, 3, 4, 5]

let reducer = R.compose(R.map(x => x + 1), R.take(5))(new XWrap(R.flip(R.append)));
console.log(R.reduce(reducer, [], ImLazy.range(0, Infinity)));
// [1, 2, 3, 4, 5]

Butl question about producing lazy seqs from reduce / transduce -like functions remains.

from imlazy.

benji6 avatar benji6 commented on May 31, 2024

Hey this is all really interesting stuff. I have been thinking a lot about transducers and lazy iterables recently.

So I always thought that the point of reduce and transduce were that you specified how any new value is created. Reduce doesn't even need to return a collection, but if you wanted to return a lazy iterable you could easily just do this:

const R = require('ramda')
const L = require('imlazy')

L.reduce(R.flip(L.append), [], [1, 2, 3]) // => lazy iterable of [1 2 3]
R.transduce(R.map(R.identity), R.flip(L.append), [], [1, 2, 3]) // => lazy iterable of [1 2 3]

Are you talking about something deeper than this that I am missing though?

Although I built this library I am thinking that transducers really are the fundamental building block for collection processing, but I think there is still a very important place for lazy and endless collections as a conceptual tool.

There's a question as to whether the library should expose any of its own transformation functions when they can all be done using transducers which are in fact a lot more useful and reusable

from imlazy.

ivan-kleshnin avatar ivan-kleshnin commented on May 31, 2024

Thanks! No, I'm just learning and I've kinda stuck thinking about all this. I see you use stuff like
const B = a => b => c => a(b(c)) which I see as a sign of heavy functional background. So I'd like to acquire a bit of your knowledge πŸ˜„

If you don't mind I have more questions.

  1. Your library defines append as:
let createIterable = generator => Object.freeze({[Symbol.iterator]: generator});

let append = curry((a, xs) => createIterable(function* () {
  yield* xs;
  yield a;
})); // return iterable

But could "technically" do it as:

let append = curry((a, xs) => (function* () {
  yield* xs;
  yield a;
}())); // return raw generator

What immutability means for lazy sequence? If we convert it with Array.from we loose it.

  1. The thing that bothers me with transducers is how they complicate signatures making a code barely typable.
basicMap :: (a -> b) -> ([a] -> [b])

makeMappingReducer :: (a -> b) -> (a -> b -> c) -> (a -> b -> c)

mapWithReduceSupport :: (a -> b) -> [a] | (a -> b -> c) -> [b] | (a -> b -> c)

The last "transduce-friendly" map has extremely ugly signature and still compiler won't prevent you from returning [b] in response to (a -> b -> c) or vise verca. It looks like transducers just transfer expression problem from application level to library level.

I'm curious if it would be better so separate standalone map from mapping function, filter from filtering functions, etc. which would be factories for corresponding transducers. In this case we cleanup our signatures but still violate DRY as mapFn, filterFrn applications are still used in multiple places through our code...

from imlazy.

benji6 avatar benji6 commented on May 31, 2024

Hey, well I'm actually fairly new to all this stuff and am learning too!

So this code:

let append = curry((a, xs) => (function* () {
  yield* xs;
  yield a;
}()));

doesn't return a generator because the generator is immediately invoked which then returns an iterator. This library could return generators instead of iterables, but I think iterables are kinda more useful. There's loads of good info on this here https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Iterators_and_Generators

If you convert an iterable to an array using Array.from the array is of course mutable, but the original iterable will not be affected by those mutations (unless the values in the iterable are mutable I guess)

So I actually am a bit lost on the second part of your comment, if you could provide some examples that would be great (I've not done a lot of work in typed languages)

from imlazy.

ivan-kleshnin avatar ivan-kleshnin commented on May 31, 2024

doesn't return a generator because the generator is immediately invoked which then returns an iterator

Yes, it's an iterator which is iterable by definition. So it's unclear why you wrap iterator in createIterable call.

So I actually am a bit lost on the second part of your comment, if you could provide some examples that would be great (I've not done a lot of work in typed languages)

Sorry, I'll consider how to improve this question and continue then.

from imlazy.

benji6 avatar benji6 commented on May 31, 2024

so you can only iterate over an iterator once which is why generators are more useful

const generator = function * () {yield 1}
const iterator = generator()
const array1 = [...iterator] // => [1]
const array2 = [...iterator] // => []

and you cannot spread generators but can spread objects which have a Symbol.iterator property which is one of the reasons everything is passed through the createIterable function. Hope that clears things up a bit

from imlazy.

ivan-kleshnin avatar ivan-kleshnin commented on May 31, 2024

Thanks! That's clear for sure.

from imlazy.

awto avatar awto commented on May 31, 2024

I've got a pointer to the library from this ramda thread, and imlazy is the library I was looking for.

Regarding this question:

Then they should be seen as a fundamental building block for collection processing instead of generators (in dynamically typed language like JS).

I wrote a post about, may be interesting here,
transducers as a function transforming producers (iterables) does the same job clojure style transducers (transforming consumers) do but much simpler. The former type of producers (used in this library) is described in [this paper](Lazy v. Yield: Incremental, Linear Pretty-printing).

from imlazy.

ivan-kleshnin avatar ivan-kleshnin commented on May 31, 2024

@awto, @benji6 I'm fully dissatisfied with transducers.

In Clojure transducers do two things: universal iteration + ability to break loops (key feature to use HO functions instead of raw recursions).

In Haskell the first is solved by typeclasses, the second – by lazy evalution.

Transducers make library code look like a mess (look at the related code of Ramda and Clojure – OMG). They are hard to type (I mean static typing).

So now I'm totally in favor of languages with lazy evaluation where you just don't need transducers.

from imlazy.

awto avatar awto commented on May 31, 2024

@ivan-kleshnin

In Clojure transducers do two things: universal iteration + ability to break loops (key feature to use HO functions instead of raw recursions).

hm, what do you mean by universal iteration? there was sequences long before transducers in clojure. I see transducers main goal is to avoid intermediate values.

In Haskell the first is solved by typeclasses, the second – by lazy evalution.

how would you break the loop by lazy evaluation?

What I meant in the post generators function taking 1 or more iterables may be called transducer too, and the term transducer for this was used earlier than in clojure (2012 vs 2014). The difference between the Oleg's paper and clojure transducers is just what exactly transducer transforms producer or consumer. In JavaScript generators may be utilized for to simplify both options. Producer option will be simpler, while consumer option may solve async values generation. Breaking loops, maybe even nested, is just javascript break statement, no special protocols needed.

from imlazy.

comraq avatar comraq commented on May 31, 2024

@awto actually, the main intention of clojure transducers isnt about intermediate values, but by pulling out the "concatenation" reducer (step) function from transformations such as map and filter.

in fact it is completely up to the reducer to manage the concatenation, and thus, any data structure that has their on reducing/concatenating function can be iterated over while applying transformations through transducers.

Its just that by implementating transformations using reduce, pulling the reducing step out of transformation functions and passing them in via HO functions, you also get the benefit of no intermediate values with eager evaluation.

from imlazy.

comraq avatar comraq commented on May 31, 2024

@ivan-kleshnin only thing i see about reduce being lazy is that it only outputs 1 value (ie the final accumulated result). Even if it is lazy, essentially it will only have ONE "next" value and then its iteration should end.

So its not really lazy? (as it needs to eagerly iterate through ALL values in the input sequence/collection to fully 'reduce' it down to ONE final value)

from imlazy.

awto avatar awto commented on May 31, 2024

@comraq

the main intention of clojure transducers isnt about intermediate values, but by pulling out the "concatenation" reducer (step) function from transformations such as map and filter.

no "concatenation", no intermediate value, isn't this absolutely the same?

from imlazy.

ivan-kleshnin avatar ivan-kleshnin commented on May 31, 2024

hm, what do you mean by universal iteration? there was sequences long before transducers in clojure. I see transducers main goal is to avoid intermediate values.

I wasn't talking about Clojure specifics.

how would you break the loop by lazy evaluation?

It's just not required with lazy evaluation. That's the point.

Breaking loops, maybe even nested, is just javascript break statement, no special protocols needed.

If you fall down to imperative code – yes. If we're talking about functional composition – no.

Simple test is – can you implement find in terms of head and filter?

let find = compose(head, filter) 

Without a lazy evalution / transducers you need either imperative loop or explicit recursion.

from imlazy.

awto avatar awto commented on May 31, 2024

@ivan-kleshnin ah, I see, I misunderstood breaking loop expression in your message, and yes, I agree, for lazy functions in Clojure sequences are enough, transducers were introduced for eager ones, and my point is: in javascript transducers are not needed because there are generators to achieve the same goal in the much simpler way

If you fall down to imperative code – yes. If we're talking about functional composition – no.

I don't see anything wrong in imperative code, the task is to split computation in as small stages as possible without overhead, and generators fit perfectly here

from imlazy.

ivan-kleshnin avatar ivan-kleshnin commented on May 31, 2024

@awto then I agree. Generators are better fit for JS.

from imlazy.

comraq avatar comraq commented on May 31, 2024

@awto "no concatenation" really is taking the concat out of transfromations such as map filter and etc...

so map, filter, take, and all those functions can now operate on any data structure (because the user supplies the reducing function thats appropriate for the data structure). Ie, linked list, trees and other classes with their own "concat" method (not just limited to typical collections such as arrays)

from imlazy.

awto avatar awto commented on May 31, 2024

@comraq not reduce function but consumer and producer, Clojure reducer is just an instance of consumer from Oleg's paper, there in the paper they use them for pretty printing document, and the computation is separated into a few stages, and has linear time complexity. I also use this approach for abstract syntax trees transformation of SQL and JavaScript. So I easely join small transformations into larger one without overhead. There is, for example, my small estree-transducers project with very simple producer and consumers for ESTree traversal.

from imlazy.

Related Issues (19)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.