Code Monkey home page Code Monkey logo

Comments (13)

nex3 avatar nex3 commented on June 12, 2024 1

Alright, SGTM. Convenience constructor as a compromise? i.e. AsyncCache.noDuration()

Maybe AsyncCache.ephemeral()?

from async.

matanlurey avatar matanlurey commented on June 12, 2024 1

Woohoo!

from async.

nex3 avatar nex3 commented on June 12, 2024

I'm not sure I totally understand your example. You have three calls to expensiveCall(), but the comments indicate that "CALL MADE" is only printed twice. Does this mean the synchronizer would drop calls on the floor if a call was already in progress?

from async.

matanlurey avatar matanlurey commented on June 12, 2024

What I mean is that it would just use the response of the (pending) future on calls being made before that future completes.

from async.

nex3 avatar nex3 commented on June 12, 2024

That seems like it would produce a high risk of race conditions, since the difference between "use the value being computed" and "compute a new value from scratch" could just be a few microtasks—and presumably that distinction matters, or the user would just use AsyncMemoizer and always have the computed value available.

Maybe I'd understand better with a real-world example—what's the motivating use-case here?

from async.

matanlurey avatar matanlurey commented on June 12, 2024

Sure, so let's say you have an "expensive operation*"

*Read a large file from the disk, make a network request, spawn an isolate and do work

To your users, you'd like to present a simplified API:

abstract class ExpensiveWorkService {
  Future<String> compute();
}

If a user calls compute multiple times though, you don't want to read/fetch/spawn multiple times.

To simplify your implementation, you could use SynchronizedFuture:

class _ExpensiveWorkServiceImpl implements ExpensiveWorkService {
  Future<String> _doCompute() => ...

  @override
  Future<String> compute() => new SynchronizedFuture(_doCompute);
}

So if a user does something like:

// I'm bound to a UI button that is clickable.
onClickCompute() async {
  printResults(expensiveWorkService.compute());
}

You can rest assured that this call will only be made once if they click multiple times but the previous call has not yet completed. In this way, it works like AsyncMemoizer. In fact, you could implement it with one and clear out the memoizer and re-create it over and over.

class _ExpensiveWorkServiceImpl implements ExpensiveWorkService {
  AsyncMemoizer _doComputeMemoizer;

  Future<String> _doCompute() => ...

  @override
  Future<String> compute()  {
    if (_doComputeMemoizer == null) {
      _doComputeMemoizer = new AsyncMemoizer();
      _doComputeMemoizer.runOnce(_doCompute).then((_) {
        _doComputeMemoizer = null;
      });
    }
    return _doComputeMemoizer.runOnce(_doCompute);
  }
}

This pattern appears a lot internally, and usually be using lots of completers or other nonsense, so I'd like to have something I can recommend to people without lots of boilerplate.

from async.

nex3 avatar nex3 commented on June 12, 2024

In that case, why not just use an AsyncMemoizer? Why do you want to re-do the expensive work if a request comes in too late?

from async.

matanlurey avatar matanlurey commented on June 12, 2024

So if you are fetching resources from the server or reading from disk, it's pretty likely they will eventually change - if they won't, async memoizer is great. Otherwise let's say I'm fetching a list of online users... memoizer isn't so great.

(This is different than a cache, which users might do - it's to prevent making multiple expensive calls or using network/file resources when they will all return the same result - not for transactions obviously)

from async.

nex3 avatar nex3 commented on June 12, 2024

How do you know they'll all return the same result? That's the fundamental thing that worries me about this API. It seems like it's using the time it takes to do the computation as a heuristic for how long the computation will stay fresh, but there's no guarantee that that heuristic is accurate. If it's wrong, you've got a race condition where a call can return bogus results depending on how fast a previous call completed.

I'd be a lot more comfortable with an API that requires users to be explicit about the freshness of the result. Something like this:

class AsyncCache<T> {
  /// Creates a cache that invalidates its contents after [duration] has passed.
  AsyncCache(Duration duration);

  /// Returns a cached value or runs [callback] to compute a new one.
  ///
  /// If [callback] has been run recently enough, returns its previous return
  /// value. Otherwise, runs [callback] and returns its new return value.
  Future<T> fetch(Future<T> callback());
}

class Example {
  Future<String> expensiveCall() => _expensiveCallCache.fetch(() async {
    print('CALL MADE');
    return 'Hello';
  });
  final _expensiveCallCache = new AsyncCache<String>(new Duration(seconds: 2));
}

from async.

matanlurey avatar matanlurey commented on June 12, 2024

I think that would be a reasonable compromise, thanks for the feedback.

My only suggestion is to allow duration to be optional (default to Duration.ZERO) for users that just don't want two outgoing requests at the same time (i.e. above use case). We could clearly document that behavior and encourage a meaningful duration, though.

How do you feel about a similar API for streams? (Separate PR)

EDIT: My only other question would be how users are expected to test - this likely would not honor fake async unless we use a timer to "clear" the freshness flag, which is not my favorite idea.

from async.

nex3 avatar nex3 commented on June 12, 2024

My only suggestion is to allow duration to be optional (default to Duration.ZERO) for users that just don't want two outgoing requests at the same time (i.e. above use case). We could clearly document that behavior and encourage a meaningful duration, though.

We can allow Duration.ZERO, but I don't think it should be the default. I want to make sure all the users of this API think about their freshness guarantees, so at least they're aware of the potential for race conditions.

How do you feel about a similar API for streams? (Separate PR)

That sounds reasonable. We could probably do this as another method on AsyncCache. I think we'd probably want to start counting from the stream's "done" event, though—that way there's more consistency between calling fetch(() => stream.toList()) and fetchStream(() => stream).

from async.

matanlurey avatar matanlurey commented on June 12, 2024

Alright, SGTM. Convenience constructor as a compromise? i.e. AsyncCache.noDuration()*

*Open to alternative names

+1 to another method of AsyncCache for streams

from async.

matanlurey avatar matanlurey commented on June 12, 2024

+1

from async.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.