-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Behavior of .return()
ing, concurrency-wise
#12
Comments
Yep, makes sense. |
It seems fair to me to say that a lot of real sources will be generators that are queueing calls, even if the transformers are not queueing them. If that is true the source would generate the remaining requested items, and then return as requested, and no amount of cajoling by the helpers would convince it to do otherwise, right? |
Definitely agree that generators seem to be the intuitive go-to for the common mind, for me they are. I really hope that with time more of those end user use cases would involve already well-formed sources obtained from standard native APIs and or ecosystem libraries, rather than having to write substantial generator code by hand. A bit off-topic, but maybe ecosystem libraries might step in here to mitigate generator DX by offering a utility that wraps generators as-are, just changing their |
see tc39/proposal-iterator-helpers#122 I think |
Also tc39/proposal-iterator-helpers#223. Both of those discussions preceded the change to support concurrency, though. Now that we are planning to allow concurrency, I've been thinking along the same lines as the OP. |
Concurrency effectively creates a request response model, doesn't it? Isn't that a pretty reasonable way of thinking about what's going on? |
In addition to forwarding to the underlying iterator, we might want to also set an internal bit which marks the helper as "done", so that subsequent calls to It's possible that it should also immediately settle any outstanding promises from prior calls to
Arguably all async iterators are a request-response model, yes. But you could reasonably interpret I think the latter is generally more useful, and better matches real use. Consider let hasX = producer.buffered(concurrencyFactor).some(val => isX(val)); When let controller = new AbortController();
let signal = controller.signal;
let hasX = AsyncIterator.from(urls)
.map(url => fetch(url, { signal }))
.cleanup(() => controller.abort())
.buffered(concurrencyFactor)
.some(val => isX(val)); When Incidentally you may both be interested in this (long) old thread on the original async generators repo and the many, many pingbacks. |
As much as this seems an appealing and intuitive behavior to settle on, I'm also leaning towards no on this due to the many possible flexibilities it might trade away: Click to see verbose elaboration here...
Worth pointing out that I think not restricting the cases expressed above might also make iterator helpers ideally align better with the "consistency" or "structured concurrency" principles that were discussed by some participates of this thread across some other issues, if I haven't misinterpreted those hopefully 🙂 I know many of these hypothetical usage scenarios in any case require much non-trivial and imperative implementation to nail right and robustly from the user side (like maintaining references inside a source iterator to/from its current pending pulled promises), but I still tend towards not disallowing them through restriction of freedom, in the same way that implementing flows with "traditional" regular functions is also quite imperative by nature but unrestrictive. For the latter - function composition and functional helper libraries have been helping with nailing the small details robustly, while what we're doing here is the same anyways; designing composable helpers for iterators to relieve the same difficulties without overly restricting. And still, would love to see anyone coming up with more views about these dilemmas 🤞🏻
In the light of all the considerations elaborated above this, which are supportive of helpers' "transparency" to their underlying iterator, I'd say maybe better to stick to it also in this regard - to not manage any internal "done" state of their own. I agree it barely makes sense in most scenarios to forward subsequent |
This one is a really interesting point touched there. I'd like to share some semantic interpretation of my own on top of this view, in hope it might contribute something to the main subject here.
What I'm talking about is simply the difference between these two methods of consumption we're already familiar with: const iterator = getSomeIterableIterator();
(async () => {
for await (const value of iterator) {
console.log(value);
}
})();
// And later, via some other code context:
await iterator.return(); The above shows a consumer that closes the iterator mid-action - during a prior in-progress pull, because it does this in disconnection with the On the other hand, if the consumer wishes to indicate "I still care about the results of any outstanding prior requests" - it could simply just perform the closing in between iterations - when you look at this in simple eyes, it becomes obvious this essentially opts to close "without canceling outstanding requests" as we've coined it - since clearly we've first drained any outstanding request and THEN closed (with const iterator = getSomeIterableIterator();
(async () => {
for await (const value of iterator) {
console.log(value);
if (hadEnoughValues()) {
break;
}
}
})(); The bottom line in all that is the choice between closing with canceling or not canceling outstanding requests can be seen as purely a consumer choice, and how the source iterator is implemented isn't inherently concerned with it (apart from specifying how or if to actually early abort its stuff, whenever the consumer opts for this). |
As described in the
README.md
's "Concurrency" section, and also as been discussed a lot across the Add a buffering helper issue thread - the helpers are designed to NOT queue up.next()
calls like async generators would, but instead delegate each call over to the source iterator's.next()
as soon as it was made. And I absolutely support this of course.Now, I'm bringing up this issue for us to consider the effect of calls to a helper's
.return()
in the same regard as above - which I didn't see was touched yet 😃With every async generator today, a call to
.return()
gets queued on top of whatever pending.next()
calls were made prior to it (which IMO is inherently problematic since there is no guaranty when the last previous.next()
calls are finally gonna resolve, until which any underlying open resources cannot be released whatsoever). However as for our standpoint here though, I suppose a call to a helper's.return()
should just transparently forward it to the source iterator's.return()
, and consequently resolve (the promise it returns) simply as soon as the source iterator's returned promise is resolved.That makes sense, right? 🤞🏻
The text was updated successfully, but these errors were encountered: