(This post explains how to use generators to wrangle duplicate calls to async
functions. Check out this gist for the final approach or read on to learn more! š)
JavaScript is a twisty maze of horrible asynchronous calls, all alike. We've all written code like thisābut in this post, I'll talk about async
and await
. These are keywords that are widely supported and help you migrate that code to something much more readable. šš
And most importantly, I'll cover a key pitfall: how to deal with an asynchronous method being run more than once, so that it doesn't clobber other work. šš„
Let's start with the example. This function will fetch some content, display it to the screen and wait a few seconds before drawing attention to it:
function fetchAndFlash(page) {
const jsonPromise = fetch('/api/info?p=' + page)
.then((response) => response.json());
jsonPromise.then((json) => {
infoNode.innerHTML = json.html;
setTimeout(() => {
flashForAttention(infoNode);
}, 5000);
});
}
Now we can rewrite this with async
and await
like this, with no callbacks:
async function fetchAndFlash(page) {
const response = await fetch('/api/info?p=' + page);
const json = await response.json();
infoNode.innerHTML = json.html;
// a bit awkward, but you can make this a helper method
await new Promise((resolve) => setTimeout(resolve, 5000));
flashForAttention(infoNode);
}
Isn't that nicer? It jumps around and it's easy to see the steps top-to-bottom: fetch a resource, convert it to JSON, write to the page, wait five seconds and call another method. š
It's A Trap!
But there's something here which can confuse readers. This isn't a regular function which is executed "all at once"āevery time we call await
, we basically defer to the browser's event loop so it can keep working. ā”š¤
To put it another way: let's say you're reading code that uses fetchAndFlash()
. If you hadn't read the title of this post, what might you expect to happen if you run this code?
fetchAndFlash('page1');
fetchAndFlash('page2');
You might expect that one will happen after the other, or that one will cancel the other. That's not the caseāboth will run more or less in parallel (because JavaScript can't block while we wait), finish in either order, and it's not clear what HTML will end up on your page. ā ļø
To be clear, the callback-based version of this method had exactly the same problem, but it was more apparentāin a very disgusting kind of way. In modernizing our code to use async
and await
, we make it more ambiguous. š
Let's cover a few different approaches to solving this problem. Strap in! š¢
Approach #1: The Chain
Depending on how and why you're calling an async
method, it might be able to 'chain' them one after another. Let's say you are handling a click event:
let p = Promise.resolve(true);
loadButton.onclick = () => {
const pageToLoad = pageToLoadInput.value;
// wait for previous task to finish before doing more work
p = p.then(() => fetchAndFlash(pageToLoad));
};
Every time you click, you add another task to the chain. We could also generalize this with a helper function:
// makes any function a chainable function
function makeChainable(fn) {
let p = Promise.resolve(true);
return (...args) => {
p = p.then(() => fn(...args));
return p;
};
}
const fetchAndFlashChain = makeChainable(fetchAndFlash);
Now, you can just call fetchAndFlashChain()
and it'll happen in-order after any other call to fetchAndFlashChain()
. š
But that's not the proposal in this blog postāwhat if we want to cancel the previous operation? Your user has just clicked on a different load button, so they probably don't care about the previous thing. š
Approach #2: Barrier Checks
Inside our modernized fetchAndFlash()
, we use the await
keyword three times, and only really for two different reasons:
- to do the network fetch
- to flash after waiting 5 seconds
After both these points, we could stop and askā"hey, are we still the most active task? The thing the user most recently wanted to do?" š¤š
We can do this by marking each distinct operation with a nonce. This means creating a unique object, storing this locally and globally, and seeing if the global version divergesābecause another operation has startedāfrom the local one.
Here's our updated fetchAndFlash()
method:
let globalFetchAndFlashNonce;
async function fetchAndFlash(page) {
const localNonce = globalFetchAndFlashNonce = new Object();
const response = await fetch('/api/info?p=' + page);
const json = await response.json();
// IMMEDIATELY check
if (localNonce !== globalFetchAndFlashNonce) { return; }
infoNode.innerHTML = json.html;
await new Promise((resolve) => setTimeout(resolve, 5000));
// IMMEDIATELY check
if (localNonce !== globalFetchAndFlashNonce) { return; }
flashForAttention(infoNode);
}
This works fine, but is a bit of a mouthful. It's also not easy to generalize and you have to remember to add checks everywhere it matters!
There is one way, thoughāusing generators to generalize for us.
Background: Generators
While await
defers execution until the thing it's waiting for finishesāin our case, either a network request or just waiting for a timeoutāa generator function basically does the opposite, moving execution back to where it was being called from.
Confused? It's worth a quick rehash:
function* myGenerator() {
const finalOut = 300;
yield 1;
yield 20;
yield finalOut;
}
for (const x of myGenerator()) {
console.info(x);
}
// or, slightly longer (but exactly the same output)
const iterator = myGenerator();
for (;;) {
const next = iterator.next();
if (next.done) {
break;
}
console.info(next.value);
}
This program, both versions, will print 1, 20 and 300. What's interesting is that I can do whatever else I like inside either for
loop, including break
early, and all the state inside myGenerator
stays the sameāany variable I declare, and where I'm up to.
It's not visible here, but the code calling the generator (and specifically the .next()
function of the iterator it returns) can also resume it with a variable. We'll see how soon.
We can use these parts together to just not continue working on some task if we decide to stop, and also to resume execution with some output. Hmmāsounds perfect for our problem! ā
The Solution š
Let's rewrite fetchAndFlash()
for the last time. We literally just change the function type itself, and swap await
with yield
: the caller can wait for usāwe'll see how next:
function* fetchAndFlash(page) {
const response = yield fetch('/api/info?p=' + page);
const json = yield response.json();
infoNode.innerHTML = json.html;
yield new Promise((resolve) => setTimeout(resolve, 5000));
flashForAttention(infoNode);
}
This code doesn't really make sense right now, and it'll crash if we try to use it. The point of yielding each Promise
is that now, some function that calls this generator can do the await
for us, including checking a nonce. You now just don't have to care about inserting these lines whenever you wait to wait for somethingāyou just have to use yield
.
And most importantly, because this method is now a generator, not an async
function, the await
keyword is actually an error. This is the absolute best way to ensure you write correct code! šØ
What is that function we need? Well, here it isāthe real magic of this post:
function makeSingle(generator) {
let globalNonce;
return async function(...args) {
const localNonce = globalNonce = new Object();
const iter = generator(...args);
let resumeValue;
for (;;) {
const n = iter.next(resumeValue);
if (n.done) {
return n.value; // final return value of passed generator
}
// whatever the generator yielded, _now_ run await on it
resumeValue = await n.value;
if (localNonce !== globalNonce) {
return; // a new call was made
}
// next loop, we give resumeValue back to the generator
}
};
}
It's magic, but hopefully it also makes sense. We call the passed generator and get an iterator. We then await
on every value it yields, resuming with the resulting value, like a network responseāuntil the generator is done. Importantly, this lets us generalize our ability to check a global vs local nonce after each async operation.
An extension: return a special value if a new call was made, as it's useful to know if individual calls were cancelled. In the sample gist I return a Symbol
, a unique object that you can compare to.
Finally, we actually use makeSingle
and wrap up our generator for others to use, so now it works just like a regular async method:
// replaces fetchAndFlash so all callers use it as an async method
fetchAndFlash = makeSingle(fetchAndFlash);
// ... later, call it
loadButton.onclick = () => {
const pageToLoad = pageToLoadInput.value;
fetchAndFlash(pageToLoad); // will cancel previous work
};
Hooray! Now, you can call fetchAndFlash()
from wherever you like, and know that any previous calls will cancel as soon as possible.
Aside: Abortable Fetch
Keen folks might note that what I've covered above just cancels a method, but doesn't abort any in-flight work. I'm talking about fetch
, which has a somewhat-supported way to abort the network request. This might save your users bandwidth if the async function is say, downloading a really large file, which wouldn't be stopped by what we've doneāwe'd just cancel once the file has already eaten up precious bytes.
Done
If you've read this far, you've hopefully thought a bit more about the way JavaScript works.
JS can't block when you need to do asynchronous work, multiple calls to your methods can happen, and you can have strategies to deal with thatāeither chaining, or as the whole thesis of the post goes, cancelling previous calls.
Thanks for reading! š