Sample Video Frame
30: Callbacks, Events, Promises and Async
This exercise is very large and you must watch the video to really understand it. I am going to introduce you to all the various ways that JavaScript handles events using callbacks, promises, and a new thing called a async/await. This exercise is long because I am going to actually build a version of a simple function to read a file in each style. I am then going to critique and re-factor each code sample in the video so you can see how I would redo this or simply avoid the example.
I'm going to be honest with you and say that JavaScript is absolutely insane when it comes to handling computation with callbacks and events. Node.JS famously declared that events were the easiest way to perform computations and then promptly decided that every computation would be handled with callbacks. This misunderstanding of event-based processing meant that they stumbled around for years trying to come up with various solutions to the problem of handling I/O. This means whenever you see code you are going to run into as many as four styles of programming depending on how old the code is.
What I'm hoping to do is show you all the different styles you might run into, show you the problems with each style, and show you how to work around the problems or simply avoid them. This will make the exercise large but it's a good final exercise for the first half of the book. Take your time, watch the videos, and really study the code so that you understand how this is done.
Finally, I am purposefully choosing a particular API that is difficult to use because it demonstrates this flaw in JavaScript and Node.JS. I believe that most people would simply avoid this API and use any of the others, but this kind of problem comes up often enough that you would need to know how to handle this.
The Requirements
We are going to do something that is very simple:
- Open a file.
- Get its size.
- Create a buffer for that size.
- Read the contents of the file into the buffer.
In most languages (including JavaScript) you would do this usually with a single function call. However, a good test of the design of a programming language is how difficult it is to do this simple set of operations. In our tests we're going to use a simple file named test.txt
to do each test.
Callback Style
In the first example I'm going to show you what is called "callback style" JavaScript code. You'll notice that the sequence of callback functions becomes a sort of nested structure where you handle what is normally a straight line process. The other significant thing with this is I have to give it a callback in order to get the final operation. In the video I talk about this particular code and the implications of that.
const fs = require('fs');
const read_file = (fname, cb) => {
fs.stat(fname, (err, stats) => {
fs.open(fname, 'r', (err, fd) => {
let inbuf = Buffer.alloc(stats.size);
fs.read(fd, inbuf, 0, stats.size, null, (err, bytesRead, buffer) => {
cb(buffer);
});
});
});
}
read_file('test.txt', (result) => {
console.log(`Result is ${result.toString()}`);
});
When you run this code you simply see contents of the file:
$ node "cbstart.js" test.txt
Result is I am a small file.
Callback Hell
The initial code seems not too bad but then we have to add error handling. Suddenly the code complexity is much higher with nested if statements that call nested function calls that have nested callbacks that have nothing that does nesting and has errors. In the video I attempt to re-factor this code to possibly make it a little nicer but you'll see that it is actually very difficult to keep this code straight.
const fs = require('fs');
const read_file = (fname, cb) => {
fs.stat(fname, (err, stats) => {
if(err) {
cb(err, null);
} else {
fs.open(fname, 'r', (err, fd) => {
if(err) {
cb(err, null);
} else {
let inbuf = Buffer.alloc(stats.size);
fs.read(fd, inbuf, 0, stats.size, null, (err, bytesRead, buffer) => {
if(err) {
cb(err, null);
} else {
cb(err, buffer);
}
});
}
});
}
});
}
read_file('test.txt', (err, result) => {
if(err) {
console.log(err);
} else {
console.log(`Result is ${result.toString()}`);
}
});
When we run this code is the same output but now the code can handle an error. Try getting the file name wrong.
$ node "cbhell.js" test.txt
Result is I am a small file.
Event Style
Node.JS advocates an events based style of processing data, but frequently uses the callback style above. This confusion of the difference between simply having a callback and producing/consuming events leads to quite a few bad designs in the Node.JS ecosystem. When you code in a browser you use a mostly correct event-based system. When you code in Node.JS you typically deal with the callback style of code above.
In this code I create a first version of an event style in the same way you might find in a typical web browser. In this style you give the function a set of callbacks for each event, and it calls the proper function when something happens. You should see that it's basically the same as the callback style because I'm forced to use the Node.JS callbacks. However, now I have the option of intercepting each event as things happen and my error handling is a little easier in the next step.
const fs = require('fs');
const read_file = (fname, events) => {
let noop = () => {};
let onStat = events.onStat || noop;
let onOpen = events.onOpen || noop;
let onRead = events.onRead || noop;
fs.stat(fname, (err, stats) => {
onStat(stats);
fs.open(fname, 'r', (err, fd) => {
onOpen(fd);
let inbuf = Buffer.alloc(stats.size);
fs.read(fd, inbuf, 0, stats.size, null, (err, bytesRead, buffer) => {
onRead(bytesRead, buffer);
});
});
});
}
read_file('test.txt', {
onRead: (bytesRead, buffer) => {
console.log(`Read ${bytesRead} bytes: ${buffer.toString()}`);
},
onStat: (stat) => {
console.log(`Got stats, file is ${stat.size} size.`);
},
onOpen: (fd) => {
console.log(`Open worked, fd is ${fd}`);
}
});
When you run this example you get more output because we have access to more of the events inside the function.
$ node "event.js" test.txt
Got stats, file is 19 size.
Open worked, fd is 18
Read 19 bytes: I am a small file.
Event Hell
As with the callback, style we add error handling, but it takes less code to handle the errors. The reason there isn't much more code in this version is because the previous event-based system mostly had error handling in it and we just needed to handle the error conditions from the callbacks.
const fs = require('fs');
const read_file = (fname, events) => {
let noop = () => {};
let onError = events.onError || noop;
let onStat = events.onStat || noop;
let onOpen = events.onOpen || noop;
let onRead = events.onRead || noop;
fs.stat(fname, (err, stats) => {
if(err) {
onError(err);
} else {
onStat(stats);
fs.open(fname, 'r', (err, fd) => {
if(err) {
onError(err);
} else {
onOpen(fd);
let inbuf = Buffer.alloc(stats.size);
fs.read(fd, inbuf, 0, stats.size, null, (err, bytesRead, buffer) => {
if(err) {
onError(err);
} else {
onRead(bytesRead, buffer);
}
});
}
});
}
});
}
read_file('test.txt', {
onRead: (bytesRead, buffer) => {
console.log(`Read ${bytesRead} bytes: ${buffer.toString()}`);
},
onStat: (stat) => {
console.log(`Got stats, file is ${stat.size} size.`);
},
onOpen: (fd) => {
console.log(`Open worked, fd is ${fd}`);
},
onError: (err) => {
console.error(err);
}
});
When you run this version you receive the same output as the previous version but now it should be able to handle errors.
$ node "eventhell.js" test.txt
Got stats, file is 19 size.
Open worked, fd is 18
Read 19 bytes: I am a small file.
Events vs. Callbacks
When you look at the previous examples of event versus callback styles of code you should realize there isn't too much of an advantage of one over the other. They are about the same complexity, but with events you at least have access to everything that's going on. This helps with error handling because you can receive an error callback at any part of the operation. The Node.JS standard library comes with something called "streams" that support this exact method of dealing with most of your I/O.
The thing to keep in mind is that you are usually forced to use one or the other, and it is fairly arbitrary which you have to use. You'll see that even when I start trying to use the new promises feature in JavaScript I still have to do nested callbacks so I can reuse calculations from previous function calls. I get into this more in the video so you should watch that to understand this particular criticism of this style of programming and how to deal with it.
Promise Style
After all that struggle with "event over threads" the Node.JS community finally realized it was stupid and came up the Promise
API. Promises are a compromise between having callbacks (I'm sorry, "events") but also dealing with asynchronous events cleanly. The Promise
system helps to linearlize the previously nested callbacks needed. A quick example using the timeout API is this:
const sleeper = (timeout) => {
return new Promise((resolve, reject) => {
setTimeout(() => resolve('DONE'), timeout);
});
}
let wait1 = sleeper(100);
wait1.then(x => console.log("Done.", x));
In this example I've created a function which gives us a timeout, but returns a Promise so that we can wait for the timeout to finish. Normally you would need to have a callback which then caused it to not really wait, so promises are much better. Watch the video so you can see how promises actually work and get a better understanding of this part.
Here is how we use promises to rewrite our code:
const fs = require('fs').promises;
// you have to do nested calls any time you need the result of the previous calculation
const read_file = (fname) => {
fs.open(fname, 'r').then((fh) => {
fh.stat().then((stat) => {
let buf = Buffer.alloc(stat.size);
fh.read(buf, 0, stat.size, null)
.then((result) => {
console.log(`Read ${result.bytesRead} bytes: ${result.buffer.toString()}`);
}).catch((err) => {
console.error(err);
});
}).catch((err) => console.error(err));
}).catch((err) => console.error(err));
}
read_file('test.txt');
When you run this code you see this:
$ node "promises.js" test.txt
Read 19 bytes: I am a small file.
The Secret Promise Feature
However, I sent this code to my friend Thomas who said that I didn't know about this one secret feature nobody tells you about:
const fs = require('fs').promises;
const read_file = (fname) =>
fs.open(fname, 'r')
.then(fh => fh.stat().then(stat => [fh, stat]))
.then(res =>
res[0].read(Buffer.alloc(res[1].size), 0, res[1].size, null))
.then(result =>
console.log(`Read ${result.bytesRead} bytes: ${result.buffer.toString()}`))
.catch(err => console.error(err))
read_file('test.txt');
My mind boggles at how much simpler this is and the feature I didn't know is this:
If a promise returns another promise, then the next .then() works on that new promise. But, if it returns anything else that returned value becomes the parameter of the next .then().
This is confusing, so in the video I will demonstrate some code that shows how promises are working, and how this code works compared to my version, and how to take advantage of this feature.
Async/Await Style
The final style we will explore is the async/await
style of programming which uses promises but allows you to synchronize asynchronous code. To use this style of code you have to add a special keyword "async" to the function, and use the "await" keyword when you call the function that has a promise. Other than that it handles the majority of the problems related to overly nested and complicated callback code. It also allows you to use exceptions to catch errors rather than callbacks.
let fs = require('fs').promises
const read_file = async (fname) => {
try {
let file = await fs.open(fname, 'r');
let stat = await file.stat();
let buffer = Buffer.alloc(stat.size);
let result = await file.read(buffer, 0, stat.size, null);
console.log(`Read ${result.bytesRead} bytes: ${result.buffer.toString()}`);
} catch(err) {
console.log("ERROR", err);
}
}
// see the async.mjs version
read_file('test.txt');
When you run this code you see the output is the same and you didn't have to do any nested function calls or anything. This is a major advantage over the previous ways of doing things, but it is very new so I would expect it to change in the future.
$ node "async.js" test.txt
Read 19 bytes: I am a small file.
Modules and Async/Await
At the end of the async.js
code above I didn't put await read_file('test.txt')
. It still works, but it's really not correct. We'd want to add the await
keyword so the script waits for the read_file
function, but if we do that we get this error:
$ node async.js
Error: Command failed: node "async.js" test.txt
async.js:17
await read_file('test.txt');
^^^^^
SyntaxError: await is only valid in async functions and the top level bodies of modules
There's three ways to fix this:
- If you have to stick to regular Node.JS "common modules" then convert this final line to a
Promise
. - Rename the file
async.mjs
. The.mjs
tellsnode
"this is a module, time to grow up." - Add to your
package.json
file the line"type": "module"
to say everything should be a new ES6 style module. This has far reaching implications we'll explore when we start making projects, but now you have to rename any file that needs to be a "common module" to end in.cjs
. You'll see how that works in later modules.
Copy the async.js
file to async.mjs
then change the first and last lines like this:
import { promises as fs } from "fs";
const read_file = async (fname) => {
try {
let file = await fs.open(fname, 'r');
let stat = await file.stat();
let buffer = Buffer.alloc(stat.size);
let result = await file.read(buffer, 0, stat.size, null);
console.log(`Read ${result.bytesRead} bytes: ${result.buffer.toString()}`);
} catch(err) {
console.log("ERROR", err);
}
}
// This only works if your package.json has "type": "module",
// or if you name the file async.mjs.
await read_file('test.txt');
Be sure to pay special attention to that import
at the top and how it's different from the usual require()
you use. This is modern module syntax that we'll explore in-depth in later lessons.
Now when you run it there should be no errors:
$ node "async.mjs"
Read 19 bytes: I am a small file.
Learn JavaScript Today
Register today for the Early Access course and get the all currently available videos and lessons, plus all future modules for no extra charge.
Still Not Sure? Try the next FREE Lesson!