Table of contents
The following piece talks about a not-so-talked-about, but a very important concept in JavaScript - events, a ubiquitous term in the realm of web development.
So, let's geek out, shall we? 😊
Setting the stage
When it comes to writing and organizing code, there are a few fundamental styles, formally referred to as programming paradigms. Each paradigm, equipped with its own unique principles, highlights a different approach to structuring software.
Event-driven programming (EDP) is one such approach. Here, a program waits for an event, such as a mouse click, and then responds to it by invoking specific handlers asynchronously. This non-blocking approach is effective and convenient for interactive web apps and is at the core of server-side JS environments like Node.
Consider the following Express routing snippet:
// Part of an Express app listening at port 8080
app.get('/', (req, res) => res.status(200).send('I am root'));
The above logic is entirely event-driven!
Behind the scenes, an Express app creates a server instance using the http
Node module. Now whenever a client hits the server, it triggers a “request” event, which further invokes the corresponding event handler. Express abstracts this event handling through its middleware system to process a request.
Let us look at another common event-driven scenario:
<!-- some is a function that is called only when user clicks the button -->
<button type="button" onclick="some()"> Click Me </button>
In the above snippet, the onclick
attribute essentially attaches an event listener to the button element for "click" events. The required hardware-level logic to intercept mouse clicks is already taken care of by the browser. Thus, when a user clicks the button, some
function—the event handler—is called.
I hope the above examples provide enough motivation for you to read on and learn about events :)
Events
To get a sense of how EDP works in Node.js environment, we need an instance of the EventEmitter
class:
import { EventEmitter } from "events";
const eventEmitter = new EventEmitter(); // Instantiation
Now consider the following code-block and keep an eye on the inline comments:
// Define an event listener for "greet" event
eventEmitter.on('greet', (name) => console.log('Hello', name));
// Trigger a "greet" event
eventEmitter.emit('greet', 'Rob');
// Output: Hello Rob
Explanation
The
on
method is used to define (or add) an event listener for a specified event such asgreet
. Its callback argument is the event handler—the function to be called when the event occurs.The formal terminology for the occurrence of any event is event emission. That being said, the
emit
method, as the name suggests, is used to trigger an event.- The subsequent argument(s) in
emit
are used to pass data to the event handler, e.g. in our case,'Rob'
is passed to thename
parameter of the callback.
- The subsequent argument(s) in
What if I want my program to respond to an event only once?
Just replace on
with once
; the mechanics remain the same.
I think the code below is self-evident. If not, feel free to put your query in the comments :)
eventEmitter.once('start', () => console.log('I am awake'));
eventEmitter.on('end', () => console.log('I am asleep'));
eventEmitter.emit('start'); // Prints 'I am awake'
eventEmitter.emit('start'); // This "start" event is ignored
eventEmitter.emit('end'); // Prints 'I am asleep'
Hide your Emissions
As I mentioned earlier, although it is not obvious from all its fancy middleware and routing mechanisms, Express is event-driven. So, what I’ll try to do next is give you a sense of how frameworks like Express hides all the event-related drama.
At first, take a look at the code, and then I’ll explain the not-so-obvious parts.
import { EventEmitter } from "events";
// Create a child class
class MyEmitter extends EventEmitter {
// Define a method that emits an event
customMethod(data) {
console.log('This is a custom method');
this.emit('baby', data);
}
}
// Instantiate the child class
const myEmitter = new MyEmitter();
// Define an event listener
myEmitter.on('baby', (data) => console.log('Data received:', data));
// Call the custom method
myEmitter.customMethod('passwd');
Explanation
The
MyEmitter
class inherits all the attributes ofEventEmitter
and also has a custom method of its own.Within
customMethod
,this
keyword gives access to theemit
method from the parent class to trigger a 'baby' event and passdata
to the event handler.Next, courtesy of inheritance,
on
method defines an event listener for 'baby' event.The last line calls
customMethod
, which emits a baby event, invoking the corresponding event handler. This callback receives'passwd'
as input, which is printed to the console.
Now suppose I offer the myEmitter
instance in a package to the public. Then, anyone can utilize customMethod
like an ordinary function without having to worry about events.
Voila!
Your event-driven logic has been abstracted :)
Something Practical
Okay so, first of all, I want you to do something.
Find a 1GB text file and pass it to the readFile
method as shown below:
import { readFile } from "fs/promises";
async function readBigFile(file) {
try {
const data = await readFile(file, { encoding: 'utf-8' });
console.log(data);
} catch (error) {
console.log(error);
}
}
readBigFile('1gb.txt');
If everything goes right, you will see a cute error message that says:
RangeError: Invalid string length
In simple words, it means that when it comes to large files, the ordinary file system methods are useless. But why?! 🤔
Well, readFile
tries to load the entire file, all of 1 gigabyte, into memory, but the Node.js process is unable to allocate enough space on the heap for the file contents. Hence, the error.
Alright. So, how do we read the file, then?
Streams
The idea is to read the file a little bit at a time, in chunks!
Streams in Node.js are objects that enable continuous read or write operations without having to load the entire data into memory.
Consider the following code where I’m using the createReadStream
method to read my huge file in chunks:
import fs from "fs";
// Creates a stream to continuously read data
const readStream = fs.createReadStream('1gb.txt');
readStream.on('data', (chunk) => console.log(chunk));
readStream.on('end', () => console.log('Read Completed!'));
readStream.on('error', (err) => console.error(err));
The syntax looks familiar, doesn’t it?
Yup! You guessed it, this whole chunks drama is event-driven.
Every time a chunk of the file is loaded into memory, the
data
event is emitted and its corresponding handler prints the chunk to the console.When there is no more chunks left to be read, the
end
event is emitted, triggering the respective handler.To handle any errors during the whole operation, an
error
event handler has also been specified.
N.B. The chunk is actually raw binary data of ~65KB coming from a Buffer. A Buffer is a temporary storage space for binary data.
Btw, your console will look something like this… you’re welcome! 😉
Now that we can read a large file, we can use the createWriteStream
method to write the contents of the file in chunks. The following piece of code will give you a glimpse of what I’m trying to say:
import fs from "fs";
const readStream = fs.createReadStream('1gb.txt');
const writeStream = fs.createWriteStream('new.txt');
readStream.on('data', (chunk) => {
writeStream.write(chunk.toString().toUpperCase());
});
readStream.on('end', () => {
console.log('Read Completed!');
writeStream.end(); // Emits a "finish" event
});
readStream.on('error', (err) => console.error(err));
// Handler for the "finish" event
writeStream.on('finish', () => console.log('Write Completed!'));
For every data
event, the write
method writes the UTF-8 uppercase equivalent of the Buffer to the new.txt
file.
THE END.
Immensely grateful for your time. I truly hope I was able to offer you something of value. See you soon!
Peace 💜