Zoom image will be displayed
Ever wonder how JavaScript handles async code without falling apart?
That’s the event loop doing its thing. It takes care of scheduling and running tasks and callbacks in a non-blocking way, which keeps JavaScript responsive even when it’s dealing with slower operations like file I/O or network requests.
But before we dive into how the event loop actually works, let’s take a step back and look at how Node.js operates under the hood. That bit of context makes it way easier to see why the event loop is such a big deal.
So, what’s going on under the hood with Node.js?
At its core, Node.js is a runtime environment that lets us run JavaScript outside the browser. One of the coolest things about it is how it handles I/O operations stuff like reading files, talking to databases, or making network requests. These kinds of tasks are naturally slow, and in most systems, they can really bog things down.
In traditional blocking I/O models, every time an I/O operation happens, the thread just… waits. It’s basically sitting around twiddling its thumbs until the task finishes. Depending on what you’re doing, that could be a few milliseconds, or it could drag on for minutes.
Now imagine you’re building a web server with that setup. If you’re using blocking I/O, the server has to stop and wait every time it handles a request. That means only one user at a time can get through. Not exactly scalable, right?
The traditional fix? Use multiple threads to handle blocking operations simultaneously. That might look something like this:
Zoom image will be displayed
But here’s the catch: this model doesn’t scale all that well.
Spawning new threads eats up memory and CPU, which gets expensive fast. In the example we just looked at, you’re only handling, say, 3 users at a time. Not great.
And to make things worse, those threads are mostly just sitting there blocked and waiting for I/O to finish, not using the CPU, not doing useful work.
The actual I/O is handled by the OS and hardware, not the thread itself. So from a performance standpoint, that’s a lot of wasted potential.
This is where non-blocking I/O comes in.
It allows us to initiate I/O operations without waiting for them to finish. Instead, we continue executing other code, and once the I/O is done, we get notified (via a callback, a promise, etc.).
That way, the CPU stays busy doing useful work while the slow I/O happens in the background, and the system can handle tons of users way more efficiently.
Zoom image will be displayed
Node.js achieves non-blocking I/O through a combination of:
- The event loop
- Callbacks, Promises, and async/await
- libuv a C library that handles asynchronous operations under the hood
- A thread pool for certain I/O tasks that can’t be handled asynchronously at the OS level
The Event Loop in Node.js is heavily inspired by the Reactor Pattern.
The Reactor Pattern is a software design pattern commonly used to handle non-blocking I/O operations in a single-threaded, event-driven architecture. It works by listening for multiple I/O events, then dispatching the appropriate handlers (callbacks) when events are ready to be processed without blocking the main thread.
Zoom image will be displayed
This pattern allows high-performance systems to handle many simultaneous I/O operations efficiently, using minimal threads.
At its core, the event loop is the piece that ties it all together. It constantly checks for pending tasks, I/O events, or callbacks that need to be executed, and processes them in a way that keeps the application responsive and efficient.
When JavaScript runs, synchronous code executes immediately on the call stack. Asynchronous tasks like setTimeout or I/O operations (handled via Web APIs or libuv in Node.js) are offloaded, and their callbacks placed in the macrotask queue.
Meanwhile, only the callback inside .then() of a resolved promise moves to the microtask queue after the promise settles. After the call stack is empty, the event loop processes all microtasks before handling macrotasks.
This cycle ensures .then() callbacks and async/await resume before timers and I/O callbacks, keeping JavaScript efficient and non-blocking.
Node.js’s event loop runs in several phases, each handling different types of callbacks in a specific order.
These phases include:
- Timers (for setTimeout and setInterval callbacks)
- I/O polling (where most asynchronous I/O callbacks run)
- Check (for setImmediate callbacks)
- Close callbacks (like socket closures)
Between these phases, the event loop also processes microtasks such as promise resolutions and async/await continuations which always run before moving on to the next phase.
Understanding these phases helps explain why some callbacks run before others and how Node.js efficiently manages asynchronous operations without blocking.
Now that you understand the event loop phases, here’s another twist:
When Node.js starts, it doesn’t begin executing your code immediately in the timers phase. Instead, it kicks off in the poll phase first, handling I/O events and callbacks queued from previous cycles.
Now try to predict the order in which the numbers will be logged without scrolling down to see the answer.
This example mixes synchronous code, microtasks (process.nextTick, Promises), macrotasks (setTimeout, setImmediate), and even an I/O operation (fs.readFile), all of which run in different phases of Node.js’s event loop.
const fs = require('fs');console.log(1);
setTimeout(() => {
console.log(2);
}, 0);
setImmediate(() => {
console.log(3);
});
fs.readFile(__filename, () => {
console.log(4);
setTimeout(() => {
console.log(5);
});
setImmediate(() => {
console.log(6);
});
process.nextTick(() => {
console.log(7);
});
Promise.resolve().then(() => {
console.log(8);
});
});
process.nextTick(() => {
console.log(9);
});
Promise.resolve().then(() => {
console.log(10);
});
console.log(11);
Alright, let’s break this down. First, the synchronous logs 1 and 11 run right away nothing fancy there. Then we hit process.nextTick(9) and Promise.resolve().then(10), which are both microtasks, but nextTick always runs first.
After that, things get interesting. Since there’s an fs.readFile, Node starts in the poll phase, and because there’s no heavy I/O yet, setImmediate(3) actually runs before setTimeout(2).
Now the I/O callback from readFile fires, logging 4, and inside it we’ve got another round of microtasks: nextTick(7) and a resolved promise logging 8.
Finally, we wrap up with the remaining macrotasks scheduled during that callback setImmediate(6) and setTimeout(5).
Tricky, right? Especially how the presence of I/O flips the usual setTimeout vs setImmediate order.
Understanding the event loop is key to mastering asynchronous programming in JavaScript and Node.js.
It’s the engine that keeps your applications responsive, efficiently juggling synchronous and asynchronous tasks without blocking the main thread.
By grasping how tasks are queued and executed across different phases, you can write cleaner, more predictable code and avoid common pitfalls like callback hell or unexpected timing bugs.
Keep experimenting with these concepts, and soon the event loop will feel less like magic and more like a powerful tool in your developer toolkit.
.png)
