
The NodeJS Error: EMFILE, Too Many Open Files is a common pain point for developers working with file-intensive systems. This extensive guide dives deep into the causes, real-world scenarios, and solutions—including best practices—for debugging and resolving this error efficiently.
Throughout this article, you’ll find:
- Clear explanations broken down step by step
- Annotated and copy-ready code snippets
- Actionable fixes for devs at every level
If you’d like to skip directly to solutions, feel free to scroll to the “How to Fix EMFILE in NodeJS” section—but you’ll miss valuable context.
What Is the EMFILE Error?
In Unix-like systems, every opened file, including sockets and pipes, uses a file descriptor (FD). Each process is limited by a maximum number of open FDs. When a process exceeds this OS-imposed limit, the operating system throws EMFILE: Too Many Open Files
, preventing further file operations.
Why It Occurs in Node.js
Node.js’s asynchronous, non-blocking I/O encourages opening many files at once—like for batch reads, scraping, or streaming. If you fire off hundreds or thousands of parallel file ops without controlling concurrency, you quickly hit that FD limit .
Where You’ll Encounter It
This error commonly arises in scenarios like:
- Batch file read/write loops
- Directory-wide recursive filesystem operations
- Image and log I/O heavy workloads
- Watchers (e.g., Metro, webpack, React Native)—file monitoring can overwhelm the descriptor limit.
What Does EMFILE Mean?
EMFILE
literally means: Your process has too many files open. It’s not specific to Node; it’s an operating system-level error indicating your app has hit resource limits.
Sample stack trace:
javascriptCopyEditError: EMFILE: too many open files, open 'src/data/file1.txt'
at Object.openSync (fs.js:xxx)
at Object.readFileSync (fs.js:xxx)
at /project/scripts/process.js:45:13
...
You’ll often see file functions like fs.readFile
, fs.open
, or fs.watch
triggering it.
Common Scenarios That Trigger EMFILE
- Untamed
fs
loops
Spinning through hundreds of files without concurrency control. - Hanging file descriptors
Forgetting to close opened files or streams. - Recursive directory processing
Usingfs.readdir
deeply across thousands of nested paths. - Massive I/O tasks
Tasks like image resizing or log bulk writes. - File watchers in large codebases
React Native’s Metro or webpack watchers hitting descriptor limits.
Demonstration: Triggering EMFILE
jsCopyEdit// example-trigger.js
const fs = require('fs');
for (let i = 0; i < 10000; i++) {
fs.readFile(`data/file${i}.txt`, (err, data) => {
if (err && err.code === 'EMFILE') {
console.error('EMFILE limit reached at iteration', i);
}
// ...
});
}
This naive loop opens thousands of simultaneous file reads—guaranteed to hit the descriptor ceiling on most systems.
How to Fix EMFILE in NodeJS?
1. Use graceful-fs
A seamless, drop-in override that queues fs
ops and retries on transient errors:
jsCopyEditconst fs = require('graceful-fs'); // replaces native 'fs'
// Using as normal:
fs.readFile('path/file.txt', (err, data) => {
if (err) console.error(err);
});
2. Throttle Using async
or p-limit
A) async.queue
jsCopyEditconst fs = require('fs');
const async = require('async');
const q = async.queue((file, cb) => {
fs.readFile(file, 'utf8', cb);
}, 5); // only 5 files at once :contentReference[oaicite:33]{index=33}
files.forEach(f => q.push(f, (err, data) => {
if (err) console.error(`Error reading ${f}`, err);
}));
B) p-limit
jsCopyEditimport pLimit from 'p-limit';
const limit = pLimit(5); // max concurrency = 5
const tasks = files.map(file =>
limit(() =>
fs.promises.readFile(file, 'utf8')
.then(data => console.log(`Read ${file}`))
)
);
await Promise.all(tasks);
Simple and flexible .
3. Properly Close File Descriptors
Always close streams, especially when using fs.open
or createWriteStream
:
jsCopyEditconst fd = fs.openSync('file.txt', 'r');
// ... use fd ...
fs.closeSync(fd); // mandatory!
Streams handle closure automatically under normal use—but monitor leaks in unusual workflows.
4. Increase OS File Descriptor Limit (ulimit
)
Linux/Mac:
bashCopyEditulimit -n 8192 # temporary
sudo nano /etc/security/limits.conf
# add:
# youruser soft nofile 8192
# youruser hard nofile 16384
# then reboot
This raises global limits; use only if you’ve optimized concurrency .
5. Use Streams, Not Bulk Reads
Prefer streams for large files instead of loading entire files into memory:
jsCopyEditconst rs = fs.createReadStream('bigfile.bin');
const ws = fs.createWriteStream('copy.bin');
rs.pipe(ws);
ws.on('finish', () => console.log('Copied successfully.'));
This approach conserves descriptors, buffers, and memory.
Best Practices to Prevent EMFILE
- Prefer streams for large I/O
- Monitor FD usage (
lsof -p PID
) - Limit concurrency (async queues, p-limit)
- Always close file handles/streams
- Use
graceful-fs
for built-in retry and queue - Raise
ulimit -n
only as last resort
FAQs
Q: What is EMFILE in Node.js?
A: It’s an operating system error thrown when your app opens more file descriptors than allowed.
Q: How to fix “too many open files” error?
A: Use throttling, streams, close descriptors, use graceful-fs
, or raise ulimit
.
Q: Is increasing ulimit
safe?
A: Within reason, yes. But it’s only a NUMERIC increase—not a fix for code-level leaks or unclosed resources.
Q: How to prevent EMFILE in file-intensive apps?
A: Adopt practices above: controlled concurrency, use of streams, proper resource cleanup.
Q: Can EMFILE crash the Node.js app?
A: Yes. Uncaught EMFILE errors (especially with fs.readFileSync
) can bring down the entire process.
Conclusion
Asynchronous nature in Node.js opens the door to high I/O performance—but also to descriptor limits. When you encounter NodeJS Error: EMFILE, Too Many Open Files, follow this prioritized action list:
- Throttle work (
async
,p-limit
) - Use streams, not bulk operations
- Leverage
graceful-fs
for queuing - Monitor and close FDs correctly
- Increase
ulimit
only after code fixes
By applying these, you’ll build more resilient, high-performance apps that scale without crashing. For deep I/O tasks or file streaming services, embracing best practices not only fixes problems but prevents them.