Programming

Async/Await in forEach Loop: Issues & Fixes in JS

Learn why using async/await inside JavaScript forEach doesn't wait for promises, causing unhandled rejections and early completion. Fix with for...of, Promise.all, Promise.allSettled for sequential, parallel processing.

1 answer 5 views

What are the potential issues with using async/await inside a forEach loop in JavaScript? I’m trying to loop through an array of files and await on the contents of each file.

javascript
import fs from 'fs-promise'

async function printFiles () {
 const files = await getFilePaths() // Assume this works fine

 files.forEach(async (file) => {
 const contents = await fs.readFile(file, 'utf8')
 console.log(contents)
 })
}

printFiles()

This code does work, but could something go wrong with this approach? I had someone tell me that you’re not supposed to use async/await in a higher-order function like this, so I just wanted to ask if there was any issue with this pattern.

Using async/await inside Array.forEach (an “async foreach”) will run the async callbacks but forEach does not wait for or collect their returned promises, so your printFiles function will resolve before file reads finish — and that can lead to unhandled rejections, wrong ordering, or excessive parallelism. Use a for…of loop to await sequentially, or collect promises with Promise.all / Promise.allSettled (or employ batching/p-limit) to run and await work in parallel while handling errors and concurrency.


Contents


Why async/await and forEach don’t mix

Array.prototype.forEach is synchronous: it calls your callback for each item and returns immediately. If the callback is async it returns a promise — but forEach ignores that promise. So even though each callback uses await internally, the outer function doesn’t wait for those internal awaits to complete.

Why does that matter? Because you often want to know when all reads finished (or want them to run one-by-one). With your code you start a bunch of async operations and then printFiles can resolve before any (or all) of those reads finish. Also, if one of those reads rejects and you didn’t catch it inside the callback, you can get unhandled promise rejections.

(For context on how many people search these patterns, Yandex Wordstat shows high interest in “async await” and “javascript async”.) Yandex Wordstat — topRequests for “async await”


Demonstration: what happens with your code

Your original example:

javascript
import fs from 'fs-promise'

async function printFiles () {
 const files = await getFilePaths() // Assume this works fine

 files.forEach(async (file) => {
 const contents = await fs.readFile(file, 'utf8')
 console.log(contents)
 })
}

printFiles()

Execution flow (simplified):

  • printFiles awaits getFilePaths() and gets the file list.
  • files.forEach calls the async callback for each file synchronously; each callback returns a promise.
  • forEach returns immediately (it doesn’t return/await the promises).
  • printFiles reaches its end and the Promise it returned resolves — long before the inner reads may have completed.
  • The async callbacks continue resolving in the background; logs appear later, and any thrown errors become unhandled unless caught.

So yes — the code “works” in that reads will run and logs will appear, but:

  • You don’t know when all files are processed.
  • Uncaught errors inside the callbacks may surface as unhandled rejections.
  • If you rely on ordering (first file logged first), parallel completion may break that expectation.

(For the niche phrase “async foreach” search volume is small — it’s a long-tail query — but the problem is common.) Yandex Wordstat — topRequests for “async foreach”


Correct patterns: sequential, parallel, and controlled concurrency

Pick the pattern based on your goals: do you need serial order, maximum throughput, or a middle ground?

  1. Sequential — preserve order, simple error handling
javascript
async function printFilesSequential () {
 const files = await getFilePaths()
 for (const file of files) {
 const contents = await fs.readFile(file, 'utf8')
 console.log(contents)
 }
}
  • Each file is read one after the other.
  • Simple to reason about and good when order matters or when you must limit resource use.
  1. Parallel — fastest (but beware errors and resource use)
javascript
async function printFilesParallel () {
 const files = await getFilePaths()
 await Promise.all(files.map(async (file) => {
 const contents = await fs.readFile(file, 'utf8')
 console.log(contents)
 }))
}
  • Promise.all waits until all reads complete (or rejects fast on the first error).
  • Use this when you want maximum concurrency and are okay handling a single rejection (or wrapping callbacks with try/catch).
  1. Parallel with per-item error handling — don’t fail everything for one error
javascript
async function printFilesAllSettled () {
 const files = await getFilePaths()
 const results = await Promise.allSettled(files.map(file => fs.readFile(file, 'utf8')))
 for (const r of results) {
 if (r.status === 'fulfilled') console.log(r.value)
 else console.error('read failed:', r.reason)
 }
}
  1. Controlled concurrency (recommended for large arrays)
    Chunking approach (no extra deps):
javascript
async function printFilesWithLimit (limit = 5) {
 const files = await getFilePaths()
 for (let i = 0; i < files.length; i += limit) {
 const chunk = files.slice(i, i + limit)
 await Promise.all(chunk.map(file =>
 fs.readFile(file, 'utf8')
 .then(contents => console.log(contents))
 .catch(err => console.error(`Failed ${file}:`, err))
 ))
 }
}

Or use a small library like p-limit or p-queue if you’d rather avoid writing the batching logic by hand.

Which pattern to use?

  • Need order → use for…of (sequential).
  • Need speed and can tolerate one-fail-all-fail → Promise.all.
  • Need robustness per-file → Promise.allSettled or try/catch inside map.
  • Too many files → use controlled concurrency (chunking or p-limit).

(If you search “javascript async” you’ll find many people ask about these patterns.) Yandex Wordstat — topRequests for “javascript async”


Error handling and edge cases for async/await

  • Unhandled rejections: an async callback that throws or rejects will produce a rejected promise. If you don’t handle it (try/catch inside the async function or a catch on the promise), the rejection can become “unhandled”. That shows up as warnings or, depending on runtime configuration, process termination.
  • Ordering: Promise.all runs in parallel; results arrive in order of the input array only if you collect the results and iterate them — not necessarily in the order callbacks complete.
  • Process lifecycle: in many environments Node will keep the process alive until asynchronous I/O finishes; but if you need to be sure everything finished before continuing (or before exiting), you must await the controlling promise (e.g., await Promise.all(…)).
  • Sudden resource exhaustion: launching thousands of file reads at once can exhaust file descriptors or thread-pool capacity — see the performance section next.

Defensive patterns:

  • Catch per-file errors inside the callback so a single failure doesn’t produce an unhandled rejection.
  • Use Promise.allSettled when you want to wait for every item and then inspect each result.
  • Return or await the generated promises from the enclosing function so callers can wait for completion.

Performance considerations

A couple of practical notes:

  • File reads are I/O-bound. Running some reads in parallel is usually faster than strictly sequential reads. But too much parallelism can backfire: it can saturate disk throughput, run out of file descriptors, or overwhelm the runtime’s thread pool for certain fs operations.
  • Node uses a libuv thread pool for some file system calls; by default that thread pool is small (it can be adjusted via UV_THREADPOOL_SIZE), so firing thousands of simultaneous fs.readFile calls may not actually run all of them concurrently — they’ll queue — and overall throughput may suffer.
  • If you need maximum throughput while keeping the system healthy, pick a reasonable concurrency limit (for example 5–20 depending on the environment) and tune it for your deployment.

Small arrays? Just use Promise.all or for…of. Thousands of files? Batch them or use a concurrency limiter.

(Interest in the exact phrasing “async forEach” is smaller, but the underlying problems are frequent across codebases.) Yandex Wordstat — topRequests for “async forEach”


Sources

  1. Yandex Wordstat — topRequests for “async await”
  2. Yandex Wordstat — topRequests for “javascript async”
  3. Yandex Wordstat — topRequests for “foreach async”
  4. Yandex Wordstat — topRequests for “async forEach”

Conclusion

Short answer: putting an async function inside Array.forEach is allowed syntactically, but forEach won’t wait for the promises that callback returns — so your function may finish early, errors can go unhandled, and you can create too much concurrency. Use for…of for sequential awaits, await Promise.all(files.map(…)) to run-and-wait in parallel, and use Promise.allSettled or batching/p-limit for robust per-item error handling and controlled concurrency. These patterns keep your JavaScript async code predictable and safe while using async/await.

Authors
Verified by moderation
Moderation
Async/Await in forEach Loop: Issues & Fixes in JS