Chunked Processing
Process items in parallel chunks for optimal throughput. Control chunk size to balance performance and resource usage.
Efficiently process large collections in manageable chunks with error handling, throttling, and TypeScript support
Process in Chunks provides a convenient way to process large collections of data in manageable chunks with built-in error handling, throttling, and TypeScript support.
It offers two main functions: processInChunks for processing items individually with parallel execution within each chunk, and processInChunksByChunk for processing entire chunks at once — useful for batch operations like database inserts or API calls that accept multiple items.
import { processInChunks } from "process-in-chunks";
const results = await processInChunks(
[1, 2, 3, 4, 5],
async (item) => item * 2,
);
console.log(results); // [2, 4, 6, 8, 10]