high severityApify Actors (esp. crawlers with Crawlee/Puppeteer/Playwright)

Run fails with \"The Actor hit an OOM (out of memory) condition. You can resurrect it with more memory to continue where you left off.\" or JS heap error. Crawler logs: \"Memory is critically overloaded\" warning despite available system RAM (e.g. 90% reported used but free -h shows plenty). Concurrency drops; new requests blocked.

Root cause

Crawler (e.g. PlaywrightCrawler via Crawlee) memory autoscaler detects high JS heap usage (e.g. from high concurrency, large in-memory data like RequestList of millions of URLs, browser contexts) approaching/exceeding allocated limit (128MB-32GB per run, subscription total quota). Mismatch between JS heap and system memory reporting can trigger false positives. RequestList loads all requests into RAM; unsuitable for large crawls.

ApifyCrawleePlaywrightCrawlerOOMheap out of memoryRequestListautoscalerconcurrency

Citations