Why Does Google's 2MB Crawl Limit Matter?
Google has resource size limits that can affect how your pages are indexed and ranked.
Google's crawler (Googlebot) has a practical limit on the size of individual resources it will fully process. Resources exceeding 2MB may be truncated or skipped entirely, which can affect how your page is rendered and indexed.
This is especially important for:
- Large images: Unoptimized hero images and infographics are common offenders
- JavaScript bundles: Heavy frameworks can produce bundles that exceed the limit
- CSS files: Concatenated stylesheets with unused rules can bloat file sizes
- Web fonts: Multiple font weights and character sets add up quickly
- HTML documents: Very long pages with inline assets can exceed limits
Full Page Scan
Discovers all linked resources including images, scripts, stylesheets, and fonts
Size Analysis
Checks the file size of every resource against the 2MB threshold
Pass/Fail Report
Clear status indicators for every resource on your page
Actionable Fixes
Specific recommendations to reduce oversized resources