HTML page size checker

Check your HTML document size and get optimization recommendations

Free
No sign-up
Instant results

Check results

This check only covers HTML size. For a full picture of your page, run a page audit.

For issues across your whole site — duplicate titles, orphan pages, broken internal links — run a site audit.

Want us to fix what we found? Our team can help.

What is HTML size and why it matters

HTML size is the raw byte length of the HTML document the server sends — before CSS, JS, or images are fetched. It matters for two distinct reasons. First, Google has a documented hard cap: since February 2026, Googlebot stops fetching after the first 2 MB of an HTML response. Anything past that point is effectively invisible to Google — not indexed, internal links not followed, Schema.org markup ignored. Second, even well below the cap, oversized HTML slows page loading on mobile networks and increases memory pressure during parsing and rendering. The median page today is roughly 33 KB of HTML on mobile (Web Almanac 2025) — most pages don't need much more than that.

What this tool checks

  • HTML byte size — graded against the 2 MB Googlebot cap and the ~33 KB mobile median
  • DOM element count — flagged as excessive above 1500 elements per Lighthouse audit

Size thresholds

  • Up to 100 KB — at or near the mobile median, no concern
  • 100 KB - 500 KB — within normal range for content-heavy pages
  • 500 KB - 2 MB — significantly larger than typical; worth auditing for inline JSON, unused markup, inline CSS/JS
  • Over 2 MB — past Google's indexing cap. Content after the 2 MB mark is invisible to Google

Good vs bad examples

Good — lean HTML under 100 KB with markup for the real content:

<!DOCTYPE html>
<html>
<head>
  <link rel="stylesheet" href="/styles.css">
  <script src="/app.js" defer></script>
</head>
<body>
  <main>... content ...</main>
</body>
</html>

Bad — huge inline JSON / Redux state / template bundle in HTML:

<script>
  window.__INITIAL_STATE__ = { /* 800 KB of server state */ };
</script>
<style>
  /* 200 KB of inline CSS */
</style>

Common mistakes

  • Server-rendering huge JSON blobs — SPA frameworks embed Redux/Vuex state, tRPC caches, or Apollo snapshots in HTML. Kilobytes add up fast
  • Inline everything — inline CSS and JS don't get cached, inflate every HTML response, and can't be loaded in parallel
  • Template bloat — hidden modals, lazy-loaded sections, and all route variants pre-rendered in HTML instead of lazy-loaded
  • Over-nested markup — dozens of wrapper divs per component add both bytes and DOM elements
  • Long lists without pagination — 10,000-row product tables pushed into HTML rather than paginated or virtualized
  • Ignoring the 2 MB cap — the top 2 MB are what Google sees. Everything below that is lost for SEO

Frequently asked questions

Google recommends keeping HTML under 100 KB. However, there's no hard limit — overall loading speed matters more. If a page weighs 150 KB but compresses to 30 KB with gzip and renders quickly, that's acceptable. Problems start when HTML exceeds 200-300 KB, especially on mobile devices with slow connections.
Yes. A large number of DOM elements slows down HTML parsing, increases memory consumption, and slows JavaScript DOM operations. Google recommends no more than 1500 elements per page. If the DOM contains more than 3000 elements, it's a serious issue — simplify the markup and remove unnecessary wrappers.
HTML minification removes whitespace, line breaks, and comments, reducing size by 5-15%. Helpful but not critical — gzip/Brotli compression already handles repetitive whitespace well, so the real-world bandwidth savings are smaller than the raw byte diff suggests. Focus first on the bigger problems: inline JSON state, inline CSS/JS, template bloat. Minification is a nice bonus, not the main tool.
DOM size is the count of elements (tags) in the parsed HTML tree. Even if the raw HTML byte size is reasonable, a page with thousands of DOM elements runs slow: parsing takes longer, memory usage grows, every JavaScript DOM query is slower, and style recalculations compound with element count. Google Lighthouse flags DOM trees over 1500 elements as excessive. Common causes are deeply nested wrapper divs (component libraries that wrap every child in extra layers), long un-paginated lists, and duplicate markup across templates.
Yes — very different. HTML size is only the initial document the server returns. Total page weight adds every CSS file, JavaScript bundle, image, font, video, and tracking script the page loads. A 50 KB HTML page can still pull in 5 MB of other resources. The 2 MB Googlebot cap is specifically about the HTML document — each external file has its own separate 2 MB budget. This check focuses on HTML only; other checks cover images (image-format), lazy-loading, and render-blocking scripts.