When someone visits your website, how does the browser know what to display? The answer depends on your rendering strategy—and this choice has massive implications for SEO, performance, and AI search visibility.
Most AI website builders generate Client-Side Rendered (CSR) applications. This works fine for apps behind a login. But for marketing websites that depend on search traffic, it's a critical mistake.
The Two Approaches Visualized
Client-Side Rendering
Static Site Generation
What Search Engines Actually See
Search engine crawlers and AI assistants don't browse like humans. They read the raw HTML response. Here's the critical difference:
<!DOCTYPE html> <html> <head> <title>My Website</title> </head> <body> <div id="root"></div> <script src="/bundle.js"></script> </body> </html>
Googlebot sees: Empty page, no content
<!DOCTYPE html>
<html>
<head>
<title>Pricing - My Website</title>
<meta name="description" content="...">
</head>
<body>
<nav>...</nav>
<main>
<h1>Simple Pricing</h1>
<p>Start free, upgrade when ready</p>
<div class="pricing-table">...</div>
</main>
</body>
</html> Googlebot sees: Complete content, ready to index
SEO Impact: The Data
Google's documentation confirms they can render JavaScript—but with important caveats:
Two-Wave Indexing
Google uses a two-phase approach: first crawl (HTML only) and second crawl (JavaScript rendered). The second phase happens in a separate render queue with no guaranteed timing—it can take hours, days, or weeks.
Render Budget
Google allocates limited render resources per site. Large CSR sites may not get fully rendered. Pages in Search Console showing "Discovered - currently not indexed" often suffer from render budget exhaustion.
JavaScript Errors
Any JavaScript error during rendering can prevent content from being indexed. Errors that work fine in browsers may fail in Googlebot's older Chrome version (typically months behind current release).
Performance: Core Web Vitals
Core Web Vitals are now a Google ranking factor. Here's how rendering strategy affects them:
| Metric | CSR (Client-Side) | SSG (Static) |
|---|---|---|
| LCP Largest Contentful Paint | 2.5s - 4.0s+ Waits for JS + API | < 1.0s HTML ready on arrival |
| FID First Input Delay | 100-300ms Main thread blocked | < 50ms Minimal JS to execute |
| CLS Cumulative Layout Shift | 0.1 - 0.25+ Content pops in | < 0.05 Layout defined in HTML |
| TTFB Time to First Byte | Fast But empty content | Fast Full content from CDN |
AI Search: The Hidden Crisis
Beyond Google, AI assistants like ChatGPT, Perplexity, and Claude are becoming major discovery channels. These systems face the same problem—but worse:
AI Crawlers Don't Execute JavaScript
Most AI web browsing tools read raw HTML without JavaScript execution. ChatGPT's browse feature, Perplexity's crawler, and other AI systems see CSR sites as completely empty. If AI can't read your site, it can't recommend your product.
CSR content visible to ChatGPT browse
CSR content visible to Perplexity
SSG content visible to AI crawlers
When to Use Each Approach
CSR Makes Sense For:
- • Apps behind authentication (SEO irrelevant)
- • Internal dashboards and tools
- • Complex interactive applications
- • Prototypes and MVPs (non-production)
- • Real-time collaborative tools
SSG Makes Sense For:
- • Marketing websites
- • Landing pages
- • Blogs and content sites
- • Documentation
- • Any site where SEO matters
- • Any site where AI discovery matters
How Pagesmith Solves This
Pagesmith generates static Astro sites that deliver pre-rendered HTML by default:
- SSG by Default: Every page pre-rendered at build time. No JavaScript required to see content.
- Islands Architecture: Interactive components hydrate on demand. Static content stays static.
- SSR When Needed: Need dynamic content? Enable server-side rendering for specific pages.
- SEO Built In: Sitemap, meta tags, Schema.org, OG images—all automatic.