Pagesmith.ai
Technical SEO

Client-Side Rendering SEO

Why CSR architecture creates fundamental SEO problems, and how the rise of AI search makes it even worse.

How Client-Side Rendering Works

Client-side rendering (CSR) is an architecture where the browser, not the server, builds the page content. Here's the typical flow:

1

User Requests Page

Browser sends request to server for yoursite.com/page

2

Server Returns Empty Shell

Server sends minimal HTML with <div id="root"></div> and a JavaScript bundle

3

Browser Downloads JavaScript

Browser downloads, parses, and executes the JavaScript bundle (often 100KB+)

4

Content Finally Appears

JavaScript renders the actual page content into the DOM. User sees the page.

The SEO Problem

Search engine crawlers and AI systems receive the response at step 2—an empty HTML shell. They don't execute JavaScript like browsers do, so they never see the content created in step 4.

How Google Handles JavaScript

Google uses a two-wave indexing system. Understanding this is key to understanding why CSR causes SEO problems.

Wave 1: Crawl

  • Googlebot fetches the raw HTML
  • For CSR sites, sees empty <div>
  • Happens immediately
  • Can index links and basic metadata

Wave 2: Render

  • Page enters render queue
  • May wait days to weeks
  • Limited rendering resources
  • Complex JS may timeout/fail

Why This Matters

  • New content takes longer to rank because it must wait for rendering
  • Content discovered via JS may get less ranking weight than HTML content
  • Complex sites may never fully render, leaving content permanently unindexed

The AI Search Problem

If Google's handling of CSR is problematic, AI search systems are even worse.

Perplexity AI

Limited JavaScript rendering. CSR content is typically invisible.

ChatGPT

Web browsing reads raw HTML. JavaScript-dependent content is missed.

AI Overviews

Google's AI summaries prioritize easily-crawled static content.

As AI search becomes more prevalent, CSR sites face an increasingly severe visibility problem. Content that can't be read by AI systems simply won't be cited in AI-generated answers—regardless of how valuable it is.

CSR vs SSG vs SSR: SEO Impact

Comparing the three main rendering architectures.

Aspect CSR SSG SSR
Initial HTML Empty Complete Complete
Google Indexing Delayed Immediate Immediate
AI Search Visibility None Full Full
Core Web Vitals Poor Excellent Good
Server Load Low None High
Best For Apps (internal) Marketing, blogs Dynamic content

Common CSR Tools (and Their SEO Risks)

These tools default to client-side rendering unless configured otherwise.

Create React App

Pure CSR, no built-in SSR/SSG. Avoid for any SEO-sensitive content.

CSR only

Vite (React/Vue)

Default templates are CSR. Requires additional setup for SSR.

CSR default

Vue CLI

CSR by default. Nuxt adds SSR/SSG, but requires migration.

CSR default

Angular

CSR by default. Angular Universal adds SSR but adds complexity.

CSR default

Skip CSR Problems Entirely

Pagesmith generates static Astro sites that avoid CSR's SEO limitations by design. All content is in the initial HTML—no JavaScript rendering required.

  • Static HTML Generation

    Complete HTML from first byte. No JS required to see content.

  • AI Search Ready

    Perplexity, ChatGPT, and AI Overviews can read and cite your content.

  • Instant Indexing

    No waiting for Google's render queue. Content is indexed immediately.

What Crawlers See

<!DOCTYPE html>
<html>
<head>
  <title>Your Page Title</title>
  <meta name="description" content="...">
</head>
<body>
  <!-- All content here from first byte -->
  <h1>Your Headline</h1>
  <p>Your content is immediately visible...</p>
  <section>
    <h2>Section Title</h2>
    <p>More visible content...</p>
  </section>
</body>
</html>

Build SEO-First, Not SEO-Last

Stop fighting CSR's limitations. Generate static sites with AI that work perfectly for Google, AI search, and users.

Client-Side Rendering SEO FAQ

What is client-side rendering (CSR)?

Client-side rendering (CSR) is a web architecture where the browser builds the page content using JavaScript after receiving an empty HTML shell. Frameworks like React, Vue, and Angular typically use CSR by default. The server sends minimal HTML with a JavaScript bundle that renders the actual content in the browser.

Why is client-side rendering bad for SEO?

CSR is problematic for SEO because search engines receive empty HTML. While Google can render JavaScript, it does so in a delayed second pass with limited resources. Pages may timeout, fail to render completely, or wait weeks in the rendering queue. AI search systems like Perplexity and ChatGPT can't render JavaScript at all.

Can Google index client-side rendered pages?

Google can index CSR pages, but with significant limitations. Google uses a two-wave indexing process: crawl first (sees empty HTML), then render later. The render queue has limited capacity and may take days to weeks. Complex JavaScript may timeout or fail. Content discovered after rendering may not get the same ranking weight.

What's the alternative to client-side rendering?

The main alternatives are Static Site Generation (SSG) and Server-Side Rendering (SSR). SSG pre-builds HTML at compile time—best for content sites. SSR generates HTML on each request—useful for personalized content. Both ensure crawlers receive complete HTML immediately, solving the core CSR SEO problem.

How do AI search systems handle client-side rendered sites?

AI search systems like Perplexity, ChatGPT, and Google AI Overviews have very limited JavaScript rendering capability. They primarily read the raw HTML response. CSR sites appear empty or incomplete to these systems, meaning they can't be cited in AI-generated answers—a critical visibility gap as AI search grows.

Related Technical SEO