Pagesmith.ai
SEO Problem

Why Your AI Website
Isn't Ranking on Google

You built a beautiful AI-generated website. The design looks professional. But Google doesn't seem to know it exists. Here's the technical reason why—and what you can do about it.

You've done everything right: clear content, relevant keywords, good design. Your site looks great when you visit it in a browser. So why does Google Search Console show your pages as "Discovered - currently not indexed"? Why isn't your site showing up in search results?

The answer has nothing to do with your content quality. It's about how your AI builder delivered that content to the browser—and to Google's crawler.

The Hidden Problem: Client-Side Rendering

Most AI website builders—Lovable, Bolt.new, v0, and others—generate React applications that use Client-Side Rendering (CSR). This means:

What CSR Means for Your Site

  1. 1 Your server sends a nearly empty HTML file
  2. 2 The HTML contains a JavaScript bundle (often 200KB+)
  3. 3 The browser downloads and executes the JavaScript
  4. 4 JavaScript builds the page content in the browser
  5. 5 Only then does your actual content appear

For humans with modern browsers, this works fine—you see the final result. But for Google's crawler, this creates serious problems.

What Google Actually Sees

Google's First Crawl - your-ai-site.com
<!DOCTYPE html>
<html lang="en">
<head>
  <meta charset="UTF-8">
  <title>My Startup</title>
</head>
<body>
  <div id="root"></div>
  <script type="module" src="/assets/main.js"></script>
</body>
</html>

<!-- Google's initial index entry: -->
<!-- Title: "My Startup" -->
<!-- Content: (empty) -->
<!-- Keywords found: (none) -->

Google's crawler sees an empty page. There's a title, but no content to index. No keywords. No headings. No product descriptions. Nothing.

Google's Render Queue Problem

"But Google can execute JavaScript!" — Yes, they can. But here's what actually happens:

Phase 1

Initial Crawl

Google fetches your HTML and parses it immediately. For CSR sites, this captures almost nothing—just the empty shell. This is what goes into the initial index.

Phase 2 (Delayed)

Render Queue

Pages requiring JavaScript get added to a separate render queue. Google allocates limited resources to this queue. Your page might wait hours, days, or weeks before being rendered.

The Render Budget Problem

Google limits how many pages it will render per site. If you have many CSR pages, Google may never render all of them. Large sites can have entire sections that never get indexed properly.

Symptoms in Google Search Console

If your AI-built site has these issues, CSR is likely the cause:

"Discovered - currently not indexed"

Google found your URL but hasn't added it to the index. Often means render queue backlog.

"Crawled - currently not indexed"

Google crawled but found nothing valuable. Empty HTML returns "low quality" signals.

Duplicate titles/descriptions across pages

CSR apps often share one set of meta tags. Google sees duplicate content issues.

Slow indexing of new content

New pages take days or weeks to appear in search, if they appear at all.

The AI Search Catastrophe

Beyond Google, there's an even bigger problem: AI assistants like ChatGPT and Perplexity are becoming major discovery channels. And they cannot execute JavaScript at all.

0%

CSR content visible to ChatGPT

0%

CSR content visible to Perplexity

0%

Chance of AI recommendation

When someone asks ChatGPT "What's the best tool for X?" and your site is built with CSR, you won't be mentioned. AI literally can't read your site to know you exist.

How to Fix It: Static Site Generation

The solution is architectural: switch from Client-Side Rendering to Static Site Generation (SSG).

With SSG, your pages are pre-rendered at build time. When Google (or ChatGPT, or Perplexity) requests a page, they get complete HTML immediately—no JavaScript execution required.

SSG Site - What Google Sees
<!DOCTYPE html>
<html lang="en">
<head>
  <title>AI Website Builder for SEO - Pagesmith</title>
  <meta name="description" content="Build websites that rank...">
  <script type="application/ld+json">
    {"@type": "SoftwareApplication", ...}
  </script>
</head>
<body>
  <header>
    <nav>...</nav>
  </header>
  <main>
    <h1>Build websites that rank on Google</h1>
    <p>Pagesmith generates static HTML sites with
       built-in SEO. No JavaScript required to
       see your content.</p>
    <section>
      <h2>Features</h2>
      <!-- Full content available immediately -->
    </section>
  </main>
</body>
</html>

Immediate Indexing

Google sees full content on first crawl. No render queue delay. No budget concerns. AI assistants can read and cite your content immediately.

The Solution

Pagesmith: AI Websites That Actually Rank

Pagesmith generates static Astro sites designed for SEO from the ground up:

  • Pre-rendered HTML: Every page is complete HTML. Google indexes immediately.
  • AI-Readable: ChatGPT, Perplexity, and AI assistants can read and cite you.
  • SEO Built In: Sitemap, meta tags, Schema.org, canonical URLs—all automatic.
  • Real Code: Export your site as a standard Astro project you own.

Frequently Asked Questions

Why isn't my AI-generated website ranking on Google?
Most AI website builders generate client-side rendered (CSR) React applications. When Google crawls these sites, it sees empty HTML because the content requires JavaScript to render. While Google can execute JavaScript, CSR sites face indexing delays, render budget limits, and often rank poorly compared to static HTML sites.
Can Google index JavaScript websites?
Yes, Google can render JavaScript, but with limitations. JavaScript rendering happens in a separate queue that can delay indexing by hours, days, or weeks. Google also has a "render budget" that limits how many pages it will render per site. CSR sites often show "Discovered - currently not indexed" errors in Search Console.
What is the "Discovered - currently not indexed" error?
This Search Console error means Google found your URL but hasn't added it to its index. For CSR sites, this often happens because Google's render queue is backlogged or the page didn't render properly. Static HTML sites rarely see this error because content is immediately available without JavaScript execution.
Do AI assistants like ChatGPT index JavaScript sites?
No. Most AI crawlers (including ChatGPT's browse feature and Perplexity) read raw HTML without executing JavaScript. They see CSR sites as completely empty. If you want AI assistants to recommend your product, your site must serve pre-rendered HTML that AI crawlers can read.
How do I fix my AI website's SEO problems?
The fix is architectural: switch from client-side rendering to static site generation (SSG) or server-side rendering (SSR). Tools like Pagesmith generate static Astro sites that deliver pre-rendered HTML, ensuring both search engines and AI assistants can read and index your content immediately.

Related Topics