CSR vs SSR

MORE NEWS

DIGITAL MARKETING

SEO

SEM

Invisible watermarking in AI content with Google SynthID

Invisible watermarking is a key innovation in authenticating and protecting content created by generative AI. Google SynthID is a state-of-the-art watermarking system designed to embed imperceptible digital signatures directly into AI-generated images, videos, text,...

Google Search API – A technical deep dive into ranking logic

📑 Key Takeaways from the API Leak If you don't have time to analyze 2,500 pages of documentation, here are the 3 most important facts that reshape our understanding of SEO: 1. Clicks are a ranking factor (End of Debate): The leak confirmed the existence of the...

Information gain in the age of AI

The digital information ecosystem stands at a precipice of transformation that is arguably more significant than the introduction of the hyperlink. For the past twenty-five years, the fundamental contract of the web was navigational. Users queried a search engine, and...

Google Discover optimization – technical guide

We have moved from a query-based retrieval model to a predictive push architecture. In this new environment, Google Discover is no longer a secondary traffic source. It is a primary engine for organic growth. The rise of zero-click searches, which now account for...

Parasite SEO strategy for weak domains

The barrier to entry for new digital entities has reached unprecedented heights in this year. For professionals entering competitive verticals, such as SaaS or finance, the mathematical reality of ranking algorithms presents a formidable challenge....

The resurrection protocol of toxic expired domains

The digital economy is littered with the remnants of abandoned web properties, often referred to in the cybersecurity sector as zombie domains. These are domain names that have expired, been dropped by their original registrants, and subsequently re-registered or...

Beyond the walled garden silo – true ROAS across platforms

Google says your campaign generated 150 sales. Amazon claims 200. Meta swears it drove 180. Add them up and you get 530 conversions. Check your actual revenue and you'll find you sold 250 units total.​ This is the walled garden nightmare every e-commerce marketer...

Data-driven CRO for PPC landing pages

In paid search campaigns, exceptional Quality Scores and high conversion rates don’t happen by accident—they’re the result of rigorous, data-driven optimization that blends user behavior insights with systematic testing. By combining visual tools like heatmaps and...

New YouTube Shorts campaign features in Google Ads

YouTube Shorts advertising has undergone significant transformation in 2025, introducing groundbreaking features that revolutionize how advertisers can target, optimize, and monetize short-form video content. The most notable advancement is the introduction...

The latest changes to Google Ads in 2025

Google Ads has undergone its most significant transformation in 2025, with artificial intelligence taking center stage in nearly every aspect of campaign management and optimization. The platform has evolved from a traditional keyword-based advertising system into a...

Jacek Białas

Holds a Master’s degree in Public Finance Administration and is an experienced SEO and SEM specialist with over eight years of professional practice. His expertise includes creating comprehensive digital marketing strategies, conducting SEO audits, managing Google Ads campaigns, content marketing, and technical website optimization. He has successfully supported businesses in Poland and international markets across diverse industries such as finance, technology, medicine, and iGaming.

How client side rendering is killing SEO

Sep 23, 2025 | SEO

The contemporary web development environment is defined by a profound architectural schism. On one side stands the user experience imperative which drives the adoption of robust JavaScript frameworks like React or Angular to create seamless interfaces. On the other side lies the fundamental infrastructure of the search engine ecosystem which is a system built primarily on the efficient ingestion of text-based HTML. This report posits that Client-Side Rendering (CSR) acts as a systemic depressant on Search Engine Optimization performance.

While superficial analysis often suggests that Google and other search engines have solved JavaScript rendering, a deep forensic examination of the crawling pipeline reveals a different reality. The friction introduced by CSR manifests not as a hard block but as a silent tax involving a series of latencies and interpretive failures that cumulatively degrade organic visibility. By offloading the rendering cost to the client and search bot, websites inadvertently place themselves in a deferred queue where they are subject to the whims of the search engine’s computational budget.

Initial HTML response vs the DOM

To understand the mechanics of how CSR kills SEO, one must first distinguish between the two states of a webpage which are the initial HTML response and the fully rendered Document Object Model. In a traditional Server-Side Rendered environment, these two states are virtually identical because the server executes the logic and sends a complete document. A search engine crawler receiving this response immediately parses the text to identify keywords and indexes the content without impediments.

In a Client-Side Rendered architecture, this pathway is severed. The server responds with an HTML shell that often contains little more than a basic boilerplate with a reference to a massive JavaScript bundle. If a crawler were to index this initial response, it would see a blank page. Content visibility is entirely contingent upon the successful download and execution of the JavaScript bundle which transforms the indexing process from a single-step operation into a complex pipeline.

The first wave involves the initial crawl where Googlebot fetches the URL and parses the server response. Finding no content, the bot cannot index the page for relevance. Instead, it places the URL into a Rendering Queue to wait for availability in Google’s Web Rendering Service.

The web rendering service and the time tax

The existence of the rendering queue introduces a non-trivial latency on content discovery. While Google has reduced the delay between the initial crawl and the rendering pass, this delay is not zero and it is not consistent. The Web Rendering Service is a shared resource across the entire internet. During periods of high demand, the time a page spends in the rendering queue can stretch from minutes to days which creates a distinct content invisibility window.   

This latency is catastrophic for content that relies on freshness such as news or time-sensitive e-commerce promotions. While a competitor using SSR is indexed immediately upon publication, the CSR site remains in limbo while waiting for a headless browser to prove it contains content worth ranking. The process is resource-intensive because executing JavaScript requires orders of magnitude more CPU cycles than parsing static HTML.

The render budget and execution timeouts

The rendering service is not an infinite resource and operates under strict constraints regarding execution time. If a JavaScript bundle is too large or if the chain of asynchronous network requests takes too long, the service may trigger a timeout. Official discussions suggest that a rendering threshold of approximately 5 seconds exists to prevent a single page from monopolizing the bot’s resources.   

If the application fails to render its main content within this window, Googlebot takes a snapshot of the partially rendered page. In many CSR applications, this snapshot might contain only the header and a loading spinner. The search engine then indexes this incomplete state which results in a page that ranks for nothing because the main body content was effectively cloaked by the latency of the client-side execution.

Furthermore, the bot does not interact with the page in the way a human user does. It does not scroll infinitely and it does not click buttons to load more content. If a CSR application relies on these interactions to fetch and display text, that content remains permanently invisible to the search engine.

FeatureServer-Side Rendering (SSR)Client-Side Rendering (CSR)Implication for SEO
Content AvailabilityImmediate (in initial HTML)Delayed (requires JS execution)CSR creates an indexing lag where content is invisible for a period.
Crawl DependencyHTTP Response onlyHTTP + JS Download + ExecutionCSR creates multiple points of failure (network, syntax, timeout).
Resource CostLow (Text parsing)High (CPU execution)Google effectively charges CSR sites more crawl budget per page.
Link DiscoveryInstant (<a href>)Delayed (injected into DOM)CSR slows down the discovery of deep pages and new sections.

The 15MB limit and the bloat of bundles

A specific technical constraint that disproportionately affects CSR sites is Googlebot’s 15MB crawl limit. Documentation confirms that Googlebot stops processing a file after the first 15MB. While this might seem large for a text file, modern CSR applications often rely on massive JavaScript bundles and inlined JSON data payloads that can easily approach this limit.

If the essential code required to initialize the application resides after the 15MB cutoff point, the bot sees a broken file. The script fails to execute and the page is indexed as blank. This limit applies to the initial request rather than the referenced resources, but many Single Page Applications embed data directly into the initial HTML to speed up hydration. If this state object is too large, the truncation results in a syntax error that crashes the rendering process.

The high cost of JavaScript on mobile

The cost of execution is not just a problem for Googlebot because it is also a problem for the user’s device. Snippets from performance analysis highlight the stark difference in processing power between high-end devices and average devices. A script that parses in 4 seconds on an iPhone might take 36 seconds on a low-end Android device.

Googlebot generally renders using a capable server environment, but it uses the data from real-world users via the Chrome User Experience Report to determine ranking signals. If a CSR site relies on heavy JavaScript that chokes the CPUs of average mobile phones, the site’s Core Web Vitals scores will plummet. The cost of JavaScript causes lower rankings due to poor UX signals.

The illusion of the 200 OK status

ne of the most insidious ways CSR kills SEO is through the mismanagement of HTTP status codes which creates “Soft 404s.” In a traditional server architecture, requesting a non-existent URL triggers a 404 Not Found response. This is a clear signal to the search engine to drop the URL from the database.

In a Single Page Application, the routing logic moves from the server to the client. When a user requests a URL, the server returns the main index.html file and a 200 OK status code regardless of whether the content exists. It is then the responsibility of the JavaScript framework to inspect the URL and render an error component. To Googlebot, the 200 OK status is a green light that signals the page is valid.   

If the JavaScript then renders a “Page Not Found” message, Google effectively indexes a valid URL with the content “Page Not Found.” This creates index bloat where the search engine’s index becomes cluttered with empty or error pages. Google’s algorithms punish sites with high ratios of low-quality content.

The difficulty of detection and fixing

Fixing Soft 404s in a CSR environment is notoriously difficult because the server has already sent the header. Developers must resort to complex workarounds such as using JavaScript to inject a noindex tag into the DOM. However, this creates a race condition where Googlebot must successfully execute the JavaScript to see the tag. If the rendering fails due to a timeout, the tag is never seen.

This issue is compounded by the tooling gap. Standard SEO crawlers like Screaming Frog will simply see the 200 OK status and report the site as healthy unless they are configured to render JavaScript. This leaves site owners blind to the fact that they are generating thousands of Soft 404s until they see a warning in Google Search Console.

Interaction to Next Paint and the main thread

Google replaced First Input Delay with Interaction to Next Paint (INP) as a Core Web Vital to measure responsiveness. CSR sites are structurally disadvantaged regarding INP because they rely heavily on the browser’s main thread.

When a user loads a CSR page, the browser must download and execute the JS bundle to “hydrate” the application. During this hydration phase, the main thread is often completely blocked. If a user attempts to click a navigation link while the JavaScript engine is busy, the browser cannot respond. This input delay results in a high INP score which negatively affects rankings.

Largest Contentful Paint and the waterfall

Largest Contentful Paint (LCP) measures loading speed by tracking when the main content element appears. In a CSR architecture, LCP is inherently delayed by a dependency chain known as the “waterfall.” The browser must download the HTML, download the JavaScript, execute the JavaScript, fetch data from an API, and finally inject the image tag. This sequence adds significant latency to the LCP time.

Quantitative case studies reinforce this impact. When companies switch from client-side rendering to server-side rendering, they often see massive improvements in LCP. For instance, Tokopedia improved LCP by 55% by moving to SSR. These are not theoretical gains because they represent the concrete business cost of the latency introduced by client-side rendering.

The hydration paradox and interactivity

Server-Side Rendering with React or Vue frameworks is often presented as the solution to CSR issues. It introduces a complex phase known as rehydration where the client takes over. In this process the server sends a fully populated HTML document which allows the user to see the content immediately but the page is essentially a static corpse until the JavaScript bundle executes. This creates a phenomenon known as the uncanny valley where the interface looks ready for interaction but does not respond to clicks or inputs.

Hydration errors and the double burden

A critical technical risk in this architecture is the potential for hydration mismatches which cause the DOM to break. If the HTML rendered by the server differs even slightly from what the client-side JavaScript expects the framework may discard the entire server-rendered tree and force a full re-render from scratch. This catastrophic failure doubles the CPU workload and negates the initial performance benefits of SSR while leading to poor Core Web Vitals.

Furthermore, hydration imposes a “double burden” on the network. The browser must download the HTML representing the UI and the JSON data required to recreate that UI in JavaScript. This bloating of the payload size can delay the Time to Interactive (TTI). This leaves the search engine bot waiting longer to verify that the page is actually functional and safe to index.

The Open Graph failure

While Google has the resources to attempt rendering JavaScript, most other bots do not. The crawlers used by social media platforms like Facebook and LinkedIn are essentially “dumb” HTTP clients. They fetch the HTML and parse it for Open Graph meta tags to generate the preview card.   

In a pure CSR application, these meta tags are often dynamically injected by the JavaScript framework. When the Facebook crawler visits the URL, it sees the empty HTML shell which lacks the specific metadata for that page. As a result, when a user shares a link to a CSR site, it appears as a broken link or a generic homepage preview. This is a failure for social distribution.

Bing and DuckDuckGo strict HTML preference

Ignoring search engines other than Google is a strategic error for many businesses. Bing’s documentation explicitly states that while they can process JavaScript, they have limitations on scale. DuckDuckGo uses its own crawler alongside data from Bing and has limited JavaScript execution capabilities.

Furthermore, the rise of AI-driven search and “Answer Engines” introduces a new variable. Many of these bots prioritize speed and text ingestion. A CSR site that hides its text behind a script wall effectively opts out of being a source for these next-generation AI answers.

The fall of dynamic rendering strategy

For several years, Google offered a compromise solution for CSR sites called Dynamic Rendering. This involved setting up a middleware server to serve a pre-rendered static HTML version of the page to bots while serving the standard JS version to users. However, Google has formally deprecated this recommendation.

The official guidance now labels dynamic rendering as a “workaround” rather than a solution and cites the complexity and fragility of maintaining such systems. Google now explicitly advises developers to move toward Server-Side Rendering or Static Site Generation.

The implications of giving up

This shift is profound because it signals that Google is no longer willing to subsidize the computational cost of inefficient web development. By calling it a “workaround,” Google is stating that the architecture is fundamentally flawed for the open web. It implies they effectively gave up on supporting a separate rendering path for bots because it was inefficient for both the crawler and the site maintainers.   

Maintaining a dynamic rendering infrastructure creates technical debt. It requires a separate rendering cluster that can crash or cache stale content. The deprecation is a clear signal that the future of SEO is server-rendered.

Business impact and strategic fixes

The decision to build a CSR application is often driven by developer velocity. However, this creates a massive technical SEO debt. Remedying a mature CSR site to support SSR often requires a fundamental re-architecture of the application’s routing and data fetching layers.

However, the cost of not remediating is often higher. It includes lost traffic due to rendering latency, lost conversions due to poor LCP, and lost social reach. Case studies demonstrate that the ROI of moving to SSR is positive and drives improvements in core business metrics.

Strategic recommendations for 2025

In the current SEO environment, relying on pure Client-Side Rendering is a strategic error. The argument that “Google renders JS” is technically true but practically misleading because it ignores the latency and the probability of failure.

For new projects – the architecture must be Server-Side Rendering (SSR) or Static Site Generation (SSG) by default. Frameworks like Next.js or Nuxt should be employed to deliver static HTML to bots.

For existing CSR projects:

  • Vigilance – rigorous monitoring of the Rendering Queue via Log Analysis is required to detect when the WRS fails to render content.
  • Migration – plan a migration to a meta-framework that supports SSR.
  • Hybridization – if full migration is impossible, implement “Isomorphic” rendering where critical landing pages are server-rendered.
Share News on