Client-side rendering (CSR)
An architecture in which the server sends a minimal HTML shell and a JavaScript bundle, and the browser builds the page in place by executing JavaScript.
Also known as: CSR
Client-side rendering (CSR) is an architecture in which the server sends a minimal HTML document plus one or more JavaScript files. The browser executes the JavaScript, which fetches data, builds the page’s content, and inserts it into the DOM. The browser does the rendering work that a server would do under SSR.
How CSR works
A typical CSR page load:
- The browser requests the page
- The server returns a small HTML file (often just
<div id="app"></div>plus script tags) - The browser downloads the JavaScript bundle
- The JavaScript runs, fetches any required data from APIs, and builds the page’s content
- The page becomes visible and interactive
Subsequent navigation within the application can be very fast because the JavaScript is already loaded; only data needs to be fetched.
Common CSR architectures
CSR is the default rendering model for traditional single-page applications (SPAs) built with frameworks like:
- React (without SSR)
- Vue (without SSR)
- Angular
- Svelte (without SSR)
- Ember
- Backbone, older Knockout-based applications
These frameworks all support SSR or static rendering as well; the distinction is whether the initial HTML is built on the server or in the browser.
CSR vs SSR vs static
| Aspect | CSR | SSR | Static |
|---|---|---|---|
| Where HTML is built | Browser | Server | Build server |
| Initial page weight | Small HTML + larger JS | Full HTML | Full HTML |
| First Contentful Paint | Slower | Faster | Fastest |
| In-app navigation | Often very fast | Can require server round-trip | Requires re-fetching |
| SEO indexing | Requires JS-aware crawlers | Reliable | Reliable |
| Hosting | Static host or CDN | Application server | Static host or CDN |
Performance characteristics
CSR pages tend to:
- Have a slower First Contentful Paint and Largest Contentful Paint than SSR or static
- Have larger initial JavaScript payloads
- Feel responsive after the initial load, especially for app-like navigation
- Benefit from code splitting, lazy loading, and modern JavaScript build tools to reduce bundle size
Performance varies widely depending on bundle size, network conditions, and device capability.
SEO considerations
Search engines must execute JavaScript to see the content of a CSR page. Google’s crawler does execute JavaScript, but rendering is queued separately from initial crawling, which can delay indexing. Other search engines and many social media link previewers do not execute JavaScript at all.
For SEO-sensitive content, common patterns include:
- Pre-rendering critical pages as static HTML (Next.js, Nuxt static export)
- Using SSR for the initial load, then hydrating with client-side JavaScript
- Server-rendering only the HTML head (titles, meta tags) and rendering the body client-side
When CSR tends to fit
- Web applications behind a login (admin dashboards, internal tools)
- Highly interactive apps where initial render speed is less important than in-app experience
- Cases where SEO is not relevant (the content is gated or app-like)
When other approaches tend to fit better
- Public, SEO-sensitive content (marketing sites, blogs, documentation)
- Pages where time-to-content matters most (landing pages)
- Devices with slower CPUs or networks where large JS bundles are costly
Hybrid approaches
Most modern frameworks blend CSR with SSR or static rendering. The HTML is server-rendered or pre-built; the JavaScript “hydrates” the page to enable interactivity. This combines fast initial load with rich in-app behavior.
Common misconceptions
- “CSR is faster than SSR.” CSR is typically slower for first contentful paint but can feel faster for in-app navigation after load.
- “All React apps use CSR.” React supports CSR, SSR, and static rendering; modern frameworks like Next.js and Remix default to SSR or hybrid.
- “CSR is bad for SEO.” It can be problematic, but Google handles JavaScript content; the main risks are indexing latency and non-Google crawlers.