---
title: "Why React Changed How I Think About Building"
date: 2026-03-07
description: "From P5.js canvas hacks to React Three Fiber — the detour that taught me component thinking, even though the site landed on Astro."
tags: ["react","frontend","astro","components","animations","ai-tools"]
readingTime: "10 min read"
url: https://alexmoening.com/dev-thoughts/why-react-for-everything.html
markdownUrl: https://alexmoening.com/dev-thoughts/why-react-for-everything.md
---

# Why React Changed How I Think About Building

[← Back to /dev/thoughts](/dev-thoughts/)

<p class="lead">I spent weeks migrating my site's interactive ASCII portrait from P5.js to React. The result is a 725-line React component with custom GLSL shaders, Three.js instanced meshes, and physics-based particle effects — running inside an Astro static site. The detour taught me more about building for the web than any tutorial could have.</p>

### The P5.js Problem

<p class="section-summary">P5.js is great at drawing. It's terrible at being part of a website.</p>

My interactive ASCII portrait was a self-contained sketch. It fetched a 265KB HTML file of colored spans, parsed them into a grid, and rendered them on canvas with wave effects. Beautiful in isolation. But it existed in its own universe — no awareness of the page layout, no easy way to share state, no lifecycle management beyond `setup()` and `draw()`.

When I wanted to add a gallery of homepage concepts — different layouts users could toggle between — P5.js had nothing to offer. I was hand-wiring DOM manipulation alongside canvas rendering, managing state in global variables, and praying that the timing of async fetches wouldn't break the draw loop.

It felt like duct-taping a jet engine to a bicycle.

### What React Actually Solved

<p class="section-summary">Components aren't just reusable UI — they're a way of thinking about state.</p>

The migration started as an experiment. I wrapped my ASCII canvas in a React component, expecting to spend most of my time fighting the framework. Instead, something clicked.

React doesn't just give you components. It gives you a **mental model** for how data flows through your application. The ASCII grid becomes state. Wave effects become side effects managed by `useEffect`. User clicks become events that update state, which triggers re-renders, which update the canvas. Every piece has a clear home.

Here's what changed concretely:

<table class="data-table">
    <thead>
        <tr><th>Capability</th><th>Before (P5.js)</th><th>After (React)</th></tr>
    </thead>
    <tbody>
        <tr><td>State management</td><td>Scattered global variables</td><td>Wave positions, gradients, dimensions — all in React state with <code>useRef</code> and <code>useState</code></td></tr>
        <tr><td>Lifecycle control</td><td>Orphaned animation loops when navigating away</td><td><code>useEffect</code> handles setup and teardown cleanly</td></tr>
        <tr><td>Composition</td><td>One monolithic sketch</td><td>The portrait became a component I could drop into any layout with a <code>variant</code> prop</td></tr>
        <tr><td>Visual quality</td><td>Canvas 2D at 60fps</td><td>WebGL instanced meshes with custom GLSL shaders, bloom postprocessing, additive blending</td></tr>
    </tbody>
</table>

The P5.js version was 400+ lines of imperative code doing everything at once. The React version — `ParticleWaves.tsx` — separates concerns naturally: a glyph atlas builder, custom vertex and fragment shaders, a camera rig component, and a particle system. Same visual effect, completely different architecture.

### Going Deeper Than I Expected

<p class="section-summary">React Three Fiber turned what would have been months of raw WebGL into a week of component composition.</p>

Here's where the React ecosystem really showed its value. I didn't just port the P5.js sketch — I rebuilt it on a fundamentally different rendering stack:

<table class="data-table">
    <thead>
        <tr><th>Layer</th><th>Technology</th><th>What It Does</th></tr>
    </thead>
    <tbody>
        <tr><td>Rendering</td><td>React Three Fiber</td><td>Declarative Three.js — the 3D scene is a React component tree</td></tr>
        <tr><td>Geometry</td><td>Three.js <code>InstancedMesh</code></td><td>6,600+ ASCII characters as GPU-instanced quads — one draw call</td></tr>
        <tr><td>Shading</td><td>Custom GLSL</td><td>Vertex shader positions glyphs from a texture atlas; fragment shader handles wave color boosts, shimmer modes, and sparkle effects</td></tr>
        <tr><td>Postprocessing</td><td><code>@react-three/postprocessing</code></td><td>Bloom filter gives the cyberpunk neon glow</td></tr>
        <tr><td>Animation</td><td><code>useFrame</code> hook</td><td>60fps animation loop with scatter/reform cycles, shockwaves on click, mouse gravity</td></tr>
    </tbody>
</table>

The component supports 8 visual variants — `portrait-waves`, `ambient`, `heartbeat`, `shimmer`, `sparkle`, `shimmer-sparkle`, `shimmer-cascade`, `shimmer-ripple` — all controlled by a single `variant` prop. The homepage currently runs `shimmer-cascade`. Click anywhere and the portrait explodes into particles, then reforms.

None of this would have been practical with P5.js. Not because P5 can't do WebGL, but because React Three Fiber's component model means I'm composing a 3D scene the same way I'd compose a form — pieces that snap together with clear data flow.

### The Part Nobody Tells You

<p class="section-summary">React's learning curve is real, but it's the right kind of hard.</p>

I'm not going to pretend this was frictionless. React's model requires you to think differently about rendering. You can't just mutate a variable and expect the screen to update. The rules around hooks — call order, dependency arrays, stale closures — tripped me up more than once.

But here's the thing: every time I hit a wall, the error message or the linter told me *why*. React's constraints aren't arbitrary. They exist because the framework is solving real problems — how to efficiently update the DOM, how to avoid memory leaks, how to keep UI consistent with data.

Compare that to debugging a P5.js sketch where the draw loop runs at 60fps and your wave effect glitches because you modified the array while iterating it. No framework to catch that. Just you and `console.log`.

### Why the Site Runs on Astro, Not React

<p class="section-summary">React is the right tool for interactive components. Astro is the right tool for a static site.</p>

Here's the twist in the story. After building this React component and falling in love with the mental model, I didn't go all-in on a React framework like Next.js. The site runs on **Astro**.

Why? Because a personal website is mostly static content. Blog articles are Markdown files with frontmatter. The about page is HTML with CSS. The project cards are a data file rendered at build time. React would be overkill for 90% of the site — shipping a JavaScript runtime so the browser can render a paragraph of text.

Astro's architecture solves this elegantly: **zero JavaScript by default, with opt-in interactivity**. The blog pages, about page, contact page — they ship as pure HTML and CSS. No framework runtime. No hydration overhead. Fast.

But when I need interactivity — the ASCII portrait, the prototype gallery — Astro's island architecture lets me drop in a React component with `client:load`:

<pre class="terminal"><code><span class="ansi-gray">&lt;!-- src/pages/index.astro --&gt;</span>
<span class="ansi-white">&lt;</span><span class="ansi-yellow">ParticleWaves</span> <span class="ansi-cyan">variant</span><span class="ansi-white">=</span><span class="ansi-green">"shimmer-cascade"</span> <span class="ansi-magenta">client:load</span> <span class="ansi-white">/&gt;</span></code></pre>

That one line loads the full React + Three.js + postprocessing stack, but *only on the homepage*. Navigate to a blog post and none of that JavaScript exists.

<table class="data-table">
    <thead>
        <tr><th>Page</th><th>JavaScript Shipped</th><th>Framework</th></tr>
    </thead>
    <tbody>
        <tr><td>Homepage</td><td>~180KB (React + Three.js + shaders)</td><td>React island via Astro</td></tr>
        <tr><td>Blog articles</td><td>0KB</td><td>Pure HTML/CSS</td></tr>
        <tr><td>About, Contact, Projects</td><td>0KB</td><td>Pure HTML/CSS</td></tr>
        <tr><td>Prototype gallery</td><td>~180KB per prototype</td><td>React islands</td></tr>
    </tbody>
</table>

This is the architecture I wish someone had shown me earlier: **use React where React shines (interactive, stateful, complex rendering) and use nothing where nothing is needed (content pages)**. Astro makes that boundary explicit rather than accidental.

### The AI Design Workflow

<p class="section-summary">Claude Code skills turn "I want that pattern" into production CSS in one prompt.</p>

Here's something I didn't expect: AI design tools have become part of my workflow. Specifically, a Claude Code skill called [ui-ux-pro-max](https://github.com/nextlevelbuilder/ui-ux-pro-max-skill) that acts like having a design system consultant sitting next to you.

The way it works: you find a component you like — say, a card with a grid-ellipsis dot pattern from a shadcn example site — and you feed it to the skill. It hands you back production-ready code with the component, CSS, dependencies, and implementation guidelines. Then you adapt it.

I used this workflow to upgrade the article cards on my [/dev-thoughts](/dev-thoughts/) page. The skill generated a `GridPatternCard` component with an SVG dot-grid background, Framer Motion entrance animations, and a gradient overlay. My site doesn't use Tailwind or shadcn — it's Astro with hand-written CSS and a cyberpunk palette. So instead of dropping the component in verbatim, I extracted the pattern:

<table class="data-table steps">
    <thead>
        <tr><th>Step</th><th>Action</th><th>Detail</th></tr>
    </thead>
    <tbody>
        <tr><td>1</td><td>Took the SVG grid pattern</td><td>From the component's Tailwind config (800x800 viewBox, rectangles for grid lines, circles at intersections)</td></tr>
        <tr><td>2</td><td>Adapted the colors</td><td>Swapped white strokes for cyan (<code>hsla(180,100%,50%,1)</code>) and white dots for magenta (<code>hsla(300,100%,50%,1)</code>)</td></tr>
        <tr><td>3</td><td>Encoded as CSS custom property</td><td><code>--grid-pattern-cyber</code> layered under a dark gradient overlay</td></tr>
        <tr><td>4</td><td>Applied to existing cards</td><td>No new components, no framework changes, just a CSS variable swap</td></tr>
    </tbody>
</table>

The result: the same dot-grid visual language you see on polished React component libraries, but rendered in pure CSS on a static Astro page. No Framer Motion needed. No React rendering overhead. Just the design pattern, extracted and adapted.

This is what makes AI design tools useful in practice — not generating entire pages, but accelerating the "I saw something I liked, now I need to make it mine" workflow.

### What I'd Tell My Past Self

<p class="section-summary">Learn React. Ship Astro. Use each where it belongs.</p>

If I were rebuilding this site from scratch, I wouldn't skip the React phase. That exploration taught me component thinking, state management patterns, and the value of declarative rendering. Those lessons inform how I build everything now — even the parts that don't use React.

But I also wouldn't build the whole site in React. The right answer turned out to be both:

<table class="data-table">
    <thead>
        <tr><th>What I Need</th><th>Tool</th><th>Why</th></tr>
    </thead>
    <tbody>
        <tr><td>Interactive 3D portraits</td><td>React + Three.js</td><td>Component composition, shader management, 60fps animation loops</td></tr>
        <tr><td>Blog articles</td><td>Astro + Markdown</td><td>Content collections, zero JS, fast builds</td></tr>
        <tr><td>Page layouts</td><td>Astro components</td><td>HTML templates with scoped CSS, no runtime</td></tr>
        <tr><td>Design patterns</td><td>AI-assisted extraction</td><td>Borrow from React ecosystem, implement in CSS</td></tr>
    </tbody>
</table>

The cyberpunk ASCII portrait still runs at 60fps. The particles still scatter and reform. The shimmer cascade still flows like rain on glass. But now it's a React island in an Astro ocean — interactive where it needs to be, static everywhere else.

That's the shift. React didn't make my animation better. It gave me a mental model for how complex interactive systems should be structured. Astro gave me the discipline to not apply that model where it isn't needed. Together, they make a site that's both more capable and more efficient than either framework alone.

---

*Building with React islands in Astro? I'm always up for comparing notes — find me on [LinkedIn](https://www.linkedin.com/in/alexmoening/).*

---

## Navigation

- [Home](/)
- [About](/about.html)
- [Projects](/projects.html)
- [Contact](/contact.html)
- [/dev/thoughts](/dev-thoughts/)

*Copyright 2026 Alex Moening. Opinions expressed are my own.*
