From Zero to Production in Hours: A Brutally Honest Retrospective on Building an AI-Native Portfolio with Next.js, Neon, and Claude Code
Technical Build Retrospective — March 2026
What Got Built
A full-stack portfolio and admin dashboard for an AI Automation Architect persona. Public marketing site with six pages (home, articles, projects, systems, about, contact), plus a JWT-gated admin dashboard for content management and a custom analytics pipeline. The entire stack: Next.js 14 App Router, TypeScript strict mode, Tailwind CSS, Neon Postgres, Vercel, jose for JWT, and bcryptjs for password hashing.
Honest verdict on vibe coding: Viable for 80% of the build. The remaining 20% — runtime boundaries, secret management, DB connectivity, security hardening — still bites hard if you skip it.
Deploy hygiene rating: 6/10. Working in production, but with unresolved security issues that would keep any experienced engineer awake at night.
Spec vs. What Actually Shipped
The master prompt was ambitious. Here's what landed and what didn't.
Shipped clean: Public site (all pages), JWT admin auth, full article and project CRUD with slug-based routing, analytics event ingestion with client-side batching and rate limiting.
Shipped rough: Analytics dashboard stores events correctly but visualization is basic. GitHub repo metrics endpoint exists but requires manual sync — no automation.
Didn't ship at all: TOTP two-factor authentication (placeholder UI only, backend ignores the field), streaming architecture (loading.tsx / error.tsx files never created), formal database migration system, rate limiting on admin routes.
The gap between spec and production isn't unusual. What's notable is which things fell off — all of them are hardening concerns, not features. The vibe-coded path delivers features fast and leaves infrastructure debt behind.
Architecture: How It Actually Works
The Runtime Split That Broke Everything
The single most important architectural fact about this project is the Edge/Node runtime boundary. Vercel runs middleware in V8 isolates (Edge Runtime) — no Node.js standard library, no crypto module, no fs. API routes and Server Components run in full Node.js.
This means any library you use in middleware must work with Web APIs only. The AI suggested jsonwebtoken — the most popular JWT library in the Node ecosystem. It silently fails in Edge because it depends on crypto.createHmac. This cost 3–4 debug cycles and multiple commits before switching to jose, which uses the Web Crypto API and works everywhere.
The Auth Architecture
Three layers of defense, each independent:
- Middleware (Edge) — intercepts all
/admin/*requests, reads the HTTP-only cookie, verifies the JWT withjose. Redirects to login if invalid. - Admin Layout (Server) — calls
getAuthUser()on every render. If null, redirects. This catches anything middleware misses. - API Route Guards (Node) — every admin API handler calls
requireAuth()before executing. Even if both previous layers fail, the data layer is protected.
Route groups (auth) and (dashboard) under app/admin/ are the structural trick that makes this work. The login page renders without the sidebar and without the auth check — without those parenthesized folders, you get an infinite redirect loop because the layout's auth check fires on the login page itself.
Custom Analytics Pipeline
No third-party scripts. No GDPR liability from external tracking. The pipeline works like this:
The client generates a visitor fingerprint and session ID, queues events in memory with a 500ms debounce, and flushes batches to a server endpoint. The server validates event types against a whitelist, rate-limits by IP, and inserts to Neon's events table with JSONB metadata for flexible context.
What's broken: React Strict Mode double-fires effects (duplicate page views), mobile beforeunload doesn't fire reliably (lost time-on-page data), and the rate limiter uses an in-memory Map that resets on every serverless cold start.
The Debugging Retrospective (Incident-Style)
Incident 1: JWT Silently Failing in Edge Runtime
Symptom: Login API returns 200, cookie is set, browser shows cookie exists — but navigating to /admin immediately redirects back to login. Infinite loop.
Root cause: jsonwebtoken uses Node's crypto module. Edge Runtime is V8-only. The verifyToken() function threw an error that was silently caught, returning null. Middleware treated every token as invalid.
The fix was conceptually simple — migrate to jose. But diagnosing it required understanding that local next dev doesn't enforce Edge restrictions, that the error was swallowed by a try/catch, and that the library's incompatibility produces zero useful error messages. This is exactly the kind of problem that vibe coding creates: the AI confidently installs the wrong library, and if you don't know about runtime boundaries, you have no mental model for why things break.
Incident 2: Database Connection Refused
Symptom: All API routes return 500 with ECONNREFUSED 127.0.0.1:5432.
Root cause: Environment variable wasn't set in Vercel. The connection pool fell back to localhost defaults. Additionally, Neon requires rejectUnauthorized: false for SSL — missing this produces a separate but equally cryptic failure.
Lesson: Environment variables are deployment configuration, not code. If your DB URL is missing, your app should fail loudly at startup, not silently fall back to connecting to a database that doesn't exist.
Incident 3: Two Different JWT Secrets
Symptom: After fixing Edge runtime, tokens still failed verification — but intermittently.
Root cause: Two files had their own JWT helper functions, each with a different hardcoded fallback secret. Tokens generated by one couldn't be verified by the other.
Lesson: Single source of truth for secrets. One auth module. No fallbacks in production — fail if the environment variable is missing.
Incident 4: Login Redirect Truncation
Symptom: After successful login, the browser navigated to a truncated URL or stayed on the login page.
Root cause: router.push("/admin") is a soft navigation — it doesn't force the browser to re-read cookies. The middleware still saw the old (empty) cookie state.
Fix: window.location.replace("/admin") — a hard redirect that forces a full page reload and fresh cookie evaluation. Simple, but only obvious if you understand the difference between client-side routing and browser navigation.
Incident 5: Analytics Double-Counting
Symptom: Inflated page view counts.
Root cause: useEffect without a ref guard runs twice in React Strict Mode. HMR hot reloads compound the problem.
Lesson: Always use ref guards for one-time analytics initialization. Never assume useEffect runs exactly once.
Incident 6: Schema Drift
Symptom: Production error — column doesn't exist.
Root cause: Two SQL files defined the same table differently. The original schema was applied to production; app code referenced columns that only existed in the patch file. Manual SQL execution in a console with no version control.
Lesson: Use a migration framework. Every schema change is a versioned file. No "fix" patches applied by hand.
Vibe Coding vs. Real Engineering: The Honest Assessment
Where AI-Assisted Development Excels
Scaffolding, boilerplate elimination, repetitive patterns. The entire public site, all CRUD forms, analytics client code, and the database schema were generated faster with AI than without. When you can describe a clear deliverable — "Route handler that validates slug uniqueness and returns 409 on conflict" — the AI delivers clean, working code.
Where It Falls Apart
Runtime boundaries. The AI doesn't inherently know that Vercel Edge is V8-only. It suggests the popular library, not the compatible one.
Deployment specifics. SSL configuration, cookie sameSite behavior on multi-domain deployments, hard vs. soft redirects after authentication — these are environment-specific problems that require deployment experience.
Security hygiene. The AI built a debug endpoint that exposed sensitive data. It added hardcoded secret fallbacks. It didn't add rate limiting to the login endpoint. It didn't add CSRF protection. The AI builds what you ask for — and you have to know to ask for the secure version.
Dead code accumulation. Old library code remained after migration. The AI fixed the immediate problem, not the technical debt surrounding it.
The Knowledge You Still Need
You need enough to know which runtime your code runs in, read error logs and understand what they mean, understand HTTP fundamentals (cookies, redirects, CORS), review generated SQL for injection surfaces and missing indexes, and know what security measures matter for your context.
How to Apply That Knowledge
Treat every AI-generated file as a pull request. Review line by line. Ask: Is this library Edge-compatible? Does this endpoint require auth? Is this secret hardcoded?
State constraints upfront. "Use only Edge-compatible libraries for middleware" prevents entire classes of bugs. "No hardcoded secrets, even fallbacks" forces proper environment variable handling.
Validate stepwise. Generate → run → test → commit → repeat. Don't let twenty files accumulate before running anything.
What I'd Do Differently
From day one: Start with jose (never jsonwebtoken), add loading.tsx stubs at every async boundary, use a migration framework like Drizzle ORM, validate environment variables at startup with Zod (fail loudly if missing), and set up a security checklist before the first deploy.
Architectural changes: Use @neondatabase/serverless instead of pg.Pool for better serverless cold-start behavior. Use Redis (Upstash) for distributed rate limiting instead of in-memory Maps. Add proper form state management (React Hook Form + Zod) instead of fifteen useState hooks per form.
Security hardening: Rate limiting on the login endpoint, security headers in middleware (X-Frame-Options, X-Content-Type-Options, CSP), consent banner for analytics, and no debug endpoints in production — ever.
The Bottom Line
This portfolio went from zero to production in hours. The AI compressed what would have been days of boilerplate into focused generation sessions. But every hour saved on scaffolding was partially reinvested in debugging problems the AI created — wrong library choices, missing environment configuration, security gaps nobody asked it to close.
Vibe coding is a force multiplier. It is not a substitute for engineering judgment. The 80% it handles well makes you faster. The 20% it handles poorly can make you slower than if you'd written it by hand — because debugging someone else's confident-but-wrong code is harder than debugging your own uncertain-but-intentional code.
The honest takeaway: learn enough to supervise the machine, or the machine will ship your vulnerabilities to production with the same confidence it ships your features.
Built with Next.js 14, Neon Postgres, Vercel, and Claude Code. Written from source evidence only.