AI-Generated Website - A Practical Example

My experience building this website with AI-assisted development and OpenSpec instead of unstructured 'vibe coding'.

TL;DR

My personal website was outdated for over four years and difficult to maintain. For the redesign, I relied on AI-assisted development with OpenSpec instead of unstructured “vibe coding”. The specification-driven approach proved more efficient: First, the project was roughly described, then implemented in small steps – from initialization through content migration to multilingual support. The open proposals allowed flexible corrections during implementation. With sub-agents in Claude Code, deployment could be automated. Conclusion: AI-assisted development requires structure and documentation, but feels like productive teamwork – you review more code than you write.

Starting Point

For more than four years, my personal website remained unchanged online. The content was outdated, maintaining the static HTML was tedious – hardly motivating. With the goal of repositioning myself professionally and increasing my visibility, I restarted the project.

Why AI – But Not “Vibe Coding”

I’ve been experimenting with AI-assisted software development for some time. In simple cases, the results are usable, but in more complex tasks they’re often unreliable. This experience convinced me to abandon unstructured “vibe coding” and instead proceed in a specification-driven manner.

Spec-Driven Development in Practice

The basic idea: Don’t have code generated directly, but first create a precise description of what the software should do. This specification serves the AI as a guardrail for maintainable, high-quality code.

A first experiment was a web application for summarizing and translating user feedback. The results were good, but spec-kit generated too many artifacts and consumed disproportionately many tokens. For my new website, I therefore use OpenSpec: noticeably lighter, faster, and significantly more efficient in token consumption.

Describing “The Big Picture”

I started from scratch, installed OpenSpec and initialized it in an empty project. I use Claude Code as my AI tool. Immediately afterward, I described the project in openspec/project.md. A rough sketch was sufficient – much was automatically structured sensibly and correctly supplemented by the agent.

# Project Context
## Purpose
Converting the existing static HTML website at markusgraf.ch
into a Hugo-based static site...

Proposals Instead of Rigid Specifications

OpenSpec differs from spec-kit in that specifications are deliberately formulated more openly – more as proposals than set in stone. In implementation, this leaner plan proved easier to adapt. With spec-kit this is more difficult because multiple documents must be maintained in parallel.

From the spec-kit experiences, I deliberately focused on very small iterations: First just the initialization of an empty page with the static generator Hugo. Then I adopted the content of the existing page, added a CV page as the first extension, and subsequently introduced an optional English translation.

These steps remained manageable and required no changes to the initial plan. However, I got stuck with deployment. Originally, I wanted to have a script created that uploads via FTPS – obvious for reseller hosting, but outdated and unreliable in practice. I therefore switched to OpenSSH Secure Copy (scp). It quickly became apparent: Files were transferred but not cleanly mirrored. Chaos threatened on the server.

The solution was switching to rsync over SSH to mirror the target structure exactly. I had the proposal, specifications and tasks adjusted accordingly – after that the adapted implementation worked smoothly. The added value of small, openly formulated proposals became particularly clear here: The plan can be adjusted to new insights during implementation.

Cleaning Up and Automating

After successful rollout, I archive the implemented adjustments: Artifacts go into an archive folder with timestamp, the specifications as “single source of truth” into openspec/specs/. This keeps current specifications clearly separated from proposed changes in openspec/changes/.

Finally, I tested sub-agents in Claude Code – an obvious use case is deployment.

## Workflow
1. Verify current branch is 'main'
2. Build Hugo site
3. Verify build success
4. Execute scripts/deploy.sh

With an appropriate hint (“deploy changes”), the agent automatically checks the prerequisites, builds the static pages and publishes the results. If a prerequisite is missing, such as an outstanding commit, the agent waits and then continues.

Conclusion: Unfamiliar – And Precisely Therefore Exciting

AI-assisted development forces structure: specify, document, verify. That you end up predominantly reviewing code rather than writing it yourself is unfamiliar – but productive. The dialogue with the system feels like collaboration in a team: You discuss sensible implementation paths, ask questions and thus quickly arrive at reliable results. And quite honestly: It’s simply fun to converse so casually with the computer as if it were a good friend.