Building a VST Looper with AI
How I used Claude to build CAPO in ~4 hours. The actual prompts, iterations, and lessons learned.
The Process
I didn't write most of this code manually. Here's my actual workflow:
- Describe what I want in plain language
- Claude generates code (often 80-90% correct)
- Test and iterate — describe errors, get fixes
- Refine edge cases — ask about scenarios I didn't think of
Total time: ~4 hours from zero to working VST
Key Prompts That Worked
Prompt 1: Initial Architecture
I want to build a looper pedal VST plugin in Rust using nih-plug. Requirements: - Record audio when I press a button - Play it back in a loop when I press again - Calculate BPM based on how long I recorded - Sync with Ableton Link to set the session tempo Start with the core looper logic - a state machine for Idle → Recording → Playing → Overdubbing
Lesson: Being specific about the state machine upfront saved iterations later.
Prompt 2: BPM Calculation
When the user finishes recording, I need to calculate BPM. Given: - The loop length in samples - The sample rate - The number of bars the user selected (1, 2, 4, 8, 16) Formula needed: BPM = (bars × 4 beats) / seconds × 60 Please add this to the looper, triggered when recording ends. Clamp to 30-300 BPM range.
Lesson: Giving the formula explicitly avoided ambiguity.
Prompt 3: Ableton Link Integration
I need to integrate Ableton Link using the rusty_link crate. When the loop starts playing: 1. Set the Link session tempo to the calculated BPM 2. Start the transport (set is_playing = true) 3. Request beat 0 so Ableton starts from the downbeat Show me the key functions I need.
Result: Claude provided the exact API calls I needed:
session_state.set_tempo(bpm, time_micros);
session_state.set_is_playing_and_request_beat_at_time(
true, time_micros as u64, 0.0, quantum
);
Prompt 4: Fixing a Subtle Bug
Bug: When I start the loop, Ableton's transport starts but it's not aligned to beat 0. It starts at some random beat. Current code: [pasted the link.rs file] What am I missing?
Claude's insight: I was using set_is_playing() instead of set_is_playing_and_request_beat_at_time(). The "request" part is what asks Link to align to a specific beat.
Iteration Log
| Session | Focus | Time | Outcome |
|---|---|---|---|
| 1 | Core looper logic | 1h | State machine working |
| 2 | BPM + Link | 1.5h | Tempo sync working |
| 3 | UI | 1h | Basic UI done |
| 4 | Polish | 30min | Production ready |
What I Couldn't Delegate
- Testing in DAW: Had to manually test in Ableton
- Audio tuning: The 0.85 decay factor for overdub was trial and error
- UX decisions: How the button should feel, color choices
- Link quirks: Some timing issues required reading Ableton docs
Conclusion
AI didn't replace understanding. I still needed to know:
- How audio buffers work
- What Ableton Link does conceptually
- How nih-plug structures plugins
But AI accelerated everything. What might have taken 2-3 weeks of learning and coding took 4 hours.
The key: Be specific, iterate fast, test early.