AI as Learning Accelerator: Modern Tooling for Small Teams
Claude isn't writing all our code—it's helping us understand complex systems faster. OpenTofu modules, Kubernetes manifests, debugging GitOps issues.
By Jurg van Vliet
Published Nov 25, 2025
What AI Actually Did for This Project
This platform—Clouds of Europe—was built in six months by a small team (mostly solo, with occasional help). That timeline included:
- Multi-cluster Kubernetes setup (local, test, production)
- GitOps with Flux v2 and SOPS encryption
- OpenTofu for infrastructure provisioning
- Complete observability stack (Prometheus, Loki, Grafana)
- Next.js application with authentication, content system, and event management
- 264 API tests with 100% endpoint coverage
- Playwright E2E tests
- Production deployment on Scaleway
Six months isn't impressive by Silicon Valley standards. For infrastructure projects, it's unusually fast. AI made the difference.
What AI Accelerated
1. Learning complex systems faster
Kubernetes Gateway API was new to all of us. Instead of reading documentation for hours, we could ask:
"How do I configure an HTTPRoute with path-based routing and header manipulation?"
Claude provided working examples with explanations. We learned by doing, with immediate feedback.
2. Boilerplate and scaffolding
OpenTofu modules have a lot of boilerplate—variables, outputs, provider configuration. AI can generate the structure in seconds:
"Create an OpenTofu module for a Scaleway Kapsule cluster with configurable node pools and autoscaling."
We review, adjust, test. But the scaffolding is instant.
3. Debugging
When Flux reconciliation failed with cryptic errors, we could paste the full error and context:
"Flux shows 'kustomize build failed' with this error: [paste]. Here's my kustomization.yaml: [paste]. What's wrong?"
AI spots issues humans miss—indentation errors, missing fields, incompatible API versions.
4. Documentation and explanation
Complex Kubernetes manifests with CRDs, policies, and networking are hard to read. AI can explain what a manifest actually does:
"Explain this Gateway API configuration: [paste]"
This is learning tool. You don't just copy-paste and hope—you understand what you're deploying.
What AI Didn't Replace
Architecture decisions: We chose Kubernetes, Flux, Scaleway, Gateway API, Next.js. AI provided information about options, but these were human decisions based on project requirements.
Domain expertise: AI doesn't know your business, your compliance requirements, your performance constraints. We decided what needed to be built.
Critical judgment: When AI suggests three approaches, you need to evaluate tradeoffs. AI can explain pros and cons; it can't decide what matters for your context.
Debugging complex issues: AI helps with first-level debugging—syntax errors, missing configuration, common mistakes. Deep system issues still require understanding the stack.
Code review: AI-generated code needs review. Sometimes it's subtly wrong—syntactically correct but semantically flawed. Human review catches this.
Effective AI Use Patterns
Start with understanding: Before asking AI to generate code, understand what you're trying to achieve. AI amplifies your intent; unclear intent produces unclear results.
Iterate and refine: First response is rarely perfect. Refine the prompt, add context, specify constraints. AI conversation is iterative.
Verify everything: Test AI-generated code. Don't assume it works. Our test coverage exists partly to catch AI mistakes.
Learn from the output: Don't just copy-paste. Read the generated code, understand why it's structured that way, learn the patterns.
Honest Assessment: 6 Months vs What?
Without AI, this project would have taken:
- 12-18 months with experienced team: Someone who already knows Kubernetes, Flux, Gateway API, and the ecosystem could build this in a year to a year and a half.
- 24+ months learning from scratch: Learning all these technologies from documentation, experimenting, debugging—that's a multi-year journey.
AI compressed learning time. We learned Kubernetes Gateway API, Flux GitOps patterns, OpenTofu module structure, and Playwright testing in weeks instead of months. The concepts still had to be learned—AI just made the learning path shorter.
What This Means for Small Teams
You can build more with less: Projects that required dedicated DevOps engineers are now accessible to small teams. The cognitive load of learning new technology has decreased.
You need different skills: Less "memorise syntax," more "design good systems and critically evaluate solutions." The judgment layer becomes more important.
The pace of change accelerates: When learning new tools is faster, you can adopt new technologies more readily. This can be good (better tools) or bad (tool churn).
The Tool, Not the Author
We document AI's role honestly. This doesn't diminish the work—the architecture, judgment, testing, refinement, and actual building were human. AI was a powerful assistant that compressed learning cycles.
Think of AI as a very knowledgeable pair programmer who:
- Never gets tired
- Knows a broad range of technologies
- Doesn't judge stupid questions
- Can't make final decisions
- Sometimes confidently suggests wrong approaches
Used well, it's transformative. Used poorly, it produces plausible-looking broken code.
#ai #tooling #learning #development #productivity