Learning in the Age of AI
If AI can already do something—or will be able to soon—is it still worth learning how to do it ourselves?
Take grammar or spelling, for example. Tools like GPT-5 are better at writing than I am. When I draft something, I usually end up cutting hallucinations and fixing errors. But if I ask GPT to write from scratch, the result is often clearer, more detailed, and easier to understand than my own attempt.
That realization can be frustrating.
The Copy-of-a-Copy Problem
If we constantly offload thinking to LLMs—for blog posts, documentation, or even reflections like this—the output risks becoming watered down. It’s like photocopying a copy of a copy: fidelity gets lost.
AI text tends toward the statistical average, and when we rely on it, we start to trust its “synthetic clarity” more than our own messy but original thoughts. That feels dangerous—like a slow erosion of independent thinking.
Do Blog Posts Even Matter?
Lately, when I sit down to write a tutorial or a “how-to” blog post, I question the point. Why would anyone read it when they could just ask Claude or ChatGPT to explain the topic interactively?
In that light, the most valuable thing isn’t the blog post itself—it’s making sure the source documentation is correct. The raw, factual ground truth (API references, endpoint descriptions, config parameters) is what LLMs rely on to generate useful explanations. Without that, the AI has nothing to build from.
So maybe the future of “teaching online” isn’t in writing detailed tutorials, but in producing accurate, structured source material that AI can remix into personalized lessons.
How I Learn Now
When I started exploring ComfyUI, I had no idea what a VAE was, how samplers differed, or what model sizes meant. In the past, I might have pieced together blog posts, courses, or Stack Overflow threads.
Instead, I used ChatGPT and Claude as tutors:
• I’d explain my current understanding.
• The model would confirm or correct me.
• I’d ask follow-up questions, going deeper step by step.
I’ve done the same with lawn care - learning about mowing heights, aeration schedules, fertilizer timing, and pest detection. The same tool that helps me reason through image-generation nodes also teaches me how to keep my grass healthy. That’s still astonishing.
The Discouragement
This shift leaves me discouraged about creating educational content. Tutorials, screencasts, blog posts—do they matter anymore if people will just ask an LLM instead?
Documentation, yes, that remains vital. But the “teaching” layer I once enjoyed building feels less necessary. And that hits me personally: being useful, helpful, and productive is important to me. If AI is better at that role, then what’s my role?
At the same time, these same tools excite me as a learner. So I’m caught in a tension: grateful for the power of AI as a personal tutor, but saddened by the possibility that my way of teaching is less relevant.
Where This Leaves Me
I don’t have a clean answer yet. Maybe people still want YouTube walkthroughs and blog posts. Maybe AI won’t fully replace human teaching.
But I do know this: if we’re building tools and platforms in an AI-first world, the foundation must be solid documentation. That’s the raw material that enables everything else, whether a person reads it directly, or an AI explains it in a way that feels tailored just for them.
For me, that’s both an opportunity and a loss.