The "Just Learn to Write" Trap (And Why It's Not That Simple)
What if the debate about AI writing tools is missing something bigger?
The False Choice That's Driving Me Crazy
You know what keeps coming up in my DMs? People telling me to "just learn to write" instead of using AI tools. And honestly? I get where they're coming from. There's something pure about mastering a craft the traditional way.
But here's the thing that's been eating at me lately...
Why are we pretending this is an either/or situation?
I mean, seriously. When did we decide that using AI tools and learning to write better were mutually exclusive? It's like saying you shouldn't use a calculator because you should "just learn math." (I know, I used the calculator-math analogy in a previous post too, but here it actually fits well)
Here's what nobody seems to want to talk about: even the most brilliant writing teacher in the world can't be there for you at 3 AM when you're struggling with writer's block. They can't instantly provide feedback on seventeen different versions of your opening paragraph. They can't help you explore multiple angles of an argument in real-time.
But you know what? That's not even the most interesting part.
What a Writing Partnership Really Looks Like
The real magic happens when you stop seeing AI as a replacement and start seeing it as a sparring partner.
Let me share something that happened just last week. I was working with Jane (one of our early Chibi users, not her real name) who was struggling with a technical article about blockchain. She's brilliant at the technical stuff but kept getting stuck making it accessible to beginners.
Instead of just having AI rewrite her work, we did something different. We had her write a paragraph explaining a complex concept, then used AI to generate three different versions of the same explanation - each aimed at a different expertise level. But here's the crucial part: We weren't looking for a replacement paragraph. We were looking for patterns.
What made the beginner-friendly version clearer? How did the expert version differ? Which analogies worked best?
Jane didn't just copy any of these versions. Instead, she started noticing patterns in how complex ideas could be broken down. She began seeing which technical terms needed more context and which could stand alone. She was learning - actively, intentionally - while using AI as a tool.
The next article she wrote? Completely her own voice, but with this new understanding of how to scaffold complex ideas for different audiences. The AI didn't write it for her - it helped her develop a new skill.
Think about it this way: When you're learning martial arts, you don't just learn from your sensei. You practice with other students. You try things out. You fail. You get feedback. You adjust. You grow.
That's exactly what's possible with AI writing tools (when used right).
About That Whole "Do Better" Thing
You know what's funny? Sometimes just using certain words can trigger whole arguments that weren't even meant to happen. Like the other day, I used the phrase "do better" in a post and someone immediately assumed I was pulling that condescending internet thing where people say "do better" as some kind of moral superiority flex.
(God, I hate when people do that, by the way. Nothing makes me roll my eyes harder than someone dropping a "do better" bomb and walking away like they just delivered some profound wisdom.)
But it got me thinking about something deeper: how quick we are to assume someone's trying to tell us how to do our job.
I get it. Really, I do. When you've spent years - maybe decades - mastering your craft, the last thing you want is someone waltzing in with a shiny new tool telling you how to "improve." Especially if that someone isn't from your field.
Here's the thing though - and I'm just thinking out loud here - what if we're all getting a little too precious about our territories?
Because honestly? I'm not here to tell anyone they're doing their job wrong. Hell, I'm still learning new things about writing every single day. What I am doing is pointing at this new tool sitting right there and saying "hey, what if this could be useful?"
It's like... imagine if photographers had rejected digital cameras because the people making them weren't traditional photographers. Or if musicians had ignored electronic instruments because the engineers weren't classically trained.
The Real Question We Should Be Asking
Here's where I might ruffle some feathers: The people who claim AI will make our writing worse are often the same ones who haven't explored its potential as a learning tool.
When someone tells me "AI will just replicate common errors," I can't help but wonder - have they actually tried using it as a tool for improvement rather than a crutch?
Because here's what I've seen working with thousands of writers:
The best ones use AI to challenge their assumptions
They use it to explore different perspectives
They use it to understand why certain writing choices work better than others
And most importantly: they stay in control of their voice
You might be thinking, "But Chad, what about the risks of becoming dependent on AI?"
Fair question. But let me flip that around: Isn't it riskier to ignore a tool that could help us understand our own writing better?
Finding Our Way Forward
I built Chibi specifically because I saw this false dichotomy playing out everywhere. Writers feeling like they had to choose between "pure" human writing and "soulless" AI generation.
What if there's a middle path?
What if, instead of either rejecting AI or surrendering to it, we learned to dance with it? To use it as a mirror for our own creativity? To challenge ourselves to grow while maintaining our authentic voice?
That's the conversation I think we should be having.
Because at the end of the day, it's not about whether you use AI or not. It's about how intentionally you approach your craft.
And maybe - just maybe - the best way to become a better writer isn't to reject new tools, but to learn how to use them mindfully.
What do you think? Have you found yourself caught in this either/or thinking? I'd love to hear your experiences in the comments.
Would this shift in perspective help you approach AI writing tools differently? Share this post with someone who might need a fresh take on this debate.
You might be thinking, "But Chad, what about the risks of becoming dependent on AI?"
I guess we're already dependent on Google, which is technically based off AI too, in our daily lives and even work. It's already acceptable to "hang on, let me Google it" when someone isn't sure about something in a professional context. Very soon, and increasingly in certain more open-minded industries, it'd be entirely acceptable to "hang on, let me ask [insert AI model by big tech or even your own coded AI agent based off another model, eg. Chibi]".
The question would be "what about the risks of not using AI at all and get left behind?", like those who were adamant about using horses as a mode of transport and refusing to take a motor vehicle.