![]() |
Using AI for execution, not creation in writing. The essay itself proves the argument. |
| I've been grappling with something that makes many writers uncomfortable: I use AI when I write. Not just for grammar checks or brainstorming—I mean, I actually use it to help me compose prose. And before you close this tab or start typing your response about how AI is ruining literature, I want to tell you why I do it, what it actually looks like in practice, and then ask you a question that might complicate your assumptions. But first, some context. In September 2019, I had brain surgery to remove a meningioma. The surgery was successful in the ways that matter most—I'm alive, I can think, I can function. But my brain doesn't work the way it used to. I describe it as functioning "seven degrees left of center." Everything still works, it just works differently. Some tasks that used to be automatic now require conscious effort. Some connections that used to fire instantly now take longer routes. Writing became one of those things. Before surgery, I spent 20 years in the Air Force and another 15 coordinating logistics for Walmart—moving trucks, managing schedules, and solving immediate problems. When brain surgery forced me out of that world, writing became how I processed what happened to me. It started as blog entries during recovery, turned into a memoir, and eventually became something I do every day. I call myself a writer now, not because someone gave me permission, but because I show up and do the work. Here's what that work looks like: I know what I want to say. I know what story I'm trying to tell, what my characters need to do, what emotional beat a scene requires. The creative decisions—the actual art of writing—those are clear in my head. But the mechanical process of translating thought into clean, coherent prose? That's where my rewired brain struggles. So I use AI as a tool. And that makes people uncomfortable. I get it. There's genuine fear in the writing community about AI replacing human creativity, flooding the market with generic slop, and undermining the authenticity of work produced with algorithmic assistance. Those fears aren't irrational. But I think they're often based on assumptions about how people actually use these tools rather than how we imagine they're used. Let me make a distinction that matters to me: there's a difference between using AI for creation and using it for execution. Creation is the hard stuff. It's deciding what story to tell and why it matters. It's building characters with genuine motivations. It's choosing emotional beats, determining pacing, and finding your voice. Creation is the work of making decisions—artistic decisions that define what your writing actually means. Execution is translation. It's taking those decisions and turning them into sentences that flow naturally on the page. It's organizing thoughts into a coherent structure. It's finding precise language for what you're trying to say. It's the craft of making your vision readable. For me, AI helps with execution, not creation. When I work on a scene, I come to the conversation with decisions already made. I know my character needs to react a specific way because of who they are and what they've experienced. I know this moment needs to create tension here and release it there. I know the voice needs to sound like this, not that. Those choices are mine. What AI helps me do is articulate those choices clearly. It's like having a writing partner who asks, "Is this what you mean?" and helps me find the words that actually capture what I'm trying to say. The creative work—the decision-making that defines the writing—that's all me. The typing assistance is the tool. Think about it this way: nobody asks a novelist, "But did you handwrite it, or did you use a word processor?" Nobody says, "Well, your editor rewrote this paragraph, so you didn't really write this book." We understand those are tools that help execute the writer's vision. They don't make the creative decisions. They help translate decisions into final form. AI is in the same category. Now, I know what some of you are thinking: "But typing IS part of the creative process. Finding the right word, hearing the rhythm of a sentence—that happens during the physical act of writing." You're right. For most writers, execution and creation are intertwined. The act of typing helps you think. Revision is where you discover what you're actually trying to say. But not everyone's brain works that way. Some writers dictate their entire books because typing is too slow or physically difficult. Some use heavy editorial collaboration where someone else restructures their work. Some use writing software that suggests phrases or reorganizes prose. Are their books less authentic? The real question isn't "what tool did you use?" The question is: who made the creative decisions? Can you defend every choice in your work? Can you explain why scenes happen, why characters do what they do, and what you're trying to accomplish? Can you point to the vision that shaped the work and say, "That's mine"? If yes, the work is yours, regardless of what tool helped you get from thought to page. If no—if you typed "write me a vampire romance" into a prompt and copy-pasted the output—then AI did the creative work, not you. That's different. That's the thing people are rightfully worried about. But here's where it gets complicated: you can't always tell from the outside which one happened. When someone says, "AI helped me write this," you don't know if they mean "AI helped me organize my existing ideas into clear prose" or "AI generated ideas I then claimed as my own." That ambiguity creates suspicion. And I understand that. The thing is, I'm still grappling with this myself. I don't have clean answers. The largest part of my typing is done by AI, and that sometimes bothers me. It feels like I'm outsourcing something essential. But then I ask myself: what's essential? The ideas or the keystrokes? My memoir exists because I lived through brain surgery and five years of recovery, processed that experience, and decided what story to tell. AI didn't give me that narrative arc. It didn't create the voice that sounds like me talking. It didn't choose to be honest about the struggle without melodramatizing it. My science fiction novel exists because I spent months imagining a world, building characters with real motivations, and plotting technical sequences that create dramatic tension. AI didn't invent my protagonist or decide what she wants. It didn't choose to ground the story in atmospheric detail or determine the pacing of action sequences. Those creative decisions—the ones that make the work what it is—came from me. The tool helped me execute them clearly. So here's where I need to tell you something about this essay. Everything you've just read came from a conversation I had with an AI. I showed up with a problem: I needed to articulate how I actually use AI versus how people assume I use it. We talked through the distinction between creation and execution. I pushed back on ideas that didn't feel right. I redirected when the explanation went off track. I made every decision about what to include and how to frame it. Then I asked AI to turn that conversation into this essay. The ideas are mine. The structure reflects the choices I made during the conversation. The voice sounds like me because it IS me—shaped by decisions I made about tone, what to reveal, what to hold back. But AI did most of the typing. So now I'm asking you: who wrote this essay? Did I write it because I made every creative decision, directed every element, and can defend every choice? Or did AI write it because it typed most of the words? What do you think? |