Why I’m Learning to Collaborate with AI (And Why You Should Too)
I’m a combat veteran, a licensed clinical social worker specializing in trauma for veterans and first responders, a PhD candidate in AI, and a father of five (with a sixth on the way). I’ve founded veteran suicide prevention projects built around transforming the “22 a day” statistic into choosing life. I also write and perform country music—not as a side hobby, but as another form of the same work: helping people feel seen, process pain, and find their way back to themselves.
I’m not stacking credentials here. I need you to understand where I’m standing when I say this:
I believe the future of humanity rests directly on how fast and how well people learn to use AI for good.
That’s not hype. I’m not a tech evangelist chasing the next shiny object. I’m someone who sits across from people in crisis, who knows what it costs when systems fail them, and who has spent years building bridges between clinical wisdom and emerging technology.
And I just used AI to help write this very post.
This Article Is the Evidence
Here’s something most people won’t tell you out loud:
This article was written in collaboration with AI. Not by AI. With AI.
Earlier today, I opened a chat window and typed a simple question: “Is it time to start writing on my Substack?”
From there, we went back and forth. I wrote, the AI reflected. I pushed, it pushed back. It spotted patterns in my thinking that I’ve been circling around for years. It helped surface what was already there: the threads of my work that run from the therapy room, to the research lab, to the songs I write late at night when the house is finally quiet.
This piece is the result of that collaboration. Neither of us could have written it alone.
The clinical judgment is mine. The lived experience is mine. The values and mission—those are mine. But the process of shaping raw, swirling thoughts into something coherent and useful? That was shared work.
That’s not replacement. That’s amplification.
The Exposure Therapy Analogy
Think about how we approach trauma in therapy.
When someone carries a painful memory—combat, loss, betrayal—the instinct is to avoid it. Don’t think about it. Don’t talk about it. Push it down. But avoidance doesn’t heal anything. It just gives the wound more power.
Exposure therapy works differently. We don’t avoid the memory because it’s powerful; we approach it deliberately, with safeguards, with a trained guide, until the nervous system learns it’s no longer a threat. The thing that felt unbearable becomes manageable. Then integrated. Then, sometimes, a source of strength.
AI is very similar.
The instinct for a lot of people right now is avoidance. It feels too big, too fast, too uncertain. So they don’t engage. They hope it will pass them by or that someone else will figure it out.
But avoidance doesn’t make powerful things less powerful. It just means you don’t learn how to work with them.
You don’t have to be a programmer or a researcher to benefit from knowing how to engage with AI responsibly. In fact, the less “technical” you are, the more important it is to understand what this thing is and isn’t—because it is already shaping your healthcare, your news, your children’s education, your job.
The question is not whether AI will be part of your life. It’s whether you will engage it thoughtfully—or let it reshape your world while you look away.
The Stakes Are Real
Let me be blunt.
If thoughtful, values-driven people avoid AI out of fear, distrust, or overwhelm, they leave the field open for people who will happily weaponize it. The grifters, the propagandists, the extractive platforms that see human attention and emotion as raw material.
Those actors are not waiting for anyone’s comfort level. They are already building. Already deploying. Already shaping the incentives and defaults the rest of us end up living inside.
But trajectories aren’t fixed. They’re the sum of a million small decisions.
Every clinician who experiments with AI to serve patients more effectively raises the floor. Every educator who uses it to deepen, not cheapen, learning raises the bar. Every veteran or survivor who uses it to process their story, organize their thoughts, or turn pain into art adds signal to the noise.
Avoidance doesn’t keep us safe. Avoidance creates the outcomes we’re afraid of.
What Collaboration Actually Looks Like
“Collaborate with AI” can sound vague, so let me make it concrete.
For me, it looks like:
In the clinic: Using AI to generate a first draft of a session note or treatment summary based on my outline—then reviewing, correcting, and adding the nuance only a human clinician can see. That means less time typing and more time being fully present with the next person who walks through the door. This mattered enough to me that I built a tool for it: TheraNotes.ai. It’s designed specifically for clinicians—HIPAA-conscious, built around how therapists actually think and write, and focused on one thing: getting you out of documentation faster so you can get back to the work that actually matters. I’m not mentioning it to sell you anything. I’m mentioning it because it’s the clearest example I can give of what “responsible AI for clinicians” looks like in practice. The technology exists. The question is whether we’ll shape it for our field, or let someone else do it for us.
In research: Asking AI to help me map connections between studies, clarify a concept, or stress-test a hypothesis. Not to replace my judgment, but to widen my peripheral vision so I don’t miss blind spots.
In music: Feeding themes from veterans’ stories (with care and consent) into a co-writing process—using AI to suggest metaphors, lyrical angles, or song structures I might not have found alone, then keeping only what rings true in my gut and my guitar.
In all of these, the pattern is the same:
AI is not the driver. It’s the exoskeleton. It doesn’t remove responsibility; it increases my reach.
You have your own version of this—whether you’re a teacher, parent, nurse, pastor, founder, artist, or just a human trying to navigate a confusing world.
What “Leveling Up” Really Means
When I talk about integrating AI into your life, I’m not talking about outsourcing your mind.
I’m talking about leverage.
I’ve spent thousands of hours in training, supervision, and rooms where someone’s life was quietly on the line. That expertise doesn’t disappear when I open an AI tool. It gets multiplied. I move faster on the parts that don’t require my full emotional presence. I see patterns in my own thinking and work that I might otherwise miss. I free up time and bandwidth for the things that only a human can do.
For you, leveling up with AI might mean drafting then refining, instead of staring at a blank page. Automating admin so you can focus on the work that actually matters. Using it as a thinking partner—not an authority—to clarify what you believe and why.
It doesn’t make you less human. It makes you more effective at the deeply human work you’re already called to do.
An Invitation
I called this Substack AI Social Worker on purpose.
I believe clinicians should be at the front of this conversation, not catching up from the back. We understand complexity. We sit with suffering. We know how badly things can go when systems prioritize efficiency over humanity. We also know what’s possible when tools are built and used in service of human flourishing.
But this isn’t just for clinicians. This is for anyone who wants to be an active participant in shaping what comes next rather than a passive recipient of whatever’s built.
So here’s my invitation:
Start learning. Start experimenting. Engage this technology—not with naïve optimism, and not with paralyzing fear, but with grounded curiosity and responsibility. Find one small place in your life or work where AI could reduce friction or increase impact, and try it. Keep what aligns with your values. Throw out what doesn’t. Then, when you discover something useful, share it. Teach someone else.
The future isn’t something that happens to us. It’s something we build—choice by choice, tool by tool, story by story.
I intend to build it alongside people who choose life.
Resources to Get Started
If you want to begin learning, here are some free courses from the companies building the technology:
Anthropic Academy — Free courses on AI Fluency, working with Claude, and responsible AI use. Certificates available.
Google AI Essentials — Beginner-friendly courses on AI fundamentals, prompting, and practical applications.
ChatGPT Prompt Engineering for Developers — A free short course from DeepLearning.AI and OpenAI on effective prompting.
Google Cloud Generative AI Learning Path — More technical courses for those who want to go deeper.
You don’t need to become a developer. You just need to become a thoughtful user.
A Note on Veteran Suicide
I mentioned the “22 a day” statistic. The most recent data from the VA’s 2024 National Veteran Suicide Prevention Annual Report shows an average of 17.6 veteran suicides per day. The number has shifted over time as reporting improved—but the crisis hasn’t gone away. In 2022, suicide was the second leading cause of death for veterans under 45.
That’s why we chose 23. Not as a response to a statistic, but as a daily defiant choice to choose life. Twenty-three is the number after twenty-two. It’s the next step. It’s still here.
If you’re a veteran in crisis, or you’re concerned about one, please reach out:
Veterans Crisis Line: Dial 988, then press 1
Chat: VeteransCrisisLine.net
Text: 838255
You don’t have to be enrolled in VA benefits to connect. The line is available 24/7.
Choosing life is always an option.
If this resonated, subscribe so you don’t miss what’s next. And reply to tell me:
What’s one thing you’re already doing with AI in your work or life—or one place you’re still hesitant to touch it? I read every reply.
TJ is a Licensed Clinical Social Worker, combat veteran, and founder of 23Protocol, Resonance Research Institute and 23Strong. He writes about the intersection of AI, mental health, and human flourishing at AI Social Worker. He also makes music as Thomas James Music—his latest album, The Weight of Grace, explores themes of service, faith, family, and the long road home. Because some parts of us can’t be healed by explanation—they have to be sung into alignment.



let's collaborate?
u:"reply to tell me:
What’s one thing you’re already doing with AI in your work or life..."
I have BSCS and MSCS in AI...