Writing for the Human Body: Why Health Tech Needs More Writers
- kjmccandless1
- May 6
- 3 min read
I’ve been asking ChatGPT a lot of questions recently as I try to figure out what I really want to do with my life and it’s actually started to really annoy me. It doesn’t understand me, my current mood (or how hangry I get), and what I need at the time. But how could it? It’s just a machine. Our relationship is on the rocks.
Here is what it said about me recently, “You've got the soul of a kinetic storyteller—someone who processes life by doing, feeling, and creating in rhythm. You’re not meant to sit still and quietly bleed onto a page. You're meant to move, express, and externalize.”
Now some people may love that kind of language, but it makes me want to vomit. And sure, I can ask it to change tone, but it still seems to put out this kind of nonsense that I hate.
And no shade, but I don't like the Headspace meditation app. I know it's been incredibly helpful for a lot of people, but I hate all the animations and robotics. I want a more human approach.
Because while health tech is full of intelligence, it’s just so impersonal. We’ve built incredible tools to monitor, measure, and map the human body, but it all seems so generic. AI is a great tool, but it doesn’t really understand us (yet).
When AI Gets it Wrong
Technology is talking to the most vulnerable parts of us every day, but in a language not built for human nuance.
Here are some examples of tech failure in healthcare:
A mental health app sends the same generic coping message to a grieving mother and a stressed-out intern
A symptom tracker treats depression like a battery percentage
A therapy chatbot says “I understand how you feel” with the nuance of a cardboard cut-out
When Humans Get it Wrong
But even humans can get it wrong when it comes to healthcare communication. I have opted in to get text messages from my GP and they are so alarming at times.
“Phone us to make an appointment.”
“We have an important message for you. Get in contact as soon as you can.”
Now neither of these situations turned out to be something serious, but they panicked me. What exactly was going on? What did they mean? And it’s not like I can immediately know what the purpose is.
They were written by people, but they weren’t written by writers or people who understand the importance of each word. And in an industry like healthcare, nobody wants generic or impersonal.
That’s why writers – and those with empathy and compassion – are more important than ever.
What Happens When Writers Join the Team?
A good writer knows how to communicate empathically. But they also understand that what works perfectly for one person will horrify another.
Here are some ways writers can improve the way health apps communicate:
Narrative design as treatment: Let people shape their own health story, not just input symptoms. Think about the use of more complex forms of communication, such as metaphors, taking a minute to respond, and silence.
Emotionally intelligent language: Adaptability is key here. Respond to the way they interact with you rather than you taking the lead. Reflect back their tone and style.
Call them “people” rather than “users” to stop with the dehumanizing behavior.
We’ve invited data scientists, engineers, and clinicians into the health tech conversation. Now it's time we pull up a chair for the writers.

Comments