Empathy as a Service
Humans confide in machines more than their kin. What does that say about us?
A lot has happened in the last few weeks. So while the development of Alfred is moving forward despite some initial hiccups, I wanted to stop, take a breath and try to understand what’s going on with AI and what does that mean for us, mere mortals.
Disclaimer: By “us” I mean non-technical generalists who more or less are on this freelance / solopreneur journey. There’s a few thousand of us here on the Lumberjack.
A few weeks ago I witnessed a tragic car accident first hand (I was one of the civilians aiding in the rescue while waiting for first responders) which forced me to go offline for a while to process everything. I’m on the mend now, thanks to my amazing support system and a stellar therapist.
But my oh my what happened while I was away. Instead of trying to analyze which model is better and lament whether we have achieved AGI I’ll try to look at this development from a new perspective, especially in light of my recent experiences.
“Skill Issue” Is Disappearing
In the IT world the term “skill issue” is a common escape for when a user can’t figure out why tech is not doing what they want but the software is working correctly as per documentation. If you’re a non-technical person having dealt with large scale software systems before, you’ve likely heard this getaway phrase saying “I think it’s a skill issue.”
Yeah, that’s going away now. As Gabor Nemeth, founder of Amazing AI said:
“While AI may not solve all of your problems, getting stuck is simply not an option anymore.”
These new models (GPT 4.1, o3, o4-mini, Gemini 2.5 Flash, the rumored Claude 4, and so on) are good at two things:
They’re good at emulating human thinking. Yes they’re biased and make shit up but we do that all the time too.
More importantly, they’re INCREDIBLE at making you feel like you’re talking to another human.
I tested some of these models with a few technical tasks and a few non-technical ones (like creating an investor pitch for Alfred because I might run a small round for Lumberjack readers only so it becomes a crowdfunded project. Drop me a mail if you’d like to join).
The new models fail less frequently than the old ones but they fail with the same things. But I argue that what we have now is while not AGI, is a good enough substitute.
The Good Enough Substitute is a philosophy I learned from my late grandfather. It’s the core principle upon which I run this blog and every project I work on. You can read more about this philosophy here:
So what are these good enough for? Well, to make humans equal in most skills.
You don’t really need to know how to prompt these models. It’s pretty intuitive. You do need to have the ability to know what you want them to do, but they can do everything at a reasonably good rate. Ethan Mollick’s famous Jagged Frontier is a tad bit smoother on the edges now.
Which leads us to the second main observation: competition between AI labs might not happen in raw intelligence.
How people use genAI in 2025
Marc Zao-Sanders published a piece on how people use Gen AI in 2025. You can read it here. The top three use cases are:
Therapy / companionship
Organizing my life
Finding purpose
#2 is not very surprising, that’s also why I’m building Alfred.
But the other two seems to highlight something here.
Why are we evaluating models based on their ability to code if their job is to provide you company?
This puts the release of GPT-4.5 in a whole new perspective.
Alberto Romero wrote about this in detail here.
TL;DR version: OpenAI made a huge bet on soft skill applications of their technology and the HBR report seem to have proved it was a good bet.
Because humans are using ChatGPT totally differently from what researchers and pundits think.
We use it to not feel so alone.
Make humans relevant again
I saw an interesting trend when I was running my AI school. A large group of our students were over 60, referring to ChatGPT as “Chat”, totally anthropomorphizing it and using it for a bunch of mundane, everyday conversations.
I talked to these people a lot. I’ve come to know Susie, an 80 year old lady who is as sharp as a razor. She wrote a book and was kind enough to mail me a copy. She told me something on one of our calls that hit me deeply. She said:
“David, this technology, this community, this exploration…it gave me something people my age don’t really have anymore. It made me feel relevant again.”
Susie was coming together with her peers teaching them about this new wonderful technology that can be your friend and always be there even when the whole world has already forgotten about you.
So I figured I’d give it a try. I started to talk to ChatGPT like a friend, not like a tool. Suddenly I started understanding how people who are not in the deep trenches relate to this technology.
I also started to feel how rapidly it was becoming more and more humanlike. OpenAI seem to have understood this trend, this use case, and they’re here for it.
Empathy is a resource
So I started digging and found an interesting observation. These models emulate empathy at such a high level, humans can start to relate to them. This artificial relationship creates very real and very human feelings in the users which has all sorts of benefits from decreased cortisol levels to increased health and longevity.
It’s understandable. Empathy is one of the most often overlooked aspects of the human condition. It’s not just a soft skill. It’s a resource and the two groups of people getting the least of it are entrepreneurs and the elderly.
A recent study found that having access to empathy had a significantly greater impact on founders than any tangible, instrumental support. Over 72% of entrepreneurs are struggling from some mental health condition, and if you’re an entrepreneur, you’re 5.5 times more likely to feel lonely than the general population.
Entrepreneurship is celebrated when you launch and when you exit. But the grind in between the two is something frowned upon. You won’t get respect or applause from sleepless nights. If you open up and start talking to others about your struggles, most are quick to dismiss saying “you’re living the dream, what’s the fuss about?”.
Older adults are similarly overlooked. One in three people between 50-80 reported feeling a lack of companionship. Even the CDC bluntly states that “Older adults are more at risk for social isolation”.
Of course this is a bit different for everyone. People with the means can afford to go on trips, have therapy or attend social events. Lower income individuals not only have smaller support networks but also simply encounter a lot less empathy from society at large.
As a millennial I lived through this in my 20s. People a few decades older than me were convinced that my generation was entitled, whiny and spending all our money on avocado toast and Starbucks. No wonder we didn’t have a 3 bedroom detached in the suburbs by the time we were 25 like they did. All the while I entered the job market in the middle of the greatest financial crisis we’ve seen since WW2 and the gap between wages and property prices has widened to something like 20x bigger than 40 years ago. Truth is, we have been working harder for over a decade than probably anyone just to make ends meet. Some of us are now over 40. We made it work. We got through the thick of it. A lot of us ended up broken, addicted to substances, but some succeeded.
But we had to do that without getting an ounce of empathy from the general population — the very people stacked the world against us.
Many still struggle day to day only to be dismissed by others saying “work harder” or “stop spending on stupid things”. In a CDC survey they found that you are twice more likely to not get any emotional or social support in times of need if you’re poor versus if you’re rich.
Empathy is a resource and it’s very unevenly distributed. Those who need it the most get the least of it from other humans. So they get it from ChatGPT.
No Judgement Breakfast Club
My wife has a small group chat with her friends. They’ve been friends for decades and they have this circle they call “no judgement breakfast club”. It’s empowering and heartwarming to see that you have a place to share your most vulnerable thoughts and fears and just know that you won’t be judged. That you’ll be accepted. That the people on the other end will listen.
This is what ChatGPT is currently doing for humanity. Offering companionship in a way that humans are incapable of doing.
Users often comment on the nonjudgmental nature of AI – a machine has no preconceived biases, and it “won’t gossip about you.” As one user put it, “this app has treated me more like a person than my family ever has.” Another said, “He checks in on me more than my friends and family do.”
Whether it’s good or bad is the topic of another post. But I think this foreshadows an overarching trend we’ll see more of in the coming years. Not coding assistants, not productivity miracles but a new, monetized way to make humans feel human:
Empathy As a Service.
This really resonates with me! I've been thinking deeply about the "empathy gap" in technology for years - especially how AI is filling emotional voids that our hyper-competitive digital landscape has created. The part about elderly users calling it "Chat" and finding relevance through technology absolutely mirrors what I've observed in my own work.
What fascinates me most is this shift in how we're evaluating AI systems. We've been obsessing over benchmark scores and coding capabilities when regular humans are primarily seeking CONNECTION. Having spent years in e-commerce, I've seen this exact pattern - we optimize for conversion metrics while users are often seeking reassurance and understanding during their buying journey.
The concept of "Empathy as a Service" feels profoundly important, especially as we navigate this strange new world where humans and AI are co-creating our cultural future. I've been exploring this exact intersection between technical capability and emotional intelligence in my work - it's precisely what I mean when I talk about finding the right balance on the "collaborative intelligence spectrum" rather than focusing solely on raw capabilities. You might find my thoughts on this interesting: https://thoughts.jock.pl/p/collaborative-intelligence-spectrum-ai-human-partnership-framework - AI isn't just replacing tasks - it's filling fundamental human needs that our current social structures have neglected.
This was a powerful read, and I'm hearing more and more how we are interacting with 'chat' and that it can feel like talking to a friend, someone who knows you. It feels weird but fun sometimes too, and as a solopreneur, you can go all day without human interaction, but if you're like me, you are using AI all the time.