Thursday, December 05, 2024 | Jumada al-akhirah 3, 1446 H
broken clouds
weather
OMAN
23°C / 23°C
EDITOR IN CHIEF- ABDULLAH BIN SALIM AL SHUEILI

Human interaction is now a luxury good

minus
plus

In part of her new book, “The Last Human Job,” the sociologist Allison Pugh shadowed an apprentice hospital chaplain, Erin Nash, as she went through her day. Nash ministered to a family that had lost a young woman to a Tylenol overdose. She went from room to room, praying, offering hugs, even singing with bereaved and anxious patients and family members. “There is nothing like being in the worst moment of your life and being met with comfort by someone you don’t even know,” Nash recounted a patient telling her.


Pugh didn’t just interview chaplains. She spent five years following teachers, doctors, community organisers and hairdressers — more than 100 people in total who perform what she calls “connective labour,” which is work that requires an “emotional understanding” with another person.


Pugh explains that increasingly, people in these jobs have to use technology to obsessively monitor and standardise their work, so that they might be more productive and theoretically have better (or at least more profitable) outcomes.


But a lot of care work cannot be tracked and cannot be standardised. “Industrial logic” when applied to something like chaplaincy borders on the absurd — how do you even measure success when it comes to providing spiritual comfort? Unlike with doctors, “The hospital did not bill anybody for her ‘units of service,’” Pugh writes about the chaplain, but she still had to figure out a way to chart her actions in multiple systems, which mostly didn’t capture what she was doing in the first place. This additional labour arguably made Nash a worse chaplain because it sapped her energy — dealing with the glitchy tech frustrated her — and wasted her time.


Pugh’s timely book reveals the hidden ways that technology is making many jobs miserable for both workers and consumers, at a moment when artificial intelligence continues its unregulated incursion into our lives.


The pro-tech argument I often hear in my reporting on education and mental health therapy is that it’s “better than nothing” for people who would otherwise not have access to services. Which is to say: Emotional support through a chatbot is better than no support at all, and AI tutoring is better than no tutoring at all. Too many people accept these arguments as true without considering the social cost of cutting out everyday human interaction and the financial and environmental cost of the technology itself. AI chatbots don’t come for free.


We’re increasingly becoming a society where very wealthy people get obsequious, leisurely human care, like concierge medicine paid out of pocket, private schools with tiny class sizes and dead tree books, and apothecaries with personal shoppers. And everybody else might receive long wait times for 15-minute appointments with harried doctors, a public school system with overworked teachers who are supplemented by unproven apps to “personalise” learning and a pharmacy with self-checkout.


Or, as Pugh puts it, “being able to have a human attend to your needs has become a luxury good.”


As I was reading her book, I had a minor revelation about the growing lack of trust in various American institutions. Overall trust in institutions is at historic lows, according to Gallup, and the picture is one of declining faith over the past 40 years. That’s roughly the same period in which technology has accelerated and replaced or bowdlerized a lot of low-stakes human interaction, otherwise known as “weak ties,” like the ones you have with a grocery store clerk you see regularly or even the primary care physician you see once a year.


I wondered if having to interact with an extremely stressed person who is being rated on how many customers she sees a day or, alternatively, talking to a malfunctioning robot that keeps asking us if we’re human is making many of us feel like our institutions don’t care about us at all.


I called Pugh to see if she thought my theory — that the loss of connective labour was a factor in the breakdown of institutional trust — held any water. She did, and she told me a story about a postal worker who had heard her on a podcast and got in touch with her. The postal worker was retiring, and the people who lived in the neighbourhood she worked in threw her a little party.


“She said she felt so moved, and she talked about how she didn’t feel like those kinds of relationships are that possible anymore because of the time pressure” workers are under, Pugh told me. So she was in a kind of double mourning — for the relationships she made, but also because she thought that as a society that prefers to get packages dropped off without even making eye contact, we were losing those kinds of everyday connections entirely.


I asked Pugh if there was any hope in pulling back from this dystopian, inhuman future. She assured me that a world where we are applying “Moneyball”-style statistical analysis to the soul work of hospital chaplains is “not inevitable.” Even with all the “extraordinary advances” of interactive technology (the refinement of large language models like Chat GPT), “humans lose interest in interacting with machines after a while, partly because of machine predictability.”


Most of us still crave the spontaneity that comes from talking to human beings, especially at our most vulnerable. If we don’t value care work now, we might be paying the cost during our final moments, as the chaplain has to rush off from our bedside to mark down the time she spent holding our hands.


SHARE ARTICLE
arrow up
home icon