The trust famine inside publishing is a business problem, not a disposable moral argument

The trust famine inside publishing is a business problem, not a disposable moral argument

BY LIZ MOOREHEAD, BEELER.TECH


It’s darkly funny to me that people only seem ready to get serious about trust now, after watching so many institutions collapse, rot from the inside, or hollow themselves out in public.

And even now, a lot of us are still hesitating. We know trust matters. We can feel what its absence does to a culture, a business, a public, an audience. We know what it costs when nobody believes anybody. Yet we keep clinging to the same systems that profit from distrust, confusion, exhaustion, and dependency. Because they’re familiar, and familiarity has a way of convincing us that it’s safety, even as it’s luring us over a cliff’s edge to our own demise.

So yes, we say we want something better. But many of us are still resisting the kinds of changes that would actually require us to build it. And we are paying for that resistance to trust in real time.

I come from a content marketing background, as well as a journalism background. So, what I find funny is this whole “trust” conversation isn’t new.

Of course it isn’t. 

Trust is one of those basic human principles that keeps civilization from turning into a knife fight, even if we are often too clumsy or self-interested to define it well.

That said, “trust” was the glossy, bullshit battle cry of every business speaker with a microphone, an overpriced blazer-t-shirt combo, and a half-digested understanding of Stoicism from 2010 to about 2019.

And they all said the same thing:

Trust is the only currency that matters.

Without the trust of your buyers, you have nothing.

You are in the business of trust.

Great. Wonderful. Amazing.

Are any of those statements technically wrong? 

No. 

In fact, I’ve worked for very smart people (whom I respect deeply to this day) who champion these concepts. Not everyone who positions trust as a business strategy is pushing snake oil.

But that’s not the problem.

The problem was how these lofty ideals were framed in most conversations, and who they were framed for. These were not moral arguments in favor of trust. They were growth arguments. Trust was not being treated as something we should embody because social and institutional life breaks down without it. It was being pitched as a tactic. A lever. A means to revenue.

So people treated it that way.

They pulled the trust lever like it was a slot machine. If it paid out, great. If it didn’t, they moved on to the next tactic, the next framework, the next growth play.

That’s what made the whole thing feel so cheap.

And now, here we are.

This is what a trust famine looks like, at scale

We are living through an era where the public has less faith in institutions, less confidence in media, less certainty about what is real, and less patience for being manipulated. 

In the United States, Gallup reported in 2025 that trust in mass media fell to 28%, the lowest level in its trend data. Reuters Institute’s 2025 Digital News Report also said traditional news media are struggling with low trust and declining engagement.

So when people in publishing say trust matters, my first reaction is: 

Yes, obviously. 

My second reaction is: 

So, what are you doing about it?

Because trust is not the story we tell about the business of publishing. Trust is what determines whether the business of what we do holds at all. That is especially true now, when AI has made it easier than ever to produce material at scale while making it harder than ever to produce something people actually believe.

And this is where I think the conversation gets far too soft.

We are focused on the wrong AI story, and if we continue, it will be fatal

Most AI conversations stay stuck on output. 

  • How much faster can you make content? 
  • How much cheaper can you make it? 
  • How much more can you produce?

Those are real questions, but they’re not the only ones. AI also affects how your work lands with actual human beings. It shapes whether people trust what they are seeing, whether it feels inhabited by a mind, or whether it triggers that vague sense that something is off.

We have entered a more dangerous period for content because of what AI does to human perception. 

There’s a concept called the uncanny valley (first proposed by Japanese roboticist Masahiro Mori in 1970), which describes what happens when something nonhuman gets very close to looking or acting human, but not close enough.

Instead of connection, it produces discomfort. 

Then we feel repulsed.

We start running, before we even bother to understand why. “Why” is a question for later, when we feel safe again. This is why some say this instinct is baked into our evolutionary need for survival.

People usually talk about the uncanny valley in visual terms, with robots or CGI faces or wax figures that look just human enough to be upsetting. But the principle does not stop at faces. It extends to communication too. 

That’s why AI writing can be so unsettling; we’re constantly making judgments about whether language feels inhabited by a person, even if we don’t feel that process in motion at a conscious level.

Now your audience can’t tell you why they don’t trust you 

We used to live in a content environment where, if someone disliked what you published, they could usually tell you why.

  • Your writing was weak.
  • Your point of view was vague.
  • Your reporting was thin.
  • Your voice was annoying.
  • You’re right, but you’re not likable.

Fine. Painful, but fine. I don’t like you, either. 

But at least those are diagnosable problems.

I can fix bad writing. I can address broken strategy. I can craft better stories that “resonate” with the right people at the right time. I can work with subject matter experts to get out of their own way, so they finally say something real.

Now there is another layer.

AI content sounds right at the surface: it’s fluent, it’s organized, the arguments are clear, the statements are polished. At a high level, you’re fucking nailing it, right? But while AI content gets close to approximating living, breathing, genuinely human communication, it never fully arrives there.

So, you’re left with writing that checks the boxes, makes a sound argument, and is (hopefully) factually correct. Still, somehow, your audience can feel it’s dead inside:

  • There’s no heartbeat in the words.
  • There’s no summary of lived experiences driving the narrative.
  • There’s no felt primal urge to connect in the way only humans can through storytelling.

It’s “content” that approximates humanity.

But even as terms like “AI slop” become more common, your audience may not tell you, “This feels AI-generated” for two reasons:

  • They may not even know that is what they are reacting to. They will just pull back. They will trust you less. They will spend less time with you. They will feel less attached to your brand, your publication, your byline, your community.
  • They know what they’re reacting to. They can smell the stench of AI all over you. And they’re still not going to tell you, they’re just going to leave.

And that’s the ballgame, folks.

This is why the AI conversation cannot stay trapped at the level of efficiency

Scale is not the whole game. The closer synthetic content gets to human expression without carrying actual human conviction, judgment, accountability, and stakes, the more dangerous it becomes as a publishing habit.  

And yes, there is a bitter irony here for actual writers.

I get accused of using AI when I have not used AI. 

The accusations come from every angle, too: the types of quotation marks I use, the punctuation tropes I default to, the mixed metaphors I collapse into (because sometimes I get too high on my own supply), or the fact that I understand some paragraphs should be short while others are long. And god forbid, I use an em dash correctly.

Other writers hear the same thing. 

We are now in a market where writing too cleanly, too quickly, or too competently can make people suspicious. 

Pardon my high school French, but what the fuck, are you for real? (See, at least you know there is a human behind these words.) 

I’m a college dropout who never got anything higher than a C in my English classes for all of middle school and most of high school. I worked hard to develop my voice and my craft and ideas. I’ve busted my ass for years to not sound functionally illiterate. Now, I’m being punished for that hard work?

In today’s publishing environment, your humanity is not assumed.

I’m furious about this, and I know I’m not alone.

Because this is where we are, living inside a damaging hellscape of manufactured homogeny trying to pass itself off as “real enough” to get clicks and impressions. I hate it here.

And that brings me to a distinction the industry badly needs to get more serious about.

Brand equity and trust are not the same thing

They overlap. Sometimes. They are not identical.

A legacy brand can have reach, awareness, history, and market share. It can still have traffic. It can still have habitual users. It can still be typed directly into the browser bar for years. None of that guarantees trust.

Habit is not trust. Recognition is not trust. Size is not trust. Distribution is not trust.

TRUST is trust.

A lot of media institutions are learning this the hard way.

In addition to the absolute cratering of consumer trust in mass media, journalism is seeing a continued move toward individual brands and journalist-led entities. This is colliding with the fact that Poynter reported in 2025 that for many journalists, owning their own work and building a personal brand is tied to survival in an unstable media landscape. 

That does not mean institutions are finished, but let’s not kid ourselves. 

Institutional scale can’t assume moral authority forever

If anything, the burden is heavier now.

You don’t get trust because:

  • You used to have it.
  • You are large.
  • Your logo is familiar.
  • Your traffic chart still looks decent enough to keep shareholders off your back for another quarter.

You get trust because people believe there are humans on the other side of what you make. Humans with standards. Humans with judgment. Humans with a point of view. Humans who are willing to be known, corrected, challenged, and held accountable.

Trust is the result of repeated contact with judgment you can recognize and motives you can live with.

There are no shortcuts.

For publishers, that is where the opportunity is.

Rob Beeler has been pushing on this point from a different angle, and I think he’s right.

If trust and human connection are central to the future of publishing, then the question is not whether trust is important. The question is whether publishers are willing to treat trust as leverage. And if we are, how do we build a model where the organizations that earn trust actually get paid for trust, especially in a market where AI systems may end up trading on trust publishers built first? 

That is a much harder and more useful conversation than “trust is important.”

Because if a publisher is the one earning the relationship, earning the repeat visit, earning the registration, earning the subscription, earning the data-sharing consent, earning the benefit of the doubt, then why has the market behaved for so long as if that trust should simply be handed downstream to everyone else by default?

In fact, why aren’t we actively pissed off about the fact that the people who do the hardest, slowest, most reputation-sensitive work in the chain have the least amount of leverage, when they should have the most?

Those are business questions, not esoteric, philosophical questions spouted by a college freshman trying to get laid. 

We’re paying the bill of scale-and-growth-at-all-costs thinking

For years, the market rewarded volume, speed, automation, extraction, optimization, and the ability to keep numbers moving up no matter what got flattened in the process. That logic trained companies to treat quality as optional, relationships as fungible, and audience trust as something you could burn through and replace later.

Well, guess what, everybody?

Later is here.

It turns out human beings still want human things. What a concept! They want to feel oriented. They want to feel less alone. They want to feel that somebody is actually trying to tell them the truth. They want judgment they can trust, rooms they can trust, communities they can trust, and brands that do not treat them like inventory with a pulse.

And I’m going to take a wild stab in the dark that, when you go home after a long, fulfilling day of increasing shareholder value, those are the same things you want, too.

Maybe that was easier to ignore when humanity was abundant enough to feel ambient.

It is harder to ignore now.

Scarcity clarifies things.

When hope is easier to find, you can get lazy about it

When trust feels more available, you take it for granted. When community is intact enough to feel ordinary, you stop noticing how much intellectual, emotional, and even moral nourishment you were getting from it. Then those things thin out.

Institutions wobble. Algorithms take over. Feeds get synthetic. Communities fracture. Platforms reward outrage, speed, and volume over discernment, judgment, and nuance. And when you are feeding yourself a steady diet of bullshit and humanity-adjacent inputs, you never feel full. You consume, chase dopamine hits, and become malnourished over time. 

One day, the hunger becomes visible again.

That’s what I think is happening now. We’re all starving. People are reaching and they’re hungry, because they want something with an actual pulse. Something real. Real bylines. Real communities. Real conversations. Real experiences. Real points of view. Real arguments. Real debates.

Yes, there are still plenty of people who don’t care. There are still businesses perfectly happy to profit from confusion, numbness, dependency, and audience fatigue. Some will keep treating trust deficits like a competitive advantage.

Fine. Let them. But you don’t have to be them. We’ve spent years confusing scale with strength, familiarity with trust, and brand equity with belief. We kept building for reach, volume, efficiency, extraction, and then acted surprised when audiences started pulling back from products that no longer felt meaningfully human.

We were wrong.

So if we are serious about the future of publishing, we need to stop romanticizing trust and start acting like we understand the price of losing it. Because once audiences stop trusting you, scale won’t save you.