In a Low-Trust World, Trust Becomes a Strategic Advantage

I think I'm reaching AI saturation in my news feed.  Just a few recent headlines and anecdotes from friends which would be earth shattering 10 years ago, but today are just a blip

  • Wal-Mart strikes a deal with Open AI to incorporate AI search into their ecommerce

  • Amazon fighting with Perplexity, demanding it stop allowing its Comet web browser to make purchases on users behalf

  • Uber partners with OpenAI to integrate with ChatGPT - using AI to explore ride options and food deliveries

  • Children using AI chatbots discussing suicidal thoughts triggers Character.AI to ban children from under 18 from talking to its chatbots

Add to those headlines the information that the big AI companies need data centers which consume massive amounts of electricity - so much so that it's driving up energy prices across the country.

As a consumer it is dizzying following all of the news, understanding just the surface level of implications to all of these moves.  So, is AI good or bad for ME?  It can do research for me in an instant or make that cool meme, but my electricity bill is going up?  I love chatting with my favorite company's customer service chat instead to talking to live person - but who was it REALLY on the other end of that keyboard?

From the formation of our society, trust is foundational for a healthy society and economy: it underpins cooperation, functioning markets, public-policy effectiveness, civic engagement and social cohesion.  In the groundbreaking book "Why Nations Fail" the authors make the case that strong institutions with a foundation of rule of law creates trust that allows for broad sections of the population to participate in a successful economy.

Trust

If you think about it for a moment.  If you couldn't trust your next door neighbor, you would over invest in things like security and you definitely would not ask to borrow a cup of sugar

If you couldn't trust companies like Amazon, there would be no way you would give them your credit card information and ecommerce would be dead.

If you couldn't trust Chase Bank, you would keep your money under lock and key, severely depressing economic activity

Trust is the bedrock to a successful American society.

So what do I mean by trust and how is it eroding?  First, we must distinguish between three different kinds of trust:

  • Interpersonal trust — the belief that “most people can be trusted.”

  • Institutional trust — confidence in key institutions (government, media, science, business).

  • Technological trust — faith that platforms, algorithms and systems behave fairly, transparently and safely.

Trust has both an emotional dimension (belief, goodwill) and a functional dimension (competence, reliability, integrity). When these erode — when people feel institutions or systems are unpredictable, biased, opaque or ideological — trust falls.  We have seen evidence that erosion in all three forms of trust has been happening for some time. Just a few stats:

  • In the U.S., the share of adults who say “most people can be trusted” declined from 46% in 1972 to about 34% in 2018, and remains at 34% in 2023-24. Pew Research Center

  • Trust in institutions: In spring 2024, only ~22% of U.S. adults said they trust the federal government to do the right thing “always” or “most of the time”, down from already‐low levels. pew.org

  • On a global level, the 2024 global trust index (via the Edelman Trust Institute) hit a 23-year low, with governments/trust in government at ~50%, businesses ~59%, NGOs ~54%. Medium+1

  • Institutional trust specifically in the U.S.: A May 2025 survey found only ~31% of U.S. adults have “a lot” or “some” trust in the federal government to act in society’s best interest

Why does this erosion matter?  When trust is low, collective action is harder, institutions struggle to implement policy, markets may falter, innovation can lose its social license, and society becomes more fragmented.  For example, communities working with government data/collections find greater success only when trust is present.

So how did we get here?  I can point to two key areas recently:

Social media & information disorder

During the rise, and at the core of social media platforms is a trusted user network.  Facebook and other social media platforms started by leveraging "interpersonal trust".  I would see a post from my sister about her new baby and I inherently trust the information I'm reading.  However, as social media evolved, it enabled a flattening of authority as news exchange started to creep in: that post from my sister when she talks about controversial subjects like vaccines now seems as credible as a major news outlet, lowering the “trust premium” of expertise.

Now add in the significant increase in disinformation campaigns by the platforms that are not moderating for truth and you have the erosion of trust in online content.  In a survey, it was found that Americans believe only ~41% of what they read online is accurate & human-generated; 23% believe content is completely false; 78% say it’s increasingly difficult to distinguish real vs. AI-generated content. New York Post. The consequence is that if people can’t trust what they read, then the underpinning of informed decision-making, civic discourse and institutional legitimacy begins to crumble.

COVID-19, vaccines & expert/institutional status

When COVID-19 arrived, the ground was primed for an acceleration of  conflicting messaging, changing guidance, politicisation of science and vaccines, all of which chipped away at the credibility of experts and institutions.  As trust in institutions declined, people substituted peer networks, social media, or alternative sources for information – further diluting shared factual basis and collective trust.

Now enter additional technology shifts – especially generative AI – which are creating new risks to trust even as they are requiring MORE trust if they are truly going to live up to their promise.  Fraudulent images, videos and documents can be produced rapidly, making authenticity harder to gauge. Globally, 66% of people use AI regularly; 83% believe it will bring benefits; yet only 46% are willing to trust AI systems. KPMG

Why the erosion of trust is becoming an issue

If consumers are stating that they can’t trust the output of AI (or, frankly they can't tell the difference between output that is real vs fake), then how will they ever jump to the point of agentic commerce?  Will consumers feel comfortable that if their AI agent books an airline flight for them, the dates are correct?  Or that the pricing was right? Or that the connections are correct?

Will consumers trust agents to analyze their health information and the lifestyle or supplement recommendations that come from AI?

Will consumers trust their banking information with AI agents if those agents could be hacked like their credit card transactions from retailers?

In my estimation, NO.  Consumers are typically willing to trade functional benefits for privacy - think of the social media example above.  I will give Facebook my information for ad targeting in order to see those cute photos of my sister's kid.  However, AI ratchets up the risk.  I honestly believe that as AI becomes more functional, the risk of something going seriously wrong and having a large negative personal impact becomes high.  All is needed is one terrible AI hack where a bot cleans out your checking account to destroy any future consumer trust in these developing and emerging platforms.

What does this mean and what can brands and marketers do about it?

So what does this mean going forward and how will we need to rebuild trust to enable future progress and growth of AI?  Even though trust is challenged, there’s an opportunity for brands, institutions and technologists to lead with trust as a differentiator — not just speed or novelty.

Here are some strategic levers relevant for marketing, innovation and consumer-centric business to rebuild or strengthen trust:

Be transparent & explain the “how.”

  • Whether it’s how data is used, how a product is made, or how an AI algorithm works — make visible what it is you do and why.

  • For tech/AI: Providing explainability, third-party audits, open-source schemas, data-provenance and clear communication reduces perception of the “black-box.”

Embrace authenticity & human-centered signals.

  • People trust people. Use real voices, real stories, credible third-party testimonials and shift from boilerplate to human-centered narratives.

  • Even if you use AI or digital tools, emphasize human oversight, human values and accountability.

Engage relationally — not just transactionally.

  • Trust is built over time via consistency. Don’t just launch a one-off transparency initiative—embed honest communication in your marketing, operations, culture.

  • Especially when addressing communities historically under-served or distrusted: demonstrate fairness, inclusion, responsiveness.

Align with purpose & shared societal value.

  • Especially in a low-trust era, brands that show they stand for something (beyond profit) can earn trust premium.

  • Whether in health & wellness, sustainability or social equity (areas you’re immersed in), aligning brand actions with broader public good helps.

Build for the trust economy in tech.

  • As AI becomes pervasive, design for trust: provide controls, allow people to opt-in/out, explain AI’s role, disclose AI usage — in short, let users know they’re not blind participants.

  • Use trust audits, fairness measurement, bias mitigation, transparency dashboards.

Leverage “little wins, big change.”

  • Trust isn’t rebuilt by big announcements alone—it’s built by a string of small credible actions, consistent over time. As you often say: “little wins, big change.”

Conclusion

Trust isn’t a nice-to-have: it’s foundational. It underpins functioning markets, innovation systems, healthy institutions and collaborative social fabric. Today, we live in a moment of depleted trust—interpersonal, institutional and technological. That creates risk and friction, but also opportunity.

For marketers, innovators and brand leaders — this is a differential moment. Those who build trust intentionally — through transparency, authenticity, human-centred design and consistent purpose-driven action — will win. As emerging technologies like AI shift faster than public understanding, the brands and institutions that build the bridge of trust will occupy a stronger position.

My ask of you: What is one action you will take today to build trust with your audience? What transparency will you provide? What story will you tell? What proof will you show? Because when trust returns, and it will, organizations that have already invested will lead the way.

Previous
Previous

The Future of Retail Part 1: Why We Still Need Stores in a World That Delivers Everything

Next
Next

Is the Marketing Funnel Dead? Or has marketing just gotten more complicated?