
Is this your first time reading The Patient Experience Strategist?
Welcome! If a colleague forwarded you this issue, we’re glad you’re here. Don't miss out on future insights. Join our community of healthcare leaders who are navigating the future of care. Get these strategies delivered directly to your inbox every week.
Subscribe Now and Get the Full Experience
The Population You Are Not Planning For Yet
Every health system I work with has a diverse population strategy.
I have read the slides. I have seen the dashboards. Language access. Health equity committees. Cultural competency training. Community benefit reporting. Most of it is thoughtful. Some of it is excellent.
And almost none of it accounts for the cohort that is about to change everything about how healthcare gets chosen.
Serving a diverse population is not only a race, language, income, and access conversation. It is also a generational culture conversation. And the generation aging into the healthcare decision-making seat right now carries a set of values that most health systems have not yet figured out how to read, let alone respond to.
Gen Z is 54 percent more likely than the average consumer to care about AI's impact on the environment. That is from Numerator's 2026 consumer research. Seventy-five percent of Gen Z say sustainability matters more than brand name when they choose where to spend. Roughly 30 percent actively research a company's environmental practices before buying. A quarter have already ended relationships with businesses over unsustainable practices.
These are the patients coordinating care for aging parents, choosing pediatricians for their own kids, and deciding which health system earns their own complicated cases. They are also the generation that already distrusts institutional healthcare most, switching providers at six times the rate of older cohorts, according to AHA / Accenture research.
And they are about to watch a sector that already accounts for 8.5 percent of U.S. greenhouse gas emissions spend roughly $1.4 billion a year building AI on top of it.
Today is Earth Day. This issue is about a specific cultural competency gap most health system boards have not named yet, and what responsible AI strategy has to look like to close it.
The Carbon Math Most Boards Have Not Seen
Start with the baseline. The U.S. health sector already accounts for roughly 8.5 percent of national greenhouse gas emissions. Hospital care alone produces about 36 percent of that footprint. The United States is responsible for a quarter of all global health sector emissions, making it the single largest contributor worldwide.
That was the number before the AI buildout.
Now add data centers. The International Energy Agency projects global data center electricity consumption will approach 1,050 terawatt-hours by the end of 2026, moving them into fifth place among national-scale electricity consumers, between Japan and Russia. In the United States, data centers are on track to hit 6 percent of national electricity use by the end of this year, and roughly 8 percent by 2030.
AI is the accelerant. One peer-reviewed analysis estimates the carbon footprint of AI systems alone will reach between 32 and 80 million tons of CO2 in 2025, with a water footprint between 312 and 765 billion liters. A separate Nature Sustainability study modeling U.S. AI server deployment found the annual water footprint could range from 731 to 1,125 million cubic meters between 2024 and 2030, with additional carbon emissions of 24 to 44 million tons per year.
Every AI query has a cost. OpenAI's own disclosures put an average ChatGPT query at roughly 0.34 watt-hours of electricity and a small fraction of a teaspoon of water per prompt. Independent researchers at UC Riverside and UT Arlington have placed that number higher, estimating 10 to 50 milliliters of water per medium-length response when you include cooling and the electricity generation behind it.
The individual numbers sound small. They are not, at scale. OpenAI disclosed one billion queries per day at the end of 2024. That usage has only grown since.
Now map that against an industry that is already 8.5 percent of the nation's carbon footprint, already adding roughly $1.4 billion a year in AI deployments, and already watching institutional patient trust cut nearly in half. Public trust in hospitals and physicians dropped from 72 percent to 40 percent between 2020 and 2024.
Every AI tool a health system turns on is a climate decision. The population aging into your decision-making seat is treating every climate decision as a trust decision.
This Is a Community Trust Fit Problem
The Trust Algorithm measures five operational signals that predict patient retention: Accessibility, Resolution, Continuity, Proactivity, and Recovery. But those signals do not mean the same thing for every population you serve.
Accessibility for an aging rural community means something different than Accessibility for a young urban immigrant population. Proactivity for a patient managing three chronic conditions means something different than Proactivity for a healthy 27-year-old choosing a primary care provider for the first time. Recovery for a Black patient whose family carries generations of medical mistrust means something different than Recovery for a patient who is trusting the system for the first time.
This is what I call Community Trust Fit. The five signals measure whether you are building trust for the specific population you serve. Not in general. Not on average. For them.
Generational culture is a Community Trust Fit dimension most health system dashboards do not track yet. For a Gen Z patient, Proactivity is not just appointment reminders and gap-in-care outreach. Proactivity is also evidence that you thought about the downstream consequences of the tools you deployed to reach them. Recovery is not just service recovery after a bad experience. Recovery also includes whether the organization can account for the environmental and ethical choices behind the automation that answered their call at 2 a.m.
The patients who already distrust institutional healthcare are not going to extend you the benefit of the doubt on AI. They are going to read it for what it is.
Responsible AI Is a Strategy Discipline, Not a Procurement Event
Every conversation I have with healthcare executives about AI follows the same arc. It starts with a vendor pitch. It ends with a budget line. And somewhere in the middle, the question that should have come first never gets asked.
Not which tool. Not which vendor. Not which use case.
What does this AI deployment teach our population about whether we can be trusted with them?
That question sits above procurement. It is the cultural competency question, the Community Trust Fit question, and the environmental question all at once. And it is the question most AI governance frameworks inside health systems still do not structure for.
Here is the honest diagnosis. You are measuring satisfaction inside a system that was never built for trust. The surveys grade the experience your current operations deliver. They do not grade whether your operations were designed to earn trust in the first place. When you add AI on top of a system that was not built for trust, you scale the gap. Automation cannot fix an absence of strategy. It can only make the consequences faster and cheaper to produce.
When AI serves the goal of building trust, it earns its carbon cost. A chatbot that routes patients to the right human faster, reduces the need for three follow-up calls, and captures the clinical context that gets surfaced at the next appointment, that tool serves Accessibility, Continuity, and Proactivity. It pays for its footprint in trust.
When AI works against those signals, it burns capital and carbon to make things worse. An automated call system that traps patients in a menu tree, a bot that gives wrong answers confidently, a black-box triage model that never explains itself, those deployments do not just fail patients. They spend compute, water, and electricity to actively teach patients not to trust you.
This is the core premise of the 3A Framework. Automate the tasks that cost trust when a human does them. Augment the humans whose work builds trust. Amplify the relational moments that only humans can create. Organizations that skip the strategy and collapse everything into Automate end up with the most expensive, most energy-intensive, lowest-trust configuration available. It is also, not coincidentally, the most environmentally costly version of AI a health system can deploy.
The destination is the PX Hub Model, a contact center and digital front door operated as a strategic trust asset rather than a cost center. Responsible AI has a physical address inside the organization. It is not a feature scattered across fifteen vendor contracts. It is a coordinated system with a strategy behind it.
The Financial Shape of the Cost of Inaction
Let me make the business case plain, because the environmental case alone will not move most boards. It needs to.
Every unstrategic AI deployment is an expense that compounds in three directions.
The first is the direct capital cost. The average hospital AI contract now runs into seven figures. Deploying AI without a trust framework to evaluate it is buying software without knowing what outcome you are buying.
The second is the retention cost. Trust fully mediates the relationship between patient satisfaction and patient loyalty. A peer-reviewed structural equation model of 1,696 patients demonstrated mathematically that satisfied patients without trust still leave. The AHA's 2024 consumer research found nearly 90 percent of patients who switched providers cited the organization being “hard to do business with,” which is the operational signature of broken Accessibility and Resolution. Bad AI accelerates both.
The third is the reputational cost, which is where the environmental story becomes a trust story. The 2025 Edelman Trust Barometer on Health found that no institution, including business, government, media, or NGOs, is trusted globally to address health needs. In 9 of 16 countries surveyed, a majority believe institutions are actively undermining access to care. The ground is already tilted against you. Wasting carbon and water on AI deployments that fail patients is how a tilted ground becomes a slide.
The cost of missing out on trust-centered AI strategy is not a slow drip. It is compounding, and it is visible to the patients whose loyalty the next decade of reimbursement depends on.
What This Means for You
Responsible AI is not an ESG line item. It is a strategy discipline that sits at the center of your retention case, your reputation case, and your cultural competency case for the population you are about to serve.
The practical work is plain, even if it is not easy. Every AI deployment under consideration needs to pass through two questions before it reaches procurement. Does this tool improve a Trust Algorithm signal for the specific populations we serve, including the generational cohorts that will grade us hardest on environmental responsibility? And is the carbon and capital cost defensible against that outcome?
If your team cannot answer both questions clearly, the tool is not ready. The strategy is not ready. And the carbon cost is not earned.
The organizations pulling ahead in 2026 are the ones whose AI governance already bakes the Community Trust Fit question and the environmental question into the evaluation rubric, upstream of the vendor conversation. The organizations falling behind are the ones treating AI as an operational efficiency lever disconnected from retention, reputation, and the values of the patients whose loyalty the next decade of reimbursement depends on.
The conversation happens at the executive table before it happens at the procurement table. That is the work.
Why This Matters on Earth Day, and Every Day After It
Earth Day gives us an annual excuse to say out loud what should be a standing part of the strategy conversation.
Serving a diverse population means seeing the population in front of you, not the one your surveys were designed for. Generational culture is a diversity dimension. Environmental values are a cultural competency dimension. And the patients who will define the next decade of reimbursement are telling us, clearly and in advance, what they will be grading us on.
Responsible AI in healthcare is not a sustainability slogan. It is a strategy discipline. It requires asking, before every deployment, whether the tool serves trust for the specific population being served. Whether the compute is earning its keep or just spinning. Whether the patients who will be paying attention, especially the Gen Z and Millennial patients whose loyalty the next decade depends on, will read this deployment as evidence that you can be trusted with them.
You are measuring satisfaction inside a system that was never built for trust. An AI strategy built on top of that measurement gap will scale the gap faster. A trust strategy built underneath the AI decisions, sized to the actual population you serve, is how this becomes a competitive advantage instead of a risk.
The environmental case and the trust case are the same case. And on Earth Day 2026, with $1.4 billion a year flowing into healthcare AI and institutional trust at a generational low, the health systems that align those cases are the ones that will still be standing when the reimbursement model fully turns over.
Where to Go From Here
If you are evaluating an AI deployment right now and you want a structured way to pressure-test it against trust outcomes, the Trust ROI Calculator at patientexperiencestrategist.com models the retention and revenue impact of each of the five Trust Algorithm signals. It takes about fifteen minutes. It will give you a number your CFO will take seriously.
If you are further along and want to talk through how to build a trust-centered AI evaluation process into your governance, I open a small number of strategic thought partner conversations each month. No pitch. No pressure. Just a working session on where trust strategy fits in your next twelve months of AI decisions. You can book one here —> Strategy Call.
And if this issue resonated with someone on your executive team, forward it. The conversations that change how health systems deploy AI start at the level of who gets the memo first.
Let’s get to work,
Ebony
About Your Strategist
My name is Ebony Langston, and I spent 20+ years leading sales and operations for Fortune 100 healthcare organizations, driving millions in revenue growth by championing client-centric solutions. Today, I use that executive-level expertise, paired with my own personal experience navigating fragmented care, to position you as the visionary who can connect the dots between financial health, operational efficiency, and a truly human-centered patient experience.
I’m here to help you become a trusted partner for your patients.

