Tackling your board's next big question

Are you sleepwalking into dark debt?

April 24 | 8 min read | By Tim Cooper

TLDR;

AI providers are investing hundreds of billions of dollars a year in data center infrastructure, much of which isn't reflected in current prices. They’ll have to recover those investments through price hikes eventually. Getting hooked on technology that becomes expensive later sounds painful. Being left behind in the AI race sounds worse. That’s the “dark debt” dilemma. So, what are CFOs doing about it?

  • Shadow liabilities. Subsidized compute costs are leaving businesses exposed to future inflation risk.

  • Agility premium. Finance chiefs are mapping tech costs directly against business outcomes, and avoiding fixed deals to maintain flexibility.

  • Layer of freedom. Building enterprise-wide agentic AI, under your control, helps avoid vendor lock-in.

Please don’t vibe code your ERP because you read about it on LinkedIn. Build these 10 apps instead.

Secret CFO is right: leave production software to engineers. But one-off tools and custom dashboards? Fair game.

Aleph's Finance Vibe Coding Playbook gives you a build-vs-buy decision guide, plus 10 apps you can build yourself, with copy-and-paste prompts. Scenario modelers, automated Slack digests, and more. 

Artificial Intelligence is being piped into our smartphones, laptops, vacuum cleaners, and anything else you can think of… whether we ask for it or not.

But have you thought about where that AI comes from? What it really costs to produce - and use - all that resource-hungry tech? And, what that could mean for your P&L in the future?

Contract risk

The non-profit Technology Business Management Council (TBMC) warned AI spend is a looming crisis as lock-in contracts – potentially worth millions – are often signed without a way to measure unit cost of the service or actual business yield. Most CFOs don't see the technology value chain, the organization said.

Many IT contracts embed or allow for price rises during their term. That applies across different types of software and hardware, including AI tools. 

And the problem is that the current pricing structures don’t reflect the true cost of producing these tools. 

These hikes can compound quickly over, say, three or five years. And the huge upfront costs of AI mean a lock-in of any type could expose CFOs to stomach-churning price hikes in the future.

“Organizations are moving faster than they can forecast spend – 78% say AI adoption is outpacing the ability to manage risks. Future hazards remain hidden,” said Amanda Donohue, EY Americas finance technology sector leader. “The biggest risks we’re seeing with enterprises come from this AI rush.” 

The AI supply chain

There’s reason for concern. The cost structure for AI runs deep, and what you pay today doesn't come close to covering it. With AI model costs dominated by infrastructure, not just clever software, the true expense is hard to determine.  

It starts with the chips. Roughly 80% of the processors that power the AI supply chain come from a single manufacturer—Nvidia—which reported $130.5 billion in fiscal 2025 revenue.

This single Nvidia GPU rack costs $3 million and uses as much electricity at any given moment as 40 American homes. The hyperscalers are buying thousands at a time

The chips are shipped to servers in massive data centers run by cloud providers. And those data centers are expensive. The four largest AI hyperscalers alone (Amazon, Google, Meta, and Microsoft) spent approximately $413 billion on data center capital expenditures just in 2025—an 84% increase from $224 billion in 2024, according to research by IT solutions provider Brightlio. McKinsey projects that by 2030, data centers will require $6.7 trillion worldwide to meet demand – an insane amount of money. 

Next in the chain are the foundational models, such as OpenAI’s Chat and Anthropic’s Claude. Their revenues, which run into the tens of billions, come chiefly from subscriptions and enterprise partnerships (OpenAI) and API use (Anthropic). 

Every interaction consumes tokens, the unit by which AI compute is measured and sold. There are a lot of costs folded into each token. So, whether a user is stress-testing a financial model or planning tonight's dinner, every interaction consumes compute—and that compute represents a real cost to providers: amortized hardware, model development spend, and raw power draw.

So, so much power: Brookings estimates that if data centers were a country, they'd be the fifth largest energy consumer in the world, right between Russia and Japan.

Right now, tokens are being sold to end users at a loss at the gross margin level. That's not a sustainable position, and something will have to give. Either unit economics improve dramatically, prices move up, or both. CFOs running meaningful AI workloads should expect the current structure to shift. Heavy enterprise users will likely feel it first.

Then there are products and applications built on these models, such as Campfire and Brex. Businesses cover the costs of models and apps, but the underlying infrastructure - the chips and data centers - is far more opaque. Ownership and payment structures here are complex, leading the OECD to identify the following risks:

  • abuse and bottlenecks caused by dominant providers

  • lock-in by cloud platforms

  • lack of transparency around investments and partnerships.

Prices will have to catch up with costs

Cloud and model providers plan to gain enough scale to cut these costs and turn a profit eventually. Continuous tech advancements could significantly boost efficiency. But it’s a massive challenge because the compute cost of reasoning‑heavy models is currently much higher than for other types of software.

Many of the unknown impacts will arrive in the future. But we’re getting clues about where it’s heading:

  • Cloud providers are already passing costs down the value chain

  • Consumption-based pricing and AI add-ons make budgets harder to predict and control

  • Use-based pricing, AI upgrades, and shifts into higher-tier plans inflate cost unexpectedly mid-contract

  • AI exacerbates waste as unused tech becomes more expensive

  • Proactive management of software costs – including continuous visibility over use and renewal discipline – is necessary.

The forecast minefield

Michael Perica, CFO of enterprise software support provider Rimini Street said the phrase dark debt “is absolutely appropriate and is being felt” across the tech stack. He said that re-platforming software is often expensive, so you’re beholden to providers at renewal.

“Our clients are reporting compounded, double-digit subscription fee increases – unforecasted, unbudgeted, and more restrictive. Innovation can also become hamstrung because you’re limited to the AI in their solution,” he said. “Negotiating better contracts is only a slight mitigation.”

But CFOs do have options for avoiding the AI dark debt trap, said Perica. 

He suggested:

  • Building a layer of enterprise-wide agentic AI under your control. This can allow innovative workflows into your existing systems without committing to a single model or vendor.

  • The CIO isn't optional; build that relationship like it's infrastructure to make execution seamless.

Conviction in AI

But not everyone sees a price-acolypse on the horizon. Matthias Steinberg, CFO at audit and risk intelligence provider MindBridge, is less pessimistic about the future costs of AI. 

“We are rapidly getting more dependent on AI tooling. If those services are subsidized by investors, will they at some point turn up the pricing? I don’t know,” he said. “There’s little track record to guide our assumptions and trajectories. But trying to build a narrow, bottom-up, quantitative business and ROI case can’t work.”

“For example, we are leaning into agentic coding, taking a risk on the future cost. To forecast how it can improve developer metrics, we use complex, wide-ranging assumptions (from 20% more to 10 times more efficiency) because there’s no [more precise] data. What makes me support these risks is our high-level conviction that AI is here to stay, regardless of which providers endure.”

Market competition and continuous improvements will ensure vendors don’t unfairly push up prices, said Steinberg.

“I’m not saying it will be cheap, but if we don’t do it, our competitors will and they’ll catch up. It will evolve rapidly – we just need to continue doing the math, measuring use cases, and accept that the potential dark cost can’t be quantified now,” he said.

Killing innovation and research

But that’s not the only risk. There’s a danger that, as companies dramatically reallocate resources to AI projects, other parts of the business will suffer. 

Getting trapped in high-cost commitments can cannibalize your budget for innovation. Companies also risk taking their eye off the ball on customer experience due to their obsession with AI-led productivity and cost control.

Matthew Guarini, executive director of TBMC, said there’s a risk that companies are “reducing investment on current priorities, then not getting returns where you reallocate to AI.”

Cost of flexibility

Steinberg said his firm typically sticks with annual subscriptions to enable flexibility. “If I need to switch to a better product after six months, I’ve wasted the other half of the year. It’s annoying but not catastrophic,” he said.

To try and match cost to output by improving efficiency per token spent, Steinberg suggested giving teams:

  • KPIs

  • role profiles

  • training

 Investing in fin-ops should also help, he said.

On short versus long contracts, Steinberg said: “Things are changing by the hour. You need to accept the [added cost of more flexible contracts] until you get enough data to predict trajectories for a section of your technology. Then you can start using [cheaper long-term contracts with better terms] for that layer.”

If the board is aligned around your AI strategy and budget, including adoption speed and risk willingness, that should help offset any unseen tech cost pains you encounter.

Brian Chasin, CFO at rehabilitation center Soba New Jersey, is also employing a flexible and agile approach to avoid dark debt destroying ROI. 

“The enterprise technology world is littered with poorly-defined contracts. These result in stranded digital assets (if the volume fluctuates) because the licensing structure is not tied to actual use,” he said. “To prevent cloud or AI technology costs from increasing rapidly, I don’t sign multi-year, fixed-volume contracts. I require consumption-based contracts that correlate directly with our monthly clinical case numbers. If a tool doesn’t meet its target, say to reduce baseline cost per admission, within 90 days of implementation… we terminate the contract.”

Reading the Room…

  •  Stack exposure. Which existing software contracts allow vendors to reprice as AI is embedded?

  • Strategic conviction. Where do we back AI on conviction, and where do we demand proof of return?

  • Vendor lock-in. What is our plan if a key AI provider doubles its prices tomorrow?

  • Budget cannibalization. What non-AI investment are we deprioritizing to fund this AI adoption?

  • Flexibility strategy. Are we favoring consumption-based contracts until AI cost trajectories are clearer?

  • One-way doors. What flags a commitment as irreversible, and who has the authority to pause and review?

Boardroom Brief is presented by The Secret CFO Network

Dive into last week's Playbook for The Secret CFO’s advice on whether to buy, build, or borrow AI tools.

If you found this helpful, please forward it to your fellow finance leaders (and maybe even your Board). If this was forwarded to you, you can make sure you receive the next edition by subscribing here.