Until recently, tracking AI usage costs was like trying to do FinOps blindfolded. You could see the bill at the end of the month, but you had no way to drill down into which team, model, or workload was driving spend.
OpenAI changed that by exposing a Usage API and Cost API, and this week Anthropic followed suit with a brand new Usage & Cost Admin API. For FinOps practitioners, this is big news. Let’s break it down.
OpenAI’s APIs are already fairly mature. With an Admin key, you can fetch:
It’s straightforward, stable, and already battle-tested by enterprises. Many teams use it today to run daily jobs that pull spend and token usage directly into their cost dashboards.
Anthropic’s new Admin API is fresh out of the oven – but surprisingly robust:
One nuance: Priority Tier usage doesn’t show up in cost reports – you’ll only see it via the usage endpoint. That’s something FinOps teams will need to stitch together themselves.
Both APIs now let you track token consumption, daily costs, and attribute spend across teams or projects. The differences?
The important part: the direction is clear. Every serious AI provider will need to expose billing APIs if they want enterprise adoption at scale.
As AI adoption grows, so do the bills. These APIs finally let FinOps teams treat AI services like any other cloud resource:
FinOps is about accountability and optimization. Without programmatic access to usage and cost, you’re flying blind. With it, you’re back in control.
OpenAI set the pace. Anthropic is catching up fast. And the FinOps community wins either way.
If your teams are building with GPT or Claude, now is the time to bring their usage into the same FinOps workflows you already use for AWS or Kubernetes. Because AI costs aren’t “special” anymore – they’re just another line item in your infrastructure bill, and they deserve the same level of visibility and discipline.