Last year I pulled up a vendor contract because someone in operations was complaining about response times. The SLA said four-hour response for priority incidents. I asked the vendor for their performance reports. They sent over a PDF that showed 99.2% compliance.
Sounded great. Except it wasn’t.
When I matched their report against our internal ticket logs, the numbers didn’t line up. They were counting “response” as an automated acknowledgment email, not an actual human looking at the issue. By our definition (and by the contract’s, if you read the clause carefully), they’d missed the SLA on priority incidents 23 times in one quarter. Each miss triggered a service credit. Nobody had ever claimed a single one.
That was one vendor. I had 40 others with SLA terms I wasn’t tracking at all.
The problem isn’t the contract. It’s the Tuesday after.
I’ve written before about how most of the contract lifecycle happens after signature. But SLA tracking is where this gets concrete and expensive.
A 2026 report from World Commerce & Contracting and Ironclad put the average post-signature value leakage at 11% of total contract value. For a company with $500 million in contracted spend, that’s $55 million a year. The report breaks that down into categories: unauthorized scope changes (2 to 3%), missed price adjustments (1 to 2%), and dormant clauses that nobody ever activates (another 1 to 2%). Service-level failures and weak enforcement account for a meaningful chunk of the rest.
That last category is the one that kept me up at night, because it was entirely within my control to fix and I just… hadn’t.
What I actually built
I want to be honest about this: the thing that saved us $200K started as a spreadsheet. Not a dashboard. Not a CLM module. A Google Sheet with columns.
Here’s what was in it:
Vendor name. Contract number. SLA metric. Target. Measurement method. Reporting frequency. Credit/penalty if missed. Who checks it.
That last column was the important one. Before the spreadsheet, nobody checked anything. The contracts had SLA terms. The vendors (theoretically) tracked them. But nobody on our side was comparing what was promised against what was delivered. Ever.
I started with our 10 highest-value vendor contracts and just… read the SLA sections. I know that sounds pathetically basic. It is pathetically basic. But I found penalty clauses nobody had ever invoked, credit thresholds nobody was monitoring, and two contracts where the vendor had quietly stopped sending quarterly performance reports and nobody noticed.
Within the first month of actually tracking, I identified $48,000 in unclaimed service credits across three vendors. Within six months, that number was over $200,000 if you included the pricing adjustments we caught (two vendors had missed contractual rate reductions that should have kicked in based on volume thresholds).
Why nobody does this
This isn’t complicated work. Reading a contract, writing down the obligations, checking whether they’re met. Any reasonably organized person can do it. So why doesn’t it happen?
The Hackett Group’s 2025 Key Issues Study found that procurement workloads increased by 10% while budgets grew just 1%, creating a 9% efficiency gap. Everyone is doing more with less. And the “more” is usually the urgent stuff: new deals, renewals coming due, compliance fires. SLA monitoring is important but never urgent. Until it is, and by then you’ve already lost the money.
The other reason is more psychological. Claiming service credits feels adversarial. You’re going to your vendor and saying “you didn’t deliver what you promised, and now I want money back.” That’s uncomfortable, especially when you have a good relationship. But here’s the thing: the SLA is in the contract for exactly this reason. The vendor agreed to it. They priced it in. Not claiming your credits isn’t being a good partner. It’s leaving money on the table that was explicitly built into the deal.
The spreadsheet grew up (a little)
I still use a spreadsheet for the actual tracking. But the workflow around it has matured.
Monthly check-ins on the top 10. Every month I pull the performance data for our highest-value contracts and compare it against SLA targets. This takes about two hours. It’s not glamorous. But those two hours have been the highest-ROI time on my calendar consistently.
Quarterly reviews for the next 20. Medium-value contracts get a quarterly look. Same process, lower frequency.
Date alerts in ContractSafe for credit claim windows. A lot of SLA credit clauses have time limits. You might only have 30 or 60 days after a miss to file a claim. Miss that window and the credit expires, even if the failure is well-documented. I set these up as alerts in ContractSafe so I don’t have to remember them. The system reminds me.
A one-page “SLA cheat sheet” for each major vendor. Not the full contract. Just the key metrics, the targets, the penalty structure, and who to contact. I keep these in a shared folder so that if someone else on the team needs to flag an issue, they know exactly what to look for.
The $200K breakdown
People always ask what the $200K was made of. Fair question. Here’s the rough split from our first year of tracking:
Unclaimed service credits from missed response-time SLAs: about $48,000. These were straightforward. The vendor missed the target, the contract specified a credit, nobody had ever asked for it.
Volume-based pricing adjustments that hadn’t been applied: about $87,000. Two vendors owed us lower rates once we hit certain usage thresholds. We’d hit them months earlier. The lower rates never showed up on invoices because nobody told the vendor to apply them. (They’re not going to tell you.)
One auto-renewal we caught and renegotiated instead: about $65,000. This isn’t technically SLA-related, but the spreadsheet is what surfaced the renewal date in time. The existing deal had escalation clauses that would have added $65,000 over the next term. We renegotiated flat pricing because we started the conversation 90 days out instead of 9.
None of this was adversarial. Every conversation was “hey, per section 4.3 of our agreement, this credit applies” or “it looks like the volume tier adjustment hasn’t been reflected yet.” Every vendor honored it without pushback. They expected us to track this. They were just surprised nobody had been.
You don’t need a system for this. You need a habit.
Deloitte calls post-signature the phase where contract value is “won or lost.” That phrase stuck with me because it captures something most of us know intuitively but don’t act on: all the negotiation effort in the world is worthless if nobody enforces what was agreed.
You can build this tracking in a spreadsheet. You can build it in your CLM. You can build it on sticky notes, though I wouldn’t recommend it. The tool genuinely does not matter. What matters is that somebody, once a month, sits down and asks: are our vendors delivering what they promised? And are we collecting what we’re owed?
I’ve been doing this for about 18 months now. The $200K in year one has continued. Not the same $200K (we fixed the original issues), but new findings keep surfacing. A late-delivery penalty here, a missed discount there, a quarterly report that shows a 97% compliance rate when the contract requires 99%.
The boring, tedious, manual act of reading a contract and comparing it to reality is the most valuable thing I do. And I’m never going to stop being a little embarrassed that I didn’t start doing it sooner.


Leave a Reply