Traditional SaaS revenue recognition is already nuanced. Now add usage-based pricing, custom model training as a deliverable, outcome-based fees, and ongoing model updates that may or may not constitute new performance obligations — and you have a category of accounting problem that IFRS 15 and ASC 606 were not specifically designed to address.
That does not mean the standards do not apply. They do. But applying them to AI business models requires judgment calls that controllers and revenue accountants at AI startups encounter daily. This article walks through the most common pressure points, with plain-English explanations and practical examples.
Educational disclaimer: This is educational content, not professional accounting advice. Revenue recognition judgments depend on specific contract terms and facts. Users should consult a qualified accountant or auditor before making accounting policy decisions.
Why AI Business Models Are Different
Standard SaaS contracts are relatively tidy: a customer pays a monthly fee for access to software, and you recognize that fee ratably over the service period. The performance obligation is clear (software access), the transaction price is fixed, and the recognition period matches the subscription term.
AI startups rarely get that luxury. A few characteristics make AI revenue recognition structurally harder:
- Usage-based or consumption pricing. Many AI products charge per API call, per token, per inference, or per output generated. The total transaction price is not known at contract inception — and may vary significantly month to month.
- Custom model training as a deliverable. Some contracts include a distinct phase where the vendor trains a bespoke model on the customer's data. Is that training a separate performance obligation? Does the customer control the resulting model? These questions change when revenue is recognized.
- Ongoing model updates and retraining. AI models degrade without maintenance. Contracts that promise "the model will stay current" or "we'll retrain quarterly" may contain implicit performance obligations that need to be carved out and valued.
- Outcome-based or success fees. Some AI vendors charge based on results — a percentage of cost savings achieved, a fee per successful prediction, or a bonus tied to accuracy thresholds. These are variable consideration arrangements with constrained estimates.
- Data licensing bundled into the contract. If the AI product only works because the vendor provides proprietary training data — and that data access is part of what the customer is paying for — then data licensing may be a separate performance obligation.
None of these are unsolvable. But each requires a deliberate analysis under the five-step model. Working through that analysis is exactly what ClearRevenue's interactive guide is built for.
Step 2: Identifying Performance Obligations When AI Is Bundled
The most common place AI contracts go wrong from a revenue recognition standpoint is Step 2: identifying performance obligations.
Under both IFRS 15 and ASC 606, a performance obligation exists when a promised good or service is distinct — meaning the customer can benefit from it on its own (or together with other readily available resources), and it is separately identifiable from other promises in the contract.
In a bundled AI offering, common elements that practitioners evaluate for distinctness include:
- Software platform access — typically a stand-ready obligation recognized over time
- Initial model setup or configuration — may or may not be distinct depending on whether the customer could use the platform without it
- Custom model training — more likely distinct if the customer receives and can use the trained model independently; less likely distinct if the model only runs on the vendor's platform
- Training data access — consider whether the customer is paying for the data itself or just for what the model does with it
- Ongoing support and retraining — a series of distinct services if they are substantially the same and have the same pattern of transfer
A practical starting point: read the contract and ask what the customer would lose if you removed each element. If removing it would leave the customer with something materially less valuable that they could not readily substitute, that element is likely a distinct performance obligation.
Getting this right matters because how you count performance obligations determines how you allocate the transaction price — and therefore when revenue hits.
Step 3: Variable Consideration in Usage-Based and Outcome-Based Pricing
Usage-based AI pricing creates a variable consideration problem. The transaction price is not fixed at contract inception — it depends on how much the customer actually uses the product.
Under IFRS 15 (paragraph 56) and ASC 606 (ASC 606-10-32-8), variable consideration should be estimated using either the expected value method (probability-weighted average across scenarios) or the most likely amount method (whichever better predicts the amount the vendor will be entitled to). That estimate is then constrained to the amount that is probable (IFRS 15) or not probable of significant reversal (ASC 606) of subsequent downward adjustment.
For many AI companies, the practical approach looks like this:
- Tiered or minimum-commitment contracts: Recognize the minimum guaranteed amount immediately; recognize usage above minimums as the customer uses the product (the usage-based allocation exception in ASC 606-10-32-40 and IFRS 15.B16 may apply here).
- Pure pay-as-you-go with no minimums: Recognize revenue as usage occurs, since the transaction price for each increment of usage is known at the time of delivery.
- Outcome-based or success fees: These are highly constrained variable consideration. Practitioners typically do not recognize success fees until the outcome is determined (or very highly probable), because including an estimate earlier would likely result in a significant revenue reversal.
One thing worth documenting carefully in your revenue memo: the rationale for your variable consideration estimate and the constraint applied. Auditors will ask. ClearRevenue's revenue memo template includes a dedicated section for this.
Step 3 Continued: SSP Estimation Without Market Comparables
If a contract has multiple performance obligations, the transaction price must be allocated to each based on relative standalone selling prices (SSP). For most enterprise software, SSP is observable — you can look at what you charge customers when you sell that element on its own.
AI startups frequently lack observable SSP data because:
- Custom model training is rarely sold as a standalone service — it's almost always bundled
- Ongoing retraining and maintenance have no established market rate for novel AI models
- The AI product category is new enough that there are few or no market comparables
When observable SSP is not available, IFRS 15.79 and ASC 606-10-32-34 allow estimation using approaches such as:
- Adjusted market assessment: Look at prices charged by competitors for similar services, adjusted for your product's specific characteristics
- Expected cost plus margin: Estimate the cost to fulfill the obligation and add an appropriate margin
- Residual approach: Subtract the SSP of all other obligations from the total transaction price — but this is only allowed when SSP is highly variable or uncertain for one element
In practice, most AI startups use a combination of cost-plus (for custom training) and adjusted market assessment (for platform access and support). The key is documentation: your SSP estimates need to be supportable and consistently applied.
ClearRevenue's SSP Estimator walks through all three methods with live calculators and helps you document the rationale behind your chosen approach.
Step 5: Over-Time vs. Point-in-Time for AI Model Development
For custom model training — where this analysis gets genuinely complex — Step 5 requires determining whether the performance obligation is satisfied over time or at a point in time.
A performance obligation is satisfied over time if any of three criteria are met (IFRS 15.35, ASC 606-10-25-27):
- The customer simultaneously receives and consumes the benefits as the vendor performs
- The vendor's performance creates or enhances an asset that the customer controls as it is created
- The vendor's performance does not create an asset with an alternative use to the vendor, and the vendor has an enforceable right to payment for performance completed to date
For AI model training, the analysis often turns on criterion 3. Consider the following scenarios:
- Model trained on customer's proprietary data, hosted on customer's infrastructure: The model has limited alternative use to the vendor (it was built for this customer's data), and if the contract provides for progress payments, over-time recognition is likely supportable.
- Model trained on the vendor's general data, deployed on the vendor's platform: The model has significant alternative use (the vendor can use the training work to improve their general model), so criterion 3 likely does not apply. Recognition would be at the point in time the model is delivered and the customer accepts it.
- Ongoing model updates and retraining: If these are a series of distinct services that are substantially the same, practitioners typically apply the series guidance and recognize revenue over the service period as each update is delivered.
Getting this wrong can cause material revenue timing differences — recognizing a large training contract at a single point in time versus ratably over a development period can significantly affect quarterly financials.
A Practical Example
Consider an AI startup that signs a $500,000 annual contract with a mid-market logistics company. The contract includes:
- Access to a demand forecasting platform ($200K/year)
- Custom model training on the customer's historical shipment data ($150K one-time)
- Quarterly model retraining and maintenance ($50K/year)
- A success fee of $100K if forecast accuracy exceeds 92% by month 12
A practitioner working through this contract would typically:
- Identify four performance obligations: platform access (stand-ready, over time), custom training (point-in-time or over-time depending on control analysis), quarterly retraining (series of distinct services), and the success fee (variable consideration, likely highly constrained).
- Estimate SSP for each: Platform access may have observable SSP from other contracts. Training and retraining would use cost-plus. The success fee estimate would likely be constrained to zero until the accuracy threshold is probable.
- Allocate $400K (excluding constrained variable consideration) across the three non-constrained obligations based on relative SSP.
- Recognize revenue: Platform access ratably over 12 months; custom training at delivery (or over the training period if criterion 3 is met); retraining ratably over each quarter; success fee when (and if) the threshold is met.
This is a simplified illustration — actual contracts are messier. But the logic holds across most AI revenue arrangements.
Checklist: Common AI Revenue Recognition Pressure Points
When reviewing an AI contract, consider whether each of the following has been addressed:
- Have all bundled elements been evaluated for distinctness as separate performance obligations?
- Is custom model training over-time or point-in-time, based on the control analysis?
- Is usage-based pricing recognized as usage occurs, or does a minimum commitment affect the pattern?
- Are outcome-based fees appropriately constrained in the variable consideration estimate?
- Is SSP observable, or does it require an estimation method — and is that method documented?
- Do ongoing model updates represent distinct performance obligations that need to be allocated separately?
A structured review process matters here. ClearRevenue's 25-point audit checklist covers each step of the five-step model and flags the pressure points most commonly raised in SaaS and AI contract reviews.
Where to Go From Here
AI startup revenue recognition is not going to get simpler. As business models evolve — multi-model arrangements, AI agents that act autonomously on behalf of customers, hybrid licensing-plus-inference pricing — the accounting questions will multiply.
The tools that help most are also the simplest: a clear process, consistent judgment, and documentation that can withstand auditor scrutiny.
If you are building out your revenue recognition process for the first time or reviewing it ahead of an audit:
- Interactive 5-Step Guide — walk through the IFRS 15 / ASC 606 framework with SaaS and AI examples at each step
- Audit Checklist — 25 items across all five steps, with common AI pitfalls flagged
- Revenue Memo Template — a structured template for documenting your policy decisions in audit-ready format
- SSP Estimator — live calculators for all three SSP estimation methods
For finance teams handling more complex arrangements — M&A, restatements, or structures outside the standard SaaS playbook — specialized consultants remain the right call. For the 80% of AI startup contracts that fall within familiar patterns, a structured internal review process covers most of the ground.
Educational disclaimer: This article is educational content only and does not constitute professional accounting, legal, or tax advice. Revenue recognition judgments depend heavily on specific contract terms, facts, and applicable accounting standards. Users should consult a qualified accountant or auditor before making accounting policy decisions.