← Blog

Why Your QoE Report Will Surface Data Problems (And How to Prevent It)

A quality of earnings report is supposed to validate your financials. It often ends up exposing your data infrastructure instead.

Here is how it usually goes. The sell-side QoE is almost done. The accounting firm asks for a reconciliation between your billing system and your GL. Your controller pulls the numbers. They do not match. Not by a lot, but by enough. Now everyone is chasing the variance, the timeline slips, and the buyer starts wondering what else does not tie out.

The accounting firms that prepare QoE reports are very good at finding number discrepancies. What they typically do not do is explain why the numbers diverge. They flag the adjustment. They do not diagnose the root cause.

That is the gap this piece addresses. Not what QoE adjustments look like, but why they happen and what you can do about the underlying data problems before someone else finds them.

What a QoE report actually tests

A QoE report validates the quality and sustainability of your reported earnings. The accounting firm is answering a simple question for the buyer: are these numbers real, and will they continue?

To do that, they test several things.

Revenue quality. Is revenue recognized correctly? Is it recurring or one-time? Are there large customer concentrations? Can revenue be verified against source documents?

Expense normalization. Which expenses are one-time, owner-related, or non-recurring? What is the real run-rate cost structure?

Working capital. What are the trends? Are there seasonal patterns? Is inventory properly valued?

EBITDA adjustments. What is the gap between reported EBITDA and adjusted EBITDA? Are the adjustments supportable?

Each of these tests requires pulling data from your systems and checking whether it is consistent, complete, and correct. That is where data infrastructure problems surface.

Why data infrastructure is the root cause most firms miss

When a QoE report flags an adjustment, the immediate response is usually to fix the number. Adjust the revenue recognition. Reclassify the expense. Write off the inventory.

But the number is the symptom. The root cause is almost always one of three things.

Systems that do not agree. Your CRM says you have 500 active customers. Your billing system shows 480 invoiced accounts. Your GL has revenue allocated across 12 revenue codes that do not map cleanly to either system. Each system is internally consistent. They just do not agree with each other.

Manual processes that introduce drift. Someone exports data from one system, transforms it in Excel, and imports a summary into another system. This works fine for months. Then someone changes a formula. Or forgets to run the export one month. Or includes a different set of accounts. Over time, small differences compound.

Definitions that changed without documentation. Two years ago, you defined “active customer” as anyone with a transaction in the last 12 months. Last year, marketing started using “active” to mean anyone who logged in. The board deck uses one definition. Finance uses another. Neither is wrong. But when the QoE team asks for active customer count, the answer depends on who you ask.

These are not accounting problems. They are data problems. And they create QoE adjustments because the accounting firm cannot verify what they cannot reconcile.

The 5 most common data issues that show up in QoE

1. Revenue does not reconcile across systems

This is the most common finding and the most damaging. If the buyer’s accounting firm cannot tie your revenue from booking to billing to recognition, every revenue assumption in the model becomes suspect.

The root cause is usually one of two things. Either your systems were never designed to reconcile (they serve different purposes and were implemented at different times) or there is a manual step in the middle that introduces error.

Fix it by building a monthly reconciliation between your booking system, billing system, and GL. Document the expected differences (timing, recognition rules, currency) and track unexplained variances.

2. Customer metrics do not match financial data

Your investor deck says you have 92% retention. Your QoE report shows revenue declined in three of the last four quarters from existing customers. Both might be technically correct (logo retention vs. revenue retention, different time periods, different definitions) but the inconsistency creates a narrative problem.

The QoE team will dig in. If the story does not hold together, the buyer adjusts their growth assumptions downward.

Fix it by aligning your customer metric definitions with your financial data. If you report logo retention, make sure it is calculated from the same customer master your finance team uses. If you report revenue retention, make sure the cohort definitions match.

3. Cost allocations are not documented or consistent

Many mid-market companies allocate costs across business units, product lines, or geographies using methodologies that live in someone’s head. When the QoE team asks how shared services costs are allocated, the answer is often “our controller handles that.”

If the allocation methodology changed, or if it is inconsistent, or if it produces results that seem convenient rather than accurate, the QoE team will flag it.

Fix it by documenting your allocation methodology, applying it consistently, and reviewing it quarterly. The methodology does not need to be sophisticated. It needs to be consistent and explainable.

4. Historical data has gaps or format changes

System migrations, acquisitions, and process changes create breaks in historical data. The QoE team needs 36 months of consistent data to analyze trends. If month 18 looks different from month 19 because you changed ERP systems, they need to understand why and adjust their analysis.

Gaps are worse than format changes. If three months of data are missing because a system was being migrated, the QoE team cannot analyze that period. They will note it, and the buyer will assume the missing months were the bad ones.

Fix it by bridging any data gaps before the QoE process starts. Map old system data to new system formats. Document what changed and when. Fill gaps where possible and clearly flag where they cannot be filled.

5. Adjustments are supported by spreadsheets, not systems

EBITDA adjustments are expected in every QoE. The problem is when the supporting documentation is a spreadsheet that only one person understands, with formulas that reference other spreadsheets, some of which have been moved or renamed.

The QoE team needs to trace each adjustment back to source data. If the trail goes through five layers of Excel workbooks, they will spend time (your time and your money) untangling it. And they will have less confidence in the result.

Fix it by maintaining adjustment documentation in a structured format with clear links to source transactions. Each adjustment should have a description, dollar amount, supporting evidence, and the person who approved it.

How to run a pre-QoE data audit

You do not need to hire an accounting firm to find these issues. You need someone who understands your systems to stress-test the connections between them.

Step 1. Map the data flow. Draw a diagram showing how financial data moves from source (transactions, contracts, timesheets) through intermediate systems to your GL and financial statements. Identify every manual step, every export/import, every spreadsheet.

Step 2. Test the reconciliation points. Pick three months from the last year. Pull revenue from your booking system, billing system, and GL. Do they match? If not, can you explain why? Do the same for customer counts, headcount, and any KPIs in your board deck.

Step 3. Audit your definitions. Pull every KPI definition your company uses. Check whether finance, sales, marketing, and operations are using the same definitions. Check whether the board deck definitions match the financial statement definitions.

Step 4. Check your adjustment trail. For every EBITDA adjustment in the last four quarters, trace it back to source data. Can you follow the trail? Does it make sense? Could someone who has never seen it before follow it?

Step 5. Document what you find. The goal is not to fix everything immediately. The goal is to know what is there. Prioritize based on dollar impact and complexity to fix.

This exercise takes one to two weeks for a focused team. The investment is small compared to the cost of discovering these issues during a live QoE engagement.

When to start

The honest answer is earlier than you think.

If your exit is 12 or more months away, start the pre-QoE audit now. You have time to fix the root causes, not just the symptoms.

If your exit is 6 to 12 months away, start immediately. Focus on the high-dollar reconciliation issues and documentation gaps. You may not fix everything, but you can fix the items that would trigger the largest adjustments.

If your exit is less than 6 months away and you have not done this work, you have two options. Rush a focused assessment on the top three risk areas (revenue reconciliation, customer metrics, adjustment documentation) or accept that the QoE will surface issues and prepare your responses in advance.

The worst outcome is being surprised. Even if you cannot fix every issue before the QoE, knowing what is there and having a response ready changes the conversation from “we did not know” to “we identified this, here is the impact, and here is our plan.”

For the full set of questions buyers ask during data diligence, start with The Complete Data Diligence Guide.

For a structured checklist of what data readiness looks like across your organization, see PE Exit Readiness: The Data Checklist Most Teams Miss.

For a weekly brief on data diligence, QoE preparation, and practical frameworks for PE-backed teams, subscribe to Inside the Data Room.