← Blog

Reverse Due Diligence: Audit Your Own Data Before Buyers Do

A former CFO I respect, David Dean, ran two PE-backed companies through successful exits. His operating principle was simple: operate like you are always six months from an exit.

Not because he was always selling. Because a company that can withstand buyer scrutiny at any moment is a company that operates with discipline. The data is clean. The definitions are documented. The answers come fast.

Most companies do the opposite. They run the business, then scramble to get data in order when the banker calls. The scramble costs time, creates risk, and often surfaces problems that needed months of lead time to fix.

Reverse due diligence is the practice of looking at your own business the way a buyer will. Before they ask the questions, you ask them yourself. Before they find the gaps, you find them first. The result is not a perfect data room. It is a management team that has answers ready and a data foundation that holds up under pressure.

I have identified five areas that cover 80% of what buyers test in the first two weeks of diligence. For each one, I will walk through the specific questions a buyer asks, what a good answer looks like, and a focused fix you can execute in two weeks.

Area 1. Revenue reconciliation across systems

This is always the first thing a buyer validates. Revenue by customer, by product, by geography, by channel. All should tie back to the GL. When they do not, the conversation shifts from growth story to data integrity.

The questions a buyer will ask

  • Can you show monthly revenue for the last 36 months, broken out by your top 20 customers? Does it tie to your GL?
  • What is the timing difference between when you book revenue and when you recognize it?
  • Are there manual journal entries that adjust revenue between systems? How many? How large?
  • Can you reconcile revenue by customer and revenue by product to the same total?

What good looks like

Revenue data exports cleanly from the billing or ERP system in under an hour. The total ties to the GL within an explainable variance (timing, credit memos, FX adjustments). The variance is documented monthly, not reconstructed during diligence. Customer names are consistent across systems. A new team member could reproduce the reconciliation without guidance from the person who built it.

The two-week fix

Week 1. Pull revenue by customer and revenue by product from your systems for the last 12 months. Compare both to your GL totals. Document every difference. Categorize them: timing, credit memos, manual adjustments, unexplained. Quantify the unexplained bucket.

Week 2. For the unexplained variances, trace each one to a root cause. Build a reconciliation template that your team can run monthly going forward. Establish the process: who runs it, when, what the tolerance threshold is, and who reviews it. Run it once to confirm it works.

Total effort: 40 to 60 hours across your finance and data team. The output is a monthly reconciliation process and a 12-month historical bridge that you can hand to a diligence team on day one.

Area 2. Customer metrics consistency

Customer count, retention rate, churn, CAC, LTV. These metrics drive the growth story in every deal model. When the buyer recalculates them using their methodology and gets different numbers, trust erodes. Not because your numbers were wrong, but because the gap raises questions about everything built on top of them.

The questions a buyer will ask

  • How many active customers do you have? What is your definition of active?
  • What is your gross retention rate? Net retention rate? Show the calculation and the underlying data.
  • What is your customer concentration? Revenue from top 5, top 10, top 20 customers?
  • Can you show a cohort analysis of customer behavior over time?
  • What is your CAC by acquisition channel?

What good looks like

One definition of “customer” that finance, sales, and operations all use. Retention calculated quarterly with a documented methodology that aligns with what buyers expect (typically calendar year or trailing 12-month, inclusive of all paying customers). A customer master that is deduplicated and reconciled to billing records. Concentration analysis available at the click of a button, not after a week of manual assembly.

The two-week fix

Week 1. Convene finance, sales, and operations for a 90-minute meeting. Agree on one definition of “active customer.” Document it. Pull customer count from CRM, billing, and GL. Reconcile the differences and document expected variances (prospects, free tier, parent/child accounts). Calculate retention using the buyer’s methodology: all paying customers, calendar year, both gross and net.

Week 2. Build the cohort view. Tag each customer with their first transaction date. Create a quarterly cohort matrix showing retention and expansion by acquisition period. Calculate concentration ratios. Document the methodology for all metrics in a single reference page. This becomes your customer metrics appendix for the data room.

Total effort: 30 to 50 hours. The big unlock is the definition alignment meeting. Once everyone agrees on what “customer” means, the metrics follow naturally.

Area 3. EBITDA adjustment documentation

Every company has EBITDA adjustments. Owner compensation, one-time expenses, non-recurring items. Buyers expect them. QoE teams scrutinize them. The problem is never the existence of adjustments. It is the quality of the supporting documentation.

I have seen a $2.5M adjustment supported by a spreadsheet that referenced four other spreadsheets across two shared drives, maintained by one person over three years. The QoE team spent four days untangling it. They confirmed $2.1M and rejected $400K. That rejection flowed straight to the EBITDA line and was then multiplied by the exit multiple. A documentation failure that cost the seller $2M to $3M in enterprise value.

The questions a buyer will ask

  • Walk me through each EBITDA adjustment above $100K. What is the dollar amount, the business rationale, and the supporting evidence?
  • Are these truly non-recurring? How do you define non-recurring?
  • Can someone other than the person who prepared these adjustments explain and verify them?
  • What is the trend in adjustments over the last 12 quarters? Are they growing?

What good looks like

Each adjustment has a one-page summary: dollar amount, business rationale, source transaction(s), calculation methodology, and approver. The summaries are organized in a single folder, not scattered across the finance team’s hard drives. Any member of the finance team can explain any adjustment without referencing the person who built it. The QoE team can trace from the summary to the source data in under 30 minutes.

The two-week fix

Week 1. List every EBITDA adjustment from the last eight quarters. For each one above $100K, create a one-page summary using a consistent template: description, amount, rationale, source reference, approver. If the supporting workbook is complex, simplify it or extract the relevant calculation into a standalone document.

Week 2. Have someone who was not involved in creating the adjustments attempt to verify each one using only the documentation. Where they get stuck, improve the documentation. File everything in a single organized folder. Test the trail: can you go from summary to source transaction for every adjustment? If not, fix the gaps.

Total effort: 20 to 40 hours, depending on how many adjustments you carry. This is the highest-ROI work in the entire reverse diligence process because every dollar of rejected adjustment gets multiplied by the exit multiple.

Area 4. Data lineage and source system mapping

Buyers want to understand how data moves from a customer transaction to the financial statements. When this flow is undocumented (and it usually is), the diligence team has to reverse-engineer it. That takes time, introduces risk, and often reveals surprises that should have been found earlier.

The questions a buyer will ask

  • Can you show us how a customer transaction flows through your systems to the P&L?
  • How many manual steps are in your data pipeline? Where are the integration points?
  • What happens to historical data when you migrate systems? Is it bridged?
  • Where does your most critical business logic live? In a system or in a spreadsheet?

What good looks like

A one-page data flow diagram showing source systems, integration points, manual steps, and reporting outputs. Not a full enterprise architecture document. A practical map that shows where data originates, how it moves, and where it lands. Updated within the last six months. Key person dependencies identified. Manual steps flagged with an indication of automation priority.

The two-week fix

Week 1. Sit down with the people who actually touch the data: the controller, the analyst who builds the board deck, the operations lead who maintains the KPI tracker. Map the flow from transaction to financial statement. Use a whiteboard, a slide, a simple drawing tool. Identify every system, every manual export/import, every spreadsheet in the chain. Mark the manual steps.

Week 2. Clean up the diagram. For each manual step, document who performs it, how often, what happens if they are unavailable, and what the error rate is. Create a single reference document that a new team member or a diligence team could follow. If you had a system migration in the last three years, add a section showing how historical data was bridged (or flag that it was not, with a plan to address it).

Total effort: 20 to 30 hours. The value here is twofold. The diagram accelerates every diligence request because the buyer’s team knows where to find things. And the exercise itself often surfaces integration issues that nobody realized existed.

Area 5. Reporting speed and automation level

How fast you can produce numbers signals how mature your data operations are. A buyer who hears “our flash report takes three weeks” is mentally budgeting for a finance function overhaul. A buyer who hears “five business days, fully reconciled” sees a company that can be operated without heroic effort.

The questions a buyer will ask

  • How many days after month end can you produce a preliminary flash report?
  • How many days for a fully closed set of financials?
  • What percentage of your monthly close process is automated versus manual?
  • If your controller is on vacation during month end, does the close still happen?

What good looks like

Preliminary flash within two to three business days of month end. Full close within five to seven business days. The close process is documented step by step, with each step assigned to a role (not a person). At least one backup is trained for every critical close task. Automation handles data collection and aggregation. Manual effort is reserved for review, analysis, and judgment calls.

The two-week fix

Week 1. Map your month-end close process from start to finish. For each step, document the elapsed time, the person responsible, whether it is manual or automated, and what its dependencies are. Identify the three longest bottlenecks. In most mid-market companies, these are: waiting for data from a subsidiary or department, manual reconciliation, and sequential steps that could be parallel.

Week 2. For each bottleneck, design a specific fix. Some fixes are process changes (running two reconciliations in parallel instead of sequentially). Some are automation (scheduling a data export instead of running it manually). Some are cross-training (having a second person who can perform the bank reconciliation). Implement at least one fix this week and document the target close timeline.

Total effort: 25 to 40 hours. You will not go from a 20-day close to a 5-day close in two weeks. But you will have a documented process, identified bottlenecks, and a credible improvement plan. When the buyer asks about close speed, “we are currently at 12 days and tracking to 7 by Q3” is a much stronger answer than “about three weeks.”

The 80-hour argument

Add up the total effort across all five areas: roughly 135 to 220 hours, or about 80 hours for the high-priority items if you focus on the top three (revenue reconciliation, customer metrics, and EBITDA documentation).

That is one person working half-time for a month. Or a small team working focused hours across two weeks.

The return on that investment is not abstract. In a deal where data issues cause even a quarter-turn of multiple compression on a $10M EBITDA company, the cost is $2.5M. On a $50M EBITDA company, it is $12.5M. The 80 hours of self-audit is the most leveraged work your team can do before going to market.

David Dean was right. Operating like you are always six months from an exit is not paranoia. It is discipline. And the discipline shows up in the data room as fast answers, clean reconciliations, and a management team that has already asked itself the hard questions.

Where to start

If you have not done this before, start with the 48-Hour Test. Send the basic diligence questions to your team and see what comes back. That will tell you which of the five areas need the most attention. See The 48-Hour Test: Can Your Team Answer These Questions Before Diligence? for the full question set.

For the complete picture of what buyers test during data diligence, read The Complete Data Diligence Guide.

For a weekly brief on data readiness, diligence preparation, and practical frameworks for PE-backed teams, subscribe to Inside the Data Room.