← Blog

7 Data Red Flags That Kill Deals in Diligence

Bad data kills deals. Sometimes before you buy. Sometimes when you try to sell.

I have watched deals reprice, stall, and collapse over data problems that could have been fixed six months earlier. Not big dramatic failures. Small, specific things that eroded buyer confidence one question at a time.

Here are seven red flags that show up in diligence and what they actually signal to a buyer.

1. Your revenue by customer does not match your revenue by product

The buyer asks for revenue two ways. By customer and by product line. Both should total to the same number. When they do not, and they frequently do not, it tells the buyer that your data model has allocation problems.

What it looks like. Revenue by customer totals $47.2M. Revenue by product totals $48.1M. Nobody can explain the $900K difference without spending a week in Excel.

Why buyers care. If you cannot slice revenue cleanly, the buyer cannot validate the growth story by segment. They cannot tell which products are growing, which customers are profitable, or where the real margin is.

What it signals. Your systems were not designed to report both dimensions consistently. There are manual allocations happening somewhere, and the logic is not documented.

How to fix it. Build a reconciliation that ties revenue by customer and revenue by product to your GL total. Do this monthly. Document the allocation logic. The goal is not zero variance. The goal is explained variance.

2. Nobody can explain the logic in your Excel adjustments

Every company has adjustment spreadsheets. The problem is when those spreadsheets contain formulas that reference cells in other workbooks, use hardcoded overrides, and have been maintained by three different people over four years.

What it looks like. The QoE team asks how a $2.3M adjustment was calculated. Your controller opens a workbook with 14 tabs, navigates to a cell that references another file, and says “let me walk you through this.”

Why buyers care. If the adjustment trail requires an expert guide, it is not verifiable. Unverifiable adjustments get discounted or rejected.

What it signals. The financial reporting process depends on tribal knowledge. Key person risk is high. The numbers might be right, but nobody except the person who built the spreadsheet can confirm it.

How to fix it. For every material adjustment, create a one-page summary showing the dollar amount, the business rationale, the source data, and the calculation. If the underlying logic is in Excel, simplify it or move it into a format that someone new could follow.

3. Your CRM and accounting system show different customer counts

Sales says 1,200 customers. Finance says 1,050 billable accounts. The board deck says “over 1,000 customers.”

What it looks like. The diligence team asks how many customers the company has. Three people give three numbers. Each is defensible with their own definition. None of them align.

Why buyers care. Customer count drives retention analysis, concentration analysis, and growth assumptions. If the base number is contested, every metric built on it is suspect.

What it signals. There is no single source of truth for customer data. Definitions vary by department. Nobody has reconciled the master data across systems.

How to fix it. Define “customer” once. Document the definition. Reconcile your CRM, billing, and GL customer counts monthly. Explain the expected differences (prospects vs. active, parent vs. child accounts, free tier vs. paid).

4. Historical data has unexplained discontinuities

A system migration 18 months ago broke the continuity of your data. Revenue categories changed. Customer classifications shifted. KPI definitions were updated. None of this was documented.

What it looks like. The diligence team charts revenue by category over 36 months and sees a spike in one category and a drop in another at exactly the same time. When asked, the team says “oh, that was the NetSuite migration.”

Why buyers care. Discontinuities make trend analysis unreliable. If the buyer cannot see a clean 36-month trend, they cannot project forward with confidence. They will apply a higher discount rate to account for the uncertainty.

What it signals. System changes were treated as IT projects, not data governance events. The downstream impact on reporting was not managed.

How to fix it. Bridge the data. Map old categories to new ones. Create a documented crosswalk that allows someone to analyze 36 months of consistent data. This is tedious work, but it is far cheaper than the valuation impact of unexplainable trend breaks.

5. Your flash report takes more than ten business days

The buyer asks how quickly you can produce a flash financial report after month end. The answer is three weeks.

What it looks like. Month end is the first. Preliminary numbers are available around the 20th. Final numbers, after all the adjustments and reconciliations, land closer to the end of the following month.

Why buyers care. A slow close process signals manual work, reconciliation challenges, or insufficient staff. Post-close, the buyer needs timely reporting for their LPs. If the company cannot produce numbers quickly, the buyer knows they will need to invest in the finance function.

What it signals. The accounting and data processes are not automated. There are likely manual steps, workarounds, and dependency on specific people. The finance function is doing data integration work that should be handled by systems.

How to fix it. Map the close process end to end. Identify bottlenecks. Automate the data collection steps. The goal is a five to seven day close with a preliminary flash available in two to three days. You do not need to get there overnight, but showing a documented improvement plan helps.

6. You cannot segment revenue by new versus existing customers

The buyer asks what percentage of revenue comes from new customer acquisition versus expansion of existing accounts. The team cannot answer without building a custom analysis.

What it looks like. “We know total revenue is growing 15% year over year. We think most of that is from existing customers, but we would need to pull the data to confirm.”

Why buyers care. New customer acquisition and existing customer expansion are valued very differently. A company growing through expansion has a different risk profile than one dependent on new logo acquisition. If you cannot separate them, the buyer will assume the less favorable mix.

What it signals. Customer lifecycle is not tracked systematically. The company does not have a clear view of its own growth composition. Revenue analysis is done at the aggregate level, which hides important dynamics.

How to fix it. Tag each customer in your system with their first transaction date. Build a monthly view showing revenue from customers acquired in each period versus revenue from customers who existed before that period. This is a one-time setup that pays dividends well beyond diligence.

7. Data security and PII handling is described as “best practices” with no documentation

The diligence team asks about data security policies. The answer is “we follow industry best practices.”

What it looks like. There is no written data classification policy. PII (names, emails, payment information) is stored in multiple systems with varying access controls. Nobody has done a formal assessment of where PII lives across the organization.

Why buyers care. Data breaches are expensive. Regulatory exposure (GDPR, CCPA, state privacy laws) creates liability. If the company does not know where its sensitive data is, the buyer is inheriting unknown risk.

What it signals. Data governance is informal. The company has not invested in understanding its data risk profile. If an incident occurred, the response would be reactive rather than planned.

How to fix it. Conduct a PII inventory. Document where sensitive data lives, who has access, and how it is protected. Create a data classification policy (it does not need to be 50 pages, two pages will do). If you handle payment data, ensure PCI compliance. If you have European customers, address GDPR basics.

The pattern

Notice what all seven red flags have in common. None of them are about having bad numbers. They are about being unable to prove your numbers are good.

Buyers expect imperfection. Every company has data issues. What separates a smooth diligence from a painful one is whether the company knows about its issues, can explain them, and has a plan.

The fix for most of these is not a technology project. It is documentation, reconciliation, and honest assessment. A team that spends 40 hours over two months addressing these items will save weeks during diligence and potentially points on the multiple.

For a full walkthrough of what buyers test, see The Complete Data Diligence Guide. To test your own readiness right now, try The 48-Hour Test.

For a weekly brief on what actually breaks in diligence and how to fix it, subscribe to Inside the Data Room.