I've been on both sides of due diligence.
I've been the operator whose company was being assessed - watching consultants ask questions that confirmed what the deck said and miss what was actually broken. And I've been the operator who had to fix what those DDs missed - the tech debt that wasn't pitched as debt, the founder dependencies that weren't called out, the AI claims that turned out to be GPT wrappers.
Now I do DD for VCs. And I do it differently because I've been the company being assessed.
Why Pure-Analyst DD Misses Things
Most pre-investment DD follows a familiar pattern: management interviews, document review, customer calls, expert calls, and a polished deck of findings. It's thorough on what's documentable - financials, contracts, market data, customer concentration.
It's weak on what isn't documented:
- How the team actually makes decisions
- Whether the architecture will scale or buckle
- Whether the founders will execute or freeze under pressure
- Where critical knowledge is concentrated in one person
- Whether the AI strategy is real or marketing
Pure analysts work from frameworks. Operators work from scars.
The best DD comes from people who've been the company being assessed - and who got it wrong themselves before they learned to spot the patterns in others.
The 5-Phase Operator DD Framework
Phase 1: Operational Diagnostic (Days 1-3)
Most "operational DD" reports stop at the org chart. That's where mine starts.
What I actually look for:
- Decision flow: Who really makes which decisions, regardless of titles? Founder bottlenecks become deal-breakers post-Series B.
- Process maturity: Do "processes" exist as documented artifacts nobody follows? Or as actual operating rhythm? The gap between the two predicts post-close pain.
- Tribal knowledge mapping: What lives only in one person's head? Quantify the bus factor on critical functions.
- Communication patterns: Do teams talk to each other or only through leadership? Cross-functional friction is invisible in interviews but visible in workflow analysis.
- Cultural cracks: Companies look great when revenue's growing. Cultural problems surface during the first hiring crunch, the first big customer loss, or the first AI competitor.
Output: Operational maturity score across 8 dimensions, with prioritized risks.
Phase 2: Technical Architecture Review (Days 3-7)
Most VCs outsource technical DD to ex-CTOs who've been out of the building for 5 years. They evaluate against frameworks from when they were last hands-on. Tech moves fast - DD evaluators need to be moving with it.
What I actually look for:
- Architecture vs. business plan alignment: Does the architecture support the next 18 months? Or only the last 18?
- Tech debt that's been rebranded: "Fast iteration" sometimes means "we never refactored." Decode the language.
- Real engineering practices: Code review culture, deployment frequency, incident response - these signal engineering maturity better than headcount.
- Build vs. buy decisions: Companies that built what they should have bought reveal poor judgment. Companies that bought what they should have built reveal weak technical leadership.
- Hidden infrastructure costs: Cloud bills, third-party APIs, data licensing - costs that grow with usage and surprise teams at scale.
Phase 3: AI Readiness Assessment (Days 5-9)
This is where I see the biggest gap in current DD practice. Companies pitch AI capabilities they're not positioned to deliver.
Pure-analyst DD reads the AI section of the deck, takes management's word, and moves on. Operator-led AI DD evaluates against the 5 dimensions of AI readiness:
- Data foundation: Is the data clean enough for AI to produce useful output?
- Team capability: Does the team understand AI well enough to use it correctly?
- Infrastructure: Can existing systems support AI integration?
- Strategy & governance: Is there an AI strategy beyond "we use ChatGPT"?
- Use case identification: Are AI use cases tied to measurable business outcomes?
"AI-powered" companies whose AI is a thin GPT wrapper without proprietary data, model fine-tuning, or differentiated workflow. Looks impressive in demo. Disappears as soon as a competitor builds the same wrapper.
Phase 4: Team & Founder Evaluation (Days 7-11)
This is the highest-stakes module - and the one most underdeveloped in traditional DD.
Pure-analyst DD does management interviews, captures management's stated philosophy, and submits a write-up. Operator DD looks at decision patterns under pressure.
- Founder decision archeology: What are the 3 hardest decisions you made in the last 12 months? How did you make them? What would you do differently? The answers reveal more than any psychometric.
- Conflict resolution: When co-founders disagree, how does it actually resolve? Companies with unresolved founder conflict rarely survive scaling pressure.
- Hiring philosophy: Show me the last 5 senior hires. Who interviewed them? Who decided? The answer reveals whether founders can let go.
- What scares them: Ask founders what would kill the company. Vague answers ("execution") reveal weak strategic clarity. Specific answers ("we have 90 days to fix our churn before LTV inverts") reveal operators.
- Bus factor: If the founder disappeared for 60 days, what breaks? The answer measures the company's actual independence.
Phase 5: Risk Inventory & Post-Close Roadmap (Days 11-14)
This is where operator DD differs most from pure-analyst DD. Pure-analyst DD ends with findings. Operator DD ends with actionable post-close roadmap.
What goes in the inventory:
- Key-person dependencies that need redundancy planning
- Process gaps that will fracture at the next growth stage
- Technical debt requiring immediate attention vs. eventual refactor
- AI claims requiring deal terms (e.g., milestone-based payments)
- Cultural risks needing hiring or coaching intervention
- Governance gaps requiring board attention
The output: A go/caution/no-go signal, plus a prioritized 90-day post-investment value creation plan. If you invest, here's what to fix first.
The Red Flags Pure-Analyst DD Misses
Some patterns I've learned to spot - usually invisible to pure-analyst DD:
1. The "Architectural debt" Hidden in Velocity
Companies bragging about deploy frequency sometimes have it because nothing is actually tested. Investigate: do they have automated tests? Code review culture? Or is "ship fast" a euphemism for "ship without rigor"?
2. The Founder-As-Hero Dependency
If the founder personally closes every enterprise deal, personally interviews every senior hire, personally makes every product trade-off - the company has no business after the founder. Look for what the founder has successfully delegated.
3. AI as Marketing, Not Capability
Test: ask the head of engineering to explain how AI is integrated into the core product. If the answer is "we use ChatGPT for X" without describing data flows, model selection, fine-tuning approach, or governance - the AI claim is marketing.
4. Process Theater
Companies show you their Notion workspace with 47 documented processes. Ask people on the ground: do you actually follow this? In 70% of cases, the documented processes are theater - performed for investors, ignored in practice.
5. Customer Concentration Risk Hidden in Logos
"We have 50 customers" sometimes means "1 customer is 60% of revenue, 49 are 40%." Look at revenue concentration, not logo count.
6. The Fast-Growth Cultural Crack
Companies that grew 10x in 18 months almost always have unresolved cultural debt - hires made too fast, processes broken to ship, leadership stretched thin. The crack isn't visible until growth slows. Plan for it.
What Modern Deal Pace Requires
Big consulting firms quote 6-8 week DD timelines. That doesn't match how deals actually move today.
Modern DD timelines I work to:
- Light DD (specific risk area): 3-5 days
- Standard DD (operational + technical): 1-2 weeks
- Comprehensive DD (full coverage): 2-3 weeks
- Express DD (time-sensitive deals): 24-72 hours
Speed is a feature, not a compromise. The depth of analysis isn't lower - the framework just runs faster because there's no committee approval, no junior associate doing the first draft, no 12-person team meeting to align on findings. Operator-led DD is one experienced person plus their network of specialists when needed.
Why I Often Continue Post-Investment
The most successful pattern I've seen: VCs hire me to do DD. I find the issues. They invest. I continue with the portfolio company to fix the issues I diagnosed.
Why this works:
- No re-onboarding cost. I already understand the company.
- Credibility with founders. The DD report wasn't a hit job - it was a roadmap.
- Continuity for the VC. They get a single point of contact for the highest-risk asset.
- Aligned incentives. I'm motivated to be honest in DD because I'll have to live with the findings.
This pattern doesn't fit every VC's model. But for funds investing in companies that need operational support to reach the next stage - it's substantially more efficient than serial consulting engagements.
Have a Deal in Motion?
Operator-led DD for VCs, angel investors, and family offices. 1-2 week delivery. Honest signals. Starting from $1,500.
See DD Service DetailsFrequently Asked Questions
What's operator-led DD vs traditional DD?
Traditional DD applies frameworks from analyst perspective. Operator-led DD applies pattern recognition from someone who's built and scaled the systems being assessed. Both have value - operator-led is faster, catches things frameworks miss, and produces actionable post-close roadmaps.
Should I always do operator DD or sometimes pure analyst?
Pure analyst DD wins when: financial rigor matters most, deal is in a sector you're not familiar with, or regulatory complexity dominates. Operator DD wins when: scaling/execution risk is the main question, AI capabilities need verification, or post-investment value creation is part of the thesis.
How honest can DD really be when fees come from the VC?
The same way audit honesty works - reputation is the currency. A DD consultant who hides issues to please a VC loses every future fund engagement once the issues surface post-close. The economic incentive aligns with honesty over flattery.
Can DD findings actually kill a deal?
Yes - and they should sometimes. About 20% of my DDs have produced "caution" or "no-go" signals that resulted in deal renegotiation, expanded warranties, or pulled deals. The other 80% became "go with these post-close priorities" - which is also valuable.
About the author: May Mor is a Scale Architect, AI Builder, and competitive poker player. She runs Scale with May, providing operator-led due diligence for VCs and post-investment support for portfolio companies.