A recent Wall Street Journal article argued that workers worried about AI taking their jobs may be missing an even bigger danger — cue the escalation that sells headlines — AI systems absorbing institutional knowledge and shifting control away from individuals.
The fear is real. Surveys show it across industries. Not just routine workers. Knowledge professionals too. That anxiety shouldn’t be dismissed.
But the idea that AI’s “capture” of institutional knowledge represents some unprecedented rupture? That’s overstated.
What we are seeing is not a new danger. It’s the acceleration of something companies have always tried to do: move critical process knowledge out of individuals and into institutions.
And properly understood, that strengthens durable enterprises rather than undermines them.
This Has Happened Before
Corporations have always worked to reduce dependency on undocumented expertise. Scientific management codified workflows that once lived inside individual craftsmen. ERP systems standardized accounting and operational processes across global companies. Sarbanes-Oxley formalized internal controls and audit trails. Excel decentralized modeling power that once sat exclusively inside IT.
I wasn’t around for the assembly line. But I was around for ERP, SOX, and Excel. And none of those destroyed the prospects of great accountants. If anything, they elevated them.
Each step reduced information asymmetry inside the company. That’s the trend. AI is simply the next acceleration point.
What’s different now isn’t intent. It’s scale and speed. AI can extract and translate tacit workflows at a level that was previously impractical. But the direction is entirely consistent with corporate evolution.
Companies have always preferred institutional memory over personal memory. Any serious sales manager rides along with their top salesperson. Not to steal relationships — but to institutionalize them.
That’s not sinister. It’s governance.
The Real Issue: Internal Information Monopolies
The debate right now blurs two very different concepts: innovation and opacity.
In many companies, undocumented systems have created internal information monopolies.
- Codebases only one developer understands.
- Revenue logic embedded in spreadsheets known to a single analyst.
- Operational workflows that exist only in tribal memory.
That isn’t innovation. That’s asymmetry.
From an economic standpoint, information asymmetry creates rents. When only one person understands how the numbers are generated or how systems connect, that knowledge becomes leverage. Sometimes accidental. Sometimes strategic.
AI threatens that opacity.
- It can map dependencies.
- It can explain legacy code.
- It can surface inconsistencies.
- It can generate documentation retroactively.
It does not eliminate expertise. It reduces the premium derived from exclusivity. And that distinction matters.
Core Competency vs. Administrative Friction
Every organization has two layers.
The first is core differentiation: research, product architecture, proprietary algorithms, distribution strength, sales execution. This is where competitive advantage lives.
The second is the administrative backbone: documentation, reconciliation, integration logic, compliance translation, reporting infrastructure.
As companies grow, that backbone accumulates complexity. And complexity accumulates leverage. Individuals become gatekeepers of “how things work.” Over time, opacity becomes protection.
AI compresses that second layer.
- It lowers the cost of documentation.
- It lowers the cost of system interpretation.
- It lowers the barrier to understanding process flow.
What it does not compress is genuine differentiation.
- The inventive engineer still invents.
- The breakthrough researcher still creates.
- The salesperson who lands the major contract still drives growth.
AI doesn’t replace core competency. It removes friction around it. That’s pro-innovation.
Why This Strengthens Financial Foundations
Finance exists to institutionalize reality. Numbers must be reproducible. Controls must be auditable. Processes must survive turnover. Risk must be observable.
When critical systems depend on undocumented logic, finance signs off on outputs it cannot independently interrogate. That’s not empowerment. That’s structural fragility.
AI changes the control architecture. It gives controllers and CFOs visibility into system logic. It enables documentation at scale. It reduces key-person exposure. It accelerates scenario modeling.
It restores symmetry between capital allocation and operational execution. If a company cannot function because one engineer leaves, that is not worker leverage. It is governance failure.
Durable enterprises require institutional continuity. AI enhances that continuity.
The Technologist’s Concern — and the Line That Matters
A serious technologist will push back, and fairly so. Not all tacit knowledge is laziness. Some complexity reflects hard-won architectural judgment. Over-optimizing for control can suppress experimentation. Breakthrough innovation often starts messy.
All true. But we need a boundary. There’s a difference between creative depth and undocumented maintenance monopolies.
AI does not eliminate architectural talent. It does not replace research. It does not invent breakthrough products on its own. What it reduces is dependency on undocumented intermediaries who sit between the firm and its own systems. The prolific inventor still thrives. In fact, AI likely amplifies that person’s productivity.
What declines is the leverage derived from being the only person who understands the mess.
That’s normalization. Not oppression.
The Implication for Founders, Owners, and Private Equity
This matters most outside the Fortune 100. Large enterprises historically had governance advantages because they could afford infrastructure: ERP systems, audit functions, compliance teams, internal controls.
Small and mid-sized firms operated with thinner margins and higher fragility. AI lowers the administrative barrier to disciplined operation. A 20-person company can now operate with levels of documentation, modeling sophistication, and system transparency that once required significant overhead. A founder can interrogate their own data. A controller can map operational dependencies without hiring a consulting firm. Compliance interpretation becomes less resource-intensive.
This democratizes institutional discipline.
Lower overhead devoted to administrative opacity means more capital allocated to actual differentiation. That is pro-competition.
Caution Against Apocalypse Framing
Fear of displacement is understandable. Every technological wave produces it. But framing institutional knowledge capture as uniquely dangerous misunderstands how firms are designed.
Companies are not built to depend on undocumented individuals. They are built to institutionalize processes so they survive growth, turnover, and scale. AI accelerates that institutionalization. It does not mark the end of worker agency. It marks the decline of internal information monopolies. It compresses rent derived from opacity and reallocates value toward true contribution.
For organizations serious about durable financial foundations, that’s not an apocalypse. It’s overdue discipline. And discipline — not fragility — is what ultimately sustains innovation.
Leave a comment