16 Dec Artificial Experience (Q1-25)
While artificial intelligence is undeniably reshaping large portions of the financial landscape, especially in areas where data is abundant and decisions can be modeled with high frequency and consistency, it is important to draw a sharp distinction between those segments of the market and the work we do in special situations and opportunistic credit. The tools and technologies being applied in traditional or sponsored direct lending – automated credit scoring, AI-driven document parsing, even predictive models trained on historical defaults – can offer speed and efficiency in what are often standardized, templated transactions. These are environments where repetition and comparability dominate. However, our approach operates in an entirely different domain.
In our world, each investment is a unique situation, shaped by a highly specific set of circumstances that defy algorithmic generalization. The analysis is neither formulaic nor mechanical. It demands something else entirely: discretion informed by context, a capacity to synthesize the idiosyncratic, and judgment honed through the repetition of non-repeatable events. The very nature of special situations investing means we are engaging where others have stepped away: with companies in transition, in distress, or at an inflection point where conventional financing is no longer accessible. There is no dataset large enough to encode how a founder’s or management team’s motivations will evolve when facing a cash crunch, or how a key supplier’s trust will influence a turnaround. Nor can AI models reliably predict the behavioral dynamics of teams under duress, or the strategic thinking of regulatory authorities, creditors, and stakeholders in moments of dislocation.
The process of assessing credit risk in these situations is deeply interpretive. It is grounded in a nuanced understanding of capital structure, asset value, and downside protection – but equally in an appreciation of context: What are the incentives of the owners? Is the management team capable of adapting under stress? Is the path to recovery defensible not just contractually, but practically? We often encounter incomplete information, contested narratives, and legal ambiguity. These are not obstacles to be sidestepped but rather the terrain. Our methodology is not about extrapolating from clean data but about evaluating under uncertainty, diagnosing root causes of business stress, and identifying what is salvageable and what is not.
Recent commentary from venture capitalist Marc Andreessen captures this distinction well¹. He notes that the most valuable skills in his business are “psychological” – understanding how people behave under pressure, anticipating their reactions, and guiding them through ambiguity. He describes venture investing not merely as capital allocation, but as psychological navigation. The same holds true in our domain. What separates a successful special situations investor is not access to a better model, but a more accurate reading of motives, incentives, and real-world constraints.
It would be misguided to suggest that AI cannot augment this work at the margins. We already employ technology to support many of the mechanical aspects of due diligence – for example, in processing large document sets, modeling downside cases, and analyzing comparative trends. But these are tools in service of a fundamentally human exercise: constructing a view of reality from fragmented, often conflicting information, and deciding what action, if any, has merit. The essence of our strategy remains interpretive and interventionist, not predictive, because the investment decisions we face are not reducible to spreadsheet logic or sentiment analysis. Consider, for example, the restructuring of a complex capital stack in a jurisdiction where creditor protections are uncertain, asset values are contested, and management alignment is fragile. Or the assessment of a family-owned enterprise undergoing succession amidst balance sheet distress and strategic dislocation. These are decisions where capital flows not on the basis of data completeness, but on trust, control levers, and an ability to navigate what is left unsaid.
Some have asked whether AI’s ascendancy in direct lending suggests a similar trajectory for the broader credit market. We see it differently. In fact, the expanding use of AI in conventional credit may well increase the value of human discretion in complex investing. As more capital flows toward standardized risk, guided by AI-enhanced underwriting tools, the opportunity set in our segment only widens. What we offer is not speed or low-cost execution, but a willingness to engage in the difficult, the mispriced, and the misunderstood. That willingness is underpinned by specialization more than scale and by the kind of insight that comes only from having worked through past cycles, restructurings, and recoveries, often where information was limited and pressure was acute.
We see this dynamic already playing out in the divergence between sponsored and non-sponsored lending. In sponsored transactions, loan structures are increasingly pro forma, reliant on sponsor-calibrated EBITDA adjustments and covenant-lite packages. These deals lend themselves to AI underwriting, because the inputs are normalized and the outputs – primarily spread and loss assumptions – are modeled from a wide base of historical data. In contrast, in the non-sponsored, special situations market where we operate, the capital structure is often bespoke, the documentation negotiated not off a template but a strategy, and the exit dependent on active intervention.
We do not believe AI will replace what we do. Not because we resist change, but because the work itself is, by its nature, resistant to commodification. Our value lies in discernment, in negotiation, in the orchestration of outcomes that would not occur without deliberate human engagement. These are not skills that can be learned from data alone. They are learned through experience.
Importantly, the real risk in complex credit is not modelable risk, but unpriced fragility: shifting collateral realities, hidden intercreditor disputes, or counterparty behavior under stress. It is worth recalling that in the lead-up to the 2008 financial crisis, it was the structured products – AAA rated, algorithmically priced – that misfired most catastrophically. The lesson was not that models were useless, but that they were blind to emergent behavior. In the same way, AI may replicate credit ratings and improve underwriting throughput, but it cannot substitute for the practitioner’s judgment in assessing unstated risks, opaque governance structures, or adversarial restructurings.
This is not just theoretical. Recent default episodes, such as the Serta Simmons capital structure controversy or the Envision Healthcare uptiering dispute, reveal a market increasingly characterized by litigation risk, document arbitrage, and first-mover advantage among lenders. These scenarios require investors to think not only about cash flows, but about process, control, and recovery paths. As Harvard Law Professor Jared Ellias and Duke Law Professor Elisabeth de Fontenay recently chronicled in their article about the meteoric rise of private credit, credit investing today is as much legal strategy as it is financial analysis.² This is precisely the space we inhabit.
In a financial world increasingly shaped by automation, we believe our strategy represents a durable and necessary counterpoint: one grounded in the conviction that complexity still requires judgment, and that outcomes still depend on the quality of decisions made by people with the experience to navigate what others overlook. As parts of the credit market become increasingly commoditized, we remain focused on opportunities that cannot be packaged, traded, or automated. The value we create is not found in higher leverage or broader distribution. It is found in being able to identify and resolve complexity that others cannot or will not engage. That is not something artificial intelligence is equipped to do. It is something only experience can.
[1] https://fortune.com/article/mark-andreessen-venture-capitalism-ai-automation-a16z/
[2] Ellias, Jared A. and Elisabeth de Fontenay. The Credit Markets Go Dark. The Yale Law Journal. January 2025.
