Published:
|
Updated:

Published on INFORMS Analytics Magazine (Joseph Byrum)
Powerful software and hardware have made the creation of idiosyncratic AI models possible, but this is still a new frontier for the financial services industry.
To measure business performance, build a model that accounts for the complexity of financial markets and individual differences.
The future of finance is idiosyncratic.
For decades, financial analysts would break out charts, statistics and formulae that assess one company’s fiscal performance in the exact same way they would measure another company’s. In many respects, this type of financial modeling becomes an exercise in plugging variables into a spreadsheet.
There is no doubt that simple calculations such as the price to earnings ratio, earnings per share and so on can offer at-a-glance insight into a given organization’s prospects. Yet oversimplifying also risks drawing the wrong conclusion. The numbers might look the same for a manufacturer with a major product launch that flopped as for a manufacturer that made a mistake but has the next must-have product in the development pipeline. Noticing subtle differences can be critical when it comes to estimating the future value of a company stock.
Building a Better Model
Accurate analysis of the behaviors of companies and financial markets has always been the Holy Grail of the investment management business. The more accurate the predictive model, the more likely it is that the management business will succeed. Conversely, firms that cling to inferior models will be left in the dust.
There’s a way to build a more complete picture of business performance using artificial intelligence (AI), one that respects and accounts for the inherent complexity of financial markets and the individual differences of the various companies being examined. With AI, it is possible to build and maintain a company-specific view that is updated daily – and manage such a view for thousands of separate companies. But the AI view is not at all like the tools commonly in use in financial analysis of today. A different way of thinking is needed.
Nineteenth century philosopher Wilhelm Windelband built on the work of Immanuel Kant to give us the vocabulary to describe two competing methods of analysis: the nomothetic and the idiographic. From the Greek word for lawgiver, nomothetic refers to the pursuit of general scientific laws through a process of generalization, such as reducing performance to numbers that can be placed in a spreadsheet. The idiographic approach, on the other hand, attempts to zero in on specifics to understand the meaning of various specific phenomena in the context of their occurrence. The common term “idiosyncratic” is the label applied to accounts of phenomena (outcomes and behaviors) that are built using the idiographic view.
Important differences between these approaches have developed over time. Current financial models are firmly rooted in a desire to generalize business performance. That should come as no surprise considering modern thought in general has been dominated by the nomothetic quest for that one set of equations or formulae able to explain all phenomena, from the movement of planets on down to the inner workings of cells and atoms – the theory of everything.
Modern mathematics uses nomothetic methods to capture the general properties of objects and their interactions. The natural sciences also rely on nomothetic methods to explore a relatively small set of general properties within groups with a high level of generality. These academic disciplines apply mathematical models to groups that are highly isotonic, which is to say, they all have similar properties. The success of these models in natural sciences has been impressive.
Real-world environments, however, are rarely so neat and tidy. So, beginning in the late 19th century, the natural sciences accounted for some of the variability found in nature by borrowing mathematical techniques from the world of gaming. Modern statistical theory framed nonisotonic groups in terms of a set of general properties that allowed “random” variation in the commonly observed values of the properties.
This technique uses the generalizing concept of “population,” which assumes that all members of a given population share general properties and interactions that can be measured and grouped together. This made it possible to calculate means, medians, variance and skew. The larger the population, the more exact the estimates for the population’s properties could become.
Conversely, accuracy diminishes with smaller populations, especially when you hit a population of one. This makes sense if you consider there was only one England in the 16th century and only one occurrence of the year 1588. Generalizations will do little to aid understanding the events of the moment. When you are analyzing individuals as individuals, it is necessary to enter the realm of the specific, the idiosyncratic account of outcomes.
Idiosyncratic models haven’t enjoyed the success and recognition in many scientific fields that the nomothetic has had. Until recently, paper, pen and natural language were the only tools available to create idiosyncratic models. No wonder the hard sciences have looked down on “soft science” disciplines such as history and psychology. They lacked mathematical rigor.
The best these disciplines could do to counter the claims of low rigor was characterize some observations with statistics to bolster scientific credibility. The fatal flaw of this approach was that the gain in apparent rigor was offset by the loss of any ability to explain causality. That is, to state why the measured population parameters unfolded as they did. There was no way to answer with precision, “Why is the variance so large?” or, “Why does the variance grow over time?”
An excellent historical example of this challenge played out in the field of clinical psychology between 1950 and 1970. The early contributions of the pioneers in this field such as Sigmund Freud were decidedly idiosyncratic. Each patient’s case was its own individual analysis. Only a loosely defined, case-based set of categories of abnormal psychological adaptation could be created. By the 1950s, this approach was under heavy criticism, and many attempts were made to make the framework for psychoanalysis more formal and general. However, these attempts provided virtually no effective guidance on successful treatment of troubled patients. Eventually, the community decided that it was not only acceptable, but actually superior to apply idiosyncratic analysis to patients to guide their therapies.
Bringing Scientific Rigor to Idiosyncratic Models
By the late 20th century, computers had changed everything. Increasingly powerful machines and software opened the possibility, for the first time, of developing rigorous idiosyncratic models through simulation and artificial intelligence.
Simulation can create an artificial population around a set of unique events and entities. By creating hypothetical causal relationships and showing these relationships could result in the actual observed events, it is possible to test and verify theories of idiosyncratic behaviors. In many hard science disciplines, such as electromagnetism and structural mechanics, simulation has supplanted nomothetic mathematical models because of its superior ability to explore the behaviors of highly specific designs. No designer today would rely on the equations of classical electrodynamics to design a cell phone. Instead, the design of the circuits and antennas within the cell phone are simulated to refine the specific details of their design before building even a prototype of the phone.
AI algorithms use highly individualized computer models to leverage both causality and qualitative properties to understand complex individual behaviors over time. In many ways, AI is a simulation of what we consider thought, belief, reasoning and action. There’s no need to force population generalities on individual entities when an AI program can build and maintain individual interacting models of literally millions of separate, distinct entities.
Dispensing with broad generalizations pays off in the world of finance. With idiosyncratic models, the unique aspects of each company can be modeled as an individual case. You don’t have to treat Walmart and 7-11 as if they were just “retailers” that differ from one another in size. Greater specificity allows more thorough and accurate analysis that can understand why one company might outperform the other.
A second part of the value of AI-based idiosyncratic models is their ability to correctly address the effects of different time periods or epochs. While the nomothetic view tries to treat time as an isotonic parameter, it is simply not the case that every year or fiscal quarter is the same. The correct interpretation of “year over year” or “quarter over quarter” data means that the important differences in context between each year or each quarter and subsequent management decisions acting in that period are also understood. Not only is Walmart different from 7-11, but Walmart in 2016 is different from Walmart in 2019.
Causality is King
In cases where individual differences matter, idiosyncratic models are essential. The classic example of this is found in television dramas featuring a detective who solves a crime by establishing motive, means and opportunity. The world’s most detailed population statistics for crimes aren’t going to tell investigators much more than where a crime is likely to happen. They are not helpful at all in identifying an individual perpetrator and building a compelling case for the guilt of the individual.
Individuals are not different randomly; they are different as the result of different actions and a variety of causes. By modeling the universe of causes and effects, a small idiosyncratic model can account for (or generate, in a simulation sense) an extremely large set of possible observable behaviors.
Building successful AI-based idiosyncratic models requires dedicated effort to untangle the causality within the behaviors of companies and the investors who choose to buy and sell their securities. Fortunately, the formal knowledge and theories of company operations and investment strategies have a large literature. Mining this literature and the experiences of investment analysts for successful accounts of causality is the practice of knowledge engineering, a specialty within the field of AI.
A powerful consequence of knowledge-based idiosyncratic AI models is their ability to not only predict, but also explain their predictions. The explanations draw upon the knowledge of causality used by the model. This enables the AI model, like our skilled crime detective, to build a compelling case for its conclusions.
While powerful software and hardware have made the creation of idiosyncratic AI models possible, this is still a new frontier for the financial services industry. Therein lies the opportunity, as the companies that embrace the idiosyncratic approach are more likely to hold the upper hand against competitors that just can’t bring themselves to ditch the weaker population models.

Joseph Byrum is an accomplished executive leader, innovator, and cross-domain strategist with a proven track record of success across multiple industries. With a diverse background spanning biotech, finance, and data science, he has earned over 50 patents that have collectively generated more than $1 billion in revenue. Dr. Byrum’s groundbreaking contributions have been recognized with prestigious honors, including the INFORMS Franz Edelman Prize and the ANA Genius Award. His vision of the “intelligent enterprise” blends his scientific expertise with business acumen to help Fortune 500 companies transform their operations through his signature approach: “Unlearn, Transform, Reinvent.” Dr. Byrum earned a PhD in genetics from Iowa State University and an MBA from the Stephen M. Ross School of Business, University of Michigan.