CHARLES PERROW ALMOST HAD TO FIGHT HIS WAY into the Federal Reserve Bank of New York last September. Occupy Wall Street was in full swing nearby, and security at the door was unusually tight. The weather and the mood of the city were decidedly stormy.
Perrow had been invited to speak to a hundred senior New York Fed officials as part of the bank’s off-the-record “Listen to Our Critics” series, and he didn’t exactly brighten their day. The 86-year-old Yale University professor emeritus of sociology has written widely about the dangers posed by complex systems, including his 1984 book Normal Accidents: Living with High-Risk Technologies, and he told the Fed audience that technological breakdowns in financial markets had become too commonplace — too “normal” — to be considered aberrations. In particular, he warned about the dangers of unleashing whiplash-fast trading algorithms and abstruse financial products on a “tightly coupled” global financial system.
“In complex, nonlinear systems it is inevitable that at some time, two or more failures — perhaps trivial, individually — will interact in a way no designer could have anticipated and no operator can understand,” he explained, squinting at his notes in a dim upstairs
ballroom in the Fed’s neo-Renaissance headquarters. “If the system is also tightly coupled, the failures will propagate and cascade through the system, even when everyone tries very hard to play safe.”
Perrow may seem an unlikely market seer, but his theories have won him a following in financial circles. Richard Bookstaber, a former hedge fund risk manager now working in the U.S. Treasury’s Office of Financial Research, drew from Perrow’s notions of normal accidents and tight coupling — and the domino effect of failures cascading through interconnected systems — in A Demon of Our Own Design: Markets, Hedge Funds, and the Perils of Financial Innovation, his 2007 book that presciently laid out the elements of the global crisis that was about to unfold. According to Bookstaber, complexity breeds opacity, which, at best, can make it harder to guarantee orderly markets and, at worst, erect a smokescreen that can ruin transparency and abet fraud. “Each innovation adds layers of complexity [that] cannot be easily disarmed through oversight or regulation,” he says. “As complexity increases, so do the odds of something unanticipated going wrong.”
Against this backdrop, a central question comes to the fore: Can the technology be tamed in a way that ensures safe and efficient markets? Judging by a series of calamities that included the granddaddy of high frequency markets run amok, the May 2010 “flash crash,” the accidents are too normal — and the prognosis is getting worse. Although cognizant of the havoc it has caused, the financial industry is too competitive and innovative to lay down its high-tech arms. And regulators admit that the technologies and trading innovations are advancing too quickly for them to supervise, at least in any traditional sense.
It sounds like a recipe for disaster. But new thinking — on both the trading and the regulatory sides — suggests that technology and systems theory might be applied to inject simplicity into the chaos. No one is under any illusion that the risks will be eliminated, but there is hope for making the accidents less normal.
Certainly the white-knuckle events of August 1, the day that errant Knight Capital Group software sent the New York Stock Exchange into a 45-minute tailspin and almost destroyed the market-making firm, do not inspire confidence. What happened followed exactly Bookstaber’s description of a tightly coupled disaster: “When things go wrong, the error propagates, linked from start to finish with no emergency stop button to hit.”
Not only was there no brake on Knight’s problem, nobody could see the glitch coming, least of all the regulators. Even as Securities and Exchange Commission chairman Mary Schapiro called the debacle “unacceptable,” she went on to insist, not entirely convincingly, that the U.S. equity market is “the most resilient, efficient and robust in the world.” The SEC has much more on its investigative plate, including the botched Facebook IPO in May. “Do we treat these as isolated incidents or as part of a broader pattern that can be addressed systemically through new rules?” says SEC director of public affairs John Nester. “Those are the types of questions our regulatory staff is asking.”
“No one wants something like this to happen,” states Jamil Nazarali, who headed Knight’s electronic trading group before joining the $13 billion hedge fund firm Citadel last year as head of market making for its retail securities unit. He considers the August 1 blowout “very serious” and expresses doubt that the SEC will be able to keep pace with the market’s breakneck technological advances. (Nazarali knows the consequences that can begin with a minor misstep: His predecessor, Andrew Kolinsky, left Citadel in June after Nasdaq’s Facebook IPO disaster, which led to estimated losses at Citadel and Knight of $30 million to $35 million each and sparked a series of lawsuits across Wall Street.)
Some market participants are latching on to the catchphrase “normalization of deviance,” coined by Columbia University sociologist Diane Vaughan: Little things go wrong and can snowball out of control; time passes; incidents repeat themselves. Eventually those incidents are seen as a normal condition. Known for her study of events leading up to the 1986 space shuttle Challenger disaster, Vaughan says humans are particularly good at letting a “problematic, can-do attitude” get in the way of noticing mounting dangers. “In these types of situations, there are usually long incubation periods of warning signs that are misinterpreted or ignored,” she says. “Incrementally, humans often will accept more and more damage and repeatedly reassure themselves that it’s normal, until it is too late.”
Is it already too late for financial markets? The incubation period has been roughly a decade and a half since regulatory changes began transforming equity market structures and computer programmers tailored systems to take advantage of them. This had the effect of fragmenting trading across an increased number of exchanges and alternative venues. The trends accelerated to the point where, as NYSE Euronext CEO Duncan Niederauer noted in May, more than half of the trading in about 1,300 U.S.-listed securities no longer takes place on exchanges.
Like other trading platforms, exchanges have morphed into high-tech operations vulnerable to operational failure. Mark Cuban, the Dallas Mavericks owner who made a fortune selling Broadcast.com to Yahoo in 1999 for $5.7 billion, looks at the exchange business through a technology lens: “As a platform, it is no different than any other network. The faster and greater the iterations of any software, the more failures it will have.”
Over the past five years, high frequency trading, where transaction speeds are measured in microseconds, has come to drive more than 50 percent of U.S. equities volumes. That activity is generated by only 2 percent of the estimated 20,000 hedge funds, brokerages, mutual funds and other market participants in the U.S. equity trading universe, according to market research firm TABB Group.
The same five-year period has seen unprecedented volatility. Before 2007 the Standard & Poor’s 500 index rarely shifted up or down more than 2 percent in a single day; it happened only twice in 2006. There were 72 such breaks in 2008 — a banner year for HFT and the height of the credit crisis — and 35 last year.
“It’s more volatile than ever,” says Alexander Fleiss, chairman and chief investment officer of Rebellion Research, an upstart $17 million quantitative hedge fund in New York. “The thinking was, with all this technology we’d have more liquid, efficient markets. But instead, markets are now correlated that never were correlated before, more people are following each other, and more machines are bound up with one another.”
High-performance technology and analytics hold sway in two key stages of the investment and trading cycle: the pretrade functions of portfolio management and strategy, and trade execution. If a firm falls short on its investment strategy, often only it — and perhaps its client base — suffers. But if a trade execution system goes wrong, the consequences can be systemic.
As they grapple with these issues, institutional investors, academics and regulators are coming around to the view that technology, for all the problems it brings, might be harnessed to improve market functioning by enhancing transparency and reducing complexity.
David Mechner, CEO of trading systems developer Pragma Securities, says he believes the next quantum leap may be meshing human traders with their technology to create better trade execution strategies. New York-based Pragma offers institutional investors about a dozen algorithmic tools with built-in logic that let clients take the measure of any number of markets — public exchanges and dark pools — simultaneously, to get the lay of the land before placing trades. The goal is to swoop in largely undetected and exit after nabbing the best price.
“The algo market is now relatively mature and the tools are well known, but the tool choice isn’t always clear,” adds Mechner, who spent a year in Japan mastering the military strategy game Go before studying neurobiology at New York University. “What we’re trying to do now is work with the portfolio managers and trading desks to help them make better decisions about which tools to use and when.”
Ben Sylvester, head of the U.S. equity trading group at J.P. Morgan Asset Management in New York, says today’s split-second poker games have forever changed the way his desk executes its $250 billion of trades a year. No one can see what is going on in the market anymore without the help of sophisticated tools, and the buy side has made great strides adopting them to control more of the trading it used to cede to sell-side banks. J.P. Morgan, for example, deploys visualization tools that allow its traders to see real-time analytics and track parts of the market that would otherwise go undetected.
“Today liquidity is hidden,” Sylvester observes. “Almost nothing is openly displayed, or it’s fleeting and flickering. We used to react to the market; now the market reacts to us.”
What are the regulators to do? “The SEC should think hard about the market structure it has created and do its utmost to rein it in,” recommends Larry Tabb, founder and CEO of TABB Group. The agency likely cannot slow the market down, he says, but it can impose measures to “reduce price and venue fragmentation.”
The lack of visibility into fragmented markets that change at lightning speed is of particular concern to Bookstaber, who is helping open regulators’ eyes and minds to nontraditional theories that may point toward new, technology-aided solutions.
A Massachusetts Institute of Technology Ph.D. in economics who has worked variously as a proprietary trader, quantitative analyst and risk manager at Morgan Stanley, Salomon Brothers and Moore Capital Management, Bookstaber left Wall Street to join the SEC in 2009 as a senior policy adviser and has spent the last year in the Treasury’s Office of Financial Research. The OFR was established under the Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010 as the data aggregation and analysis arm of the Financial Stability Oversight Council, the superregulatory panel charged with monitoring systemic risk. Bookstaber is looking into whether modeling the disparate behaviors of influential agents in the markets can help expose broader, systemic vulnerabilities.
Scientists have used analogous agent-based models to predict or determine the outcomes of events such as traffic jams, stampedes and epidemics. Agent-based modeling’s application to financial markets is in its experimental stages; the logic is that “the market is inhabited by people, heterogeneous and context-sensitive, who do not live up to the lofty assumptions of mathematical optimization and Aristotelian logic that underlie these approaches — and do not do so for good reasons,” Bookstaber noted in A Demon of Our Own Design. “The nature of complexity also is different in the economic realm from that in physical systems because it can stem from people gaming, from changing the rules and assumptions of the system.”
The regulators and the regulated alike are in pursuit of antidotes to opacity and complexity.
“All things being equal, we like a strategy to be simple. If something has to be very complicated, we really do need to have a very good reason to pursue that,” states Matthew Beddall, chief investment officer of Winton Capital Management, a $29 billion systematic macro fund based in London.
Aaron Brown, chief risk officer at Greenwich, Connecticut–based AQR Capital Management, a $57 billion asset manager that employs quantitative methods in most of its investment strategies, agrees: “Most of the advances I expect in risk management involve more-sophisticated application of simple, robust techniques rather than fancier mathematics,” he says. AQR calculates millions of probabilistic paths to study even highly unlikely combinations of events. “It gives you finer-grained answers,” Brown explains. “I expect we will be processing billions of paths in the not-too-distant future. A lot of things we are looking at now involve getting better data and running the probabilities. Not all of our improvements require new theory — we can get more out of existing theory with more processing power.”
Today’s reality is that the humans who once provided the brainpower behind trades are now setting parameters for the computers that do the work. That hardly takes humans out of the picture, and they might ultimately even contribute some correctives.
“Sadly, the computers we buy don’t come with a money-generating algorithm,” notes Winton’s Beddall. “You can dream up a wild science fiction scenario where the computers go mad and humans get kicked out, but it all comes down to humans doing the hard work and analysis. The inputs come from them. You need computers as a tool, but we are systematic traders and we make the decisions.”
Sorting through millions of trading possibilities requires that the data be immaculate — and that, too, requires manpower. “If you have twice as much data, you have to spend more time checking it,” says AQR principal and portfolio manager Michael Mendelson.
In fact, information overload may be one of the stumbling blocks. IBM Corp. estimates more than 90 percent of data on a global scale is less than two years old, most of it unstructured, with 2.5 quintillion new bytes being produced every day. “We’re in a big-data revolution, and we’re really anal about getting high-quality data,” says Beddall.
Data-quality issues simply raise the stakes in the trading game. A distinct information overclass and underclass are already developing, say Joseph Saluzzi and Sal Arnuk, co-owners of Themis Trading, a Chatham, New Jersey, institutional brokerage that has taken the lead in the debate over whether exchanges and brokers should be allowed to leak information on private investor order flows to select clients — for a price. In their co-written Broken Markets: How High Frequency Trading and Predatory Practices on Wall Street Are Destroying Investor Confidence and Your Portfolio, published this spring, Saluzzi and Arnuk question the ethics of selling individuals’ private data by asking: “Would it be OK if Visa/MasterCard sold information about what you bought?”
Beddall says Winton has been offered a private data feed showing live U.S. credit card sales. The firm, which derives 70 percent of its investment ideas from tick-by-tick market data and 30 percent from unique proprietary data sets, has been pitched other unusual information products. “People are putting devices in the cabs of tractors so we can see how crops are doing before the U.S. Department of Agriculture’s reports come out,” he explains. Beddall says he usually turns away salespeople hawking “sexy new data sets. We need at least five to ten years of historical data to put it into context.”
While sexy data sets and brute force are all well and good, IBM is looking into what could be the next great leap in big-data analysis: application of its Watson artificial intelligence technology — which is renowned for defeating two human champions of the game show “Jeopardy” — to Wall Street. IBM signed an agreement in March with Citigroup to explore uses of Watson’s pattern recognition capabilities in “identifying opportunities, evaluating risks and exploring alternative actions that are best suited for their clients,” according to Citigroup. “Analytics is the new core of competitive investing and banking,” says Robert Jewell, director of worldwide business development at IBM Watson Solutions. “We know where we’re heading, but all of the answers aren’t known yet. We are only in the early phases of a massive transition.”
IBM will offer a version of Watson that not only crunches vast amounts data — for example, earnings releases or conference calls — but also performs “machine learning,” drawing from the knowledge of its users. “The more you use it, the more valuable it becomes,” Jewell says.
Something even IBM has yet to attempt is the creation of a machine that can select hedge fund investments using artificial intelligence. Such has been the labor of Rebellion Research, a fund that launched in January 2007 with a dishwasher-sized supercomputer nicknamed “Star.” Capable of 5 million calculations a second, Star shoots out investment picks every morning to a troop of 20-something Ivy League quants and math whizzes, who adjust the firm’s global equities portfolio accordingly.
Despite his faith in the power of the supercomputer, Rebellion’s Fleiss argues that any market takeover by the machines will, by its very nature, invite humans back in: “I believe you can never have a machine-dominated market, because whenever the reaction gets to be too much, there will always be room for a person to come in and say, ‘This is ridiculous, there is no reason this should be trading at this level — I’m going to buy this stock!’” • •