- •Contact us
- •About us
- •Advertise with the FT
- •Terms & conditions
© The Financial Times Ltd 2013 FT and 'Financial Times' are trademarks of The Financial Times Ltd.
October 13, 2011 3:21 am
On September 12, a small yet highly portentous US markets statistic was smashed out of sight.
At one point during the day no fewer than 6.1m data messages per second were being pinged around US trading venues, according to industry data complier MarketDataPeaks.com. That was nearly 600,000 messages per second higher than the previous record, which had only stood since August, as global markets convulsed amid US and European growth and debt fears.
Yet that small footnote contains a neat picture of the speed and market complexity that regulators around the world will have to monitor and judge in coming years.
Data messages draw little attention but they are the lifeblood for modern markets. The electrical pulse can be either a trade or a price quote and they are being sent around markets faster and in greater quantities than ever before. Perhaps most importantly, there is no indication that the trend will plateau any time soon.
The figures collected by MarketDataPeaks come from the biggest US exchanges such as the New York Stock Exchange, Nasdaq, Direct Edge, BATS and CME Group. Deregulation enabling competition among trading platforms has created a glut of venues and liquidity has fragmented to several sources, from equity venues to derivatives venues and futures markets.
But the fragmentation has also hastened the emergence of high-frequency traders, who use super-fast technology and telecommunications links, and automated trading systems, to trade in and out of positions in microseconds. Often the technology allows them to exploit price differentials in the same asset on different venues, in a trading practice known as statistical arbitrage.
The September 12 case illustrates how fragmentation has led to a wave of data. MarketDataPeaks measured that at the very peak of the surge, US order books were sending out 1.45m messages per second as market makers and electronic traders updated their books. There was a knock-on effect in derivatives markets as quotes on thousands of contracts were changed simultaneously to reflect the changes in the price of the underlying asset. Here, more than 4.2m messages per second were sent out. The NYSE and a prominent inter-dealer broker both said terabytes of data had been created by August’s stock market volatility. The truly eye-opening part is the speed of growth – only a year ago, the record for US markets stood at 1.5m.
Automated technology and the business models of some exchanges hava allowed some traders to flood the market with orders in small, concentrated bursts. According to Nanex, a US data provider, a single second on another September day saw 19,000 quotations and 3,000 individual trade executions in Yahoo, the internet company.
At the same time, high-frequency traders are in a race to trade at the speed of light. Trades are now routinely measured in microseconds – thousandths of a second – and increasingly pushing at the nanosecond barrier. The Yahoo trade cited by Nanex had another curious feature. Based on official timestamps, Nanex says, Yahoo trades were executed on quotes that did not exist until 190 milliseconds later. It looked like trading broke the speed of light, although is more an indication that official records are struggling to keep up.
Regulatory change will only add to the burden. Sweeping reforms of the $6,0000bn over-the-counter derivatives market by the G20 group of nations will result in more off-exchange contracts traded on exchanges or electronic trading venues. The intention is to create an electronic audit trail, but it will add to the data tsunami.
The regulator stepping into this maelstrom is charged with ensuring market integrity, monitoring for potential abuse and ensuring there is no repeat of a “flash crash”.
The response has been threefold. Many authorities are upgrading their own surveillance systems. Scott O’Malia, a Republican commissioner at the Commodities Futures Trade Commission, the US regulator, has long been a vocal critic of the CFTC’s existing technology, arguing it needs, first, heavy weaponry such as automated surveillance tools, real-time trade monitoring and analysis of trade data and vast data storage. The second requirement is to push greater responsibility on to the industry, setting higher standards for banks, exchanges and inter-dealer brokers.
Third, authorities need to acknowledge that monitoring everything is a near-impossible job. However, officials have been given increased powers to investigate high-frequency trading firms and greater access to trade data held in repositories.
The Financial Industry Regulatory Authority, the US securities regulator, has admitted to asking firms for their software code, albeit as a last resort. In Europe, the latest draft of the Markets in Financial Instruments Directive proposed that investment firms using algorithmic trading should “at least annually” provide a description of the nature of its trading strategies to its home regulator. It also proposed that the domestic regulator be allowed to request further information about an algorithm.
But these last proposals have been controversial, since investment companies spend heavily to attracting the brains behind their secret algorithms. Regulators are learning that there is no easy solution.
Copyright The Financial Times Limited 2013. You may share using our article tools.
Please don't cut articles from FT.com and redistribute by email or post to the web.