You are on page 1of 13

Direct Market Access

While trading has occurred electronically for many years, it has been in the past five years that
equity markets globally have undergone the most profound technological change. Advances in
technology have dramatically altered the way in which orders are executed. Rather than orders
being entered into the market manually by broker operators, there has been an increasing trend
towards orders being generated by computer algorithms seeking to minimize market impact costs
for clients and to capture proprietary trading opportunities. These technological advances have
also driven the demand for new access arrangements (such as direct market access (DMA) and
co-location facilities) as both brokers and end users seek to improve their speed of trade
execution. This has fuelled a rapid growth in the algorithmic trading, especially in North America
and Europe, which has had tremendous impact on liquidity fragmentation across competing
market operator environments and has raised complex supervisory issues. Regulators in those
jurisdictions are only now beginning to address the inter-relationship between market access and
microstructure, the growth in algorithmic trading, and the potential impact on market quality and
integrity. As is well understood, “Direct Market Access” (hereafter referred to as DMA for
brevity) plays a crucial role in the development of new trading technologies and market
regulatory policies. In fact, DMA is seen as part of the continuum of using technology in trading
to lower costs and empower the buy-side as a whole. Thus, it is timely that the DMA and its
impact on trading activity are put to thorough scrutiny.

What is Direct Market Access (DMA)?


Definition of DMA (aka “Automated Order Processing” or AOP)

“DMA refers to electronic facilities, often supplied by independent


firms, which allow buy side firms to access liquidity for securities they
may wish to buy or sell. Buy side firms are customers of sell side firms
- brokerages and banks which may act as market makers in a security.
Buy side firms will still use the trading infrastructure of sell side firms,
but have more control over how the trade is executed.”
AOP (or DMA) is a broader concept than algorithmic trading. To illustrate the distinction,
consider online broker systems. These require AOP certification because their business model
permits clients to enter orders which are transmitted to the trading platform without being re-
keyed by an authorized broker staff member. In place of the manual order entry process, many
exchanges require a Participant to have adequate filters in place to detect trades that may breach
the law or the trading rules. Online brokers therefore offer DMA, but their clients are not
generally engaged in algorithmic trading. In fact, the majority of their clients are retail investors
who manually input orders into their computer. Participants or clients who are using algorithms to
trade also (currently) use DMA because it is the fastest way to connect to the trading platform.

Broker and client access requirements and objectives have evolved over time, with the most
recent requirements being a need for minimal latency in market data and order messaging to
support execution and situational algorithms. Clients had increasingly demanded additional
control over their order flow and Participants responded with the provision of DMA services. The
Market Rule framework supported the placing of trading terminals in the offices of clients. This
achieved the client objectives of increased order control and reduced costs, with the Participant
requirements for risk control and operational integration. These services were extended to a full
range of clients, and formed the basis of the internet trading capability that retail users are now so
familiar with.

In the US, DMA has been primarily adopted by buy-side traders to aggregate liquidity that is
fragmented across U.S. execution venues. US, Canadian and European market regulations all
facilitate trading of securities on multiple trading platforms. In the US, there are approximately
100 such platforms. Canada has a much smaller number of trading platforms (7); whereas Europe
is somewhere in the middle with around 50 equity trading venues (exchanges and MTFs) as of
September 2009. DMA offers investors a direct and efficient way of accessing these electronic
exchanges through internet trading. Traders execute multiple venues directly without intervention
from any brokers. Orders are executed directly by individuals and through specific destinations
(market makers, exchanges, ECNs). Furthermore, DMA trading is cheap. The cost of a DMA
trade is 1 Cent, for program trades it is 2 Cent and block trades cost 5 Cent on average (in 2004).
Source: NYSE Technologies

Naked Access

A common form of market access in North America is the unfiltered access, also colloquially
referred to as ‘naked access’. Unfiltered access is the practice of allowing clients to enter trades
directly into a trading platform without any form of control, pre-vetting or filtering by the market
Participant. Clients are sponsored by a Participant of a trading platform, but their order flow,
which is usually generated by algorithms, does not pass through any IT infrastructure controlled
by the broker. Instead, the client effectively connects directly to the trading platform. It is
estimated that unfiltered access accounts for around 38% equity volumes in the US. The SEC
recently proposed to impose a number of minimum controls around client access, effectively
prohibiting unfiltered access.43 The US Securities and Exchange Commission’s (SEC’s)
concerns with the practice include increased risk of trading error, possible increased systemic
risk, and ‘fair access’ considerations, because unfiltered access clients can achieve lower latency
and gain an advantage over other traders who are required to use pre-trade order filters.

Conventional Sponsored Market Access


Diagram above depicts a SMA since it has a Risk Management Gateway (i.e. layer) installed.

Exchanges favor a DMA model where a client is able to manage its own connection to the
exchange but is required to direct its order-flow through either an exchange-provided or a third-
party provided risk management system (commonly referred to as sponsored access). The
Participant would retain control over certain filters, such as in relation to credit limits, and would
receive real-time information on client orders and trades processed by the exchange. Both the
exchange and the Participant would maintain the ability to instantly and electronically remove a
client’s access to the trading platform. Some exchanges require Participants to nominate in
advance whether they will be algorithmic trading themselves or whether their clients will be
algorithmic trading. In many instances, a Participant do not know whether its clients are engaged
in algorithmic trading – in the same way that they are unlikely to know whether an investor’s
order flow is premised on technical or fundamental analysis. Some other exchanges have also
made it a requirement for Participants to identify specific algorithms that are in use either by
themselves or by their clients. Furthermore, stress-testing of these algorithms over many months
could also be a requirement of the exchange.

Case 1: A recently released finding of the NYSE Hearing Board highlights the issue of testing of
algorithmic trading programs. A firm was found to breach NYSE rules in failing to “adequately
supervise development, deployment and operation of proprietary algorithm, including failure to
implement procedures to monitor certain modifications made to algorithm.” An “unforeseen
programming issue” stemming from revisions to a proprietary algorithm led to aberrant trading
activity (see further below) which disrupted trading in 975 securities and led to late closing of
trading.

Naked Sponsored Market Access


Sponsored access allows firms that are not members of exchanges to gain access to trading
venues using the market participant identification of a member broker-dealer, also known as a
sponsoring broker. This is similar to direct market access, although the access infrastructure used
by the sponsored participant is not necessarily owned by the sponsoring broker. The study
estimates that sponsored access currently accounts for 50% of trading volume in the US equities
market.
What DMA users want?

1. Wider Access to sources of liquidity


With such a large proportion of future order flow, despite modest commission rates and modest
prospect of offering value-added algorithmic trading, a DMA platform would seem a necessity
for competition in dealing in liquid securities at least. The level of investment to make a good
DMA platform is substantial: a global solution would have several hundred concurrent FIX
connections; a presence on all the major third party order routing networks; and integration with
the major buy-side order management systems, such as Charles River, MacGregor and Longview.
If client connectivity is critical for the DMA provider, the attractiveness of DMA to the client is
directly proportional to the breadth of markets that can be reached. Even for a European DMA
platform this can require connections to 12 European markets.

2. Capacity
Capacity refers to the amount of data that can be simultaneously electronically sent or received. It
is a necessary prerequisite for many DMA users that their trading system can reliably sustain
large volumes of data. For example, electronic market makers constantly check market prices and
adjust their quotes accordingly. Statistical arbitrageurs are constantly looking for mispricing
opportunities. At a very simplistic level, both strategies require the ability to send and receive
large amounts of data simultaneously, in the form of requesting and receiving current bids/offers,
sending their own bid/offer and then receiving confirmation that the trading engine has received
their order – and any subsequent amendment or cancellation of that order.

3. Co-Location
Co-location is the practice of locating broker or client trading software and hardware in close
proximity to the trading platform’s trading engine. The aim is to minimize propagation and
transmission latency. Exchanges and third party providers charge a fee for offering co-location.

4. Latency
Latency refers to the amount of time taken to electronically send or receive information. High
frequency traders rely on very low latency (i.e. very quick data transmission) for their trading
strategies. References to trading in microseconds or milliseconds are references to latency. (Refer
the FX paper).
Impact of DMA and Potential Threats

Liquidity Fragmentation and Order Proliferation


The impact of algorithmic trading and liquidity fragmentation in a smaller market such as
India has the potential to be greater than in a larger market with deeper liquidity. The
average size of trade is continually decreasing over time as is evident from the following
graph. In the last decade, financial markets globally have experienced the same trend: a
larger number of trades being executed, and decreasing trade size. This trend is largely
attributed to the increased use of algorithms. The buy and sell orders are entered into
algorithmic trading systems designed to achieve optimal execution. To reduce market
impact costs, large orders are typically broken down into small parcels and sprayed to all
available liquidity pools (which would fragment liquidity in our case). Orders that are
not immediately executed may be cancelled or replaced. Black box trading has the effect
of increasing overall trading volume as smaller trading opportunities are exploited.

Source: Algorithmic Trading and Market Access Arrangements, ASX Review


Traditionally, the order: trade ratio for an exchange has been as low as 3:1. ASX
currently exhibits a relatively low order/trade ratio of approximately 7:1 and is indicative
of relatively low levels of HFT activity. Averages in the North American and European
markets are in the region of 100:1 (with peaks of 250:1), and demonstrate the extent of
order proliferation resulting from HFT/DMA. The driver of this change in order: trade
ratio is not clear. The differences between ASX’s ratio and those of overseas trading
platforms suggest that market microstructure plays a significant role in influencing the
ratio. New trade execution rules, the introduction of additional trading venues,
competitive pricing structures and competition for posting the best bid/offer at a given
time to attract liquidity potentially all contribute to the higher order: trade ratios.

Trading Aberrations
One of the often-cited concerns with non-human trading is that an aberrant algorithm
could disrupt an orderly market. These concerns generally stem from the 1987 stock
market crash, and have two iterations. The first is the risk of a ‘feedback loop’ whereby
an algorithm does not receive or correctly interpret messages from the trading engine,
resulting in a flood of orders and potentially unwanted trades or network capacity issues.
The second iteration is the ‘domino’ scenario, where an algorithm submits orders into the
market, which in turn triggers other algorithms to submit orders, and these feed off each
other, ultimately moving prices away from their perceived value.

1. Feedback Loop:
The NYSE Regulatory action referred to in case 1 above is an example of an
algorithm not operating as intended, and creating a feedback loop. The algorithm
in question sent hundreds of thousands of ‘cancel/replace’ messages for orders
that could not be located on the trading system, creating a high amount of
message traffic which interfered with the operation of the market. The trading
platform responded to those messages with ‘reject-unmatched cancel’ messages.
However, the algorithm was not programmed to properly respond to the ‘reject-
unmatched cancel’ message and therefore continued to repeatedly re-send the
cancel/replace requests. The firm was unaware of the malfunction until advised by
NYSE Regulation.

2. Domino Scenario:
The domino scenario has already been experienced by many exchanges over the
past decade, usually as a result of retail investor use of online broker stop loss
facilities. In some circumstances, particularly where investors have nominated the
same sale price (often a round-figure amount such as $3.00 or $3.50), a trade can
trigger one stop loss which in turn results in a sell order moving the price down,
which triggers others stop losses, leading to a rapid sell-down in that security.
Depending on the quantity of sell orders, the depth of the market, and the liquidity
of the security at that point in time, some stop loss sales may occur well below the
stop loss price nominated by the investor. The domino scenario is not confined to
stop-loss orders. In a recent US example of the domino effect, the price of a US-
listed biotech as sold down from $24 to $7.50 in just over one minute, before
trading was halted. The price subsequently rebounded to previous levels later that
day.

3. Overload of Data:
Increases in the order: trade ratio pose operational and supervisory capacity
issues, and result in additional costs for supervisors, brokers and clients.
Essentially this is because there is a much larger quantity of data to manage. At
worst, an overload of data could result in trading platform breakdown, as occurred
on one Asian exchange in recent years. (See case 3 below.) In a supervisory
context, the large increase in orders that are subsequently amended or cancelled
rather than executed means that new forms of market surveillance are required to
detect ‘micro-manipulation’ – i.e. manipulation of the bid/offer through entering,
amending and deleting orders. Supervisor, client and broker infrastructure and
telecom costs increase because they must be able to handle, store and manage
larger volumes of data.
The causes of data overloading are as follows:

• Direct electronic market access


• Lower transaction costs due to increased liquidity provider
automation
• Increased competition amongst liquidity providers, multiplication
of trading venues, and fragmentation of liquidity pools
• Explosive growth in hedge funds
• Rapid growth in black box and algorithmic trading

Case 2: New York Stock Exchange


A heavy day of stock trading on Feb. 27, 2007 resulted in clogged networks. Systems that
usually process an order in seconds were taking up to a minute and sometimes longer.
Soon after Keynote noticed the slowdown, the Dow Jones system that calculates the
industrial average stopped receiving the underlying stock price information in a timely
manner, which led to a pricing backlog that took 70 minutes to be recognized. After it
was recognized by the exchange, an alternate redundant system was activated which
made it appear that the Dow Jones industrial average had dropped 178 points in one
minute and 240 points in three minutes. This incident clearly emphasized the inability of
the exchange to cope with spikes of trading volume as electronic trading increased. Dow
Jones didn't have the appropriate trigger mechanisms to figure out things were going
down nor the appropriate notification mechanisms. Furthermore, large swings in the
market trigger computerized algorithmic trading from any number of fund managers,
spitting out reams of orders at once. The NYSE's systems are designed to handle three to
four times the normal message traffic. However, traffic was already up before the Dow
trigger (thanks largely to market activity in China and comments by former Fed
Chairman Alan Greenspan), and the stocks hit immediately by algorithmic trading were
isolated rather than spread out across the market. That created a backlog of requests at the
NYSE's Designated Order Turnaround system, which routes orders to trading floor
specialists, causing further chaos by keeping orders in a queue or routing them to
different exchanges. Late in the day, the NYSE instituted trading curbs and even
suspended electronic trading altogether.

Case 3: Tokyo Stock Exchange


Tokyo Stock Exchange faced two embarrassing breakdowns in 2005-06. The second
breakdown occurred when the number orders put in the market exceeded the total number
orders that can be safely processed by the market. A raid on an internet-services company
compounded by an early announcement by the exchange officials that the order volume
was expected to reach the system's capacity of 4.5 million trades a day, forced a
breakdown of the system, bringing the exchange to an early close. After normally
instantaneous trades started to take five minutes and more to complete, exchange officials
closed the market 20 minutes before the normal 3 p.m. closing time. This was the first
time the exchange had been forced to halt trading because of insufficient computer
capacity.

Market Manipulation:
FINRA and IIROC have recently identified the increasing fragmentation of the market
and quotation of securities in multiple jurisdictions as the reason why it is harder to detect
and investigate manipulative trading.

Macro Manipulation:
Macro-manipulation is used to refer to traditional share price manipulation. There
are various manipulative techniques – including ‘pump and dump’ (placing buy
orders to increase the price of a security and attract other traders, before selling at
the inflated price), ‘pooling’ and ‘churning’ (variations on wash trades, where
pre-arranged trades give the appearance of additional liquidity in a security), and
‘ramping’ (spreading incorrect information to increase a share price).
One concern in relation to share price manipulation relates to momentum
algorithms, which in certain circumstances could distort price discovery for a
security. Within a multi-market operator environment, where liquidity has been
fragmented and where maker-taker pricing encourages algorithms to ‘chase’ one
another to receive incentives, the risk of price distortion increases significantly.
Micro Manipulation:
The term ‘micro-manipulation’ was used by one regulator in discussions with
ASX about trading activity (i.e. entering, amending, cancelling orders) designed
to move or influence the bid/offer. This is a form of manipulation which is
arguably more likely to be exacerbated by HFT due to the speed and intelligence
with which HFTs can operate.
It is problematic to distinguish between trading activity that is manipulative, and
trading activity that is evidence of competition for the smartest, fastest, most
profitable algorithm. Public debate in the US has focused to a large degree on
whether certain types of HFT activity are manipulative. The SEC has sought
comment on a range of trading strategies in its January 2010 concept release. In
the Australian context, any new rules about market manipulation will presumably
be generated by ASIC; while any new laws are the responsibility of the Federal
Government. As a consequence of the transfer of market supervision from ASX to
ASIC, decisions as to what trading constitutes market misconduct will be made
solely by ASIC in the future.
Issues with Risk Management in DMA environment:

Although the electronic trading fuels efficiency, reduces out-trades and generally offers
traders more safeguards—courtesy of the clear audit trail it provides—, the emphasis on
speed can challenge traditional approaches to risk management.

Clearing firms traditionally assessed risk on a pre-trade basis—that is, as trades passed
through the firm's order routing infrastructure and before they reached an exchange. More
conventional ‘filtered’ forms of sponsored access use risk controls and connectivity that
are typically provided by third-party vendors or service bureaus recommended by the
sponsoring broker. For some European trading venues the trade must pass through
an additional risk management ‘layer’, managed by the venue, before being
admitted for execution.

However, naked sponsored access does not include real-time risk monitoring by the
sponsoring broker. Filtered sponsored access allows firms to access markets in 550-750
microseconds, according to Aite Group, while naked access has the lowest range of
latency, from 250-300 microseconds. By comparison, a typical DMA service introduces
between four and eight milliseconds which is quite unacceptable for ‘pure’ DMA clients.
Thus, the risk assessment for the pure DMA clients has to be done after the trade is
executed. Such situation is potentially quite dangerous. For instance, a quantitative
trading system might contain a programming error that could cause it to fall into a loop,
automatically resending the same order over and over to an exchange, hundreds of times
a second. If no one intervenes, a huge amount of money could be lost very quickly.

The current state of post-trade risk management no longer cuts it as it comes too late for
an FCM to take action when customers are trading in milliseconds. The speed of post-
trade cleared data varies among exchanges. The Chicago exchanges, Eurex and
Euronext.liffe have a reputation for being fast—essentially real-time. However, the
Chicago Mercantile Exchange's cleared trades can take up to eight minutes if there's a
logjam of trades. But even that beats Asian exchanges, which are typically slower. Eurex
is now contemplating setting "pre-execution position limits" on clients' trading for those
connecting directly to the exchange, while acknowledging that the additional latency
involves a tradeoff between execution efficiency and risk mitigation.

Another related problem is that sophisticated post-trade risk calculations may have a hard
time keeping up with their trading volume. And in the case of a rogue trader or an
algorithm that's malfunctioning, a clearing firm may not be able to cut off the client's
trading as quickly as necessary. Another ongoing risk management concern in a world of
direct access to exchanges involves unfilled customer orders. These orders must be risk
managed so an FCM can be comfortable with the trading activity and risk exposures of
its customers.

Euronext Liffe follows a different approach to risk management. They have effectively
made members of clearing or non-clearing firms responsible for all the order flow that
occurs under their individual trading mnemonic. Consequently, it's up to the exchange
member firm to decide how many people use the mnemonic and how that trading occurs;
thus, the exchange has kept “full responsibility” for all client orders through sponsored
access with the members of the exchange, even if pre- or post-trade controls and
measures are carried out in conjunction with other parties. More recently, Liffe Connect
developed a "master trader mnemonic" that enables the member firm's so-called
monitoring trader to log on and review in real time various sub-user mnemonics. If
necessary, the monitoring trader can stop trading by those users. This functionality,
which is currently being tested, would be included in the next version of the software that
Euronext.liffe rolls out.

You might also like