Algorithmic Trading on Interactive Brokers using Python

Consulting Services

Are you interested in developing an automated algorithm to trade on Interactive Brokers? Have a successful strategy already that you want automated in order to monitor a large number of data streams? Want your strategy backtested and optimized? We offer algorithmic trading consulting services for Interactive Brokers, including: algorithm implementation in python or C++, data analysis, backtesting and machine learning. Please get in touch to learn more!

About Interactive Brokers

Interactive Brokers is a large US-based brokerage firm dealing in stocks, ETFs, options, bonds and forex. Since 2021, it also offers cryptocurrency spot and futures trading.

Initial setup

Although Interactive Brokers also supports C#, C++, Java and VB, most people will probably prefer the convenience of python unless they are doing high frequency trading or their strategy is computationally demanding.

Of course, the first step is always to open an account with the broker which you can do here.

Getting setup to do algo trading on Interactive Brokers requires a few more steps than many other brokers. You’ll need to download and install their python API and their Trader Work Station app (TWS). The latter must be running in the background while you run your algo from your favourite python IDE.

After creating an ordinary account with Interactive Brokers, TWS gives you the option to log in with a paper trading account to allow you to test your algo code without making live trades. Paper accounts (otherwise known as test or demo accounts) usually have the limitation that the order book is not simulated, so they are useful for testing your code and learning how to use the API, but not always for evaluating the profitability of your strategy.

Before trying to connect using your python IDE, make sure “Enable ActiveX and Socket Clients” is ticked under File > Global Configuration > API > Settings.

Python initialization

The next step is to start reading IB’s API documentation to learn about the functionality.

These are some import statements you want at the beginning of your code:

from ibapi.client import EClient
from ibapi.wrapper import EWrapper
from ibapi.contract import Contract
from ibapi.order import Order

And these statements will initialize the connection to the broker.

app = IBapi()
app.connect(‘127.0.0.1’, 7497, 123)

The number 7497 is the socket port which can be found in the API settings in TWS mentioned above. The client ID can be an arbitrary positive integer. The EClientSocket class is used to send data to the TWS application, while the EWrapper interface is used to receive data from the TWS application.

Historical and streaming data

Using Interactive Broker’s API is slightly more complicated than for some other exchanges. As they explain in the API documentation here, the EWrapper interface needs to be implemented/overwritten by you to specify what should happen with data you request. So certain functions in the IBapi class will need to be implemented/overwritten to send the data where you want it to go. For example, to request historical data, I like to overwrite the historicalData function like this:

class IBapi(EWrapper, EClient):
def init(self):
EClient.init(self, self)
def historicalData(self, reqId, bar):
data.append(bar)

def grab_historical_data():
data = []
app.reqHistoricalData(1, eurusd_contract, ”, ’90 D’, ‘1 day’, ‘MIDPOINT’, 0, 2, False, [])
return data

This will grab daily price data for the last 90 days. It appears to use business days rather than calendar days. The interval of 90 days is calculated from the prior day’s close, so the last datapoint should be yesterday’s. Note that each day’s data contains open, close, high and low values. The variable eurusd_contract is actually a contract object which we’ll explain shortly.

If instead of getting historical data you wish to stream the latest price data as it becomes available, you want to overwrite the tickdata function use the function reqMktData as follows.

class IBapi(EWrapper, EClient):
def init(self):
EClient.init(self, self)
def tickPrice(self, reqId, tickType, price, attrib):
if tickType == 2 and reqId == 1:
print(‘The current ask price is: ‘, price)
latest_ask.append(price)

latest_ask = []
app.reqMktData(1, eurusd_contract, ”, False, False, [])

The tickType of 2 here filters for the ask price. If you want bid, high or low you can use 1 ,6 and 7 respectively.

Creating contracts and placing orders

Contract objects specify the underlying for which you wish to obtain price data or place orders. They can be created as follows:

eurusd_contract = Contract()
eurusd_contract.symbol = ‘EUR’
eurusd_contract.secType = ‘CASH’
eurusd_contract.exchange = ‘IDEALPRO’
eurusd_contract.currency = ‘USD’

In order to place an order on a given contract you must also create an order object, such as this:

eurusd_order = Order()
eurusd_order.action = ‘SELL’
eurusd_order.totalQuantity = 500
eurusd_order.orderType = ‘LMT’
eurusd_order.lmtPrice = ‘1.1389’

To place the order, you can use the placeOrder function:

app.placeOrder(1, eurusd_contract, eurusd_order)

The first number is an arbitrary integer used to ID the order.

Cryptocurrency Derivatives – Options and Futures

While cryptocurrency exchanges have been offering various kinds of delta one derivatives for a number of years now (such as perpetual futures), the availability of vanilla European call/put options (let alone more complex derivatives) is still nascent. Understandably, traders entering the crypto space would like access to the same tools that they are used to in more traditional and developed markets. Even Goldman Sachs is onboard with the development of a bitcoin options market. Yet, the high volatility of cryptocurrencies produces some unique challenges for the creation of derivatives markets.

See also our main article on cryptocurrency consulting services.

Perpetual futures contracts

Perpetual futures (also called perpetual swaps) on crypto underlyings like Bitcoin are a derivative product first offered by Bitmex and now offered by many cryptocurrency exchanges. Cryptocurrency exchanges typically offer them with up to 100x leverage. While they are similar in some ways to ordinary futures contracts, there are some significant differences.

Firstly, and giving rise to the name, they have no expiry and can instead be closed out at any time by the holder. They could also be closed out by the exchange if the holder gets liquidated, which we’ll discuss shortly.

Secondly there is a mechanism called the funding rate. In an ordinary futures market, the futures price is always related to the spot price. This is ensured by the fact that the futures price converges to the spot price as time approaches the expiry. Since a perpetual futures market has no expiries, this characteristic, if desired, must be created artificially. The funding rate is set by the exchange and is used to ensure the futures price does not diverge too far from the spot market. It is paid directly between market participants and not to the exchange. When the futures price is above the index price, the rate is positive, and traders long perpetual futures must pay the funding rate to those who are short. When the futures price is below the index price, the rate is negative, shorts must pay longs. Note that the index price here could be a weighted average of the spot price among multiple exchanges. In many cases the funding rate is paid every eight hours.

Bitmex also offers a so-called inverse perpetual futures contract.

Liquidation

The high volatility of cryptocurrencies combined with the high leverage offered by many exchanges creates challenges for the operation of margin accounts. Because of this, crypto exchanges tend to liquidate positions well before the participants actually run out of margin. If they are able to close out the position at better than the bankruptcy price, this extra money goes into an insurance fund. This is a buffer the exchange uses to ensure it is able to pay traders who have profited from price moves.

Options – calls and puts

Cryptocurrency exchanges are increasingly interested in branching out into options. While some exchanges already offer versions of European and American call/put options, other exchanges are rapidly trying to develop them. More exotic options such as barrier options and Asian options will presumably become common eventually.

It’s well-known that vanilla option prices increase with increasing volatility. This is because higher volatility means increased upside potential, yet the holder is protected from the increased downside risk by the optionality. Thus crypto options are expected to have considerably higher premiums than those on equity or FX markets.

As a few examples of the current options offerings of various exchanges:

  • Binance offers American options on BTCUSDT futures with expiries from ten minutes up to one day (note that USDT is a cryptocurrency with value tethered to the USD dollar). Binance offers only ATM call and put options, that is, there is only one available strike which is equal to the most recent traded perpetual futures price.
  • Deribit offers European options with a variety of strikes and one expiry per week.
  • Bitmex does not currently provide options, but is keen to develop this capability.

How to price cyptocurrency options

A reasonable starting point for pricing European options on cryptocurrencies is the Black-Scholes framework. In the case of American options, one can apply the binomial tree method. In either case, all of the pricing parameters such as spot and time to expiry are straight forward to determine except one – the volatility.

As is well-known, in developed options markets including equity, FX and interest rate options, the volatility is not really an input but is inferred from existing market prices. When calculated in this way, the volatility is inconsistent between options of differing strike and expiry, leading to a smile or volatility surface.

However, in the case of a nascent cryptocurrency options market, the market is unlikely to be sufficiently liquid, especially at the beginning, to obtain these market prices. In fact, many crypto exchanges do not allow options to be traded. Instead, the exchange functions as a market maker and simply sets the price itself.

Of course, one can always use an empirical calculation of historical volatility as a starting point, but a smile of some kind would need to be imposed upon it. One way to do this would be to start with the smile for an equity of FX rate which is believed to be in some sense similar to bitcoin, and scale it by a factor determined by taking the ratio of the equity ATM vol with the empirical volatility of the cryptocurrency.

Introductory Guides to Algorithmic Trading

Each exchange (or broker) provides slightly different services and features to traders wishing to automate their strategies as algos.

  • Some exchanges provide their own high-level trading language allowing people unfamiliar with conventional coding languages to implement and test simple algorithms. I don’t favor this approach as I would rather take advantage of the powerful features of a language like python.
  • Some exchanges provide a simple API key allowing you to interface with the exchange from you favorite language. Others require that you download additional software in order to interface (and authenticate) with the exchange.
  • Some exchanges provide a test account allowing you to test your code without having to risk money on live trades
  • Some exchanges do not allow customers to connect to their API unless they meet certain requirements. For example, TradeStation requires that your account have $10,000 of cash deposited before they will email you your API key. This is very bothersome if you initially intend to just develop and test your algorithm, and only invest your money at an appropriate time in the future.

Here we provide a few exchange specific guides, outlining how to get started interfacing with the exchange, grabbing price histories and posting buy/sell orders:

Algo Trading Crypto on Binance Using Python

Consulting Services

Are you interested in developing an automated algorithm to trade crypto on Binance? Have a successful strategy already that you want automated in order to monitor a large number of data streams 24/7? Want your strategy backtested and optimized? We offer algorithmic trading consulting services for spot, futures and option trading on Binance, including: trading bot implementation in python or C++, data analysis, backtesting and machine learning. Please get in touch to learn more!

About Binance

Binance is one of the world’s largest cryptocurrency exchanges, offering:

  • Spot trading on around 100 digital currencies including Bitcoin and Ethereum
  • Up to 125x leverage on perpetual futures contracts
  • At the money American call and put options with 5 minute to 1 day expiries

Note that Binance has been banned by regulators in some countries such as the US and the UK due to concerns about the compliance of cryptocurrency exchanges with anti-money-laundering laws (competitor Bitmex is in a similar situation). Binance.US is an alternative which is designed to comply with US regulations.

Crypto exchanges are keen to develop cryptocurrency derivative products such as futures and European or American options. But keep in mind that some countries like Australia, Germany, Italy and the Netherlands only allow trading in spot, as they have banned derivatives including futures, options and leverage. Regulators are concerned that retail investors may be unaware of the risk involved in derivative products given the high volatility of cryptocurrencies.

Initial setup

Since Cryptocurrency markets do not close overnight, algorithmic trading using a crypto bot is the only way to monitor your positions 24/7.

First you need to make sure you have an installation of python. I recommend downloading the Anaconda distribution which comes with the Spyder IDE. In addition, you’ll want the python library python-binance, which can be obtained by opening an anaconda prompt from the start menu and typing

pip install python-binance

In addition, an API key is needed to give your installation of python permission to trade on your binance account. After creating a Binance account, simply go to the profile icon in the top right corner and click “API Management”. Then just place these lines at the top of your python code:

from binance import Client, ThreadedWebsocketManager, ThreadedDepthCacheManager

client = Client(API Key, Secret Key)

Here, API Key and Secret Key are of course the two keys you obtained from Binance.

Backtesting data

Binance market data for backtesting purposes can be downloaded here. Spot and futures data are available with three file types as shown below. As the raw data comes without headers, I’ve included screenshots below showing the headers for convenience.

AggTrades

Klines

Trades

Basic commands to get you started

From there, one can start reading the Binance API to start learning basic commands.

To get the current price of BTCUSDT you can use

client.get_symbol_ticker(symbol=”BTCUSDT”)
Out: {‘symbol’: ‘BTCUSDT’, ‘price’: ‘51096.07000000’}

If you want to receive an updated price only when it has changed, you can stream prices by creating a threaded websocket manager. The function “update_price” defines what to do whenever some new information “info” is received from the exchange. In this case it appends it onto a vector of historical prices and prints it out to the console.

def update_price(info):
btc_latest = {}
btc_latest[‘last’] = info[‘c’]
btc_latest[‘bid’] = info[‘b’]
btc_latest[‘ask’] = info[‘a’]
btc_history.append(btc_latest)
print(btc_history[-1])

twm = ThreadedWebsocketManager()
twm.start()
twm.start_symbol_ticker_socket(callback=update_price, symbol=’BTCUSDT’)

To buy/sell, simply use client.create_order:

order = client.create_order(
symbol=’BTCUSDT’,
side=’BUY’,
type=’LIMIT’,
timeInForce=’GTC’,
quantity=100,
price=51097)

Market Making Algorithms and Models

Our PhD quant consulting service can canvass the academic literature on market making models for you, and help you design, backtest and optimize your strategy. Contact us to let us know how we can supercharge your trading.

A market maker provides liquidity to the market by standing ready to both buy and sell an asset at stated bid and ask prices. They are common for both Forex markets and stock exchanges, and there are even many firms acting as market makers for bitcoin and other cryptocurrencies. The value of market making to traders is that they are able to execute trades immediately, rather than having to wait for a matching order to appear. In exchange, the market maker generates a profit by setting an appropriate spread between the bid and ask prices. A market making algorithm must determine appropriate bid and ask prices to maximise profits. There are two trade offs that a market maker must consider when trying to achieve optimal market behaviour.

Firstly, there is a trade off between volume and margin. If the market maker’s bid ask spread is too conservative, few of his trades will be fulfilled. On the other hand, if his spread is too aggressive, many trades will be fulfilled but he will make very little money from each trade. So the bid ask spread must be sufficiently attractive to other market participants while still remaining profitable for the market maker.

Secondly, while market makers can profit from the bid ask spread, they are exposed to risk due to price changes on the inventory of the asset that they must hold. If the price drops, the inventory may have to be sold at less than it was acquired for. The market maker must therefore design a quoting algorithm which optimally sets bid and ask prices to generate a profit, while also minimising inventory risk. A market maker may hope to buy and sell in approximately equal quantities to avoid accumulating a large inventory. Market making algorithms are relevant not just to genuine market makers, but to any market participant that both buys and sells an asset. One mechanism a market making algorithm can use to reduce inventory risk is to provide more conservative bid estimates when it is already long a significant inventory.

Market making strategies differ from more general trading strategies in that the latter may take on a large position based on some view of the direction the market will move in, while the market maker attempts to avoid this risky bet as much as possible.

See also our pages on optimal execution algorithms and algorithmic trading consulting services.

The Avellaneda-Stoikov model

The Avellaneda-Stoikov model is a simple market making model that can be solved for the bid and ask quotes the market maker should post at each time \(t\).

We consider the case of a market maker on a single asset with price trajectory \(S_t\) evolving under brownian motion

\[ dS_t = \sigma dW_t.\]

While this implies a normally distributed price rather than lognormally distributed, the difference is not significant over small time horizons where \(S_t\) does not move too much from its original value.

Let \(S_t^b\) and \(S_t^a\) represent the bid and ask quotes of the market maker at time \(t\), and let \(N_t^b\) and \(N_t^a\) represent the total number of market participants who have bought and sold from the market maker respectively. The model assumes that buyers arrive to purchase from the market maker at random, with an average frequency that decreases as the bid price \(S_t^b\) drops further below \(S_t\). Similarly, the frequency at which sellers arrive to sell to the market maker arrive with an average frequency that decreases as the ask price \(S_t^a\) rises further above \(S_t\). This means that the more conservatively the market maker sets his bid and ask quotes, the less likely he is to make trades.

Furthermore, the model assumes that the market maker must keep his inventory \(q_t\) between some values \(-Q\) and \(Q\). He does this by not posting a bid quote when his inventory reaches \(Q\), and similarly for an ask quote.

For simplicity, the model assumes that each buyer purchases exactly one unit. Since the market maker earns \(S_t^a\) whenever a buyer arrives, and spends \(S_t^b\) whenever a seller arrives, his cash account satisfies the equation

\[dX_t = S_t^a dN_t^a – S_t^b dN_t^b.\]

We assume that the market maker wishes to optimize his behavior over some time interval \([0,T]\). We want to find functions of time \(S_t^b\) and \(S_t^a\) which maximise the expected value of his final holdings of cash and inventory

\[X_T + q_TS_T.\]

However, in such problems it is also typical to penalise the variance of this quantity in the optimization to factor in risk aversion. One can optimize such a function using stochastic control theory. For the exact form of the solutions and for more details see The Financial Mathematics of Market Liquidity by Gueant.

Optimal Liquidation Algorithms – the Almgren-Chriss Model

Unwinding or liquidating a position is a trade off. Liquidate too quickly and you may suffer price slippage as the market order walks the book. Liquidate too slowly with more conservative limit orders, and you are exposed to the risk of adverse price moves. The concept of splitting a large order into a number of smaller orders to be executed over a certain time period is well-known to traders. Exchanges and many other market participants are therefore motivated to develop liquidation algorithms which behave optimally. In this post we’ll discuss the Almgren-Chriss model. For more details consult The Financial Mathematics of Market Liquidity by Gueant.

We assume a trader wants to unwind \(q_0\) trades in a time interval \([0,T]\). Writing \(q_t\) for the trader’s inventory at time \(t\), we write

\[dq_t = v_t dt, \]

where \(v_t < 0\) is the rate of liquidation. If the trades were exercised in a finite number of discrete blocks, then \(v_t\) would be a sum of delta functions, for example. The mid price of the stock is modelled as

\[ dS_t = \sigma dW_t + kv_t dt\]

for \(k>0\). The first term is simply Brownian motion, although note that the decision is made for simplicity to assume a normally distributed price instead of the usual lognormally distributed price. The second term means that the price drops by an amount proportional to the number of stocks our trader executes. This is the permanent market impact.

But the most significant equation here is the equation representing how the rate of liquidation \(v_t\) affects the price obtained for the shares. This is the instantaneous part of the market impact, which in the model has no permanent impact on the market price. We assume that the price obtained for the shares executed at time \(t\) is

\[S_t + g\left(\frac{v_t}{V_t}\right),\]

where \(V_t\) represents the total market volume and \(g<0\) when \(v_t < 0\). The choice of increasing function \(g\) is actually the key to the model. It quantifies how much worse the average price obtained for the shares trades at time \(t\) is when the rate of liquidation \(v_t\) is higher (i.e. more negative). The original model of Almgren and Chriss chose the function $g$ to be linear. This means that if the trader liquidates twice as many shares at time \(t\), the average price obtained for those shares will be twice as far from the mid price. The cash earnt by the trader is then simply the number of shares liquidated multiplied by the average price obtained, i.e.

\[dX_t = – v_t\left( S_t + g\left(\frac{v_t}{V_t}\right) \right) dt.\]

If the midprice were assumed to be close to constant over time, the optimal strategy would be to liquidate as slowly as possible. This would mean that the shares would all be sold at close to the mid price. However, liquidators are not only unwilling to wait forever, but also typically wish to liquidate the portfolio at close to the current market price. Liquidating over a longer time interval means that the price may fluctuate away from the current price. Some kind of “risk appetite” consideration must therefore be included in the model.

This requirement is not actually encoded in the differential equation for \(X_t\) above. Rather, it is encoded in the quantity we wish to optimize. The way this is done is to not simply optimize the final cash holding \(X_T\), but also to penalise its variance. This can be done by choosing the function to be optimized as something like \(\mathbb{E}(X_T) – \frac{\gamma}{2} \mathbb{V}(X_T)\) or \(\mathbb{E}(-e^{- \gamma X_T})\), for some constant \(\gamma > 0\). How much one penalises variance by choosing \(\gamma\) is essentially an arbitrary decision in the model. Of course, longer trading horizons give rise to more variance in \(X_T\) because \(S_t\) becomes less predictable when allowed more time to drift. Thus this parameter will determine the rate of liquidation based on risk appetite.

Finding the optimal trading strategy \(q(t)\) is a variational problem which requires minimising the function

\[J(q) = \int_0^T{\left(V_tL\left(\frac{q'(t)}{V_t}\right) + \frac{1}{2} \gamma \sigma^2 q(t)^2\right)dt},\]

where \(L(\rho) = \rho g(\rho) \).

Gueant also discusses several extensions of the model, including:

  • Incorporating a drift term into the equation for the evolution of the stock price to allow the trader an opinion on the future trajectory of the stock
  • Placing a lower and/or upper bound on the liquidation rate
  • Considering the liquidation of portfolios of multiple stocks

The Almgren-Chriss model implemented in practice

If you attempted to implement the Almgren-Chriss model in practice, there are a number of issues that would arise. In particular, you would need to specify the parameters of the model, which may be difficult to determine.

The first is the shape of the market impact function, which represents the manner in which the price moves as you execute a certain volume of the asset. A simple assumption is a linear market impact function. However, it depends on the structure of the order book, which could take many different shapes, and may change over time. If you have access to the order book data, you could investigate whether the order book shape is sufficiently constant over time to warrant doing some kind of backtest/fitting. But your execution strategy would cease to be optimal if the shape of the order book deviated from your assumptions. And if you don’t have access to the order book data, this is going to be much harder.

The second is the risk appetite parameter, or how much one penalizes the variance in the final PnL. There are two competing factors in the optimal solution. First, the slower you liquidate the better the price you get. Second, the slower you liquidate the more likely the price will move. It’s pretty much arbitrary how to choose to balance these two competing factors. And, of course, there may be other reasons why you need to liquidate your entire inventory within a certain amount of time, regardless.

The third is your view on the likely future movement of the asset. Clearly, this will have a profound impact on your execution strategy. For example, if you believed the price was going to drop significantly soon, you’d want to use a high rate of liquidation to make sure you had liquidated your inventory before the asset drops too much. But if you had no view on the future asset trajectory, you could neglect this issue.

And finally, something not considered in the model is the need to make sure your execution strategy is unpredictable so other market participants can’t anticipate your trades. A predictable rate of execution is a great way to get taken advantage of.

Despite the above, studying this model is a very great way to clarify your thinking before designing your own execution strategy that suits your own specific application.

Volatility smoothing algorithms to remove arbitrage from volatility surfaces

Need help building a volatility smoothing algorithm? Our quant consulting service can help. Contact us today.

See also our article on generating volatility surfaces from options data in C++.

Implied volatility surfaces and smiles constructed by fitting a cubic spline to raw market data may contain arbitrage. In fact, even if the market data points used do not contain arbitrage, cubic interpolation between data points may introduce it. It is therefore usually desirable to find the best fit of a cubic spline to the data points, under the restriction that the result be arbitrage free. Unlike the basic interpolation approach, the spline need not pass through the data points. This is called volatility smoothing.

We recommend the approach of M.R Fengler in his paper Arbitrage-Free Smoothing of the Implied Volatility Surface. Instead of fitting a spline to the graph of volatility vs moneyness, Fengler uses call price vs moneyness. An advantage of this is that the no arbitrage restrictions take on a more simple form in terms of call price.

The surface fitting is done using a least squares fitting, with a number of constraints. The heart of the algorithm is therefore a constrained quadratic optimization procedure. In python, this can be achieved using scipy.optimize.minimise with the parameter method=’SLSQP’. The mathematical difficulty is mainly around understanding the constraints and implementing them accurately.

We’ve implemented Fengler’s algorithm in python. The algorithm runs very quickly on a single vol surface. However, since historical volatility data has, for each date, a large number of vol surfaces (one for each tenor), the number of surfaces to be processed can easily proliferate into the millions. In this case one may wish to consider a C++ implementation or at least a multicore implementation in python.

To illustrate the algorithm, we start with 8 pillar points (moneyness/volatility pairs) which make up the raw data of a vol surface. We’ve deliberately chosen data which contains significant arbitrage. We’ve calculated the Black-Scholes call prices corresponding to these points and plotted them as the blue dots in the below graph.

The orange line is the arbitrage free cubic spline generated by our implementation of Fengler’s approach. You can see that it very effectively solves the problem of the out of place at-the-money data point which is entirely inconsistent with an arbitrage free surface.

We can also convert the call prices back to implied volatilities, yielding the following graph. For this graph, we have simply joined the data points by straight lines for illustration purposes.

We found we had to make one addition to Fengler’s approach as described in his paper. Fengler considers a set of weights for each data point in the fitting. We found we had to weight each data point by 1/vega to achieve an accurate result. This is because at the wings of the volatility surface, where vega is very small, a small change in call price corresponds to a huge change in volatility. This means that when converting the fitted call prices back to volatilities, the surface will otherwise be a very poor fit in the wings.

Fengler’s paper is not limited to one dimensional volatility surfaces (that is, smiles). It can also be used for two dimensional volatility surfaces which incorporate both moneyness and maturity. His paper details how to extend the method to include maturity.

We provide volatility smoothing consulting, along with a wide range of quantitative finance consulting services.

You may also wish to check out our article on converting volatility surfaces between moneyness and delta.

Does barrier option valuation depend on volatility and interest rate term structure?

\(\)It’s well-known that vanilla option valuation does not depend on the term structure of volatility and interest rates. This means that the price depends only on the average volatility and average interest rate between the valuation date and maturity, not on how those quantities are distributed within the interval.

A way to visualize this and understand it intuitively is as follows. Consider a large set of paths of the underlying which have been generated by a Monte Carlo routine. The value of the option is the average over all paths of the quantity \(Max(S(T) – K, 0)\). Now, imagine stretching and compressing the paths in different places as if they were plasticine, corresponding to concentrating volatility more in some places than others. It’s as if the underlying were moving faster in some regions, and slower in others, yet \(S(T)\) remains the same for each path. Thus, the price remains the same.

Interest rates affect the underlying’s drift term. Yet, as for volatility, \(S(T)\) depends only on the total proportional increase that the drift term bestows on the underlying, not on where in the interval this increase occurs.

What about barrier options? There are a few cases to consider.

First, we consider the case of a full barrier option. This means that the barrier is monitored for the full length of the deal from the valuation date to maturity, as opposed to only being monitored for a subset of it. We also assume that the underlying’s drift term is zero (this typically occurs when interest rates are zero, for example). In this case, valuation is actually still independent of volatility term structure. This can be understood by realizing that stretching or compressing the paths in different places does not change whether they breach the barrier, but only when they breach the barrier. Thus whether a given path has knocked-in or knocked-out remains unchanged.

Next, we consider the case of a partial or window barrier option. This means that the barrier is only monitored some of the time, with the monitoring period starting after the valuation date and/or ending before maturity. We still assume that the underlying drift is zero. As mentioned above, while a different volatility term structure does not change whether a path breaches the barrier, it does change when it does. Thus, it can affect whether the path breaches the barrier inside the monitoring window or outside, thus changing whether the path knocks in/out or not. Thus, for partial and window barrier options, valuation is not independent of volatility term structure.

Finally, let’s consider the case of a non-zero drift term. In this case, valuation is not independent of volatility or interest rate term structure regardless of whether it is a full barrier option or a partial/window barrier option. To understand this, consider that the movements in the underlying due to volatility are proportional to the current underlying price. If the underlying is monotonically drifting upwards throughout the monitoring window, then volatility applied early on will cause smaller changes in the underlying than if they were applied towards the end of the monitoring window. Thus, if the volatility term structure concentrates volatility towards the end of the interval after the underlying has had time to drift upwards, they are more likely to cause the underlying to rise above an upper barrier. Thus, volatility term structure and interest rate term structure affect knock out / knock in probability and thus affect valuation.

GPS consulting – mathematics and software development for global positioning systems

GPS satellites and receivers are being applied in a huge number of industries including aviation, agriculture, financial fraud identification, robotics (navigation), and landscape surveying.

Developing software to process GPS data requires an understanding of the mathematics involved in GPS coordinate systems, including coordinate transformations between latitude/longitude/height and ECEF coordinates. GPS data often must be combined with other sensor data and run through a mathematical calculation to produce the required output data or system behaviour.

Our consultants can assist you in formulating the correct mathematical equations for your GPS application, and implement them in a variety of languages like python or C++.

Financial Computation using Nvidia GPUs.

While GPUs were originally invented for image processing, their powerful capabilities are now being applied to computation problems that have nothing to do with graphics. As GPUs have about 20x as many cores as CPUS, they can be up to 100x faster for highly parallelizable computations such as machine learning and data analysis.

Did you know that google has used Nvidia GPUs to train its google translate machine learning algorithms?

In particular, Nvidia GPUs find many applications in the financial services industry, which is increasingly making use of massive data sets and AI / deep learning. GPU computation is ideal for Monte Carlo simulations, used extensively in the finance industry, as each path can be processed independently and simultaneously.

CUDA is a program development environment from Nvidia which allows users to execute the highly parallelizable part of their code on an Nvidia GPU.