This paper focuses on the liquidity of electronic stock markets applying a sequential estimation approach of models for volume duration with increasing threshold values. A modified ACD model with a Box–Tukey transformation and a flexible generalized beta distribution is proposed to capture the changing cluster structure of duration processes. The estimation results with German XETRA data reveal the market’s absorption limit for high volumes of shares, expanding the time costs of illiquidity when trading these quantities.
I analyze the dynamic trading behavior of market participants by developing a bivariate modeling framework for describing the arrival process of buy and sell orders in a limit order book. The model contains an extended autoregressive conditional duration model with a flexible generalized Beta distribution to explain the duration process, combined with a dynamic logit model to capture the traders’ order submission strategy. I find that the state of the order book as well as the speed of the order arrival have a significant influence on the order placement, inducing temporal asymmetric market movements.
This paper discusses the possibility of recovering normality of asset returns through a stochastic time change, where the appropriate economic time is determined through a simple parametric function of the cumulative number of trades and/or the cumulative volume. The existing literature argues that the re-centred cumulative number of trades could be used as the appro- priate stochastic clock of the market under which asset returns are virtually Gaussian. Using tick-data for FTSE-100 futures, we show that normality is not always recovered by conditioning on the re-centred number of trades. However, it can be shown that simply extending the approach to a nonlinear function can provide a better stochastic clock of the market.
In this paper, we apply the meshfree radial basis function (RBF) interpolation to numerically approximate zero-coupon bond prices and survival probabilities in order to price credit default swap (CDS) contracts. We assume that the interest rate follows a Cox–Ingersoll–Ross process while the default intensity is described by the Exponential-Vasicek model. Several numerical experiments are conducted to evaluate the approximations by the RBF interpolation for one- and two-factor models. The results are compared with those estimated by the finite difference method (FDM). We find that the RBF interpolation achieves more accurate and computationally efficient results than the FDM. Our results also suggest that the correlation between factors does not have a significant impact on CDS spreads.
We propose a nonlinear filter to estimate the time-varying default risk from the term structure of credit default swap (CDS) spreads. Based on the numerical solution of the Fokker–Planck equation (FPE) using a meshfree interpolation method, the filter performs a joint estimation of the risk-neutral default intensity and CIR model parameters. As the FPE can account for nonlinear functions and non-Gaussian errors, the proposed framework provides outstanding flexibility and accuracy. We test the nonlinear filter on simulated spreads and apply it to daily CDS data of the Dow Jones Industrial Average component companies from 2005 to 2010 with supportive results.
We provide group invariant solutions to two nonlinear differential equations associated with the valuing of real options with utility pricing theory. We achieve these through the use of the Lie theory of continuous groups, namely, the classical Lie point symmetries. These group invariant solutions, constructed through the use of the symmetries that also leave the boundary conditions invariant, are consistent with the results in the literature. Thus it may be shown that Lie symmetry algorithms underlie many ad hoc methods that are utilised to solve differential equations in finance, even when the equation is non-linear.
We provide the solutions for the Heston model of stochastic volatility when the parameters of the model are constant and when they are functions of time. In the former case, the solution follows immediately from the determination of the Lie point symmetries of the governing 1+1 evolution partial differential equation. This is not the situation in the latter case, but we are able to infer the essential structure of the required nonlocal symmetry from that of the autonomous problem and hence can present the solution to the non-autonomous problem. As in the case of the standard Black–Scholes problem the presence of time dependent parameters is not a hindrance to the demonstration of a solution.
Numerous studies present strong empirical evidence that certain financial assets may exhibit mean reversion, stochastic volatility or jumps. This paper explores the valuation of European options when the underlying asset follows a mean reverting log-normal process with stochastic volatility and jumps. A closed form representation of the characteristic function of the process is derived for the computation of European option prices via the fast Fourier transform. We show that this method is considerably faster than the corresponding Monte Carlo simulation.
Average pricing is one of the main ingredients in determining the payoff associated with an option of Asian design. In the case of a fixed strike since its beginnings in 1980 much has been written on the European-style Asian. In this article, we extend the work of Zhu to this exotic option and present an analytic formula pricing an American-style Asian option of floating type. In passing, we identify the property that an exotic option must possess in order to be extended to this case. We also extend a symmetry result established by Henderson and Wojakowski.
This paper describes a new concept called market quakes, which measures the significance or insignificance of price changes in the foreign exchange market. The Scale of Market Quakes (SMQ) is part of the Olsen trading tools. It objectively measures the impact of political and economic events on the currency markets and is a decision support tool for traders, retail and institutional investors, government officials and commentators. The development of the SMQ was inspired by the Richter Scale which measures the intensity of earthquakes. Financial markets are subject to seismic shocks caused by political, economic and other events. The SMQ, which is computed on a tick by tick basis, measures the price impact of these events. It provides a clear metric of the relevance of events and contributes to reducing uncertainty of decision makers. The SMQ service that has been developed for the currency markets is part of a large project to build a comprehensive global information system. Olsen invites companies, non-government and government organizations, universities and private individuals to participate.
We have discovered 12 independent new empirical scaling laws in foreign exchange data-series that hold for close to three orders of magnitude and across 13 currency exchange rates. Our statistical analysis crucially depends on an event-based approach that measures the relationship between different types of events. The scaling laws give an accurate estimation of the length of the price-curve coastline, which turns out to be surprisingly long. The new laws substantially extend the catalogue of stylised facts and sharply constrain the space of possible theoretical explanations of the market mechanisms.
This paper provides empirical evidence that the particular intra-day seasonality observed in the Foreign Exchange market is indeed due to the different geographical locations of its traders. Analysing more than 2 years of real transactions from a microscopic perspective, we design a procedure that accounts for the time zones from which traders operate. The resulting normalized intra-day seasonality shows a pattern akin to those observed in regulated exchanges where traders are more active at the beginning and at the end of their session.
To understand financial markets and prevent crisis we need to analyze market microstructure. This paper formalizes the market process in the context of a simple double auction market. The purpose of this calculus is to analyze market dynamics and feedback loops of for example cascading margin calls with the objective to get a better understanding of risk scenarios, not to forecast exogenous order flow. The price trajectory is determined by the present market state and new orders arriving in the market. By studying the market microstructure, we can compute the impact of an order of any size, or how big a sell order has to be to cause the market to fall by a certain percentage. Using a definite formalism reduces ambiguity and enables rigorous reasoning. An algorithm for assessing risk is proposed. Real markets are more complex than the models presented in this paper and this paper is a step towards building a solid foundation for studying market models.
Seminal work has showed that empirically-observed universal features of financial markets can be replicated by simulations in which traders neither act rationally nor adaptively, and concludes that institutional features explain macro-behaviour. We showed that not all stylized facts can be explained by such, and we introduce a new hybrid model which incorporates features from both econometric- modelling and agent-based simulations. We were able to systematically catalogue which properties were replicated by the hybrid model at different frequencies. This later led to an analysis of scaling laws in agent-based finance (DOI:10.1002/isaf.1346).
This paper studies cooperation in dynamic social networks and is the first to show that their empirical properties can be explained by learning dynamics. This was invited for presentation at the Konrad Lorenz Institute in Vienna (http://www.konradlorenz.de/Modules/Assets/events/85/Network_Theory_for_Living_Systems_.pdf) by the biologist Ronald Noe. This has led to an international cross-disciplinary collaboration looking for evidence of both forms of reciprocity in biological field data, comprising biologists, economists and computer-scientists.
This paper arose out of broader work in the field of evolutionary, or automated, mechanism design, in which the author contributed one of the earliest papers, and has become an active area of research in multi-agent systems. The software used therein is available as open-source (http://jasa.sourceforge.net), is widely used, and was invited for presentation at a workshop by Doyne Farmer of the INET Institute, Oxford University, which invited the "most prominent macroeconomic agent-based scholars to work together to the development of a project aimed to build a common open-source ABM platform".
The existence of power-laws in size distributions and autocorrelation functions are well-known empirical phenomena in many complex adaptive systems, including financial markets. This paper is one of the first to take a systematic approach to the modelling of such by showing which properties of an agent- based model of a financial market contribute to particular scaling laws.
This paper reports the application of advanced technology to help British Telecom to improve its scheduling of its service teams. The first contribution of this paper is to formulate a very complex operation as a scheduling problem. This then allows the schedulers to identify opportunities to improve the operation. The work has been developed into operations at Openreach in British Telecom. This was a continuation of the work by BT's project leader Chris Voudouris (Voudouris et al. (ed.), Service Chain Management, Springer, 2008). The work is later developed into BT's cloud-based supply chain management service in 2012: Although this work is motivated by British Telecom's operation, the model and techniques developed are general to field workforce scheduling, where service teams with various skills have to serve jobs satisfying time constraints.
Bargaining is an important area in game theory. Given a situation, researchers are interested in finding the subgame equilibrium. Classical (mathematical) approaches typically assume perfect rationality by the players. Derivations are typically laborious. This paper shows that evolutionary computation allows us to approximate subgame equilibrium more efficiently. Instead of assuming perfect rationality, evolutionary computation assumes reinforcement learning, which is probably more realistic. This approach allows us to compute subgame equilibrium for much more complex situations than those reported in the literature. This paper establishes that evolutionary computation is a practical method for approximating subgame equilibrium.
This paper presents a framework, CHASM (Co-evolutionary Heterogeneous Artificial Stock Market), for research in artificial markets. This is one of the most comprehensive artificial market frameworks published so far. With this framework, this paper identifies a condition under which an artificial market can generate realistic prices; i.e. exhibiting stylized facts. Following this work, Martinez brought agent-based modelling and simulation into Mexican Central Bank. There Martinez and his co-authors’ work led to the development of central banking policies, which influenced other central banks. Some of the results were reported in Alexandrova-Kabadjova B., S. Martinez-Jaramillo, A. L. Garcia-Almanza & E. Tsang (ed.), Simulation in Computational Finance and Economics: Tools and Emerging Applications, IGI Global, 2012, which was launched in high profile by the Mexican Central Bank in October 2012.
This paper explains how problems in container terminals can be formulated as constraint satisfaction problems. It covers all the major operations in modern container terminals. It is the most comprehensive formulation in this application. This is a significant contribution, because from now on, constraint satisfaction techniques can be applied to these problems. This paper partially led to a KTP project with Felixstowe, the busiest port in UK and a TSB-funded logistics project with Transfaction.
To understand financial markets and prevent unnecessary crises, markets should be studied scientifically. With event calculus, this paper formalizes market clearance dynamics under simple market models. The immediate future is determined by the present state plus the orders in the market. The purpose of this calculus is not to forecast orders, but to provide a means to reason with market dynamics and assess risk. Using logic reduces ambiguity and enables rigorous reasoning. An algorithm for assessing risk is proposed. Real markets are more complex than the models presented in this paper. However, this paper lays a solid foundation for studying market models.
Page maintained by Edward Tsang; updated 2013.12.10