My warmest congratulations to the CCFEA on reaching its 10th birthday in such robust good health. It is immensely gratifying that what started out as no more than a gleam in the eye of Michael Dempster, Sheri Markose and Edward Tsang has, as a result of their vision and determination, become one of the world's leading centres for teaching and research in computational finance. It is wonderful that in the relatively short period of ten years so many students from across the globe have graduated from the Centre and gone on to careers of influence and innovation in the financial, IT and academic sectors. I see nothing but a bright future for the CCFEA, one in which it will continue to be as pioneering and dynamic in the next decade as it has been in the last.
Financial advisors and private wealth managers have effectively ignored academically established consumption-investment theory for individual household life-cycle financial planning. Indeed, industry best practice is a combination of static Markowitz buy and hold optimization applied to segregated funds for different goals, such as educating children and retirement. This talk advocates instead the use of dynamic stochastic optimization as currently applied by leading institutional asset liability managers. The focus is on an analysis of of CSA's iALM (TM) system's recommendations for a representative UK household. iALM is fully cash flow based, incorporates all uncertainties faced by households -- e.g. market, income, inflation, illness and death -- and utilizes CSA's patented STOCHASTICS development system. Evaluation of financial plan sensitivity to changes in individual circumstances, preferences or market environments, such as the adjustments necessary through the financial crisis, will be discussed.
In this talk, I shall give an overview of research in computational finance and economics. Computational intelligence has changed the landscape of finance and economics research. It enables us to study finance and economics in ways that we were previously impossible. For example, advances in evolutionary computation allows us to discover patterns in the market, bargaining strategies and economic models; advances in optimization, especially multi-objective optimization allows us to search for portfolios that balance between return and risk. Modelling and simulation allow us to study markets scientifically. Machine learning helps us to explore market mechanism and evolve strategies. Algorithms and heuristics define bounded rationality in an agent.
After the subprime and the European sovereign debt crises, systemic risk assessment has become a main concern for Central Banks and financial supervisors. However, crises represent also important opportunities to improve the prevailing framework in order to prevent future crises, just like the 1930's crisis worked as a vehicle to consolidate the theory on monetary policy. In the case of systemic risk measurement we are still on the drawing board phase in which there are many approaches but still no consensus.
There are some intrinsic characteristics of systemic crises which make more difficult the challenging task of systemic risk measurement: systemic crises are low frequency high impact events. Furthermore, there is also an important data gap which was made evident during the recent crisis and measurement without data is simply an impossible task.
Some academics have pointed out that the current proposals have important methodological shortcomings; among such arguments there is one which even questions if systemic risk is a measurable concept. Fortunately, consensus is growing and some of the approaches are now widely used but the quest is still open in the search of a simple yet meaningful model for systemic risk measurement.
In this talk besides providing a wide view to the problem and the current approaches, we provide relevant evidence on the tools which have been developed at the Mexican central bank for systemic risk measurement and monitoring.
Many problems in economics in general and finance in particular include some optimization part. Sometimes, this is quite obvious, e.g., when one wants the expected return of a portfolio or minimize its risk). Sometimes, it is less obvious; selecting models or estimating parameters. Stating an optimization problem is one thing, solving it another. And more often than not, strong assumptions are introduced or the models need to be over-simplified on grounds to make them mathematically tractable -- if not analytically, then at least numerically. However, this is often at the cost that the results are not very robust and reliable for the original, unconstrained problem.
One way out of this dilemma is the use of alternative methods. Heuristic methods are non-deterministic approaches that serve this purpose well: For one, they are much more flexible in incorporating constraints and are not limited to specific types of objective functions. Furthermore, they include features that allow the evasion of local optima and prevent premature convergence. Therefore, they can deal with the inherent complexity of real world financial optimization problems.
This talk presents how heuristics methods can be used for more realistic optimization models with fewer -- if any -- restricting assumptions. Unfortunately, these methods, too, come at a cost: they might require some calibration adaption to the problem at hand (and not, as traditionally, the other way round). Fortunately, these costs can be reduced: scientific analysis of these methods helps in understanding their behavior (end, hence, how to implement them efficiently), and users of these methods find application becomes increasingly efficient with experience, and otherwise untractable problem can be solved. This presentation highlights some of the methods from a operational side.
The landscape for quantitative equity investing has significantly changed over the last 10 years. The events of August 2007 (quantmare) made evident how crowded quant strategies had become. And then, during the financial crisis, some popular quant strategies that had been very profitable suffered large losses, and many investors left the space. Surviving quant managers had to innovate. In this presentation we will outline some of the recent developments, and discuss challenges and opportunities that lay ahead.
Incremental Risk Charge (IRC) is a regulatory capital model required to capture default and credit migration risk. The IRC measures risk at the 99.9th percentile for a one-year capital horizon. The presentation gives a brief history of the regulatory requirements, discusses the methodology and provides information regarding the calibration and the testing of the HSBC IRC model.
Computational Finance will transform the global economy and our society at large in the coming ten years. The talk will discuss the structural changes that will occur in the financial markets and how these changes will leverage the impact of computational finance. I will give an overview of the major challenges that computational finance faces, what type of services and new financial products will develop. The talk will cover the key concepts from intrinsic time, to theory of fractals, complexity and agent based modeling that will shape computational finance.