**All seminars took place 11:00AM to 12:30PM in 639 Evans Hall at UC Berkeley (unless otherwise noted)**

**January 2016:**

**19: Jeff Bohn**, State Street Global Exchange: *Simplicity and complexity in risk modeling: When is a risk model too simple? *

Most financial risk modelers (like most scientific modelers) attempt to develop models that are as simple as possible while still remaining useful. This said, models can become too simple thereby losing their signalling power. This circumstance can leave a risk manager without guidance as to what material risks a financial institution may be exposed. As financial markets, financial securities and regulations have proliferated, trading off simplicity and complexity becomes a particularly difficult challenge for financial risk modelers. This presentation introduces some thoughts and raises questions with respect to how a risk modeler decides when a risk model is too simple. This discussion will lean toward more practical (as opposed to theoretical) considerations.

**26: Roger Craine**, UC Berkeley: *Safe Capital Ratios for Bank Holding Companies*

This paper gives three quantitative answers to Fischer’s question “at what level should capital ratios be set?” based on (1) the FED Stress Tests 2015 (2) VLab’s Systemic Risk measures and (3) our (Craine- Martin) estimates.

This paper compares Safe Capital Ratios for 18 Bank Holding Companies. The Craine-Martin (CM) implied safe capital ratios are the highest averaging 22%, followed by VLab’s averaging 16%, and the FED Stress tests the lowest at 11%. We (CM) find higher implied safe capital ratios than VLab because our specification allows losses at one bank holding company to effect the others. In a crisis accounting for the covariance among bank holding companies returns gives much large losses since their returns and asset values are positively correlated. Both CM and VLab find larger implied safe capital ratios than the Fed Stress Tests because they calculate the loss to the market value of bank equity during a crisis while the Fed stress test calculate the loss the book value of bank equity in a crisis. Book equity values don’t respond very much to a crisis—even a crisis as large as the Gt Recession—so the implied book value safe capital ratios are not as large. Paper link: http://eml.berkeley.edu/~craine/2009/Capital%20Ratios%2001-05-16.pdf

**February 2016:**

**2: Roger Stein**, MIT: *A simple hedge for longevity risk and reimbursement risk using research-backed obligations *

Longevity risk is the risk that the promised recipient of lifetime cashflows ends up living much longer than originally anticipated, thus causing a shortfall in funding. A related risk, reimbursement risk is the risk that providers of health insurance face when new and expensive drugs are introduced and the insurer must cover their costs. Longevity and reimbursement risks are particularly acute in domains in which scientific breakthroughs can increase the speed of new drug development. An emerging asset class, research-backed obligations or RBOs (cf., Fernandez et al., 2012), provides a natural mechanism for hedging these risks: RBO equity tranches gain value as new life- extending therapies are developed and do so in proportion to the number of successful therapies introduced. We use the stylized case of annuity underwriting to show how RBO equity could be used to hedge some forms longevity risk on a retirement portfolio. Using the same framework, we then show how RBO securities may be used to hedge a much broader class of reimbursement risks faced by health insurance firms. We demonstrate how to compute hedge ratios to neutralize specific exposures. Although our analytic results are stylized, our simulation results suggest substantial potential for this asset class to reduce financial uncertainty for those institutions exposed to either longevity or reimbursement risks.

**9: Nathan Tidd**, Tidd Labs: *Predicting Equity Returns with Valuation Factor Models *

This presentation summarizes the motivation, methodology, and initial test results of Equity Valuation Factor Models, an adaptation of popular multi-factor modeling techniques that seeks to explain the price and payoff of business ownership as a precursor to explaining equity returns. A departure from traditional returns-based models, the approach produces new information such as current factor prices that inform both risk & returns expectations, with a number of potential applications for real-world investment decisions.

**The February 16 Risk Seminar will take place in 1011 Evans Hall **

**16: Ezra Nahum**, Goldman Sachs: *The Life of a Quant 1995-2015*

In this lecture, I will review different practical risk management and modeling challenges that I encountered during the course of my career (inclusive of my Ph.D years). My review will highlight how the landscape has changed. For instance, while modeling exotic derivatives was the main activity for quants in the late 90s, capital optimization is the most important consideration today.

**23: Yaniv Konchitchki**, UC Berkeley (Haas): *Accounting and the Macroeconomy: The Housing Market
*This study introduces a new approach for financial statement analysis—the geographic analysis of firms’ financial statements.

**March 2016:**

**1:** **Will Fithian**, UC Berkeley: *Semiparametric Exponential Families for Heavy-Tailed Data**
*We propose a semiparametric method for fitting the tail of a heavy-tailed population given a relatively small sample from that population and a larger sample from a related background population. We model the tail of the small sample as an exponential tilt of the better-observed large-sample tail, using a robust sufficient statistic motivated by extreme value theory. In particular, our method induces an estimator of the small-population mean, and we give theoretical and empirical evidence that this estimator outperforms methods that do not use the background sample. We demonstrate substantial efficiency gains over competing methods in simulation and on data from a large controlled experiment conducted by Facebook.

**This is joint work with Stefan Wager.

**8: Alex Papanicolaou**, *Integral Development Corporation: Background Subtraction for Pattern Recognition in High Frequency Financial Data
*Financial markets produce massive amounts of complex data from multiple agents, and analyzing these data is important for building an understanding of markets, their formation, and the influence of different trading strategies. I introduce a signal processing approach to deal with these complexities by applying background subtraction methods to high frequency financial data so as to extract significant market making behavior. In foreign exchange, for prices in a single currency pair from many sources, I model the market as a low-rank structure with an additive sparse component representing transient market making behavior. I consider case studies with real market data, showing both in-sample and online results, for how the model reveals pricing reactions that deviate from prevailing patterns. I place this study in context with alternative low-rank models used in econometrics as well as in high frequency financial models and discuss the broader implications of the melding of background subtraction, pattern recognition, and financial markets as it relates to algorithmic trading and risk. To my knowledge this is the first use of high-dimensional signal processing methods for pattern recognition in complex automated electronic markets.

**15: Kellie Ottoboni,** UC Berkeley: *Model-based matching for causal inference in observational studies*

Drawing causal inferences from nonexperimental data is difficult due to the presence of confounders, variables that affect both the selection into treatment groups and the outcome. Post-hoc matching and stratification can be used to group individuals who are comparable with respect to important variables, but commonly used methods often fail to balance confounders between groups. We introduce model-based matching, a nonparametric method which groups observations that would be alike aside from the treatment. We use model-based matching to conduct stratified permutation tests of association between the treatment and outcome, controlling for other variables. Under standard assumptions from the causal inference literature, model-based matching can be used to estimate average treatment effects. We give examples of model-based matching to test the effect of packstock use on endangered toads and of salt consumption on mortality at the level of nations.

**22:** *Spring Break (UC Berkeley Campus Closed) The Risk Seminar will reconvene on March 31, 2016.*

**29: Keith Sollers**, UC Davis: *Recent Developments in Optimal Placement of Trades
* Optimal placement of trades has received more attention recently, particularly in the high-frequency trading venue. We define a formulation of the optimal placement problem and present a closed-form solution to this problem in the discrete-time case. We then discuss the continuous-time case, where optimal solutions exist but no closed-form solution is known. After tuning the models using high-frequency market data, we present numerical solutions in continuous-time and exact solutions in discrete-time.

**April 2016:**

**5: Alex Shkolnik**, UC Berkeley: *Dynamic Importance Sampling for Compound Point Processes*

We develop efficient importance sampling estimators of certain rare event probabilities involving compound point processes. Our approach is based on the state-dependent techniques developed in (Dupuis & Wang 2004) and subsequent work. The design of the estimators departs from past literature to accommodate the point process setting. Namely, the state-dependent change of measure is updated not at event arrivals but over a deterministic time grid. Several common criteria for the optimality of the estimators are analyzed. Numerical results illustrate the advantages of the proposed estimators in an application setting.

**12: Alex Shkolnik**, UC Berkeley: *Identifying Financial Risk Factors with a Low-Rank Sparse Decomposition*

Factor models of security returns aim to decompose an asset return covariance matrix into a systematic component and a specific risk component. Standard approaches like PCA and maximum likelihood suffer from several drawbacks including a lack of robustness as well as their strict assumptions on the underlying model of returns.

We survey some modern, robust methods to uniquely decompose a return covariance matrix into a low-rank component and a sparse component. Surprisingly, the identification of the unique low rank and sparse components is feasible under mild assumptions. We apply the method of Chanrasekaran, Parillo and Willsky (2012) for latent graphical models to decompose a security return covariance matrix. The low rank component includes the market and other broad factors that affect most securities. The sparse component includes thin factors such as industry and country, which affect only a small number of securities, but in an important way. We illustrate the decomposition on simulated data, and also an empirical data set drawn from 25,000 global equities.

**19: Johan Walden**, UC Berkeley (Haas): *Trading, Profits, and Volatility in a Dynamic Information Network Model*

We introduce a dynamic noisy rational expectations model, in which information diffuses through a general network of agents. In equilibrium, agents’ trading behavior and profits are determined by their position in the network. Agents who are more closely connected have more similar period-by-period trades, and an agent’s profitability is determined by a centrality measure that is closely related to eigenvector centrality. In line with the Mixture of Distributions Hypothesis, the market’s network structure influences aggregate trading volume and price volatility. Volatility after an information shock is more persistent in less central networks, and in markets with a higher degree of private information. Similar results hold for trading volume. The shape of the autocorrelation functions of volatility and volume are related to the degree of asymmetry of the information network. Altogether, our results suggest that these dynamics contain important information about the underlying information diffusion process in the market.

**26: Yang Xu**, UC Berkeley: *Intervention to Mitigate Contagion in a Financial Network*

Systemic risk in financial networks has received attention from academics since the 2007-2009 financial crisis. We analyze a financial network from the perspective of a regulator who aims to minimize the fraction of defaults under a budget constraint. Unlike the majority of literature in this field, the connections between financial institutions (hereafter, banks) are assumed unknown in the beginning, but are revealed as the contagion process unfolds. We focus on the case in which the number of initial defaults is small relative to the total number of banks. We analyze the optimal intervention policy first for a regular network consisting of “vulnerable banks”. We then discuss the optimal intervention problem in a more general network setting.