Events

Yang Xu: Intervention to Mitigate Contagion in a Financial Network

Systemic risk in financial networks has received attention from academics since the 2007-2009 financial crisis. We analyze a financial network from the perspective of a regulator who aims to minimize the fraction of defaults under a budget constraint. Unlike the majority of literature in this field, the connections between financial institutions (hereafter, banks) are assumed unknown in the beginning, but are revealed as the contagion process unfolds. We focus on the case in which the number of initial defaults is small relative to the total number of banks. We analyze the optimal...

Jeff Bohn, State Street Global Exchange: Simplicity and complexity in risk modeling: When is a risk model too simple? 

Most financial risk modelers (like most scientific modelers) attempt to develop models that are as simple as possible while still remaining useful. This said, models can become too simple thereby losing their signalling power. This circumstance can leave a risk manager without guidance as to what material risks a financial institution may be exposed. As financial markets, financial securities and regulations have proliferated, trading off simplicity and complexity becomes a particularly difficult challenge for financial risk modelers. This presentation introduces some thoughts and raises...

Alex Shkolnik, UC Berkeley: Dynamic Importance Sampling for Compound Point Processes

We develop efficient importance sampling estimators of certain rare event probabilities involving compound point processes. Our approach is based on the state-dependent techniques developed in (Dupuis & Wang 2004) and subsequent work. The design of the estimators departs from past literature to accommodate the point process setting. Namely, the state-dependent change of measure is updated not at event arrivals but over a deterministic time grid. Several common criteria for the optimality of the estimators are analyzed. Numerical results illustrate the advantages of the proposed...

Roger Craine, UC Berkeley: Safe Capital Ratios for Bank Holding Companies

This paper gives three quantitative answers to Fischer’s question “at what level should capital ratios be set?” based on (1) the FED Stress Tests 2015 (2) VLab’s Systemic Risk measures and (3) our (Craine- Martin) estimates. This paper compares Safe Capital Ratios for 18 Bank Holding Companies. The Craine-Martin (CM) implied safe capital ratios are the highest averaging 22%, followed by VLab’s averaging 16%, and the FED Stress tests the lowest at 11%. We (CM) find higher implied safe capital ratios than VLab because our specification allows losses at one bank holding company to effect the...

Roger Stein, MIT: A simple hedge for longevity risk and reimbursement risk using research-backed obligations 

Longevity risk is the risk that the promised recipient of lifetime cashflows ends up living much longer than originally anticipated, thus causing a shortfall in funding. A related risk, reimbursement risk is the risk that providers of health insurance face when new and expensive drugs are introduced and the insurer must cover their costs. Longevity and reimbursement risks are particularly acute in domains in which scientific breakthroughs can increase the speed of new drug development. An emerging asset class, research-backed obligations or RBOs (cf., Fernandez et al., 2012), provides a...

Ezra Nahum, Goldman Sachs: The Life of a Quant 1995-2015

In this lecture, I will review different practical risk management and modeling challenges that I encountered during the course of my career (inclusive of my Ph.D years). My review will highlight how the landscape has changed. For instance, while modeling exotic derivatives was the main activity for quants in the late 90s, capital optimization is the most important consideration today.

Start date: 2016-02-16 11:00:00 End date: 2016-02-16 12:30:00 Venue: 639 Evans Hall at UC Berkeley Address: 639 Evans Hall, Berkeley, CA, 94720

Nathan Tidd, Tidd Labs: Predicting Equity Returns with Valuation Factor Models 

This presentation summarizes the motivation, methodology, and initial test results of Equity Valuation Factor Models, an adaptation of popular multi-factor modeling techniques that seeks to explain the price and payoff of business ownership as a precursor to explaining equity returns. A departure from traditional returns-based models, the approach produces new information such as current factor prices that inform both risk & returns expectations, with a number of potential applications for real-world investment decisions.

Start date: 2016-02-09 11:00:00 End date: 2016-02-09 12:30:...

Yaniv Konchitchki, UC Berkeley (Haas): Accounting and the Macroeconomy: The Housing Market

This study introduces a new approach for financial statement analysis—the geographic analysis of firms’ financial statements. Start date: 2016-02-23 11:00:00 End date: 2016-02-23 12:30:00 Venue: 639 Evans Hall at UC Berkeley Address: 639 Evans Hall, Berkeley, CA, 94720

Alex Shkolnik, UC Berkeley: Identifying Financial Risk Factors with a Low-Rank Sparse Decomposition

Factor models of security returns aim to decompose an asset return covariance matrix into a systematic component and a specific risk component. Standard approaches like PCA and maximum likelihood suffer from several drawbacks including a lack of robustness as well as their strict assumptions on the underlying model of returns. We survey some modern, robust methods to uniquely decompose a return covariance matrix into a low-rank component and a sparse component. Surprisingly, the identification of the unique low rank and sparse components is feasible under mild assumptions. We apply the...

Kellie Ottoboni, UC Berkeley: Model-based matching for causal inference in observational studies

Drawing causal inferences from nonexperimental data is difficult due to the presence of confounders, variables that affect both the selection into treatment groups and the outcome. Post-hoc matching and stratification can be used to group individuals who are comparable with respect to important variables, but commonly used methods often fail to balance confounders between groups. We introduce model-based matching, a nonparametric method which groups observations that would be alike aside from the treatment. We use model-based matching to conduct stratified permutation tests of association...