Events

Ram Akella, University of California, Berkeley School of Information and TIM/CITRIS/UCSC: Dynamic Multi-modal and Real-Time Causal Predictions and Risks

There are three major trends in prediction and risk analytics. We describe our research on two fronts and speculate on the third. We do this in the context of healthcare analytics and computational advertising at Silicon Valley firms. We first describe prediction and risk analytics using combined multi-modal numerical data (from vitals and labs) and text data (from notations by doctors and nurses). We describe and analyze dynamic models of patient mortality probabilities, and integrate novel topic modeling to account for topic constraints, and demonstrate superior performance on Intensive...

Mark Flood, Office of Financial Research: Measures of Financial Network Complexity: A Topological Approach

We present a general definition of complexity appropriate for financial counterparty networks and derive several topologically based implementations. These range from simple and obvious metrics to others that are more mathematically subtle. It is important to tailor a complexity measure to the specific context in which it is used. This paper introduces measures of the complexity of search and netting in dealer markets. We define measures of line graph homology and collateral line graph homology that are sensitive to network interactions, such as collateral commingling and interdependent...

Samim Ghamami, Office of Financial Research: Does OTC Derivatives Reform Incentivize Central Clearing?

Joint Work with Paul Glasserman Abstract: The reform program for the over-the-counter (OTC) derivatives market launched by the G-20 nations in 2009 seeks to reduce systemic risk from OTC derivatives. The reforms require that standardized OTC derivatives be cleared through central counterparties (CCPs), and they set higher capital and margin requirements for non-centrally cleared derivatives. Our objective is to gauge whether the higher capital and margin requirements adopted for bilateral contracts create a cost incentive in favor of central clearing, as intended. We introduce a model of...

Daniel Mantilla-Garcia, Optimal Asset Management: Disentangling the Volatility Return: A Predictable Return Driver of Any Diversified Portfolio

Abstract: The long-term performance of any portfolio can be decomposed as the sum of the weighted average long-term return of its assets plus the volatility return of the portfolio. The volatility return represents a larger proportion of the total return of portfolios with more homogeneous assets, such as stock factor portfolios. We unveil a direct relationship between the volatility return, and the cross-sectional variance of stock returns, as well as with the average idiosyncratic variance of the stocks in the portfolio. Furthermore, we introduce a strategy that maximizes the volatility...

Danny Ebanks, Federal Reserve: The Network of Large-Value Loans in the US: Concentration and Segregation

On this joint project with Anton Badev, we analyze the universe of large-value loans intermediated through Fedwire, the primary U.S. real-time, gross settlement service provided by the Federal Reserve System for the period from 2007 to 2015. We embed banks' bilateral lending relationships and interest rate quotes in a game on a graph following Badev (2013), for which we approximate the equilibrium play via a k-player dynamic. We document a series of fundamental changes in the topology of the bilateral loan network and propose a framework to study the evolution of the concentration of large...

Ryan Copus and Hannah Laqueur, UC Berkeley: Machines Learning Justice: A New Approach to the Problems of Inconsistency and Bias in Adjudication

Abstract: We offer a two-step algorithmic approach to the problems of inconsistency and bias in legal decision making. First, we propose a new tool for reducing inconsistency: Judgmental Bootstrapping Models (“JBMs”) built with machine learning methods. JBMs, by providing judges with recommendations generated from statistical models of themselves, can help those judges make better and more consistent decisions. To illustrate these advantages, we build a JBM of release decisions for the California Board of Parole Hearings. Second, we describe a means to address systematic biases that are...

Thomas Idzorek, CFA, Head of Investment Methodology and Economic Research at Morningstar: Popularity: A Unifying Asset Pricing Framework?

In a 2014 article, Thomas Idzorek and Roger Ibbotson introduced popularity as an asset pricing framework. Popularity seems to provide a transcendent principle or insight that explains return premiums that are consistent with equilibrium efficient market asset pricing explanations (traditional risk and return framework) as well as so-called anomalies that...

Jim Hawley & Hendrik Bartel, TruValue Labs: Big Data Analytics and ‘Non-Financial’ Sustainability Information—uses of and initial findings from TruValue Labs’ first years

We present an overview of the current state of ESG (environmental, social, and corporate governance) data in the context of the value of so-called non-financial information. TruValue Labs generates real-time ESG/sustainability data using natural language processing, machine learning, and elements of AI to quantify unstructured (text) information sources. We present an overview of how this quantification process works and follow on by examining a variety of techniques used to analyze this data output. We will present some of those techniques and initial results including TVL (TruValue Labs...

Farzad Pourbabaee, UC Berkeley: Portfolio selection: Capital at risk minimization under correlation constraint

We studied the portfolio optimization problem in the Black-Scholes setup, subject to certain constraints. Capital at Risk (CaR) has turned out to resolve many of the shortcomings of the Value at Risk, hence is taken in this presentation as the objective of the optimization problem. Then, the CaR minimizing portfolio is found under a general correlation constraint between the terminal value of the wealth and an arbitrary financial index. Results are derived from both incomplete and incomplete markets, and finally, simulations are performed to present the qualitative behavior of the optimal...

Robert M. Anderson, CDAR Co-Director: PCA with Model Misspecification

In this project with UC Berkeley Ph.D. Candidate Farzad Pourbabaee, Principal Component Analysis (PCA) relies on the assumption that the data being analyzed is IID over the estimation window. PCA is frequently applied to financial data, such as stock returns, despite the fact that these data exhibit obvious and substantial changes in volatility. We show that the IID assumption can be substantially weakened; we require only that the return data is generated by a single distribution with a possibly variable scale parameter. In other words, we assume that return is R t = v t φ t, where the...