Spring 2023

SEM217: Ben Davis, Paraport: Tax Loss Harvesting Optionality

Tuesday, April 25th @ 11:00-12:30 PM, 639 Evans Hall (RECORDING)

The tax law confers upon the investor a timing option – to realize capital losses and defer capital gains.  Investment managers offer tax loss harvesting strategies that systematically utilize this option to deliver after-tax returns in excess of tax oblivious strategies. The question of optimal exercise policy for this option was first taken up by Constantinides (1984) who showed that, when long- and short-term tax rates are equal, perfect substitute securities exist, and trading occurs at 1-year intervals, the optimal policy is to immediately realize losses and defer unrealized gains.  More recently, Khang, Cummings and Paradise (2022) examined the impact of loss harvesting frequency, from annual to daily, on the performance of passive-indexed tax loss harvesting.  The authors conclude that daily frequency has the maximum effectiveness in all volatility environments, assuming the existence of perfect substitute securities.  On the other hand, Israelov and Lu (2022) argue that the IRS wash-sale rule creates a barrier to re-investment such that investor should be selective about when to harvest a loss.  They present a trigger-based loss harvesting policy based loss depth determined by stock volatility and supported by extensive Monte Carlo simulation. Here we extend this line of reasoning by applying stochastic process theory to prove that such a trigger-based policy is optimal for a stylized, continuous time model of tax loss harvesting.

SEM217: Gary Kazantsev, Bloomberg:

Tuesday, May 2nd @ 11:00-12:30 PM 

SEM217: Hubeyb Gurdogan, CDAR: A propagation model to quantify business interruption losses in supply chain networks

Tuesday, January 24th @ 11:00-12:30 PM

Today's supply chains are global, highly interconnected, and increasingly digital. These three attributes of supply chains compound the effects of disruptions in production. For a company comprising many factories, a disruption in production at one site can impact production at other locations as well as production at other companies linked through the supply chain. Quantifying the financial impact of business interruption, such as production loss at a factory caused by a natural catastrophe (NatCat) such as an earthquake or hurricane, is challenging. The difficulty in quantification is due to complex risk propagation dynamics and complications related to the allocation of business profit to specific sites of production. Complex risk propagation dynamics reflect product and supplier dependencies and the inter-connectivity of related risks.


The aim of this research is to estimate production losses at company locations to enable the quantification of exposed business interruption values (i.e. potential gross profit/earnings losses) taking into account interdependencies among the company and the supplying partners within its supply chain network. This approach can provide insurers and reinsurers with the required financial metrics to better address these risks. In this paper, after defining the adapted stochastic fully decomposed supply chain network (FDSN), we propose a new methodology to model the production rate potential at each site of production as a stochastic process via a recursive procedure. Finally, we consider the HAZUS Earthquake Model (HAZUS-EM) to estimate downtime and to quantify the impact of business interruption. Business interruption is propagated through the FDSN given an interruption in production in the supply chain network.

SEM217: Stephen Kealhofer and Simon Cenci, Blackstone: A causal approach to test empirical capital structure regularities

Tuesday, January 31st @ 11:00-12:30 PM (RECORDING)

Capital structure theories are often formulated as causal narratives to explain which factors drive financing choices. These narratives are usually examined by estimating cross-sectional relations between leverage and its determinants. However, the limitations of causal inference from observational data are often overlooked. To address this issue, we use structural causal modeling to identify how classic determinants of leverage are causally linked to capital structure and how this causal structure influences the effect-estimation process.  The results provide support for the causal role of variables that measure the potential for information asymmetry concerning firms’ market values. Overall, our work provide a crucial step to connect capital structure theories with their empirical tests beyond simple correlations.

SEM217: David Romer and Christy Romer, UC Berkeley: Inflation and monetary policy

Tuesday, February 7th @ 11:00-12:30 PM

Drawing on work in progress, we examine the sources of the recent inflation, the Federal Reserve’s response, and the likely implications. We present evidence that the sharp rise in inflation reflects a mix of supply factors (particularly in the early stages) and an overheated economy (particularly later). The overheated economy, in turn, appears to have been caused in part by highly expansionary fiscal policy and the Federal Reserve’s slowness in tightening policy. We show that the sharp tightening of monetary policy since mid-2022 is similar to numerous anti-inflationary shifts in monetary policy since World War II. The evidence from those episodes suggests that little of the impact of the recent tightening has yet occurred, that the impact on real activity is likely to be substantial, and that the impact on inflation is highly uncertain.

SEM217: William Zame, UCLA: Index Funds, Asset Prices and the Welfare of Investors

Tuesday, February 14th @ 11:00-12:30 PM (RECORDING)

We present an equilibrium model in which heterogeneous investors choose among bonds, stocks, and an Index Fund holding the market portfolio. We show that, under standard assumptions, an equilibrium exists. We then derive predictions for equilibrium asset prices, investor behavior, and investor welfare. The presence of the index fund (or a decrease in the fee charged by the index fund) tends to increase stock market participation and thus increase asset prices and decrease expected returns from investing in the stock market. As a result, few - if any - investors benefit from the availability of cheap market indexing.

SEM217: Zachary Feinstein, Stevens Institute of Technology: Endogenous Network Valuation Adjustment and the Systemic Term Structure in a Dynamic Interbank Model

Tuesday, February 21st @ 11:00-12:30 PM (RECORDING)

In this talk we introduce an interbank network with stochastic dynamics in order to study the yield curve of bank debt under an endogenous network valuation adjustment. This entails a forward-backward approach in which the future probability of default is required to determine the present value of debt. As a consequence, the systemic model presented herein provides the network valuation adjustment to the term structure for free without additional steps required. Time permitting, we present this problem in two parts: (i) a single maturity setting that closely matches the traditional interbank network literature and (ii) a multiple maturity setting to consider the full term structure. Numerical case studies are presented throughout to demonstrate the financial implications of this systemic risk model.

SEM217: Gerald Garvey, BlackRock: Industry Winners and Losers in a Lower-Carbon Economy: A Structural Model

Tuesday, February 28th @ 11:00-12:30 PM (RECORDING)

This paper models a viable low-carbon economy using global Input-Output tables along with emissions data for 54 industries in 57 countries. Some high-emitting industries such as Air Transport and Retailing support a wide range of otherwise low-carbon goods and services and are predicted to fare well. Industries such as Health Care and Banks that appear green based on direct emissions are nonetheless vulnerable due to reliance on high emission sectors such as construction. To test the model, the authors use high historical energy prices to proxy more stringent carbon regulation. Industries that the model classifies as resilient perform well, while industries that are commonly viewed as green significantly underperform in the face of high energy costs.

SEM217: Jordan Lekeufack, UC Berkeley: Volatility is (Mostly) Path-Dependent

Tuesday, March 7th @ 11:00-12:30 PM

We learn from data that volatility is mostly path-dependent: up to 90% of the variance of the implied volatility of equity indexes is explained endogenously by past index returns, and up to 65% for (noisy estimates of) future daily realized volatility. The path-dependency that we uncover is remarkably simple: a linear combination of a weighted sum of past daily returns and the square root of a weighted sum of past daily squared returns with different time-shifted power-law weights capturing both short and long memory. This simple model, which is homogeneous in volatility, is shown to consistently outperform existing models across equity indexes and train/test sets for both implied and realized volatility. It suggests a simple continuous-time path-dependent volatility (PDV) model that may be fed historical or risk-neutral parameters. The weights can be approximated by superpositions of exponential kernels to produce Markovian models. In particular, we propose a 4-factor Markovian PDV model which captures all the important stylized facts of volatility, produces very realistic price and volatility paths, and jointly fits SPX and VIX smiles remarkably well. We thus show that a continuous-time Markovian parametric stochastic volatility (actually, PDV) model can practically solve the joint SPX/VIX smile calibration problem.

SEM217: Marielle de Jong, Grenoble Ecole de Management: Portfolio Optimization in an Uncertain World

Tuesday, March 14th @ 11:00-12:30 PM (RECORDING)

Mean-variance efficient portfolios are optimal as Modern Portfolio Theory alleges, only if risk were foreseeable, which is under the hypothesis that price (co)variance is known with certainty. Admitting uncertainty changes the perception. If portfolios are presumed vulnerable to unforeseen price shocks as well, risk optimality is no longer obtained by minimizing variance but also pertains to the diversification in the portfolio, for that provides protection against unforeseen events.

Generalizing MPT in this respect leads to the double risk objective to minimize variance and maximize diversification. We demonstrate that a series of portfolio construction techniques developed as an alternative to MPT, in fact, address this double objective, under which Bayesian optimization, entropy-based optimization, risk parity and covariance shrinkage. We give an analytical demonstration and provide by that new theoretical backing for these techniques.

SEM217: Lea Tschan, University of St.Gallen: Green Finance and Top Income Inequality

Tuesday, March 21st @ 11:00-12:30 PM

This paper provides empirical evidence for a significant positive association between green finance and top income inequality from a broad panel of countries. This relationship is strongest for countries with low GDP or low levels of financial development. Moreover, we find evidence for a significant positive long term effect on inequality that persists for more than five years. We argue that the association between green finance and inequality is at least partially driven by an innovation channel. Using a moderated mediation setup, we show that innovation is mediating the positive relationship between green finance and top income inequality.

Memorial Stadium outside view

SEM217: Cancelled

Tuesday, April 4th @ 11:00-12:30 PM 

SEM217: Youhong Lee, UC Santa Barbara: Regularized Estimators in High Dimensional PCA

Tuesday, April 11th @ 11:00-12:30 PM (RECORDING)

The idea of regularization that combines a simply structured target with classical estimators is popular in high-dimensional data analysis. We propose a new regularization method and its fast machine learning algorithm called direction-regularized principal component analysis (drPCA). The regularization method solves the PCA problem that seeks the direction of maximum variance of the data subject to some prior target direction. An asymptotic analysis of the solution by the high-dimensional low-sample size framework gives an optimal tuning parameter that minimizes an asymptotic loss function, and the data quickly learns the corresponding estimator to the optimal tuning parameter. We can also show that our estimator is equivalent to the Ledoit-Wolf constant correlation shrinkage estimator and a recently proposed James-Stein estimator for the first principal component under some specific covariance structures.

SEM217: David Buckle: When a dearth of active management affects market performance either the Grossman-Stiglitz paradox is explained, or the asset management industry needs a levy

Tuesday, April 18th @ 11:00-12:30 PM, 639 Evans Hall (RECORDING)

We hypothesise that the market Sharpe ratio depends on price discovery. We adapt Treynor and Black’s model, making Sharpe ratio endogenous on the proportion of investment that is actively managed. If investors recognise this endogeneity and improve their utility by investing such that price discovery ensures an optimal market Sharpe ratio, they will allocate to active strategies – even if these strategies underperform the market – thereby explaining the Grossman-Stiglitz paradox. If investors consider the market Sharpe ratio exogenous, they will not allocate to active, minimising market Sharpe ratio. We show that a self-financing levy maximises investor utility in this case.