All seminars are held in 1011 Evans Hall at UC Berkeley, unless otherwise notified.
Upcoming seminarThere are no upcoming seminars.
Drawing causal inference with observational studies is the central pillar of many disciplines. One sufficient condition for identifying the causal effect is that the treatment-outcome relationship is unconfounded conditional on the observed covariates. It is often believed that the more covariates we condition on, the more plausible this unconfoundedness assumption is. This belief has had a huge impact on practical causal inference, suggesting that we should adjust for all pretreatment covariates. However, when there is unmeasured confounding between the treatment and outcome, estimators adjusting for some pretreatment covariate might have greater bias than estimators that do not adjust for this covariate. This kind of covariate is called a bias amplifier and includes instrumental variables that are independent of the confounder and affect the outcome only through the treatment. Previously, theoretical results for this phenomenon have been established only for linear models. We fill this gap in the literature by providing a general theory, showing that this phenomenon happens under a wide class of models satisfying certain monotonicity assumptions.Read the paper this talk is based on: Ding et al (2017)
We introduce two approaches to computing and minimizing the risk measure Conditional Expected Drawdown (CED) of Goldberg and Mahmoud (2016). One approach is based on a continuous-time formulation yielding a partial differential equation (PDE) solution to computing and minimizing CED while another is a sampling based approach utilizing a linear program (LP) for minimizing CED.
The identification of factors that predict the cross-section of stock returns has been a focus of asset pricing theory for decades. We address this challenging problem for both equity performance and risk, the latter through the maximum drawdown measure. We test a variety of regression-based models used in the field of supervised learning including penalized linear regression, tree-based models, and neural networks. Using empirical data in the US market from January 1980 to June 2018, we find that a number of firm characteristics succeed in explaining the cross-sectional variation of active returns and maximum drawdown, and that the latter has substantially better predictability. Non-linear models materially add to the predictive power of linear models. Finally, environmental, social, and governance impact enhances predictive power for non-linear models when the number of variables is reduced.
We study the spread of losses and defaults through financial networks focusing on two important elements of regulatory reforms: collateral requirements and bankruptcy stay rules in over-the-counter (OTC) markets. Under "segregated" collateral requirements, one firm can benefit from the failure of another, the failure frees the committed collateral of the surviving firm giving it additional resources to make other payments. In OTC derivatives markets, similarly, one firm may obtain additional resources upon the failure of another if it terminates its in the money derivatives with the failed entity. Studying contagion in the presence of this real world phenomenon becomes challenging. Our proposed model deviates from the existing network models to capture collateral and accelerated contract termination payments. The model also incorporates fire sales externalities when collateral is held in illiquid assets. We show that asset fire sales increase the risk of contagion if illiquid collateral is seized and sold immediately upon defaults. We also analyze the impact of different bankruptcy stay rules on contagion. Some of our results contrast with the post-crisis stay rules. For instance, we show that when banks are not highly leveraged in terms of their OTC derivatives transactions, which is now the case due to the impact of regulatory reforms, symmetric contract termination in the absence of automatic stays can reduce the risk of contagion.
Diversification is a fundamental concept in financial economics, risk management, and decision theory. From a broad perspective, it conveys the idea of introducing variety to a set of objects. Today, there is general consensus that some form of diversification is beneficial in asset allocation, however its definition is context-dependent and there is no consensus on a widely accepted, mathematically concise and economically sound notion of diversification. Indeed, there is an ongoing debate about what the “best” level of diversification should be. There is also a recent trend of evaluating certain diversifying heuristics as being “anomalous” and “irrational”. In this talk, I shall approach the notion of diversification from a foundational perspective by asking how elementary it really is. I take the view that diversification is a behavioural choice heuristic and an evolutionary cognitive adaptation that is selectively advantageous under many economic and financial circumstances. The talk will dig deeper into the roots of this paradigm; first, through an experimental study on children that looks into whether they would diversify in a sequence of gambles aimed at replicating typical portfolio choice scenarios; then by formulating an evolutionary theory of diversification.
We present a novel explanation for the prevalence of foreign-currency borrowing in emerging markets. First, under limited liability, foreign-currency denominated debt acts as a state-contingent claim: Borrowers maximizing profits in local currency are partly shielded from large devaluations through bankruptcy, when repaying foreign currency debt is expensive, but pay higher rates in non-devaluation states, when repayment is relatively cheaper. Second, foreign- currency borrowing can improve firms’ incentives and reduce agency problems at the cost of higher systemic risk. The resulting trade-off between average performance and systemic stability, which becomes stronger when bankruptcies entail externalities, lends support to regulation limiting currency mismatches.Read the paper this talk is based on: DellAriccia et al (2018)
Any asset can use some portfolio of similar assets to insure against its own factor risks, even if the identities of the factors are unknown. A long position of an asset and a short position of this portfolio forms an asset insurance premium (AIP) that is different from the equity risk premium. We estimate the AIP by projecting a stock’s return onto the entire asset returns span using a machine learning method. Stocks least (most) synchronized with other stocks earn a monthly AIP of 0.976% (0.305%). Asset synchronicity is countercyclical: high consumption growth correlates with low average asset insurance premium.Read the paper this talk is based on: 20190319 Leung paper Download the slides from this presentation: 20190319 Leung slides
We consider the experimentation dynamics of a decision maker (DM) in a two-armed bandit setup, where the agent holds ambiguous beliefs regarding the distribution of the return process of one arm and is certain about the other one. The DM entertains Multiplier preferences a la Hansen and Sargent , thus we frame the decision making environment as a two-player differential game against nature in continuous time. We characterize the DM's value function and her optimal experimentation strategy that turns out to follow a cut-off rule with respect to her belief process. The belief threshold for exploring the ambiguous arm is found in closed form and is shown to be increasing with respect to the ambiguity aversion index. We then study the effect of provision of an unambiguous information source about the ambiguous arm. Interestingly, we show that the exploration threshold rises unambiguously as a result of this new information source, thereby leading to more conservatism. This analysis also sheds light on the efficient time to reach for an expert opinion.Read the paper this talk is based on: Pourbabaee (2019)
A 10-K is an annual report filed by a publicly traded company about its financial performance and is required by the U.S. Securities and Exchange Commission (SEC). 10-Ks are fairly long and tend to be complicated. But this is one of the most comprehensive and most important documents a public company can publish on a yearly basis. The Global Industry Classification Standard (GICS) is an industry taxonomy developed in 1999 by MSCI and S&P Dow Jones Indices and is designed to classify a company according to its principal business activity. The GICS hierarchy begins with 11 sectors and is followed by 24 industry groups, 68 industries, and 157 sub-industries. We ask two questions: First, can a classifier be trained to recognize a firm's GICS sector based on the textual information in its 10-K? Second, can we extract, from the classifier, embeddings (low dimensional vectors) for 10-Ks that respect their GICS sectors, so firms within the same sector would have embeddings that are close (measured by cosine similarity)? We report on a series of experiments with Convolutional Neural Network (CNN) for text classification, trained on two variants of document representations, one uses pre-trained word vectors, the other is based on the simple bag-of-words model.Download the slides from this presentation: tgan190416risk
This research studies the role of information network in market quality and market reaction to public announcements. We propose in this article a three-period rational noisy expected equilibrium model by taking both public and private information into account with an embedded information network structure among market traders. Closed form expressions for market reaction and market quality are derived as a function of topological structure of the network and several novel results are revealed. The trading volume and price change have different responses to network connectedness. As network connectedness increases, there is a downward trend for price change. The downward trend are decreasing which reﬂects that the market eﬃciency can not increase to inﬁnite in reality. However, the change of trading volume is uncertain because it depends on two attributes of the network, the uniformity and connectedness, it is hard to compare which one dominate another one. To the market quality, the information precision can increase market liquidity, market eﬃciency and decrease the cost of capital, network connectedness plays the same role in market eﬃciency and cost of capital, while it has a non-monotone inﬂuence towards market liquidity. And also network will suppress the eﬀect caused by disclosure.