Working papers 2018 Back to index

No posts.

  • Notes on “Refinements and Higher Order Beliefs”

    The abstract of our 1997 survey paper Kajii and Morris (1997b) on "Refinements and Higher Order Beliefs" reads:


    This paper presents a simple framework that allows us to survey and relate some different strands of the game theory literature. We describe a “canonical” way of adding incomplete information to a complete information game. This framework allows us to give a simple “complete theory” interpretation (Kreps 1990) of standard normal form refinements such as perfection, and to relate refinements both to the “higher order beliefs literature” (Rubinstein 1989; Monderer and Samet 1989; Morris, Rob and Shin 1995; Kajii and Morris 1997a) and the “payoff uncertainty approach” (Fudenberg, Kreps and Levine 1988; Dekel and Fudenberg 1990).


    In particular, this paper provided a unified framework to relate the notion of equilibria robust to incomplete information introduced in Kajii and Morris (1997a) [Hereafter, KM1997] to the classic refinements literature. It followed Fudenberg, Kreps and Levine (1988) and Kreps (1990) in relating refinements of Nash equilibria to a "complete theory" where behavior was rationalized by explicit incomplete information about payoffs, rather than depending on action trembles or other exogenous perturbations. It followed Fudenberg and Tirole (1991), chapter 14, in providing a unified treatment of refinements and a literature on higher-order beliefs rather than proposing a particular solution concept.


    The primary purpose of the survey paper was to promote the idea of robust equilibria in KM1997 and we did not try to publish it as an independent paper. Since we wrote this paper, there have been many developments in the literature on robust equilibria, fortunately. But there has been little work emphasizing a unified perspective, and consequently this paper seems more relevant than ever. We are therefore very happy to publish it twenty years later. We provide some notes in the following on relevant developments in the literature and how they relate to the survey. These notes assume familiarity with the basic concepts introduced in the survey paper and KM1997.




  • Stability of Sunspot Equilibria under Adaptive Learning with Imperfect Information


    This paper investigates whether sunspot equilibria are stable under agents’ adaptive learning with imperfect information sets of exogenous variables. Each exogenous variable is observable for a part of agents and unobservable from others so that agents’ forecasting models are heterogeneously misspecified. The paper finds that stability conditions of sunspot equilibria are relaxed or unchanged by imperfect information. In a basic New Keynesian model with highly imperfect information, sunspot equilibria are stable if and only if nominal interest rate rules violate the Taylor principle. This result is contrast to the literature in which sunspot equilibria are stable only if policy rules follow the principle, and is consistent with the observations during past business cycles fluctuations.  


    Sunspot-driven business cycle models are popular tools to account for the features of macroeconomic fluctuations that are not explained by fundamental shocks. US business cycles in the pre-Volcker period are considered to be driven by self-fulfilling expectations, so-called ”sunspots” (see Benhabib and Farmer, 1994; Farmer and Guo, 1994). Those non-fundamental expectations are considered to stem from the Fed’s passive stance to inflation (see Clarida, Gali, and Gertler, 2000; Lubik and Schorfheide, 2004). Even recently, global financial turmoils in the last decade had historic magnitudes that could not be explained by fundamental reasons, and hence it is analyzed in models with sunspot expectations (see Benhabib and Wang, 2013; Gertler and Kiyotaki, 2015)



  • Predicting Adverse Media Risk using a Heterogeneous Information Network


    The media plays a central role in monitoring powerful institutions and identifying any activities harmful to the public interest. In the investing sphere constituted of 46,583 officially listed domestic firms on the stock exchanges worldwide, there is a growing interest “to do the right thing”, i.e., to put pressure on companies to improve their environmental, social and government (ESG) practices. However, how to overcome the sparsity of ESG data from non-reporting firms, and how to identify the relevant information in the annual reports of this large universe? Here, we construct a vast heterogeneous information network that covers the necessary information surrounding each firm, which is assembled using seven professionally curated datasets and two open datasets, resulting in about 50 million nodes and 400 million edges in total. Exploiting this heterogeneous information network, we propose a model that can learn from past adverse media coverage patterns and predict the occurrence of future adverse media coverage events on the whole universe of firms. Our approach is tested using the adverse media coverage data of more than 35,000 firms worldwide from January 2012 to May 2018. Comparing with state-of-the-art methods with and without the network, we show that the predictive accuracy is substantially improved when using the heterogeneous information network. This work suggests new ways to consolidate the diffuse information contained in big data in order to monitor dominant institutions on a global scale for more socially responsible investment, better risk management, and the surveillance of powerful institutions.


    Adverse media coverage sometimes leads to fatal results for a company. In the press release sent out by Cambridge Analytica on May 2, 2018, the company wrote that “Cambridge Analytica has been the subject of numerous unfounded accusations, ... media coverage has driven away virtually all of the company’s customers and suppliers” [5]. This is just one recent example highlighting the impact of adverse media coverage on a firm’s fate. In another example, the impact of adverse media coverage on Swiss bank profits was estimated to be 3.35 times the median annual net profit of small banks and 0.73 times that of large banks [3]. These numbers are significant, indicating how adverse media coverage can cause huge damage to a bank. Moreover, a new factor, priced as the “no media coverage premium” [10], has been identified to help explain financial returns: stocks with no media coverage earn higher returns than stocks with high media coverage. Within the rational-agent framework, this may result from impediments to trade and/or from lower investor recognition leading to lower diversification [10]. Another mechanism could be associated with the fact that most of the coverage of mass media is negative [15, 23].



  • Term Structure Models During the Global Financial Crisis: A Parsimonious Text Mining Approach


    This work develops and estimates a three-factor term structure model with explicit sentiment factors in a period including the global financial crisis, where market confidence was said to erode considerably. It utilizes a large text data of real time, relatively high-frequency market news and takes account of the difficulties in incorporating market sentiment into the models. To the best of our knowledge, this is the first attempt to use this category of data in term-structure models.  

    Although market sentiment or market confidence is often regarded as an important driver of asset markets, it is not explicitly incorporated in traditional empirical factor models for daily yield curve data because they are unobservable. To overcome this problem, we use a text mining approach to generate observable variables which are driven by otherwise unobservable sentiment factors. Then, applying the Monte Carlo filter as a filtering method in a state space Bayesian filtering approach, we estimate the dynamic stochastic structure of these latent factors from observable variables driven by these latent variables. 

    As a result, the three-factor model with text mining is able to distinguish (1) a spread-steepening factor which is driven by pessimists’ view and explaining the spreads related to ultra-long term yields from (2) a spread-flattening factor which is driven by optimists’ view and influencing the long and medium term spreads. Also, the three-factor model with text mining has better fitting to the observed yields than the model without text mining.

    Moreover, we collect market participants’ views about specific spreads in the term structure and find that the movement of the identified sentiment factors are consistent with the market participants’ views, and thus market sentiment.  


    Although “market sentiment” is often regarded as an important driver of asset markets,1 it is not explicitly incorporated in traditional empirical factor models for the term structure of interest rates. This is because (1) it is not clear what sentiment factors mean, and moreover, (2) there are scant observations, if any, about these sentiment factors. This work formulates and estimates a factor model with explicit sentiment factors in the period including the global financial crisis, in which uncertainty was said to be heightened considerably. It utilizes a large text data of real-time, relatively high-frequency market news and takes account of the difficulties (1) and (2). To the best of our knowledge, this is the first attempt to use this category of data in term-structure models.



  • The Demand for Money at the Zero Interest Rate Bound


    This paper estimates a money demand function using US data from 1980 onward, including the period of near-zero interest rates following the global financial crisis. We conduct cointegration tests to show that the substantial increase in the money-income ratio during the period of near-zero interest rates is captured well by the money demand function in log-log form, but not by that in semi-log form. Our result is the opposite of the result obtained by Ireland (2009), who, using data up until 2006, found that the semi-log specification performs better. The difference in the result from Ireland (2009) mainly stems from the difference in the observation period employed: our observation period contains 24 quarters with interest rates below 1 percent, while Ireland’s (2009) observation period contains only three quarters. We also compute the welfare cost of inflation based on the estimated money demand function to find that it is very small: the welfare cost of 2 percent inflation is only 0.04 percent of national income, which is of a similar magnitude as the estimate obtained by Ireland (2009) but much smaller than the estimate by Lucas (2000).


    In regression analyses of money demand functions, there is no consensus on whether the nominal interest rate as an independent variable should be used in linear or log form. For example, Meltzer (1963), Hoffman and Rasche (1991), and Lucas (2000) employ a log-log specification (i.e., regressing real money balances (or real money balances relative to nominal GDP) in log on nominal interest rates in log), while Cagan (1956), Lucas (1988), Stock and Watson (1993), and Ball (2001) employ a semi-log specification (i.e., nominal interest rates are not in log).



  • The Formation of Consumer Inflation Expectations: New Evidence From Japan’s Deflation Experience


    Using a new micro-level dataset we investigate the relationship between the inflation experience and inflation expectations of households in Japan. We focus on the period after 1995, when Japan began its era of deflation. Our key findings are fourfold. Firstly, we find that inflation expectations tend to increase with age. Secondly, we find that measured inflation rates of items purchased also increase with age. However, we find that age and inflation expectations continue to have a positive correlation even after controlling for the household-level rate of inflation. Further analysis suggests that the positive correlation between age and inflation expectations is driven to a significant degree by the correlation between cohort and inflation expectations, which we interpret to represent the effect of historical inflation experience on expectations of future inflation rates.


    Since at least the time of Keynes (1936), economic agents’ expectations of future inflation rates have played a pivotal role in macroeconomics. Woodford (2003) describes the central importance of inflation expectations to modern macroeconomic models due to the intertemporal nature of economic problems, while Sargent (1982) and Blinder (2000) highlight the dependence of monetary policy on these expectations. However, despite the important role of inflation expectations, their formal inclusion in macroeconomic models is usually ad-hoc with little empirical justification.