ワーキングペーパー 2019年度 一覧に戻る

投稿はありません。

  • The Responses of Consumption and Prices in Japan to the COVID-19 Crisis and the Tohoku Earthquake

    Abstract

    This note compares the responses of consumption and prices to the COVID-19 shock and another large-scale natural disaster that hit Japan, the Tohoku earthquake in March 2011. The comparison shows that the responses of supermarket sales and prices at a daily frequency during the two crises are quite similar: (1) the year-on-year rate of sales growth increased quickly and reached a peak of 20 percent two weeks after the outbreak of COVID-19 in Japan, which is quite similar to the response immediately after the earthquake; (2) the items consumers purchased at supermarkets in these two crisis are almost identical; (3) the year-on-year rate of consumer price inflation for goods rose by 0.6 percentage points in response to the coronavirus shock, compared to 2.2 percentage points in the wake of the earthquake. However, evidence suggests that whereas people expected higher inflation for goods and services in the wake of the earthquake, they expect lower inflation in response to the coronavirus shock. This difference in inflation expectations suggests that the economic deterioration due to COVID-19 should be viewed as driven mainly by an adverse aggregate demand shock to face-to-face service industries such as hotels and leisure, transportation, and retail, rather than as driven by an aggregate supply shock.

    1 The Spread of COVID-19 in Japan and the World

    The spread of COVID-19 is still gaining momentum. The number of those infected in Japan started to rise from the last week of February, and the spread of the virus began to gradually affect everyday life, as exemplified by increasingly empty streets in Ginza. In March, the outbreak spread to Europe and the United States, and stock markets in the United States and other country began to drop sharply on a daily basis, leading to market turmoil reminiscent of the global financial crisis. At the time of writing (March 29), the Dow Jones Index of the New York Stock Exchange had dropped by 35%, while the Nikkei Index had fallen by 30%.

     

    WP020

  • Incomplete Information Robustness

    Abstract

    Consider an analyst who models a strategic situation in terms of an incomplete information game and makes a prediction about players’ behavior. The analyst’s model approximately describes each player’s hierarchies of beliefs over payoff-relevant states, but the true incomplete information game may have correlated duplicated belief hierarchies, and the analyst has no information about the correlation. Under these circumstances, a natural candidate for the analyst’s prediction is the set of belief-invariant Bayes correlated equilibria (BIBCE) of the analyst’s incomplete information game. We introduce the concept of robustness for BIBCE: a subset of BIBCE is robust if every nearby incomplete information game has a BIBCE that is close to some BIBCE in this set. Our main result provides a sufficient condition for robustness by introducing a generalized potential function of an incomplete information game. A generalized potential function is a function on the Cartesian product of the set of states and a covering of the action space which incorporates some information about players’ preferences. It is associated with a belief-invariant correlating device such that a signal sent to a player is a subset of the player’s actions, which can be interpreted as a vague prescription to choose some action from this subset. We show that, for every belief-invariant correlating device that maximizes the expected value of a generalized potential function, there exists a BIBCE in which every player chooses an action from a subset of actions prescribed by the device, and that the set of such BIBCE is robust, which can differ from the set of potential maximizing BNE.

    Introduction

    Consider an analyst who models a strategic situation in terms of an incomplete information game and makes a prediction about players’ behavior. He believes that his model correctly describes the probability distribution over the players’ Mertens-Zamir hierarchies of beliefs over payoff-relevant states (Mertens and Zamir, 1985). However, players may have observed signals generated by an individually uninformative correlating device (Liu, 2015), which allows the players to correlate their behavior. In other words, the true incomplete information game may have correlated duplicated belief hierarchies (Ely and Peski, 2006; Dekel et al., 2007). Then, a natural candidate for the analyst’s prediction is the set of outcomes that can arise in some Bayes Nash equilibrium (BNE) of some incomplete information game with the same distribution over belief hierarchies. Liu (2015) shows that this set of outcomes can be characterized as the set of belief-invariant Bayes correlated equilibria (BIBCE) of the analyst’s model. A BIBCE is a Bayes correlated equilibrium (BCE) in which a prescribed action does not reveal any additional information to the player about the opponents’ types and the payoff-relevant state, thus preserving the player’s belief hierarchy.

     

     

    WP019

  • LQG Information Design

    Abstract

    A linear-quadratic-Gaussian (LQG) game is an incomplete information game with quadratic payoff functions and Gaussian information structures. It has many applications such as a Cournot game, a Bertrand game, a beauty contest game, and a network game among others. LQG information design is a problem to find an information structure from a given collection of feasible Gaussian information structures that maximizes a quadratic objective function when players follow a Bayes Nash equilibrium. This paper studies LQG information design by formulating it as semidefinite programming, which is a natural generalization of linear programming. Using the formulation, we provide sufficient conditions for optimality and suboptimality of no and full information disclosure. In the case of symmetric LQG games, we characterize the optimal symmetric information structure, and in the case of asymmetric LQG games, we characterize the optimal public information structure, each of which is in a closed-form expression.

    Introduction

    An equilibrium outcome in an incomplete information game depends not only upon a payoff structure, which consists of payoff functions together with a probability distribution of a payoff state, but also upon an information structure, which maps a payoff state to possibly stochastic signals of players. Information design analyzes the influence of an information structure on equilibrium outcomes, and in particular, characterizes an optimal information structure that induces an equilibrium outcome maximizing the expected value of an objective function of an information designer, who is assumed to be able to choose and commit to the information structure.1 General approaches to information design are presented by Bergemann and Morris (2013, 2016a,b, 2019), Taneva (2019), and Mathevet et al. (2020). A rapidly growing body of literature have investigated the economic application of information design in areas such as matching markets (Ostrovsky and Schwarz, 2010), voting games (Alonso and Camara, 2016), congestion games (Das et al., 2017), auctions (Bergemann et al., 2017), contests (Zhang and Zhou, 2016), and stress testing (Inostroza and Pavan, 2018), among others.

     

    WP018

  • Gaussian Hierarchical Latent Dirichlet Allocation: Bringing Polysemy Back

    Abstract

    Topic models are widely used to discover the latent representation of a set of documents. The two canonical models are latent Dirichlet allocation, and Gaussian latent Dirichlet allocation, where the former uses multinomial distributions over words, and the latter uses multivariate Gaussian distributions over pre-trained word embedding vectors as the latent topic representations, respectively. Compared with latent Dirichlet allocation, Gaussian latent Dirichlet allocation is limited in the sense that it does not capture the polysemy of a word such as “bank.” In this paper, we show that Gaussian latent Dirichlet allocation could recover the ability to capture polysemy by introducing a hierarchical structure in the set of topics that the model can use to represent a given document. Our Gaussian hierarchical latent Dirichlet allocation significantly improves polysemy detection compared with Gaussian-based models and provides more parsimonious topic representations compared with hierarchical latent Dirichlet allocation. Our extensive quantitative experiments show that our model also achieves better topic coherence and held-out document predictive accuracy over a wide range of corpus and word embedding vectors.

    Introduction

    Topic models are widely used to identify the latent representation of a set of documents. Since latent Dirichlet allocation (LDA) [4] was introduced, topic models have been used in a wide variety of applications. Recent work includes the analysis of legislative text [24], detection of malicious websites [33], and analysis of the narratives of dermatological disease [23]. The modular structure of LDA, and graphical models in general [17], has made it possible to create various extensions to the plain vanilla version. Significant works include the correlated topic model (CTM), which incorporates the correlation among topics that co-occur in a document [6]; hierarchical LDA (hLDA), which jointly learns the underlying topic and the hierarchical relational structure among topics [3]; and the dynamic topic model, which models the time evolution of topics [7].

     

     

    WP017

  • Decentralizability of Efficient Allocations with Heterogenous Forecasts

    Abstract

    Do price forecasts of rational economic agents need to coincide in perfectly competitive complete markets in order for markets to allocate resources efficiently? To address this question, we define an efficient temporary equilibrium (ETE) within the framework of a two period economy. Although an ETE allocation is intertemporally efficient and is obtained by perfect competition, it can arise without the agents forecasts being coordinated on a perfect foresight price. We show that there is a one dimensional set of such Pareto efficient allocations for generic endowments.

    Introduction

    Intertemporal trade in complete markets is known to achieve Pareto efficiency when the price forecasts of agents coincide and are correct. The usual justification for this coincidence of price forecasts is that if agents understand the market environment perfectly, they ought to reach the same conclusions, and hence in particular, their forecasts must coincide. But it is against the spirit of perfect competition to require that agents should understand the market environment beyond the market prices they commonly observe; we therefore study intertemporal trade without requiring that price forecasts of heterogenous agents coincide.

     

     

    WP016

  • Search and Matching in Rental Housing Market   

    Abstract

    This paper builds up a model for a rental housing market. With a search and matching friction in a rental housing market, a new house entry is endogenized according to a business cycle. A price negotiation happens only when owner and tenant newly match and make a contract for a rental price. After making a contract, a rental price is fixed until the contract ends. Simulations show that variations of a price and a market tightness change according to a search friction in a housing market, a speed of a housing cycle, a bargaining power between owner and tenant for a price setting. An extensive margin effect brought by a housing entry well contributes to a price variation and this effect significantly changes by parameters.  

    Introduction

    Former studies, such as Wheaton (1990), focus on a search behavior in a housing market and show advantage of a search model to explain a housing market.
    Non-homeownership rates are at nontrivial level for a business cycle analysis across countries. In Japan, Statistics Bureau of Japan (2018) shows that a non-homeownership rate keep about 40 percent for many years. Australian Bureau of Statistics reports that the proportion of Australian households renting their home is 32 percent in 2017–18. In the U.S., the Census Bureau releases national non-homeownership rates and it is about 35 percent in the last few years. As well as buying and selling houses, a leasing house behavior can contribute to a business cycle.  

     

     

    WP015

  • Debt Intolerance: Threshold Level and Composition

    Abstract

    Fiscal vulnerabilities depend on both the level and composition of government debt. This study investigates this threshold level of debt and its composition to understand the non-linear behavior of the long-term interest rate by developing a novel approach: a panel smooth transition regression with a general logistic model (i.e., a generalized panel smooth transition regression). Our main findings are threefold: (i) the impact of the expected public debt on the interest rate would increase exponentially and significantly as the foreign private holdings ratio exceeds approximately 20 percent; otherwise, strong home bias would mitigate the upward pressure of an increase in public debt on the interest rate; (ii) if the expected public debt-to-GDP ratio exceeds a certain level that depends on the funding source, an increase in foreign private holdings of government debt would cause a rise in long-term interest rates, offsetting the downward effect on long-term interest rates by expanding market liquidity; and (iii) out-of-sample forecast of our novel non-linear model is more accurate than those of previous methods. As such, the composition of government debt plays an important role in the highly non-linear behavior of the long-term interest rate.

    Introduction

    As argued by Reinhart et al. (2003), fiscal vulnerabilities depend on both the level and composition (foreign vs. domestic) of government debt. They describe the “debt intolerance” phenomenon, in which interest rates in developing economies can spike above the “tolerance ceiling,” even though the debt levels could be considered manageable by advanced country standards. Long-term interest rates in advanced economies have been lower than those in emerging markets although debt levels in advanced economies such as Japan, the United Kingdom and the United State are much higher than in emerging markets (Figures 1 and 2). While significant research has been devoted to estimating the marginal impact of public debt on long-term interest rates, there are different estimated impacts, even though they control for fundamental variables such as inflation expectations and growth rates.

     

    WP014

  • How Large is the Demand for Money at the ZLB? Evidence from Japan

    Abstract

    This paper estimates a money demand function using Japanese data from 1985 to 2017, which includes the period of near-zero interest rates over the last two decades. We compare a log-log specification and a semi-log specification by employing the methodology proposed by Kejriwal and Perron (2010) on cointegrating relationships with structural breaks. Our main finding is that there exists a cointegrating relationship with a single break between the money-income ratio and the interest rate in the case of the log-log form but not in the case of the semi-log form. More specifically, we show that the substantial increase in the money-income ratio during the period of near-zero interest rates is well captured by the log-log form but not by the semi-log form. We also show that the demand for money did not decline in 2006 when the Bank of Japan terminated quantitative easing and started to raise the policy rate, suggesting that there was an upward shift in the money demand schedule. Finally, we find that the welfare gain from moving from 2 percent inflation to price stability is 0.10 percent of nominal GDP, which is more than six times as large as the corresponding estimate for the United States.

    Introduction

    There is no consensus about whether the interest rate variable should be used in log or not when estimating the money demand function. For example, Meltzer (1963), Hoffman and Rasche (1991), and Lucas (2000) employ a log-log specification (i.e., the log of real money balances is regressed on the log of the nominal interest rate), while Cagan (1956), Lucas (1988), Stock and Watson (1993), and Ball (2001) employ a semi-log form (i.e., the log of real money demand is regressed on the level of the nominal interest rate). The purpose of this paper is to specify the functional form of money demand using Japanese data covering the recent period with nominal interest rates very close to zero.

     

    WP013

  • Dynamic Productivity Decomposition with Allocative Efficiency

    Abstract

    We propose a novel approach to decomposing aggregate productivity growth into changes in technical efficiency, allocative efficiency, and variety of goods as well as relative efficiency of entrants and exiters. We measure technical efficiency by the aggregate production possibility frontier and allocative efficiency by the distance from the frontier. Applying our approach to establishment- and firm-level datasets from Japan, we find that the allocative efficiency among survivors declined during the banking crisis period, while the technical efficiency declined during the Global Financial Crisis period. Furthermore, we find that both entrants and exiters were likely to be more efficient than survivors.  

    Introduction

    Growth in aggregate productivity is key to economic growth in both developing and developed economies. Past studies have proposed various methods of analysis to gain further insight into its driving forces. These include aggregating producer-level productivity to economy-wide productivity, and decomposing changes in aggregate productivity. This decomposition consists of changes in technology, allocation of resources across producers, and the relative productivity of entrants and survivors. However, as far as we know, no preceding study decomposes aggregate productivity into technical efficiency in terms of the aggregate production possibility frontier, and allocative efficiency in terms of distance from the frontier, although this decomposition is straightforward from a microeconomic view.

     

    WP012

  • Who Needs Guidance from a Financial Adviser? Evidence from Japan

    Abstract

    Using individual family household data from Japan, we find that households prefer financial institutions, family and friends, and financial experts as actual sources of financial information, and financial institutions, neutral institutions not reflecting the interests of a particular industry, and financial experts as desirable sources of financial information. We find that households choosing actual sources of financial information involving financial experts have better financial knowledge, as measured in terms of knowledge about the Deposit Insurance Corporation of Japan, than those selecting family and friends for the same purpose. These same households are also more willing to purchase high-yielding financial products entailing the possibility of a capital loss within one to two years. We also find that households choosing desirable sources of financial information involving financial experts and neutral institutions also have better financial knowledge. Conditional on the choice of financial institutions as the actual source, households that regard neutral institutions as a more desirable source tend to have better financial knowledge. However, it is unclear whether households that seek the guidance of a financial expert have higher ratios of stock and investment trusts to financial assets than those selecting family and friends as their source of financial information. 

    Introduction

    The prolonged period of low economic growth and interest rates that has accompanied rapid population aging in Japan over the past two decades requires ever more Japanese households to decide more carefully how much to save and where to invest. For example, many Japanese corporations have begun to implement defined contribution corporate pension plans, such that workers must take much more responsibility for their own saving. However, the Japanese flow of funds accounts show that riskier (higher yielding) assets, such as stocks or investment trusts, represent just 16% of all household financial assets as of December 2018. Observing this rapidly changing landscape for retirement savings, the Financial Services Agency (FSA) of Japan has been actively promoting investment in FSA-selected no-load and simple investment trusts, through tax exemptions on dividend and interest earnings on securities. However, it remains for households to choose from the products approved by the FSA, and they still need sufficient financial knowledge for this purpose.

     

    WP011

  • Detecting Stock Market Bubbles Based on the Cross‐Sectional Dispersion of Stock Prices

    Abstract

    A statistical method is proposed for detecting stock market bubbles that occur when speculative funds concentrate on a small set of stocks. The bubble is defined by stock price diverging from the fundamentals. A firm’s financial standing is certainly a key fundamental attribute of that firm. The law of one price would dictate that firms of similar financial standing share similar fundamentals. We investigate the variation in market capitalization normalized by fundamentals that is estimated by Lasso regression of a firm’s financial standing. The market capitalization distribution has a substantially heavier upper tail during bubble periods, namely, the market capitalization gap opens up in a small subset of firms with similar fundamentals. This phenomenon suggests that speculative funds concentrate in this subset. We demonstrated that this phenomenon could have been used to detect the dot-com bubble of 1998-2000 in different stock exchanges. 

    Introduction

    It is common knowledge in macroeconomics that, as Federal Reserve Board Chairman Alan Greenspan said in 2002, ”...it is very difficult to identify a bubble until after the fact; that is, when its bursting confirms its existence.” In other words, before a bubble bursts, there is no way to establish whether the economy is in a bubble or not. In economics, a stock bubble is defined as a state in which speculative investment flows into a firm in excess of the firm’s fundamentals, so the market capitalization (= stock price × number of shares issued) becomes excessively high compared to the fundamentals. Unfortunately, it is exceedingly difficult to precisely measure a firm’s fundamentals and this has made it nearly impossible to detect a stock bubble by simply measuring the divergence between fundamentals and market capitalization [1–3]. On the other hand, we empirically know that market capitalization and PBR (= market capitalization / net assets) of some stocks increase during bubble periods [4–7]. However, they are also buoyed by rising fundamentals, so it is not always possible to figure out if increases can be attributed to an emerging bubble.

     

    WP010

  • Product Cycle and Prices: a Search Foundation

    Abstract

    This paper develops a price model with a product cycle. Through a frictional product market with search and matching frictions, an endogenous product cycle is accompanied with a price cycle where a price for a new good and a price for an existing good are set in a different manner. This model nests a New Keynesian Phillips curve with the Calvo's price adjustment as a special case and generates several new phenomena. Our simple model captures observed facts in Japanese product level data such as the pro-cyclicality among product entry, demand, and price. In a general equilibrium model, an endogenous product entry increases variation of the inflation rate by 20 percent in Japan. This number increases to 72 percent with a price discounting after a first price.

    Introduction

    "We have all visited several stores to check prices and/or to find the right item or the right size. Similarly, it can take time and effort for a worker to find a suitable job with suitable pay and for employers to receive and evaluate applications for job openings. Search theory explores the workings of markets once facts such as these are incorporated into the analysis. Adequate analysis of market frictions needs to consider how reactions to frictions change the overall economic environment: not only do frictions change incentives for buyers and sellers, but the responses to the changed incentives also alter the economic environment for all the participants in the market. Because of these feedback effects, seemingly small frictions can have large effects on outcomes."

     

    Peter Diamond

     

    WP009

  • House Price Dispersion in Boom-Bust Cycles: Evidence from Tokyo

    Abstract

    We investigate the cross-sectional distribution of house prices in the Greater Tokyo Area for the period 1986 to 2009. We find that size-adjusted house prices follow a lognormal distribution except for the period of the housing bubble and its collapse in Tokyo, for which the price distribution has a substantially heavier upper tail than that of a lognormal distribution. We also find that, during the bubble era, sharp price movements were concentrated in particular areas, and this spatial heterogeneity is the source of the fat upper tail. These findings suggest that, during a bubble, prices increase markedly for certain properties but to a much lesser extent for other properties, leading to an increase in price inequality across properties. In other words, the defining property of real estate bubbles is not the rapid price hike itself but an increase in price dispersion. We argue that the shape of cross-sectional house price distributions may contain information useful for the detection of housing bubbles. 

    Introduction

    Property market developments are of increasing importance to practitioners and policymakers. The financial crises of the past two decades have illustrated just how critical the health of this sector can be for achieving financial stability. For example, the recent financial crisis in the United States in its early stages reared its head in the form of the subprime loan problem. Similarly, the financial crises in Japan and Scandinavia in the 1990s were all triggered by the collapse of bubbles in the real estate market. More recently, the rapid rise in real estate prices - often supported by a strong expansion in bank lending - in a number of emerging market economies has become a concern for policymakers. Given these experiences, it is critically important to analyze the relationship between property markets, finance, and financial crisis.

     

    WP008

  • The Lucas Imperfect Information Model with Imperfect Common Knowledge

    Abstract

    In the Lucas Imperfect Information model, output responds to unanticipated monetary shocks. We incorporate more general information structures into the Lucas model and demonstrate that output also responds to (dispersedly) anticipated monetary shocks if the information is imperfect common knowledge. Thus, the real effects of money consist of the unanticipated part and the anticipated part, and we decompose the latter into two effects, an imperfect common knowledge effect and a private information effect. We then consider an information structure composed of public and private signals. The real effects disappear when either signal reveals monetary shocks as common knowledge. However, when the precision of private information is fixed, the real effects are small not only when a public signal is very precise but also when it is very imprecise. This implies that a more precise public signal can amplify the real effects and make the economy more volatile.  

    Introduction

    In the Lucas Imperfect Information model (Lucas, 1972, 1973), which formalizes the idea of Phelps (1970), markets are decentralized and agents in each market have only limited information about prices in other markets. As a consequence, output responds to unanticipated monetary shocks; that is, imperfect information about prices generates real effects of money. However, if monetary shocks are anticipated, no real effects arise. This implies that monetary shocks cannot have lasting effects, which is considered to be a serious shortcoming of the Lucas model.

     

    WP007

PAGE TOP