Working papers 2015 Back to index

No posts.

  • Consumption Taxes and Divisibility of Labor under Incomplete Markets

    Abstract

    We analyze lump-sum transfers financed through consumption taxes in a heterogeneousagent model with uninsured idiosyncratic wage risk and endogenous labor supply. The model is calibrated to the U.S. economy. We find that consumption inequality and uncertainty decrease with transfers much more substantially under divisible than indivisible labor. Increasing transfers by raising the consumption tax rate from 5% to 35% decreases the consumption Gini by 0.04 under divisible labor, whereas it has almost no effect on the consumption Gini under indivisible labor. The divisibility of labor also affects the relationship among consumption-tax financed transfers, aggregate saving, and the wealth distribution.

    Introduction

    What is the effect of government transfers on inequality and risk sharing when households face labor income uncertainty? Previous studies, such as FlodÈn (2001) and Alonso-Ortiz and Rogerson (2010), find that increasing lump-sum transfers financed through labor and/or capital income taxes substantially decreases consumption inequality and uncertainty in a general equilibrium model with uninsured earnings risk. However, little is known about the impact of increasing consumption-tax financed transfers. Does it help people smooth consumption and does it reduce inequality? What is the impact on efficiency? The present paper analyzes these questions quantitatively.

  • The Mechanism of Inflation Expectation Formation among Consumers

    Abstract

    How do we determine our expectations of inflation? Because inflation expectations greatly influence the economy, researchers have long considered this question. Using a survey with randomized experiments among 15,000 consumers, we investigate the mechanism of inflation expectation formation. Learning theory predicts that once people obtain new information on future inflation, they change their expectations. In this regard, such expectations are the weighted average of prior belief and information. We confirm that the weight for prior belief is a decreasing function of the degree of uncertainty. Our results also show that monetary authority information affects consumers to a greater extent when expectations are updated. With such information, consumers change their inflation expectations by 32% from the average. This finding supports improvements to monetary policy publicity

    Introduction

    Expectations vis-à-vis future inflation are very important for economic decision-making. People contemplate the future on many occasions, including when they consider how much to save, or whether to postpone the purchase of a house or not. Thus, economists have been discussing what inflation expectations are, how they influence the overall dynamics of the economy, and how they are formed. Occasionally, such expectations become central to policy debates because the effectiveness of some types of monetary policies crucially depends upon how these are formed (Blinder, 2000; McCallum, 1984; Sargent, 1982). In spite of their long history, inflation expectations have also been renowned for being difficult to measure (Mankiw et al., 2004; Schwarz, 1999).

  • Parameter Bias in an Estimated DSGE Model: Does Nonlinearity Matter?

    Abstract

    How can parameter estimates be biased in a dynamic stochastic general equilibrium model that omits nonlinearity in the economy? To answer this question, we simulate data from a fully nonlinear New Keynesian model with the zero lower bound constraint and estimate a linearized version of the model. Monte Carlo experiments show that significant biases are detected in the estimates of monetary policy parameters and the steady-state inflation and real interest rates. These biases arise mainly from neglecting the zero lower bound constraint rather than linearizing equilibrium conditions. With fixed parameters, the variance-covariance matrix and impulse response functions of observed variables implied by the linearized model substantially differ from those implied by its nonlinear counterpart. However, we find that the biased estimates of parameters in the estimated linear model can make most of the differences small.

    Introduction

    Following the development of Bayesian estimation and evaluation techniques, many economists have estimated dynamic stochastic general equilibrium (DSGE) models using macroeconomic time series. In particular, estimated New Keynesian models, which feature nominal rigidities and monetary policy rules, have been extensively used by policy institutions such as central banks. Most of the estimated DSGE models are linearized around a steady state because a linear state-space representation along with the assumption of normality of exogenous shocks enables us to efficiently evaluate likelihood using the Kalman filter. However, Fern´andez-Villaverde and Rubio-Ram´ırez (2005) and Fern´andez-Villaverde, Rubio-Ram´ırez, and Santos (2006) demonstrate that the level of likelihood and parameter estimates based on a linearized model can be significantly different from those based on its original nonlinear model. Moreover, in the context of New Keynesian models, Basu and Bundick (2012), Braun, K¨orber, and Waki (2012), Fern´andez-Villaverde, Gordon, Guerr´on-Quintana, and Rubio-Ram´ırez (2015), Gavin, Keen, Richter, and Throckmorton (2015), Gust, L´opez-Salido, and Smith (2012), Nakata (2013a, 2013b), and Ngo (2014) emphasize the importance of considering nonlinearity in assessing the quantitative implications of the models when the zero lower bound (ZLB) constraint on the nominal interest rate is taken into account.

  • Strategic Central Bank Communication: Discourse and Game-Theoretic Analyses of the Bank of Japan’s Monthly Report

    Abstract

    We conduct a discourse analysis of the Bank of Japan’s Monthly Report and examine its characteristics in relation to business cycles. We find that the difference between the number of positive and negative expressions in the reports leads the leading index of the economy by approximately three months, which suggests that the central bank’s reports have some superior information about the state of the economy. Moreover, ambiguous expressions tend to appear more frequently with negative expressions. Using a simple persuasion game, we argue that the use of ambiguity in communication by the central bank can be seen as strategic information revelation when the central bank has an incentive to bias the reports (and hence beliefs in the market) upwards.

    Introduction

    Central banks not only implement monetary policy but also provide a significant amount of information for the market (Blinder [2004], Eijffinger and Geraats [2006]). Indeed, most publications of central banks are not solely about monetary policy but provide data and analyses on the state of the economy. It has been widely recognized that central banks use various communication channels to influence market expectations so as to enhance the effectiveness of their monetary policy. Meanwhile, it is not readily obvious whether central banks reveal all information they have exactly as it stands. In particular, although central banks cannot make untruthful statements owing to accountability and fiduciary requirements, they may communicate strategically and can be selective about the types of information they disclose. This concern takes on special importance when central banks’ objectives (e.g., keeping inflation/deflation under control and achieving maximum employment) may not be aligned completely with those of market participants, and possibly, governments.

  • Liquidity Trap and Optimal Monetary Policy Revisited

    Abstract

    This paper investigates history dependent easing known as a conventional wisdom of optimal monetary policy in a liquidity trap. We show that, in an economy where the rate of inflation exhibits intrinsic persistence, monetary tightening is earlier as inflation becomes more persistent. This property is referred as early tightening and in the case of a higher degree of inflation persistence, a central bank implements front-loaded tightening so that it terminates the zero interest rate policy even before the natural rate of interest turns positive. As a prominent feature in a liquidity trap, a forward guidance of smoothing the change in inflation rates contributes to an early termination of the zero interest rate policy.

    Introduction

    The theory of monetary policy has been developed since 1990s based on a new Keynesian model as represented by Clarida et al. (1999) and Woodford (2003). In particular, Woodford (2003) finds history dependence as a general property of optimal monetary policy. The optimal monetary policy rule explicitly includes lagged endogenous variables and the current monetary policy reflects the past economic environment.

  • Transmission Channels and Welfare Implications of Unconventional Monetary Easing Policy in Japan

    Abstract

    This paper examines the effects of the Quantitative and Qualitative Monetary Easing Policy (QQE <2013-current>) of the Bank of Japan (BOJ) by transmission channels in comparison with those of the Comprehensive Monetary Easing Policy (CE) and the subsequent monetary easing policies (2010-2012), based on the event study using financial market data. As for the QQE under normal market conditions, depreciation of foreign exchange rate in the context of portfolio balance channel functions quite strongly, while as for the CE, signaling channel through the commitment and credit easing channel at the dysfunctional markets work. The direct inflation expectation channel is weak for both QQE and CE, although the QQE has adopted various ways to exert a direct and strong influence on inflation expectation. It can be conjectured that the gradual rise in inflation expectation comes mainly from other channels like the depreciation of the yen. The most crucial characteristic of the QQE is to maximize the potential effects of easing policy by explicitly doubling and later tripling the purchased amount of JGBs and then the monetary base proportionally. The amount of JGB purchases by the BOJ surpasses the issuance amount of JGBs, thereby reducing the outstanding amount of JGBs in the markets. Shortage of safety assets would increase the convenience yield, which itself would reduce the economic welfare and not permeate the yields of other risky assets theoretically. This paper then examines the impact of reduction in JGBs on yield spreads between corporate bonds and JGBs based on money-in-utility type model applied to JGBs, and finds that at least severe scarcity situations of JGBs as safe assets are avoided, since the size of Japan’s public debt outstanding is the largest in the world. Even so, the event study shows no clear evidence that the decline in the yield of long-maturity JGBs induced by the QQE permeates the yields of corporate bonds. Recently demands for JGBs have been increasing from both domestic and foreign investors as collaterals after the Global Financial Crisis and from financial institutions that have to correspond to strengthened global liquidity regulation, while the Government of Japan is planning to consolidate the public debts. These recent changes as well as market expectation for future path of JGB amounts should also be taken account of to examine the scarcity of safe assets in case of further massive purchases of JGBs.

    Introduction

    Facing the zero lower bound of short-term interest rate, the Bank of Japan (BOJ) conducted the Quantitative Easing Monetary Policy (QEP) from 2001 to 2006, well in advance of other developing countries. At that time there were heated discussions about its effects (Ugai (2007)). After the Global Financial Crisis (GFC) in 2008, most of the major central banks have also faced the zero interest rate lower bound (Graph 1), and the Federal Reserve pursued the Large Scale Asset Purchases (LSAPs), followed by the Bank of England and the BOJ. Recently, although the Federal Reserve has terminated the LSAP, European Central Bank has newly adopted unconventional monetary policy including an expanded asset purchase program. Although researchers have started to summarize the effects and side-effects of these unconventional monetary easing policies theoretically and empirically (IMF (2013)), there is no consensus about them so far.

  • Conservatism and Liquidity Traps

    Abstract

    In an economy with an occasionally binding zero lower bound (ZLB) constraint, the anticipation of future ZLB episodes creates a trade-off for discretionary central banks between inflation and output stabilization. As a consequence, inflation systematically falls below target even when the policy rate is above zero. Appointing Rogoff’s (1985) conservative central banker mitigates this deflationary bias away from the ZLB and enhances welfare by improving allocations both at and away from the ZLB.

    Introduction

    Over the past few decades, a growing number of central banks around the world have adopted inflation targeting as a policy framework. The performance of inflation targeting in practice has been widely considered a success. However, some economists and policymakers have voiced the need to re-examine central banks’ monetary policy frameworks in light of the liquidity trap conditions currently prevailing in many advanced economies. As shown in Eggertsson and Woodford (2003) among others, the zero lower bound (ZLB) on nominal interest rates severely limits the ability of inflation-targeting central banks to stabilize the economy absent an explicit commitment technology. Some argue that the ZLB is likely to bind more frequently and that liquidity trap episodes might hit the economy more severely in the future than they have in the past. Understanding the implications of the ZLB for the conduct of monetary policy is therefore of the utmost importance for economists and policymakers alike.

  • Short- and Long-Run Tradeoff Monetary Easing

    Abstract

    In this study, we illustrate a tradeoff between the short-run positive and long-run negative effects of monetary easing by using a dynamic stochastic general equilibrium model embedding endogenous growth with creative destruction and sticky prices due to menu costs. While a monetary easing shock increases the level of consumption because of price stickiness, it lowers the frequency of creative destruction (i.e., product substitution) because inflation reduces the reward for innovation via menu cost payments. The model calibrated to the U.S. economy suggests that the adverse effect dominates in the long run.

    Introduction

    The Great Recession during 2007–09 prompted many central banks to conduct unprecedented levels of monetary easing. Although this helped in preventing an economic catastrophe such as the Great Depression, many economies have experienced only slow and modest recoveries (i.e., they faced secular stagnation) since then. Japan has fallen into even longer stagnations, namely the lost decades. Firm entry and productive investment have been inactive since the burst of the asset market bubble around 1990 despite a series of monetary easing measures (Caballero et al. (2008)).

  • Money creation at the zero lower bound on interest rates in Japan(in Japanese)

    Abstract

    本研究は名目金利が下限に達した下での日本の銀行行動、特に貸出行動を実証的に分析する。日本は他国に先駆けて、金利に引き下げ余地がない下でマネタリーベースの量を拡大する政策を採用してきた。しかしそれに反応してマネーストックが増加した形跡はほとんど見られない。すなわち信用創造過程は弱体化し、貨幣乗数(限界的な意味での)は消失したかに見える。このことはマクロ経済学の標準的理論とも整合的である。しかしながら、議論の余地はあるものの、これらの政策は生産や物価などに一定の効果を及ぼしてきたと見られる。その源泉は何だろうか。本研究は貨幣乗数が実は完全にゼロになってしまったわけではなく、マネタリーベースの大量供給がわずかながらマネーストックの増加に寄与してきた可能性を追求する。本研究では個別銀行の財務諸表を基にパネルデータを構築し、「前期末時点でより多くの超過準備を抱えていた銀行ほど、今期中に貸出を増加させる傾向があるか」を検証する。その結果、ゼロ金利のもとでそのような傾向が平均的に観察されることが示される。さらに検討してみると、この傾向に関しては銀行間で異質性が認められた。すなわち、超過準備に貸出が反応する傾向は不良債権を多く抱えている銀行ほど強く、また業態によっても差異が認められる。よって、近年のマネタリーベースの急激な増加は銀行部門全体を通してと言うよりも、その一部を通じて信用創造過程に流れ出している可能性が示唆される。

    Introduction

    本研究では日本の個別銀行財務諸表を基にパネルデータを構築し銀行行動、特に貸出行動に関する実証分析を行う。主たる関心は、名目金利が下限に達した下で、超過準備の追加供給を受けた銀行がその一部でも貸出に回す傾向が認められるかである。

  • Payment Instruments and Collateral in the Interbank Payment System

    Abstract

    This paper presents a three-period model to analyze why banks need bank reserves for interbank payments despite the availability of other liquid assets like Treasury securities. The model shows that banks need extra liquidity if they settle bank transfers without the central bank. In this case, each pair of banks sending and receiving bank transfers must determine the terms of settlement between them bilaterally in an overthe-counter transaction. As a result, a receiving bank can charge a sending bank a premium for the settlement of bank transfers, because depositors’ demand for timely payments causes a hold-up problem for a sending bank. In light of this result, the large value payment system operated by the central bank can be regarded as an interbank settlement contract to save liquidity. A third party like the central bank must operate this system because a custodian of collateral is necessary to implement the contract. This result implies that bank reserves are not independent liquid assets, but the balances of collateral submitted by banks to participate into a liquidity-saving contract. The optimal contract is the floor system. Whether a private clearing house can replace the central bank depends on the range of collateral it can accept.

    Introduction

    Base money consists of cash and bank reserves. Banks hold bank reserves not merely to satisfy a reserve requirement, but also to make interbank payments to settle bank transfers between their depositors. In fact, the daily transfer of bank reserves in a country tends to be as large as a sizable fraction of annual GDP. Also, several countries have abandoned a reserve requirement. Banks in these countries still use bank reserves to settle bank transfers.

  • Novel and topical business news and their impact on stock market activities

    Abstract

    We propose an indicator to measure the degree to which a particular news article is novel, as well as an indicator to measure the degree to which a particular news item attracts attention from investors. The novelty measure is obtained by comparing the extent to which a particular news article is similar to earlier news articles, and an article is regarded as novel if there was no similar article before it. On the other hand, we say a news item receives a lot of attention and thus is highly topical if it is simultaneously reported by many news agencies and read by many investors who receive news from those agencies. The topicality measure for a news item is obtained by counting the number of news articles whose content is similar to an original news article but which are delivered by other news agencies. To check the performance of the indicators, we empirically examine how these indicators are correlated with intraday financial market indicators such as the number of transactions and price volatility. Specifically, we use a dataset consisting of over 90 million business news articles reported in English and a dataset consisting of minuteby-minute stock prices on the New York Stock Exchange and the NASDAQ Stock Market from 2003 to 2014, and show that stock prices and transaction volumes exhibited a significant response to a news article when it is novel and topical.

    Introduction

    Financial markets can be regarded as a non-equilibrium open system. Understanding how they work remains a great challenge to researchers in finance, economics, and statistical physics. Fluctuations in financial market prices are sometimes driven by endogenous forces and sometimes by exogenous forces. Business news is a typical example of exogenous forces. Casual observation indicates that stock prices respond to news articles reporting on new developments concerning companies’ circumstances. Market reactions to news have been extensively studied by researchers in several different fields [1]–[13], with some researchers attempting to construct models that capture static and/or dynamic responses to endogenous and exogenous shocks [14], [15]. The starting point for neoclassical financial economists typically is what they refer to as the “efficient market hypothesis,” which implies that stock prices respond at the very moment that news is delivered to market participants. A number of empirical studies have attempted to identify such an immediate price response to news but have found little evidence supporting the efficient market hypothesis [16]– [21].

  • Replicating Japan’s CPI Using Scanner Data

    Abstract

    We examine how precisely one can reproduce the CPI constructed based on price surveys using scanner data. Specifically, we closely follow the procedure adopted by the Statistics Bureau of Japan when we sample outlets, products, and prices from our scanner data and aggregate them to construct a scanner data-based price index. We show that the following holds the key to precise replication of the CPI. First, the scanner databased index crucially depends on how often one replaces the products sampled. The scanner data index shows a substantial deviation from the actual CPI when one chooses a value for the parameter associated with product replacement such that replacement occurs frequently, but the deviation becomes much smaller if one picks a parameter value such that product replacement occurs only infrequently. Second, even when products are replaced only infrequently, the scanner data index differs significantly from the actual CPI in terms of volatility. The standard deviation of the scanner data-based monthly inflation rate is 1.54 percent, which is more than three times as large as that for actual CPI inflation. We decompose the difference in volatility between the two indexes into various factors, showing that it mainly stems from the difference in price rigidity for individual products. We propose a filtering technique to make individual prices in the scanner data stickier, thereby making scanner data-based inflation less volatile.

    Introduction

    Scanner data has started to be used by national statistical offices in a number of countries, including Australia, the Netherlands, Norway, Sweden, and Switzerland, for at least part of the production of their consumer price indexes (CPIs). Many other national statistical offices have also already started preparing for the use of scanner data in constructing their CPIs. The purpose of this paper is to empirically examine whether price indexes based on scanner data is consistent with price indexes constructed using the traditional survey based method.

  • Structure of global buyer-supplier networks and its implications for conflict minerals regulations

    Abstract

    We investigate the structure of global inter-firm linkages using a dataset that contains information on business partners for about 400, 000 firms worldwide, including all the firms listed on the major stock exchanges. Among the firms, we examine three networks, which are based on customer-supplier, licensee-licensor, and strategic alliance relationships. First, we show that these networks all have scale-free topology and that the degree distribution for each follows a power law with an exponent of 1.5. The shortest path length is around six for all three networks. Second, we show through community structure analysis that the firms comprise a community with those firms that belong to the same industry but different home countries, indicating the globalization of firms’ production activities. Finally, we discuss what such production globalization implies for the proliferation of conflict minerals (i.e., minerals extracted from conflict zones and sold to firms in other countries to perpetuate fighting) through global buyer-supplier linkages. We show that a limited number of firms belonging to some specific industries and countries plays an important role in the global proliferation of conflict minerals. Our numerical simulation shows that regulations on the purchases of conflict minerals by those firms would substantially reduce their worldwide use.

    Introduction

    Many complex physical systems can be modeled and better understood as complex networks [1, 2, 3]. Recent studies show that economic systems can also be regarded as complex networks in which economic agents, like consumers, firms, and governments, are closely connected [4, 5]. To understand the interaction among economic agents, we must uncover the structure of economic networks.

PAGE TOP