Working papers 2016 Back to index

No posts.

  • Extracting fiscal policy expectations from a cross section of daily stock returns

    Abstract

    The "Fiscal foresight problem" poses a challenge to researchers who wish to estimate macroeconomic impacts of fiscal policies. That is, as much of the policies are pre-announced, the traditional identification strategy which relies on the timing and the amount of actual spending changes could be misleading. In Shioji and Morita (2015), we addressed this problem by constructing a daily indicator of surprises about future public investment spending changes for Japan. Our approach combined a detailed analysis of newspaper articles with information from the stock market. The latter was represented by a weighted average of stock returns across companies from the sector deeply involved with public work, namely the construction industry. A potential shortcoming with this approach is that any shock that has an industry-wide consequence, which happened to arrive on the same day that a news about policy arrived will be reflected in this average return. In contrast, in this paper, we propose a new indicator which takes advantage of heterogeneity across firms within the same industry. Degrees of dependence on public procurement differ markedly between construction companies. For some firms, over 80% of their work is government-related. Others essentially do all their work for the private sector. Yet they share many other features, such as large land ownership and a heavy reliance on bank finance. By looking at differences in the reactions of stock returns between those firms, we should be able to come up with a more purified measure of changes in the private agents' expectations about policies. Based on this idea, we propose two new indicators. One is simply the difference in the average excess returns between two groups of firms characterized by different degrees of dependence on public investment. The other one is more elaborate and is based on the "Target Rotation" approach in the factor analysis.

    Introduction

    This paper is a sequel to Shioji and Morita (2015). In that paper, we tried to overcome a common difficulty faced by many researchers who try to estimate macroeconomic effects of fiscal policies, known as the "fiscal foresight" problem. The recognition of the presence and importance of this issue has arguably been one of the most noteworthy developments in the field of empirical studies on fiscal policy in recent years. As Ramey (2011) argues, government spending increases, especially major ones, are typically announced long before the actual spending is made. Forward looking agents would start changing their behaviors based on those expectations as soon as the news comes in. In such a circumstance, if an empirical macroeconomist uses only the conventional indicator of policy, namely the actual amount of spending, she/he is unlikely to be able to capture the entire impact of the policy correctly. This is the reason why we need to know when the news about policy changes was perceived by the private sector as well as how large the surprise was.

  • Price Rigidity at Near-Zero Inflation Rates: Evidence from Japan

    Abstract

    A notable characteristic of Japan’s deflation since the mid-1990s is the mild pace of price decline, with the CPI falling at an annual rate of only around 1 percent. Moreover, even though unemployment increased, prices hardly reacted, giving rise to a flattening of the Phillips curve. In this paper, we address why deflation was so mild and why the Phillips curve flattened, focusing on changes in price stickiness. Our first finding is that, for the majority of the 588 items constituting the CPI, making up about 50 percent of the CPI in terms of weight, the year-on-year rate of price change was near-zero, indicating the presence of very high price stickiness. This situation started during the onset of deflation in the second half of the 1990s and continued even after the CPI inflation rate turned positive in spring 2013. Second, we find that there is a negative correlation between trend inflation and the share of items whose rate of price change is near zero, which is consistent with Ball and Mankiw’s (1994) argument based on the menu cost model that the opportunity cost of leaving prices unchanged decreases as trend inflation approaches zero. This result suggests that the price stickiness observed over the last two decades arose endogenously as a result of the decline in inflation. Third and finally, a cross-country comparison of the price change distribution reveals that Japan differs significantly from other countries in that the mode of the distribution is very close to zero for Japan, while it is near 2 percent for other countries including the United States. Japan continues to be an “outlier” even if we look at the mode of the distribution conditional on the rate of inflation. This suggests that whereas in the United States and other countries the “default” is for firms to raise prices by about 2 percent each year, in Japan the default is that, as a result of prolonged deflation, firms keep prices unchanged.

    Introduction

    From the second half of the 1990s onward, Japan suffered a period of prolonged deflation, in which the consumer price index (CPI) declined as a trend. During this period, both the government and the Bank of Japan (BOJ) tried various policies to escape from deflation. For instance, from 1999 to 2000, the BOJ adopted a “zero interest rate policy” in which it lowered the policy interest rate to zero. This was followed by “quantitative easing” from 2001 until 2006. More recently, in January 2013, the BOJ adopted a “price stability target” with the aim of raising the annual rate of increase in the CPI to 2 percent. In April 2013, it announced that it was aiming to achieve the 2 percent inflation target within two years and, in order to achieve this, introduced Quantitative and Qualitative Easing (QQE), which seeks to double the amount of base money within two years. Further, in February 2016, the BOJ introduced a “negative interest rate policy,” in which the BOJ applies a negative interest rate of minus 0.1 percent to current accounts held by private banks at the BOJ, followed, in September 2016, by the introduction of “yield curve control,” in which the BOJ conducts JGB operations so as to keep the 10-year JGB yield at zero percent. See Table 1 for an overview of recent policy decisions made by the BOJ.

  • The Optimum Quantity of Debt for Japan

    Abstract

    Japan's net government debt reached 130% of GDP in 2013. The present paper analyzes the welfare implications of the large debt for Japan. We use an Aiyagari (1994)-style heterogeneous-agent, incomplete-market model with idiosyncratic wage risk and endogenous labor supply. We find that under the utilitarian welfare measure, the optimal government debt for Japan is -50% of GDP and the current level of debt incurs the welfare cost that is 0.22% of consumption. Decomposing the welfare cost by the Flodén (2001) method reveals substantial welfare effects arising from changes in the level, inequality, and uncertainty. The level and inequality costs are 0.38% and 0.52% respectively, whereas the uncertainty benefit is 0.68%. Adjusting consumption taxes instead of the factor income taxes to balance the government budget substantially reduces the overall welfare cost.

    Introduction

    Japan's net government debt reached 130% of GDP in 2013 and the debt to GDP ratio is the highest among developed countries. A large number of papers, including Hoshi and Ito (2014), Imrohoroğlu, Kitao, and Yamada (2016), and Hansen and Imrohoroğlu (2016), analyze Japan's debt problem. However, the welfare effect of the large government debt has been less understood. Flodén (2001) finds that the optimal government debt for the United States is 150% of GDP. Is the optimal debt for Japan similar and hence should Japan accumulate more debt? Or does the current debt exceed the optimal level? How much is the welfare benefit of having the optimal level of debt instead of the current debt? 

  • The Effectiveness of Consumption Taxes and Transfers as Insurance against Idiosyncratic Risk

    Abstract

    We quantitatively evaluate the effectiveness of a consumption tax and transfer program as insurance against idiosyncratic earnings risk. Our framework is a heterogeneousagent, incomplete-market model with idiosyncratic wage risk and indivisible labor. The model is calibrated to the U.S. economy. We find a weak insurance effect of the transfer program. Extending the transfer system from the current scale raises consumption uncertainty, which increases aggregate savings and reduces the interest rate. Furthermore, consumption inequality shows a small decrease.

    Introduction

    Households face substantial idiosyncratic labor income risk, and private insurance against such risk is far from perfect. The presence of uninsurable idiosyncratic earnings risk implies a potential role of government policies. The present paper examines the effectiveness of a consumption tax and transfer system as insurance against idiosyncratic earnings risk in an Aiyagari (1994)-style model with endogenous labor supply. We find that the transfer program is ineffective in terms of risk sharing. Expanding the current transfer scheme in the United States increases consumption volatility and precautionary savings. Thus, aggregate savings increase and the interest rate falls.

  • Product Turnover and Deflation: Evidence from Japan

    Abstract

    In this study, we evaluate the effects of product turnover on a welfare-based cost-of-living index. We first present several facts about price and quantity changes over the product cycle employing scanner data for Japan for the years 1988-2013, which cover the deflationary period that started in the mid 1990s. We then develop a new method to decompose price changes at the time of product turnover into those due to the quality effect and those due to the fashion effect (i.e., the higher demand for products that are new). Our main findings are as follows: (i) the price and quantity of a new product tend to be higher than those of its predecessor at its exit from the market, implying that Japanese firms use new products as an opportunity to take back the price decline that occurred during the life of its predecessor under deflation; (ii) a considerable fashion effect exists, while the quality effect is slightly declining; and (iii) the discrepancy between the cost-ofliving index estimated based on our methodology and the price index constructed only from a matched sample is not large. Our study provides a plausible story to explain why Japan’s deflation during the lost decades was mild.

    Introduction

    Central banks need to have a reliable measure of inflation when making decisions on monetary policy. Often, it is the consumer price index (CPI) they refer to when pursuing an inflation targeting policy. However, if the CPI entails severe measurement bias, monetary policy aiming to stabilize the CPI inflation rate may well bring about detrimental effects on the economy. One obstacle lies in frequent product turnover; for example, supermarkets in Japan sell hundreds of thousands of products, with new products continuously being created and old ones being discontinued. The CPI does not collect the prices of all these products. Moreover, new products do not necessarily have the same characteristics as their predecessors, so that their prices may not be comparable.

  • Payment Instruments and Collateral in the Interbank Payment System

    Abstract

    This paper presents a three-period model to analyze the need for bank reserves in the presence of other liquid assets like Treasury securities. If a pair of banks settle bank transfers without bank reserves, they must prepare extra liquidity for interbank payments, because depositors’ demand for timely payments causes a hold-up problem in the bilateral settlement of bank transfers. In light of this result, the interbank payment system provided by the central bank can be regarded as an implicit interbank settlement contract to save liquidity. The central bank is necessary for this contract as the custodian of collateral. Bank reserves can be characterized as the balances of liquid collateral submitted by banks to participate into this contract. This result explains the rate-of-return dominance puzzle and the need for substitution between bank reserves and other liquid assets simultaneously. The optimal contract is the floor system, not only because it pays interest on bank reserves, but also because it eliminates the overthe-counter interbank money market. The model indicates it is efficient if all banks share the same custodian of collateral, which justifies the current practice that a public institution provides the interbank payment system.

    Introduction

    Base money consists of currency and bank reserves. Banks hold bank reserves not merely to satisfy a reserve requirement, but also to make interbank payments to settle bank transfers between their depositors. In fact, the daily transfer of bank reserves in a country tends to be as large as a sizable fraction of annual GDP. Also, several countries have abandoned a reserve requirement. Banks in these countries still use bank reserves to settle bank transfers.

  • Who buys what, where: Reconstruction of the international trade flows by commodity and industry

    Abstract

    We developed a model to reconstruct the international trade network by considering both commodities and industry sectors in order to study the effects of reduced trade costs. First, we estimated trade costs to reproduce WIOD and NBERUN data. Using these costs, we estimated the trade costs of sector specific trade by types of commodities. We successfully reconstructed sector-specific trade for each types of commodities by maximizing the configuration entropy with the estimated costs. In WIOD, trade is actively conducted between the same industry sectors. On the other hand, in NBER-UN, trade is actively conducted between neighboring countries. This seems like a contradiction. We conducted community analysis for the reconstructed sector-specific trade network by type of commodities. The community analysis showed that products are actively traded among same industry sectors in neighboring countries. Therefore the observed features of the community structure for WIOD and NBER-UN are complementary.

    Introduction

    In the era of economic globalization, most national economies are linked by international trade, which in turn consequently forms a complex global economic network. It is believed that greater economic growth can be achieved through free trade based on the establishment of Free Trade Agreements (FTAs) and Economic Partnership Agreements (EPAs). However, there is limitation to the resolution of the currently available trade data. For instance, NBER-UN records trade amounts between bilateral countries without industry sector information for each type of commodities [1], and the World Input-Output Database (WIOD) records sector-specific trade amount without commodities information [2]. This limited resolution makes it difficult to analyze community structures in detail and systematically assess the effects of reduced trade tariffs and trade barriers.

  • Power laws in market capitalization during the Dot-com and Shanghai bubble periods

    Abstract

    The distributions of market capitalization across stocks listed in the NASDAQ and Shanghai stock exchanges have power law tails. The power law exponents associated with these distributions fluctuate around one, but show a substantial decline during the dot-com bubble in 1997-2000 and the Shanghai bubble in 2007. In this paper, we show that the observed decline in the power law exponents is closely related to the deviation of the market values of stocks from their fundamental values. Specifically, we regress market capitalization of individual stocks on financial variables, such as sales, profits, and asset sizes, using the entire sample period (1990 to 2015) in order to identify variables with substantial contributions to fluctuations in fundamentals. Based on the regression results for stocks in listed in the NASDAQ, we argue that the fundamental value of a company is well captured by the value of its net asset, therefore a price book-value ratio (PBR) is a good measure of the deviation from fundamentals. We show that the PBR distribution across stocks listed in the NASDAQ has a much heavier upper tail in 1997 than in the other years, suggesting that stock prices deviate from fundamentals for a limited number of stocks constituting the tail part of the PBR distribution. However, we fail to obtain a similar result for Shanghai stocks.

    Introduction

    Since B. Mandelbrot identified the fractal structure of price fluctuations in asset markets in 1963 [1], statistical physicists have been investigating the economic mechanism through which a fractal structure emerges. Power laws is an important characteristic in the fractal structure. For example, some studies found that the size distribution of asset price fluctuations follows power law [2,3]. Also, it is shown that firm size distribution (e.g., the distribution of sales across firms) also follows power law [4–8]. The power law exponent associated with firm size distributions is close to one over the last 30 years in many countries [9, 10]. The situation in which the exponent is equal to one is special in that it is the critical point between the oligopolistic phase and the pseudoequal phase [11]. If the power law exponent less than one, the finite number of top firms occupy a dominant share in the market even if there are infinite number of firms.

  • Puzzles in the Tokyo Fixing in the Forex Market: Order Imbalances and Bank Pricing

    Abstract

    “Fixing” in the foreign exchange market, in Tokyo at 10am and in London at 4pm, is a market practice that determines the bid-ask-mid-point exchange rate at a scheduled time of the day in Japan. The fixing exchange rate is then applied to the settlement of foreign exchange transactions between banks and retail customers including broker dealers, institutional investors, insurance companies, exporters and importers, with varying bid-ask spreads. The findings for the Tokyo fixing are summarized as follows. (1) Price spikes are more frequent than the London fixing. (2) The customer orders are biased toward buying the foreign currencies, and this is predictable. (3) Trading volumes and liquidity concentrate on the USD/JPY. (4) Before 2008, the fixing price set by banks was biased upward, and higher than the highest transaction price during the fixing time window; the banks were earning monopolistic profits, but this gap disappeared after 2008. (5) The fixing price is still above the average transaction prices in the fixing window, suggesting that banks make profits, but that can be understood considering the risk of maintaining the fix for the rest of the business day. And (6) calendar effects also matter for the determination of the fixing rate and the price fluctuation around fixing.

    Introduction

    “Fixing” in the foreign exchange market is a market practice that determines the bid-ask mid-point exchange rate around a pre-announced time of the day. The fixing exchange rate is then applied to the settlement of foreign exchange transactions between banks and retail customers including broker dealers, institutional investors, insurance companies, exporters and importers, with varying bid-ask spreads.

  • The gradual evolution of buyer-seller networks and their role in aggregate fluctuations

    Abstract

    Buyer–seller relationships among firms can be regarded as a longitudinal network in which the connectivity pattern evolves as each firm receives productivity shocks. Based on a data set describing the evolution of buyer–seller links among 55,608 firms over a decade and structural equation modeling, we find some evidence that interfirm networks evolve reflecting a firm’s local decisions to mitigate adverse effects from neighbor firms through interfirm linkage, while enjoying positive effects from them. As a result, link renewal tends to have a positive impact on the growth rates of firms. We also investigate the role of networks in aggregate fluctuations.

    Introduction

    The interfirm buyer–seller network is important from both the macroeconomic and the microeconomic perspectives. From the macroeconomic perspective, this network represents a form of interconnectedness in an economy that allows firm-level idiosyncratic shocks to be propagated to other firms. Previous studies has suggested that this propagation mechanism interferes with the averaging-out process of shocks, and possibly has an impact on macroeconomic variables such as aggregate fluctuations (Acemoglu, Ozdaglar and Tahbaz-Salehi (2013), Acemoglu et al. (2012), Carvalho (2014), Carvalho (2007), Shea (2002), Foerster, Sarte and Watson (2011) and Malysheva and Sarte (2011)). From the microeconomic perspective, a network at a particular point of time is a result of each firms link renewal decisions in order to avoid (or share) negative (or positive) shocks with its neighboring firms. These two views of a network is related by the fact that both concerns propagation of shocks. The former view stresses the fact that idiosyncratic shocks propagates through a static network while the latter provides a more dynamic view where firms have the choice of renewing its link structure in order to share or avoid shocks. The question here is that it is not clear how the latter view affects the former view. Does link renewal increase aggregate fluctuation due to firms forming new links that conveys positive shocks or does it decrease aggregate fluctuation due to firms severing links that conveys negative shocks or does it have a different effect?

  • A Double Counting Problem in the Theory of Rational Bubbles

    Abstract

    In a standard overlapping generations model, the unique equilibrium price of a Lucas’ tree can be decomposed into the present discounted value of dividends and the stationary monetary equilibrium price of fiat money, the latter of which is a rational bubble. Thus, the standard interpretation of a rational bubble as the speculative component in an asset price double-counts the value of pure liquidity that is already part of the fundamental price of an interest-bearing asset.

    Introduction

    A rational bubble is usually modeled as an intrinsically useless asset. As shown by Tirole (1985), it attains a positive market value if it is expected to be exchangeable for goods in the future. It becomes worthless if it is expected to be worthless in the future, given that it has no intrinsic use. This property of self-fulfilling multiple equilibria has been used to explain a large boom-bust cycle in an asset price, as a stochastic transition between the two equilibria can generate a boom-bust cycle without any associated change in asset fundamentals.

PAGE TOP