ワーキングペーパー 2012年度 一覧に戻る

投稿はありません。

  • Repos in Over-the-Counter Markets

    Abstract

    This paper presents a dynamic matching model featuring dealers and short-term investors in an over-the-counter bond market. The model illustrates that bilateral bargaining in an over-the-counter market results in an endogenous bond-liquidation cost for short-term investors. This cost makes short-term investors need repurchase agreements to buy long-term bonds. The cost also explains the existence of a margin specific to repurchase agreements held by short-term investors, if repurchase agreements must be renegotiation-proof. Without repurchase agreements, short-term investors do not buy long-term bonds. In this case, the bond yield rises unless dealers have enough capital to buy and hold bonds.

    Introduction

    Repurchase agreements, or repos, are one of the primary instruments in the money market. In a repo, a short-term investor buys bonds with a future contract in which the seller of the bonds, typically a bond dealer, promises to buy back the bonds at a later date. A question arises from this observation regarding why investors need such a promise when they can simply buy and resell bonds in a series of spot transactions. In this paper, I present a model to illustrate that a bond-liquidation cost due to over-the-counter (OTC) trading can explain short-term investors’ need for repos in bond markets. It is not necessary to introduce uncertainty or asymmetric information to obtain this result.

  • Estimating Quality Adjusted Commercial Property Price Indexes Using Japanese REIT Data

    Abstract

    We propose a new method to estimate quality adjusted commercial property price indexes using real estate investment trust (REIT) data. Our method is based on the present value approach, but the way the denominator (i.e., the discount rate) and the numerator (i.e., cash flows from properties) are estimated differs from the traditional method. We estimate the discount rate based on the share prices of REITs, which can be regarded as the stock market’s valuation of the set of properties owned by the REITs. As for the numerator, we use rental prices associated only with new rental contracts rather than those associated with all existing contracts. Using a dataset with prices and cash flows for about 500 commercial properties included in Japanese REITs for the period 2003 to 2010, we find that our price index signals turning points much earlier than an appraisal-based price index; specifically, our index peaks in the first quarter of 2007, while the appraisal-based price index exhibits a turnaround only in the third quarter of 2008. Our results suggest that the share prices of REITs provide useful information in constructing commercial property price indexes.

    Introduction

    Looking back at the history of economic crises, there are a considerable number of cases where a crisis was triggered by the collapse of real estate price bubbles. For example, it is widely accepted that the collapse of Japan’s land/stock price bubble in the early 1990s has played an important role in the subsequent economic stagnation, and in particular the banking crisis that started in the latter half of the 1990s. Similarly, the Nordic banking crisis in the early 1990s also occurred in tandem with a property bubble collapse, while the global financial crisis that began in the U.S. in 2008 and the recent European debt crisis were also triggered by the collapse of bubbles in the property and financial markets.

  • The Emergence of Different Tail Exponents in the Distributions of Firm Size Variables

    Abstract

    We discuss a mechanism through which inversion symmetry (i.e., invariance of a joint probability density function under the exchange of variables) and Gibrat’s law generate power-law distributions with different tail exponents. Using a dataset of firm size variables, that is, tangible fixed assets K, the number of workers L, and sales Y , we confirm that these variables have power-law tails with different exponents, and that inversion symmetry and Gibrat’s law hold. Based on these findings, we argue that there exists a plane in the three dimensional space (log K, log L, log Y ), with respect to which the joint probability density function for the three variables is invariant under the exchange of variables. We provide empirical evidence suggesting that this plane fits the data well, and argue that the plane can be interpreted as the Cobb-Douglas production function, which has been extensively used in various areas of economics since it was first introduced almost a century ago.

    Introduction

    In various phase transitions, it is universally observed that physical quantities near critical points obey power laws. For instance, in magnetic substances, specific heat, magnetic dipole density, and magnetic susceptibility follow power laws of heat or magnetic flux. It is also known that the cluster-size distribution of the spin follows power laws. The renormalization group approach has been employed to confirm that power laws arise as critical phenomena of phase transitions [1].

  • High quality topic extraction from business news explains abnormal financial market volatility

    Abstract

    Understanding the mutual relationships between information flows and social activity in society today is one of the cornerstones of the social sciences. In financial economics, the key issue in this regard is understanding and quantifying how news of all possible types (geopolitical, environmental, social, financial, economic, etc.) affect trading and the pricing of firms in organized stock markets. In this paper we seek to address this issue by performing an analysis of more than 24 million news records provided by Thompson Reuters and of their relationship with trading activity for 205 major stocks in the S&P US stock index. We show that the whole landscape of news that affect stock price movements can be automatically summarized via simple regularized regressions between trading activity and news information pieces decomposed, with the help of simple topic modeling techniques, into their “thematic” features. Using these methods, we are able to estimate and quantify the impacts of news on trading. We introduce network-based visualization techniques to represent the whole landscape of news information associated with a basket of stocks. The examination of the words that are representative of the topic distributions confirms that our method is able to extract the significant pieces of information influencing the stock market. Our results show that one of the most puzzling stylized fact in financial economies, namely that at certain times trading volumes appear to be “abnormally large,” can be explained by the flow of news. In this sense, our results prove that there is no “excess trading,” if the news are genuinely novel and provide relevant financial information.

    Introduction

    Neoclassical financial economics based on the “efficient market hypothesis” (EMH) considers price movements as almost perfect instantaneous reactions to information flows. Thus, according to the EMH, price changes simply reflect exogenous news. Such news - of all possible types (geopolitical, environmental, social, financial, economic, etc.) - lead investors to continuously reassess their expectations of the cash flows that firms’ investment projects could generate in the future. These reassessments are translated into readjusted demand/supply functions, which then push prices up or down, depending on the net imbalance between demand and supply, towards a fundamental value. As a consequence, observed prices are considered the best embodiments of the present value of future cash flows. In this view, market movements are purely exogenous without any internal feedback loops. In particular, the most extreme losses occurring during crashes are considered to be solely triggered exogenously.

  • How Fast Are Prices in Japan Falling?

    Abstract

    The consumer price inflation rate in Japan has been below zero since the mid-1990s. However, despite the presence of a substantial output gap, the rate of deflation has been much smaller than that observed in the United States during the Great Depression. Given this, doubts have been raised regarding the accuracy of Japan’s official inflation estimates. Against this background, the purpose of this paper is to investigate to what extent estimates of the inflation rate depend on the methodology adopted. Our specific focus is on how inflation estimates depend on the method of outlet, product, and price sampling employed. For the analysis, we use daily scanner data on prices and quantities for all products sold at about 200 supermarkets over the last ten years. We regard this dataset as the “universe” and send out (virtual) price collectors to conduct sampling following more than sixty different sampling rules. We find that the officially released outcome can be reproduced when employing a sampling rule similar to the one adopted by the Statistics Bureau. However, we obtain numbers quite different from the official ones when we employ different rules. The largest rate of deflation we find using a particular rule is about 1 percent per year, which is twice as large as the official number, suggesting the presence of substantial upward-bias in the official inflation rate. Nonetheless, our results show that the rate of deflation over the last decade is still small relative to that in the United States during the Great Depression, indicating that Japan’s deflation is moderate.

    Introduction

    The consumer price index (CPI) inflation rate in Japan has been below zero since the mid-1990s, clearly indicating the emergence of deflation over the last 15 years. However, the rate of deflation measured by headline CPI in each year was around 1 percent, which is much smaller than the rates observed in the United States during the Great Depression. Some suggest that this simply reflects the fact that although Japan’s deflation is persistent, it is only moderate. Others, both inside and outside the country, however, argue that something must be wrong with the deflation figures, questitioning Japan’s price data from a variety of angles. One of these is that, from an economic perspective, the rate of deflation, given the huge and persistent output gap in Japan, should be higher than the numbers released by the government suggest. Fuhrer et al. (2011), for example, estimating a NAIRU model for Japan, conclude that it would not have been surprising if the rate of deflation had reached 3 percent per year. Another argument focuses on the statistics directly. Broda and Weinstein (2007) and Ariga and Matsui (2003), for example, maintain that there remains non-trivial mismeasurement in the Japanese consumer price index, so that the officially released CPI inflation rate over the last 15 years contains substantial upward bias.

  • 「ゼロ金利下の長期デフレ」

    Abstract

    日本では,1990 年代後半以降,政策金利がゼロになる一方,物価上昇率もゼロ近傍となっている。この「二つのゼロ」現象は,この時期における日本経済の貨幣的側面を特徴づけるものであり,実物的側面の特徴である成長率の長期低迷と対をなしている。本稿では「二つのゼロ」現象の原因を解明すべく行われてきたこれまでの研究成果を概観する。
    ゼロ金利現象については,自然利子率(貯蓄投資を均衡させる実質利子率)が負の水準へと下落したのを契機として発生したという見方と,企業や家計が何らかの理由で強いデフレ予想をもつようになり,それが起点となって自己実現的なデフレ均衡に陥ったという見方がある。試算によれば,日本の自然利子率は 1990 年代後半以降かなり低い水準にあり,マイナスに落ち込んだ時期もあった。一方,物価下落を予想する家計は少数派である。これらの事実は,日本のゼロ金利の原因として,負の自然利子率説が有力であることを示している。ただし,企業や家計の強い円高予想が起点となって自己実現的なデフレ均衡に陥っている可能性も否定できない。物価については,原価や需要が変化しても即座には商品の販売価格を変更しないとする企業が 9 割を超えており,価格の硬直性が存在する。さらに,POS データを用いた分析によれば,1990 年代後半以降,価格の更新頻度が高まる一方,価格の更新幅は小幅化する傾向がある。このような小刻みな価格変更が物価下落を緩やかにしている。小刻みな価格変更の背景には,ライバルが価格を変更すれば自分も価格を変更する,ライバルが変更しなければ自分も変更しないという意味で,店舗や企業間の相互牽制が強まっている可能性がある。
    「二つのゼロ」現象は,ケインズが提示した「流動性の罠」と「価格硬直性」というアイディアと密接に関係している。しかし,「流動性の罠」についてはケインズ以後,本格的な研究がなされておらず,「価格硬直性」についてもその原因をデータから探る研究が本格化したのはここ10 年のことに過ぎない。「二つのゼロ」現象に関する議論が混迷し,政策対応が遅れた背景にはこうした事情がある。ケインズの残した宿題に精力的に取り組むことが研究者に求められている。

    Introduction

    マクロの経済現象を実物的側面と貨幣的側面に分けるとすれば,1990 年代初のバブル崩壊後,実物的な側面における最も重要な現象は成長率の低下であった。成長率の低下やそれに伴う雇用の喪失は多くの人にとって差し迫った問題であり,研究者の間でも「失われた十年」を巡って様々な検討が進められてきた。これに対して,貨幣的な側面については,少なくともバブル崩壊直後はさほど注目されず,研究者の関心を集めることも少なかった。しかし実はこの時期,貨幣的な側面でも重要な変化が進行していた。

PAGE TOP