Takayuki Mizuno ワーキングペーパー一覧に戻る

  • Detecting Stock Market Bubbles Based on the Cross‐Sectional Dispersion of Stock Prices

    Abstract

    A statistical method is proposed for detecting stock market bubbles that occur when speculative funds concentrate on a small set of stocks. The bubble is defined by stock price diverging from the fundamentals. A firm’s financial standing is certainly a key fundamental attribute of that firm. The law of one price would dictate that firms of similar financial standing share similar fundamentals. We investigate the variation in market capitalization normalized by fundamentals that is estimated by Lasso regression of a firm’s financial standing. The market capitalization distribution has a substantially heavier upper tail during bubble periods, namely, the market capitalization gap opens up in a small subset of firms with similar fundamentals. This phenomenon suggests that speculative funds concentrate in this subset. We demonstrated that this phenomenon could have been used to detect the dot-com bubble of 1998-2000 in different stock exchanges. 

    Introduction

    It is common knowledge in macroeconomics that, as Federal Reserve Board Chairman Alan Greenspan said in 2002, ”...it is very difficult to identify a bubble until after the fact; that is, when its bursting confirms its existence.” In other words, before a bubble bursts, there is no way to establish whether the economy is in a bubble or not. In economics, a stock bubble is defined as a state in which speculative investment flows into a firm in excess of the firm’s fundamentals, so the market capitalization (= stock price × number of shares issued) becomes excessively high compared to the fundamentals. Unfortunately, it is exceedingly difficult to precisely measure a firm’s fundamentals and this has made it nearly impossible to detect a stock bubble by simply measuring the divergence between fundamentals and market capitalization [1–3]. On the other hand, we empirically know that market capitalization and PBR (= market capitalization / net assets) of some stocks increase during bubble periods [4–7]. However, they are also buoyed by rising fundamentals, so it is not always possible to figure out if increases can be attributed to an emerging bubble.

     

    WP010

  • House Price Dispersion in Boom-Bust Cycles: Evidence from Tokyo

    Abstract

    We investigate the cross-sectional distribution of house prices in the Greater Tokyo Area for the period 1986 to 2009. We find that size-adjusted house prices follow a lognormal distribution except for the period of the housing bubble and its collapse in Tokyo, for which the price distribution has a substantially heavier upper tail than that of a lognormal distribution. We also find that, during the bubble era, sharp price movements were concentrated in particular areas, and this spatial heterogeneity is the source of the fat upper tail. These findings suggest that, during a bubble, prices increase markedly for certain properties but to a much lesser extent for other properties, leading to an increase in price inequality across properties. In other words, the defining property of real estate bubbles is not the rapid price hike itself but an increase in price dispersion. We argue that the shape of cross-sectional house price distributions may contain information useful for the detection of housing bubbles. 

    Introduction

    Property market developments are of increasing importance to practitioners and policymakers. The financial crises of the past two decades have illustrated just how critical the health of this sector can be for achieving financial stability. For example, the recent financial crisis in the United States in its early stages reared its head in the form of the subprime loan problem. Similarly, the financial crises in Japan and Scandinavia in the 1990s were all triggered by the collapse of bubbles in the real estate market. More recently, the rapid rise in real estate prices - often supported by a strong expansion in bank lending - in a number of emerging market economies has become a concern for policymakers. Given these experiences, it is critically important to analyze the relationship between property markets, finance, and financial crisis.

     

    WP008

  • Predicting Adverse Media Risk using a Heterogeneous Information Network

    Abstract

    The media plays a central role in monitoring powerful institutions and identifying any activities harmful to the public interest. In the investing sphere constituted of 46,583 officially listed domestic firms on the stock exchanges worldwide, there is a growing interest “to do the right thing”, i.e., to put pressure on companies to improve their environmental, social and government (ESG) practices. However, how to overcome the sparsity of ESG data from non-reporting firms, and how to identify the relevant information in the annual reports of this large universe? Here, we construct a vast heterogeneous information network that covers the necessary information surrounding each firm, which is assembled using seven professionally curated datasets and two open datasets, resulting in about 50 million nodes and 400 million edges in total. Exploiting this heterogeneous information network, we propose a model that can learn from past adverse media coverage patterns and predict the occurrence of future adverse media coverage events on the whole universe of firms. Our approach is tested using the adverse media coverage data of more than 35,000 firms worldwide from January 2012 to May 2018. Comparing with state-of-the-art methods with and without the network, we show that the predictive accuracy is substantially improved when using the heterogeneous information network. This work suggests new ways to consolidate the diffuse information contained in big data in order to monitor dominant institutions on a global scale for more socially responsible investment, better risk management, and the surveillance of powerful institutions.

    Introduction

    Adverse media coverage sometimes leads to fatal results for a company. In the press release sent out by Cambridge Analytica on May 2, 2018, the company wrote that “Cambridge Analytica has been the subject of numerous unfounded accusations, ... media coverage has driven away virtually all of the company’s customers and suppliers” [5]. This is just one recent example highlighting the impact of adverse media coverage on a firm’s fate. In another example, the impact of adverse media coverage on Swiss bank profits was estimated to be 3.35 times the median annual net profit of small banks and 0.73 times that of large banks [3]. These numbers are significant, indicating how adverse media coverage can cause huge damage to a bank. Moreover, a new factor, priced as the “no media coverage premium” [10], has been identified to help explain financial returns: stocks with no media coverage earn higher returns than stocks with high media coverage. Within the rational-agent framework, this may result from impediments to trade and/or from lower investor recognition leading to lower diversification [10]. Another mechanism could be associated with the fact that most of the coverage of mass media is negative [15, 23].

     

    WP004

  • Power Laws in Market Capitalization during the Dot-com and Shanghai Bubble Periods

    Abstract

    The distributions of market capitalization across stocks listed in the NASDAQ and Shanghai stock exchanges have power law tails. The power law exponents associated with these distributions fluctuate around one, but show a substantial decline during the dot-com bubble in 1997-2000 and the Shanghai bubble in 2007. In this paper, we show that the observed decline in the power law exponents is closely related to the deviation of the market values of stocks from their fundamental values. Specifically, we regress market capitalization of individual stocks on financial variables, such as sales, profits, and asset sizes, using the entire sample period (1990 to 2015) in order to identify variables with substantial contributions to fluctuations in fundamentals. Based on the regression results for stocks in listed in the NASDAQ, we argue that the fundamental value of a company is well captured by the value of its net asset, therefore a price book-value ratio (PBR) is a good measure of the deviation from fundamentals. We show that the PBR distribution across stocks listed in the NASDAQ has a much heavier upper tail in 1997 than in the other years, suggesting that stock prices deviate from fundamentals for a limited number of stocks constituting the tail part of the PBR distribution. However, we fail to obtain a similar result for Shanghai stocks.

    Introduction

    Since B. Mandelbrot identified the fractal structure of price fluctuations in asset markets in 1963 [1], statistical physicists have been investigating the economic mechanism through which a fractal structure emerges. Power laws is an important characteristic in the fractal structure. For example, some studies found that the size distribution of asset price fluctuations follows power law [2,3]. Also, it is shown that firm size distribution (e.g., the distribution of sales across firms) also follows power law [4–8]. The power law exponent associated with firm size distributions is close to one over the last 30 years in many countries [9, 10]. The situation in which the exponent is equal to one is special in that it is the critical point between the oligopolistic phase and the pseudoequal phase [11]. If the power law exponent less than one, the finite number of top firms occupy a dominant share in the market even if there are infinite number of firms.

  • The Gradual Evolution of Buyer-Seller Networks and Their Role in Aggregate Fluctuations

    Abstract

    Buyer–seller relationships among firms can be regarded as a longitudinal network in which the connectivity pattern evolves as each firm receives productivity shocks. Based on a data set describing the evolution of buyer–seller links among 55,608 firms over a decade and structural equation modeling, we find some evidence that interfirm networks evolve reflecting a firm’s local decisions to mitigate adverse effects from neighbor firms through interfirm linkage, while enjoying positive effects from them. As a result, link renewal tends to have a positive impact on the growth rates of firms. We also investigate the role of networks in aggregate fluctuations.

    Introduction

    The interfirm buyer–seller network is important from both the macroeconomic and the microeconomic perspectives. From the macroeconomic perspective, this network represents a form of interconnectedness in an economy that allows firm-level idiosyncratic shocks to be propagated to other firms. Previous studies has suggested that this propagation mechanism interferes with the averaging-out process of shocks, and possibly has an impact on macroeconomic variables such as aggregate fluctuations (Acemoglu, Ozdaglar and Tahbaz-Salehi (2013), Acemoglu et al. (2012), Carvalho (2014), Carvalho (2007), Shea (2002), Foerster, Sarte and Watson (2011) and Malysheva and Sarte (2011)). From the microeconomic perspective, a network at a particular point of time is a result of each firms link renewal decisions in order to avoid (or share) negative (or positive) shocks with its neighboring firms. These two views of a network is related by the fact that both concerns propagation of shocks. The former view stresses the fact that idiosyncratic shocks propagates through a static network while the latter provides a more dynamic view where firms have the choice of renewing its link structure in order to share or avoid shocks. The question here is that it is not clear how the latter view affects the former view. Does link renewal increase aggregate fluctuation due to firms forming new links that conveys positive shocks or does it decrease aggregate fluctuation due to firms severing links that conveys negative shocks or does it have a different effect?

  • Novel and topical business news and their impact on stock market activities

    Abstract

    We propose an indicator to measure the degree to which a particular news article is novel, as well as an indicator to measure the degree to which a particular news item attracts attention from investors. The novelty measure is obtained by comparing the extent to which a particular news article is similar to earlier news articles, and an article is regarded as novel if there was no similar article before it. On the other hand, we say a news item receives a lot of attention and thus is highly topical if it is simultaneously reported by many news agencies and read by many investors who receive news from those agencies. The topicality measure for a news item is obtained by counting the number of news articles whose content is similar to an original news article but which are delivered by other news agencies. To check the performance of the indicators, we empirically examine how these indicators are correlated with intraday financial market indicators such as the number of transactions and price volatility. Specifically, we use a dataset consisting of over 90 million business news articles reported in English and a dataset consisting of minuteby-minute stock prices on the New York Stock Exchange and the NASDAQ Stock Market from 2003 to 2014, and show that stock prices and transaction volumes exhibited a significant response to a news article when it is novel and topical.

    Introduction

    Financial markets can be regarded as a non-equilibrium open system. Understanding how they work remains a great challenge to researchers in finance, economics, and statistical physics. Fluctuations in financial market prices are sometimes driven by endogenous forces and sometimes by exogenous forces. Business news is a typical example of exogenous forces. Casual observation indicates that stock prices respond to news articles reporting on new developments concerning companies’ circumstances. Market reactions to news have been extensively studied by researchers in several different fields [1]–[13], with some researchers attempting to construct models that capture static and/or dynamic responses to endogenous and exogenous shocks [14], [15]. The starting point for neoclassical financial economists typically is what they refer to as the “efficient market hypothesis,” which implies that stock prices respond at the very moment that news is delivered to market participants. A number of empirical studies have attempted to identify such an immediate price response to news but have found little evidence supporting the efficient market hypothesis [16]– [21].

  • Structure of global buyer-supplier networks and its implications for conflict minerals regulations

    Abstract

    We investigate the structure of global inter-firm linkages using a dataset that contains information on business partners for about 400, 000 firms worldwide, including all the firms listed on the major stock exchanges. Among the firms, we examine three networks, which are based on customer-supplier, licensee-licensor, and strategic alliance relationships. First, we show that these networks all have scale-free topology and that the degree distribution for each follows a power law with an exponent of 1.5. The shortest path length is around six for all three networks. Second, we show through community structure analysis that the firms comprise a community with those firms that belong to the same industry but different home countries, indicating the globalization of firms’ production activities. Finally, we discuss what such production globalization implies for the proliferation of conflict minerals (i.e., minerals extracted from conflict zones and sold to firms in other countries to perpetuate fighting) through global buyer-supplier linkages. We show that a limited number of firms belonging to some specific industries and countries plays an important role in the global proliferation of conflict minerals. Our numerical simulation shows that regulations on the purchases of conflict minerals by those firms would substantially reduce their worldwide use.

    Introduction

    Many complex physical systems can be modeled and better understood as complex networks [1, 2, 3]. Recent studies show that economic systems can also be regarded as complex networks in which economic agents, like consumers, firms, and governments, are closely connected [4, 5]. To understand the interaction among economic agents, we must uncover the structure of economic networks.

  • Buyer-Supplier Networks and Aggregate Volatility

    Abstract

    In this paper, we investigate the structure and evolution of customer-supplier networks in Japan using a unique dataset that contains information on customer and supplier linkages for more than 500,000 incorporated non-financial firms for the five years from 2008 to 2012. We find, first, that the number of customer links is unequal across firms; the customer link distribution has a power-law tail with an exponent of unity (i.e., it follows Zipf’s law). We interpret this as implying that competition among firms to acquire new customers yields winners with a large number of customers, as well as losers with fewer customers. We also show that the shortest path length for any pair of firms is, on average, 4.3 links. Second, we find that link switching is relatively rare. Our estimates indicate that the survival rate per year for customer links is 92 percent and for supplier links 93 percent. Third and finally, we find that firm growth rates tend to be more highly correlated the closer two firms are to each other in a customer-supplier network (i.e., the smaller is the shortest path length for the two firms). This suggests that a non-negligible portion of fluctuations in firm growth stems from the propagation of microeconomic shocks – shocks affecting only a particular firm – through customer-supplier chains.

    Introduction

    Firms in a modern economy tend to be closely interconnected, particularly in the manufacturing sector. Firms typically rely on the delivery of materials or intermediate products from their suppliers to produce their own products, which in turn are delivered to other downstream firms. Two recent episodes vividly illustrate just how closely firms are interconnected. The first is the recent earthquake in Japan. The earthquake and tsunami hit the Tohoku region, the north-eastern part of Japan, on March 11, 2011, resulting in significant human and physical damage to that region. However, the economic damage was not restricted to that region and spread in an unanticipated manner to other parts of Japan through the disruption of supply chains. For example, vehicle production by Japanese automakers, which are located far away from the affected areas, was stopped or slowed down due to a shortage of auto parts supplies from firms located in the affected areas. The shock even spread across borders, leading to a substantial decline in North American vehicle production. The second episode is the recent financial turmoil triggered by the subprime mortgage crisis in the United States. The adverse shock originally stemming from the so-called toxic assets on the balance sheets of U.S. financial institutions led to the failure of these institutions and was transmitted beyond entities that had direct business with the collapsed financial institutions to those that seemed to have no relationship with them, resulting in a storm that affected financial institutions around the world.

  • Analytical Derivation of Power Laws in Firm Size Variables from Gibrat’s Law and Quasi-inversion Symmetry: A Geomorphological Approach

    Abstract

    We start from Gibrat’s law and quasi-inversion symmetry for three firm size variables (i.e., tangible fixed assets K, number of employees L, and sales Y) and derive a partial differential equation to be satisfied by the joint probability density function of K and L. We then transform K and L, which are correlated, into two independent variables by applying surface openness used in geomorphology and provide an analytical solution to the partial differential equation. Using worldwide data on the firm size variables for companies, we confirm that the estimates on the power-law exponents of K, L, and Y satisfy a relationship implied by the theory.

    Introduction

    In econophysics, it is well-known that the cumulative distribution functions (CDFs) of capital K, labor L, and production Y of firms obey power laws in large scales that exceed certain size thresholds, which are given by K0, L0, and Y0:

  • The Structure and Evolution of Buyer-Supplier Networks

    Abstract

    In this paper, we investigate the structure and evolution of customer-supplier networks in Japan using a unique dataset that contains information on customer and supplier linkages for more than 500,000 incorporated non-financial firms for the five years from 2008 to 2012. We find, first, that the number of customer links is unequal across firms; the customer link distribution has a power-law tail with an exponent of unity (i.e., it follows Zipf’s law). We interpret this as implying that competition among firms to acquire new customers yields winners with a large number of customers, as well as losers with fewer customers. We also show that the shortest path length for any pair of firms is, on average, 4.3 links. Second, we find that link switching is relatively rare. Our estimates indicate that the survival rate per year for customer links is 92 percent and for supplier links 93 percent. Third and finally, we find that firm growth rates tend to be more highly correlated the closer two firms are to each other in a customer-supplier network (i.e., the smaller is the shortest path length for the two firms). This suggests that a non-negligible portion of fluctuations in firm growth stems from the propagation of microeconomic shocks – shocks affecting only a particular firm – through customer-supplier chains.

    Introduction

    Firms in a modern economy tend to be closely interconnected, particularly in the manufacturing sector. Firms typically rely on the delivery of materials or intermediate products from their suppliers to produce their own products, which in turn are delivered to other downstream firms. Two recent episodes vividly illustrate just how closely firms are interconnected. The first is the recent earthquake in Japan. The earthquake and tsunami hit the Tohoku region, the north-eastern part of Japan, on March 11, 2011, resulting in significant human and physical damage to that region. However, the economic damage was not restricted to that region and spread in an unanticipated manner to other parts of Japan through the disruption of supply chains. For example, vehicle production by Japanese automakers, which are located far away from the affected areas, was stopped or slowed down due to a shortage of auto parts supplies from firms located in the affected areas. The shock even spread across borders, leading to a substantial decline in North American vehicle production. The second episode is the recent financial turmoil triggered by the subprime mortgage crisis in the United States. The adverse shock originally stemming from the so-called toxic assets on the balance sheets of U.S. financial institutions led to the failure of these institutions and was transmitted beyond entities that had direct business with the collapsed financial institutions to those that seemed to have no relationship with them, resulting in a storm that affected financial institutions around the world.

  • Why are product prices in online markets not converging?

    Abstract

    Why are product prices in online markets dispersed in spite of very small search costs? To address this question, we construct a unique dataset from a Japanese price comparison site, which records price quotes offered by e-retailers as well as customers’ clicks on products, which occur when they proceed to purchase the product. We find that the distribution of prices retailers quote for a particular product at a particular point in time (divided by the lowest price) follows an exponential distribution, showing the presence of substantial price dispersion. For example, 20 percent of all retailers quote prices that are more than 50 percent higher than the lowest price. Next, comparing the probability that customers click on a retailer with a particular rank and the probability that retailers post prices at a particular rank, we show that both decline exponentially with price rank and that the exponents associated with the probabilities are quite close. This suggests that the reason why some retailers set prices at a level substantially higher than the lowest price is that they know that some customers will choose them even at that high price. Based on these findings, we hypothesize that price dispersion in online markets stems from heterogeneity in customers’ preferences over retailers; that is, customers choose a set of candidate retailers based on their preferences, which are heterogeneous across customers, and then pick a particular retailer among the candidates based on the price ranking

    Introduction

    The number of internet users worldwide is 2.4 billion, constituting about 35 percent of the global population. The number of users has more than doubled over the last five years and continues to increase [1]. In the early stages of the internet boom, observers predicted that the spread of the internet would lead the retail industry toward a state of perfect competition, or a Bertrand equilibrium [2]. For instance, The Economist stated in 1990 that “[t]he explosive growth of the Internet promises a new age of perfectly competitive markets. With perfect information about prices and products at their fingertips, consumers can quickly and easily find the best deals. In this brave new world, retailers’ profit margins will be competed away, as they are all forced to price at cost” [3]. Even academic researchers argued that online markets will soon be close to perfectly competitive markets [4][5][6][7].

  • Detecting Real Estate Bubbles: A New Approach Based on the Cross-Sectional Dispersion of Property Prices

    Abstract

    We investigate the cross-sectional distribution of house prices in the Greater Tokyo Area for the period 1986 to 2009. We find that size-adjusted house prices follow a lognormal distribution except for the period of the housing bubble and its collapse in Tokyo, for which the price distribution has a substantially heavier right tail than that of a lognormal distribution. We also find that, during the bubble era, sharp price movements were concentrated in particular areas, and this spatial heterogeneity is the source of the fat upper tail. These findings suggest that, during a bubble period, prices go up prominently for particular properties, but not so much for other properties, and as a result, price inequality across properties increases. In other words, the defining property of real estate bubbles is not the rapid price hike itself but an increase in price dispersion. We argue that the shape of cross sectional house price distributions may contain information useful for the detection of housing bubbles.

    Introduction

    Property market developments are of increasing importance to practitioners and policymakers. The financial crises of the past two decades have illustrated just how critical the health of this sector can be for achieving financial stability. For example, the recent financial crisis in the United States in its early stages reared its head in the form of the subprime loan problem. Similarly, the financial crises in Japan and Scandinavia in the 1990s were all triggered by the collapse of bubbles in the real estate market. More recently, the rapid rise in real estate prices - often supported by a strong expansion in bank lending - in a number of emerging market economies has become a concern for policymakers. Given these experiences, it is critically important to analyze the relationship between property markets, finance, and financial crisis.

  • The Emergence of Different Tail Exponents in the Distributions of Firm Size Variables

    Abstract

    We discuss a mechanism through which inversion symmetry (i.e., invariance of a joint probability density function under the exchange of variables) and Gibrat’s law generate power-law distributions with different tail exponents. Using a dataset of firm size variables, that is, tangible fixed assets K, the number of workers L, and sales Y , we confirm that these variables have power-law tails with different exponents, and that inversion symmetry and Gibrat’s law hold. Based on these findings, we argue that there exists a plane in the three dimensional space (log K, log L, log Y ), with respect to which the joint probability density function for the three variables is invariant under the exchange of variables. We provide empirical evidence suggesting that this plane fits the data well, and argue that the plane can be interpreted as the Cobb-Douglas production function, which has been extensively used in various areas of economics since it was first introduced almost a century ago.

    Introduction

    In various phase transitions, it is universally observed that physical quantities near critical points obey power laws. For instance, in magnetic substances, specific heat, magnetic dipole density, and magnetic susceptibility follow power laws of heat or magnetic flux. It is also known that the cluster-size distribution of the spin follows power laws. The renormalization group approach has been employed to confirm that power laws arise as critical phenomena of phase transitions [1].

  • High quality topic extraction from business news explains abnormal financial market volatility

    Abstract

    Understanding the mutual relationships between information flows and social activity in society today is one of the cornerstones of the social sciences. In financial economics, the key issue in this regard is understanding and quantifying how news of all possible types (geopolitical, environmental, social, financial, economic, etc.) affect trading and the pricing of firms in organized stock markets. In this paper we seek to address this issue by performing an analysis of more than 24 million news records provided by Thompson Reuters and of their relationship with trading activity for 205 major stocks in the S&P US stock index. We show that the whole landscape of news that affect stock price movements can be automatically summarized via simple regularized regressions between trading activity and news information pieces decomposed, with the help of simple topic modeling techniques, into their “thematic” features. Using these methods, we are able to estimate and quantify the impacts of news on trading. We introduce network-based visualization techniques to represent the whole landscape of news information associated with a basket of stocks. The examination of the words that are representative of the topic distributions confirms that our method is able to extract the significant pieces of information influencing the stock market. Our results show that one of the most puzzling stylized fact in financial economies, namely that at certain times trading volumes appear to be “abnormally large,” can be explained by the flow of news. In this sense, our results prove that there is no “excess trading,” if the news are genuinely novel and provide relevant financial information.

    Introduction

    Neoclassical financial economics based on the “efficient market hypothesis” (EMH) considers price movements as almost perfect instantaneous reactions to information flows. Thus, according to the EMH, price changes simply reflect exogenous news. Such news - of all possible types (geopolitical, environmental, social, financial, economic, etc.) - lead investors to continuously reassess their expectations of the cash flows that firms’ investment projects could generate in the future. These reassessments are translated into readjusted demand/supply functions, which then push prices up or down, depending on the net imbalance between demand and supply, towards a fundamental value. As a consequence, observed prices are considered the best embodiments of the present value of future cash flows. In this view, market movements are purely exogenous without any internal feedback loops. In particular, the most extreme losses occurring during crashes are considered to be solely triggered exogenously.

  • Emergence of power laws with different power-law exponents from reversal quasi-symmetry and Gibrat’s law

    Abstract

    To explore the emergence of power laws in social and economic phenomena, the authors discuss the mechanism whereby reversal quasi-symmetry and Gibrat’s law lead to power laws with different powerlaw exponents. Reversal quasi-symmetry is invariance under the exchange of variables in the joint PDF (probability density function). Gibrat’s law means that the conditional PDF of the exchange rate of variables does not depend on the initial value. By employing empirical worldwide data for firm size, from categories such as plant assets K, the number of employees L, and sales Y in the same year, reversal quasi-symmetry, Gibrat’s laws, and power-law distributions were observed. We note that relations between power-law exponents and the parameter of reversal quasi-symmetry in the same year were first confirmed. Reversal quasi-symmetry not only of two variables but also of three variables was considered. The authors claim the following. There is a plane in 3-dimensional space (log K, log L, log Y ) with respect to which the joint PDF PJ (K, L, Y ) is invariant under the exchange of variables. The plane accurately fits empirical data (K, L, Y ) that follow power-law distributions. This plane is known as the Cobb-Douglas production function, Y = AKαLβ which is frequently hypothesized in economics.

    Introduction

    In various phase transitions, it has been universally observed that physical quantities near critical points obey power laws. For instance, in magnetic substances, the specific heat, magnetic dipole density, and magnetic susceptibility follow power laws of heat or magnetic flux. We also know that the cluster-size distribution of the spin follows power laws. Using renormalization group methods realizes these conformations to power law as critical phenomena of phase transitions [1].

  • A New Method for Measuring Tail Exponents of Firm Size Distributions

    Abstract

    We propose a new method for estimating the power-law exponents of firm size variables. Our focus is on how to empirically identify a range in which a firm size variable follows a power-law distribution. As is well known, a firm size variable follows a power-law distribution only beyond some threshold. On the other hand, in almost all empirical exercises, the right end part of a distribution deviates from a power-law due to finite size effect. We modify the method proposed by Malevergne et al. (2011) so that we can identify both of the lower and the upper thresholds and then estimate the power-law exponent using observations only in the range defined by the two thresholds. We apply this new method to various firm size variables, including annual sales, the number of workers, and tangible fixed assets for firms in more than thirty countries.

    Introduction

    Power-law distributions are frequently observed in social phenomena (e.g., Pareto
    (1897); Newman (2005); Clauset et al. (2009)). One of the most famous examples
    in Economics is the fact that personal income follows a power-law, which was
    first found by Pareto (1897) about a century ago, and thus referred to as Pareto
    distribution. Specifically, the probability that personal income x is above x0 is
    given by

    P>(x) ∝ x −µ   for x > x0

    where µ is referred to as a Pareto exponent or a power-law exponent.

  • 「メガ企業の生産関数の形状:分析手法と応用例」

    Abstract

    本稿では生産関数の形状を選択する手法を提案する。世の中には数人の従業員で営まれる零細企業から数十万人の従業員を擁する超巨大企業まで様々な規模の企業が存在する。どの規模の企業が何社存在するかを表したものが企業の規模分布であり,企業の規模を示す変数である Y (生産)と K(資本)と L(労働)のそれぞれはベキ分布とよばれる分布に従うことが知られている。本稿では,企業規模の分布 関数 と生産 関数 という 2 つの関数の間に存在する関係に注目し,それを手がかりとして生産関数の形状を特定するという手法を提案する。具体的には,KL についてデータから観察された分布の関数形をもとにして,仮に生産関数がある形状をとる場合に得られるであろう Y の分布関数を導出し,データから観察される Y の分布関数と比較する。日本を含む 25 カ国にこの手法を適用した結果,大半の国や産業において,YKL の分布と整合的なのはコブダグラス型であることがわかった。また,Y の分布の裾を形成する企業,つまり巨大企業では,KL の投入量が突出して大きいために Y も突出して大きい傾向がある。一方,全要素生産性が突出して高くそれが原因で Y が突出して大きいという傾向は認められない。

    Introduction

    企業の生産関数の形状としてはコブダグラス型やレオンチェフ型など様々な形状がこれまで提案されており,ミクロやマクロの研究者によって広く用いられている。例えば,マクロの生産性に関する研究では,コブダグラス生産関数が広く用いられており,そこから全要素生産性を推計することが行われている。しかし,生産 Y と資本 K と雇用 L の関係をコブダグラス型という特定の関数形で表現できるのはなぜか。どういう場合にそれが適切なのか。そうした点にまで踏み込んで検討する研究は限られている。多くの実証研究では,いくつかの生産関数の形状を試してみて,回帰の当てはまりの良さを基準に選択するという便宜的な取り扱いがなされている。

  • 「ネットオークション価格はアンフェアか?」

    Abstract

    価格はなぜ硬直的なのか。Arthur Okun は,需要の増加時に価格を引き上げることを顧客はアンフェアとみるので,顧客の怒りを買うことを恐れる企業や商店は価格を上げないと説明した。例えば,雪の日にシャベルの需要が高まることに乗じて値札を付け替える行為はアンフェアである。本稿では,このフェアネス仮説がネットオークション市場にも当てはまるか否かを検証するため,2009 年の新型インフルエンザ騒動時におけるヤフーオークション市場でのマスク価格の変化を分析した。マスクの落札率(落札件数を出品件数で除したもの)は 5 月初と 8 月後半に 8 割超の水準まで上昇しており,その時期に需要が集中していたことがわかる。前者は日本で最初の「感染の疑い」事例が出た時期であり,後者は本格的な流行期入りを政府が宣言した時期である。5 月の局面では,売り手は「開始」価格(入札を開始する価格)と「即決」価格(その価格で入札すればセリを経ずに落札できる価格)の両方を引き上げる行動をとった。特に,即決価格は開始価格と比べても大幅に引き上げられており,落札価格を高めに誘導する意図があったとみられる。一方,8 月の局面では,開始価格の小幅な引き上げは見られたものの即決価格は引き上げられていない。5 月と 8 月の違いは売り手の属性の違いに起因しており,5月の局面では売り手は主として個人であり,8 月の局面では主として企業であった。企業は買い手の評判を意識するため,需要の増加に乗じて価格を引き上げることはしなかったと解釈できる。Okun は,売り手と買い手が長期的な関係をもつ顧客市場(customermarkets)と,そうした関係のないオークション市場(auction markets)を区別することの重要性を強調し,フェアネス仮説は前者にだけ当てはまると主張した。本稿の分析結果は,ネットオークション市場はフェアネスの観点からは顧客市場に近い性質をもつことを示している。

    Introduction

    一橋大学物価研究センターが 2008 年春に行った企業を対象としたアンケート調査によると,需要やコストの変動に対して直ちに出荷価格を変更するかという問いに対して 90%が変更しないと回答している。ミクロ経済学では需要曲線または供給曲線がシフトすると均衡は新しい交点に移り,それに伴って価格は直ちに変わると教える。しかし実際には,企業を取り巻く需要やコストの環境が変化しても企業は即座には価格を変更しないのである。これは価格の硬直性または粘着性とよばれる現象である。価格硬直性はマクロ経済学の根幹を成す概念であり,価格が瞬時には調整されないがゆえに失業や設備稼働率の変動が生じる。

  • On the Evolution of the House Price Distribution

    Abstract

    Is the cross-sectional distribution of house prices close to a (log)normal distribution, as is often assumed in empirical studies on house price indexes? How does the distribution evolve over time? To address these questions, we investigate the cross-sectional distribution of house prices in the Greater Tokyo Area. We find that house prices (Pi) are distributed with much fatter tails than a lognormal distribution and that the tail is quite close to that of a power-law distribution. We also find that house sizes (Si) follow an exponential distribution. These findings imply that size-adjusted house prices, defined by lnPi − aSi, should be normally distributed. We find that this is indeed the case for most of the sample period, but not the bubble era, during which the price distribution has a fat upper tail even after adjusting for size. The bubble was concentrated in particular areas in Tokyo, and this is the source of the fat upper tail.

    Introduction

    Researchers on house prices typically start their analysis by producing a time series of the mean of prices across different housing units in a particular region by, for example, running a hedonic or repeat-sales regression. In this paper, we pursue an alternative research strategy: we look at the entire distribution of house prices across housing units in a particular region at a particular point of time and then investigate the evolution of such cross-sectional distribution over time. We seek to describe price dynamics in the housing market not merely by changes in the mean but by changes in some key parameters that fully characterize the entire cross-sectional price distribution.

  • Closely Competing Firms and Price Adjustment: Some Findings from an Online Marketplace

    Abstract

    We investigate retailers’ price setting behavior using a unique dataset containing by-the-second records of prices offered by closely competing retailers on a major Japanese price comparison website. First, we find that, when the average price of a product across retailers falls rapidly, the frequency of price adjustments increases, and the size of price adjustments becomes larger. Second, we find positive autocorrelation in the frequency of price adjustments, implying that there tends to be clustering where price adjustments occur in succession. In contrast, there is no such autocorrelation in the size of price adjustments. These two findings indicate that the behavior of competing retailers is characterized by state-dependent pricing rather than time-dependent pricing.

    Introduction

    Since the seminal study by Bils and Klenow (2004), there has been extensive research on price stickiness using micro price data. One vein of research along these lines concentrates on price adjustment events and examines the frequency with which such events occur. An important finding of such studies is that price adjustment events occur quite frequently. Using raw data of the U.S. consumer price index (CPI), Bils and Klenow (2004) report that the median frequency of price adjustments is 4.3 months. Using the same U.S. CPI raw data, Nakamura and Steinsson (2008) report that when sales are excluded, prices are adjusted with a frequency of once every 8 to 11 months. Similar studies focusing on other countries include Dhyne et al. (2006) for the euro area and Higo and Saita (2007) for Japan.

  • On the Evolution of the House Price Distribution”

    Abstract

    Is the cross-sectional distribution of house prices close to a (log)normal distribution, as is often assumed in empirical studies on house price indexes? How does it evolve over time? How does it look like during the period of housing bubbles? To address these questions, we investigate the cross-secional distribution of house prices in the Greater Tokyo Area. Using a unique dataset containing individual listings in a widely circulated real estate advertisement magazine in 1986 to 2009, we find the following. First, the house price, Pit, is characterized by a distribution with much fatter tails than a lognormal distribution, and the tail part is quite close to that of a power-law or a Pareto distribution. Second, the size of a house, Si, follows an exponential distribution. These two findings about the distributions of Pit and Si imply that the the price distribution conditional on the house size, i.e., Pr(Pit | Si), follows a lognormal distribution. We confirm this by showing that size adjusted prices indeed follow a lognormal distribution, except for periods of the housing bubble in Tokyo when the price distribution remains asymmetric and skewed to the right even after controlling for the size effect.

    Introduction

    Researches on house prices typically start by producing a time series of the mean of prices across housing units in a particular region by, for example, running a hedonic regression or by adopting a repeat-sales method. In this paper, we propose an alternative research strategy: we look at the entire distribution of house prices across housing units in a particular region at a particular point of time, and then investigate the evolution of such cross sectional distributions over time. We seek to describe price dynamics in a housing market not merely by changes in the mean but by changes in some key parameters that fully characterize the entire cross sectional price distribution. Our ultimate goal is to produce a new housing price index based on these key parameters.

  • Sales Distribution of Consumer Electronics

    Abstract

    Using the uniform most powerful unbiased test, we observed the sales distribution of consumer electronics in Japan on a daily basis and report that it follows both a lognormal distribution and a power-law distribution and depends on the state of the market. We show that these switches occur quite often. The underlying sales dynamics found between both periods nicely matched a multiplicative process. However, even though the multiplicative term in the process displays a sizedependent relationship when a steady lognormal distribution holds, it shows a size-independent relationship when the power-law distribution holds. This difference in the underlying dynamics is responsible for the difference in the two observed distributions.

    Introduction

    Since Pareto pointed out in 1896 that the distribution of income exhibits a heavy-tailed structure [1], many papers has argued that such distributions can be found in a wide range of empirical data that describe not only economic phenomena but also biological, physical, ecological, sociological, and various man-made phenomena [2]. The list of the measurements of quantities whose distributions have been conjectured to obey such distributions includes firm sizes [3], city populations [4], frequency of unique words in a given novel (5-6), the biological genera by number of species [7], scientists by number of published papers [8], web files transmitted over the internet [9], book sales [10], and product market shares (11]. Along with these reports the argument over the exact distribution, whether these heavy-tailed distributions obey a lognormal distribution or a power-law distribution, has been repeated over many years as well [2]. In this paper we use the statistical techniques developed in this literature to clarify the sales distribution of consumer electronics.

  • Competing Firms and Price Adjustment: Evidence from an Online Marketplace

    Abstract

    We investigate retailers’ price setting behavior, and in particular strategic interaction between retailers, using a unique dataset containing by-the-second records of prices offered by competing retailers on a major Japanese price comparison website. First, we find that, when the average price of a product across retailers falls rapidly, the frequency of price adjustments is high, while the size of adjustments remains largely unchanged. Second, we find a positive autocorrelation in the frequency of price adjustments, implying that there tends to be a clustering where once a price adjustment occurs, such adjustments occur in succession. In contrast, there is no such autocorrelation in the size of price adjustments. These two findings indicate that the behavior of competing retailers is characterized by state-dependent pricing, rather than time-dependent pricing, especially when prices fall rapidly, and that strategic complementarities play an important role when retailers decide to adjust (or not to adjust) their prices.

    Introduction

    Since Bils and Klenow’s (2004) seminal study, there has been extensive research on price stickiness using micro price data. One vein of research along these lines concentrates on price adjustment events and examines the frequency with which such events occur. An important finding of such studies is that price adjustment events occur quite frequently. For example, using raw data of the U.S. consumer price index (CPI), Bils and Klenow (2004) report that the median frequency of price adjustments is 4.3 months. Using the same U.S. CPI raw data, Nakamura and Steinsson (2008) report that when sales are excluded, prices are adjusted with a frequency of once every 8 to 11 months. Similar studies focusing on other countries include Dhyne et al. (2006) for the euro area and Higo and Saita (2007) for Japan.

  • 「価格の実質硬直性:計測手法と応用例」

    Abstract

    本稿では,各企業が互いの価格設定行動を模倣することに伴って生じる価格の粘着性を自己相関係数により計測する方法を提案するとともに,オンライン市場のデータを用いてその度合いを計測する。Bils and Klenow (2004) 以降の研究では,価格改定から次の価格改定までの経過時間の平均値をもって価格粘着性の推計値としてきたが,本稿で分析対象とした液晶テレビではその値は 1.9 日である。これに対して自己相関係数を用いた計測によれば,価格改定イベントは最大 6 日間の過去依存性をもつ。つまり,価格調整の完了までに各店舗は平均 3 回の改定を行っている。店舗間の模倣行動の結果,1 回あたりの価格改定幅が小さくなり,そのため価格調整の完了に要する時間が長くなっていると考えられる。これまでの研究は,価格改定イベントの過去依存性を無視してきたため,価格粘着性を過小評価していた可能性がある。

    Introduction

    Bils and Klenow (2004) 以降,ミクロ価格データを用いて価格粘着性を計測する研究が活発に行われている。一連の研究では,価格が時々刻々,連続的に変化しているわけではなく,数週間あるいは数ヶ月に一度というように infrequent に変更されている点に注目し,そうした価格改定イベントの起こる頻度を調べるという手法が用いられている。そこでの主要な発見は,価格改定イベントはかなり頻繁に起きているということである。例えば,Bils and Klenow (2004) は,米国 CPIの原データを用いて改定頻度は 4.3ヶ月に一度と報告している。Nakamura and Steinsson (2008) は同じく米国 CPI の原データを用いて,特売を考慮すれば改定頻度は 8-11ヶ月に一度と推計している。欧州諸国に関する Dhyne et al (2006) の研究や,日本に関する Higoand Saita (2007) の研究でも,数ヶ月に一度程度の頻度で価格改定が行われるとの結果が報告されている。

  • Real Rigidities: Evidence from an Online Marketplace

    Abstract

    Are prices sticky due to the presence of strategic complementarity in price setting? If so, to what extent? To address these questions, we investigate retailers’ price setting behavior, and in particular strategic interaction between retailers, using a unique dataset containing by-the-second records of prices offered by retailers on a major Japanese price comparison website. We focus on fluctuations in the lowest price among retailers, rather than the average price, examining how quickly the lowest price is updated in response to changes in marginal costs. First, we find that, when the lowest price falls rapidly, the frequency of changes in the lowest price is high, while the size of downward price adjustments remains largely unchanged. Second, we find a positive autocorrelation in the frequency of changes in the lowest price, and that there tends to be a clustering where once a change in the lowest price occurs, such changes occur in succession. In contrast, there is no such autocorrelation in the size of changes in the lowest price. These findings suggest that retailers imitate each other when deciding to adjust (or not to adjust) their prices, and that the extensive margin plays a much more important role than the intensive margin in such strategic complementarity in price setting.

    Introduction

    Since Bils and Klenow’s (2004) seminal study, there has been extensive research on price stickiness using micro price data. One vein of research along these lines concentrates on price adjustment events and examines the frequency with which such events occur. An important finding of such studies is that price adjustment events occur quite frequently. For example, using raw data of the U.S. consumer price index (CPI), Bils and Klenow (2004) report that the median frequency of price adjustments is 4.3 months. Using the same U.S. CPI raw data, Nakamura and Steinsson (2008) report that when sales are excluded, prices are adjusted with a frequency of once every 8 to 11 months. Similar studies focusing on other countries include Dhyne et al. (2006) for the euro area and Higo and Saita (2007) for Japan.

  • A statistical analysis of product prices in online markets

    Abstract

    We empirically investigate fluctuations in product prices in online markets by using a tick-bytick price data collected from a Japanese price comparison site, and find some similarities and differences between product and asset prices. The average price of a product across e-retailers behaves almost like a random walk, although the probability of price increase/decrease is higher conditional on the multiple events of price increase/decrease. This is quite similar to the property reported by previous studies about asset prices. However, we fail to find a long memory property in the volatility of product price changes. Also, we find that the price change distribution for product prices is close to an exponential distribution, rather than a power law distribution. These two findings are in a sharp contrast with the previous results regarding asset prices. We propose an interpretation that these differences may stem from the absence of speculative activities in product markets; namely, e-retailers seldom repeat buy and sell of a product, unlike traders in asset markets.

    Introduction

    In recent years, price comparison sites have attracted the attention of internet users. In these sites, e-retailers update their selling prices every minute, or even every second. Those who visit the sites can compare prices quoted by different e-retailers, thus finding the cheapest one without paying any search costs. E-retailers seek to attract as many customers as possible by offering good prices to them, and this sometimes results in a price war among e-retailers.

  • 「オンライン市場における価格変動の統計的分析」

    Abstract

    本稿では,価格比較サイト「価格.com」において仮想店舗が提示する価格と,それに対する消費者のクリック行動を秒単位で記録した新しいデータセットを用いて,店舗の価格設定行動と消費者の購買行動を分析した。本稿の主要なファインディングは以下のとおりである。第 1 に,店舗の価格順位(その店舗の価格がその時点において何番目に安いか)が 1 位でない場合でもクリックが発生する確率はゼロではない。ただし,価格順位が下がるとクリック確率は下がり,価格順位とクリック確率(の対数値)の間には線形に近い関係が存在する。この線形の関係は,消費者に店舗の好みがあり,消費者が自分の好みの店舗群の中で最も安い価格を提示する店舗を選択していることを示唆している。第 2 に,各店舗が提示する価格の平均値は,ドリフト付きのランダムウォークに従っている。これは価格変動の大部分が店舗が保有する在庫のランダムな増減によって引き起こされていることを示している。ただし,価格が急落する局面などではランダムウォークからの乖離がみられ,各店舗の価格づけの戦略的補完性が値崩れを招いている可能性を示唆している。

    Introduction

    インターネットの普及が我々の生活を根底から変えるのではないかという予測は急速に支持を失いつつあるようにみえる。ネット社会において消費者や企業の行動が変化してきたし,これからも変化を続けるのは事実であるがそれは普及の当初に考えられていたほどではなかったということであろう。

PAGE TOP