Ryohei Hisano ワーキングペーパー一覧に戻る

  • Gaussian Hierarchical Latent Dirichlet Allocation: Bringing Polysemy Back

    Abstract

    Topic models are widely used to discover the latent representation of a set of documents. The two canonical models are latent Dirichlet allocation, and Gaussian latent Dirichlet allocation, where the former uses multinomial distributions over words, and the latter uses multivariate Gaussian distributions over pre-trained word embedding vectors as the latent topic representations, respectively. Compared with latent Dirichlet allocation, Gaussian latent Dirichlet allocation is limited in the sense that it does not capture the polysemy of a word such as “bank.” In this paper, we show that Gaussian latent Dirichlet allocation could recover the ability to capture polysemy by introducing a hierarchical structure in the set of topics that the model can use to represent a given document. Our Gaussian hierarchical latent Dirichlet allocation significantly improves polysemy detection compared with Gaussian-based models and provides more parsimonious topic representations compared with hierarchical latent Dirichlet allocation. Our extensive quantitative experiments show that our model also achieves better topic coherence and held-out document predictive accuracy over a wide range of corpus and word embedding vectors.

    Introduction

    Topic models are widely used to identify the latent representation of a set of documents. Since latent Dirichlet allocation (LDA) [4] was introduced, topic models have been used in a wide variety of applications. Recent work includes the analysis of legislative text [24], detection of malicious websites [33], and analysis of the narratives of dermatological disease [23]. The modular structure of LDA, and graphical models in general [17], has made it possible to create various extensions to the plain vanilla version. Significant works include the correlated topic model (CTM), which incorporates the correlation among topics that co-occur in a document [6]; hierarchical LDA (hLDA), which jointly learns the underlying topic and the hierarchical relational structure among topics [3]; and the dynamic topic model, which models the time evolution of topics [7].

     

     

    WP017

  • Predicting Adverse Media Risk using a Heterogeneous Information Network

    Abstract

    The media plays a central role in monitoring powerful institutions and identifying any activities harmful to the public interest. In the investing sphere constituted of 46,583 officially listed domestic firms on the stock exchanges worldwide, there is a growing interest “to do the right thing”, i.e., to put pressure on companies to improve their environmental, social and government (ESG) practices. However, how to overcome the sparsity of ESG data from non-reporting firms, and how to identify the relevant information in the annual reports of this large universe? Here, we construct a vast heterogeneous information network that covers the necessary information surrounding each firm, which is assembled using seven professionally curated datasets and two open datasets, resulting in about 50 million nodes and 400 million edges in total. Exploiting this heterogeneous information network, we propose a model that can learn from past adverse media coverage patterns and predict the occurrence of future adverse media coverage events on the whole universe of firms. Our approach is tested using the adverse media coverage data of more than 35,000 firms worldwide from January 2012 to May 2018. Comparing with state-of-the-art methods with and without the network, we show that the predictive accuracy is substantially improved when using the heterogeneous information network. This work suggests new ways to consolidate the diffuse information contained in big data in order to monitor dominant institutions on a global scale for more socially responsible investment, better risk management, and the surveillance of powerful institutions.

    Introduction

    Adverse media coverage sometimes leads to fatal results for a company. In the press release sent out by Cambridge Analytica on May 2, 2018, the company wrote that “Cambridge Analytica has been the subject of numerous unfounded accusations, ... media coverage has driven away virtually all of the company’s customers and suppliers” [5]. This is just one recent example highlighting the impact of adverse media coverage on a firm’s fate. In another example, the impact of adverse media coverage on Swiss bank profits was estimated to be 3.35 times the median annual net profit of small banks and 0.73 times that of large banks [3]. These numbers are significant, indicating how adverse media coverage can cause huge damage to a bank. Moreover, a new factor, priced as the “no media coverage premium” [10], has been identified to help explain financial returns: stocks with no media coverage earn higher returns than stocks with high media coverage. Within the rational-agent framework, this may result from impediments to trade and/or from lower investor recognition leading to lower diversification [10]. Another mechanism could be associated with the fact that most of the coverage of mass media is negative [15, 23].

     

    WP004

  • The Gradual Evolution of Buyer-Seller Networks and Their Role in Aggregate Fluctuations

    Abstract

    Buyer–seller relationships among firms can be regarded as a longitudinal network in which the connectivity pattern evolves as each firm receives productivity shocks. Based on a data set describing the evolution of buyer–seller links among 55,608 firms over a decade and structural equation modeling, we find some evidence that interfirm networks evolve reflecting a firm’s local decisions to mitigate adverse effects from neighbor firms through interfirm linkage, while enjoying positive effects from them. As a result, link renewal tends to have a positive impact on the growth rates of firms. We also investigate the role of networks in aggregate fluctuations.

    Introduction

    The interfirm buyer–seller network is important from both the macroeconomic and the microeconomic perspectives. From the macroeconomic perspective, this network represents a form of interconnectedness in an economy that allows firm-level idiosyncratic shocks to be propagated to other firms. Previous studies has suggested that this propagation mechanism interferes with the averaging-out process of shocks, and possibly has an impact on macroeconomic variables such as aggregate fluctuations (Acemoglu, Ozdaglar and Tahbaz-Salehi (2013), Acemoglu et al. (2012), Carvalho (2014), Carvalho (2007), Shea (2002), Foerster, Sarte and Watson (2011) and Malysheva and Sarte (2011)). From the microeconomic perspective, a network at a particular point of time is a result of each firms link renewal decisions in order to avoid (or share) negative (or positive) shocks with its neighboring firms. These two views of a network is related by the fact that both concerns propagation of shocks. The former view stresses the fact that idiosyncratic shocks propagates through a static network while the latter provides a more dynamic view where firms have the choice of renewing its link structure in order to share or avoid shocks. The question here is that it is not clear how the latter view affects the former view. Does link renewal increase aggregate fluctuation due to firms forming new links that conveys positive shocks or does it decrease aggregate fluctuation due to firms severing links that conveys negative shocks or does it have a different effect?

  • High quality topic extraction from business news explains abnormal financial market volatility

    Abstract

    Understanding the mutual relationships between information flows and social activity in society today is one of the cornerstones of the social sciences. In financial economics, the key issue in this regard is understanding and quantifying how news of all possible types (geopolitical, environmental, social, financial, economic, etc.) affect trading and the pricing of firms in organized stock markets. In this paper we seek to address this issue by performing an analysis of more than 24 million news records provided by Thompson Reuters and of their relationship with trading activity for 205 major stocks in the S&P US stock index. We show that the whole landscape of news that affect stock price movements can be automatically summarized via simple regularized regressions between trading activity and news information pieces decomposed, with the help of simple topic modeling techniques, into their “thematic” features. Using these methods, we are able to estimate and quantify the impacts of news on trading. We introduce network-based visualization techniques to represent the whole landscape of news information associated with a basket of stocks. The examination of the words that are representative of the topic distributions confirms that our method is able to extract the significant pieces of information influencing the stock market. Our results show that one of the most puzzling stylized fact in financial economies, namely that at certain times trading volumes appear to be “abnormally large,” can be explained by the flow of news. In this sense, our results prove that there is no “excess trading,” if the news are genuinely novel and provide relevant financial information.

    Introduction

    Neoclassical financial economics based on the “efficient market hypothesis” (EMH) considers price movements as almost perfect instantaneous reactions to information flows. Thus, according to the EMH, price changes simply reflect exogenous news. Such news - of all possible types (geopolitical, environmental, social, financial, economic, etc.) - lead investors to continuously reassess their expectations of the cash flows that firms’ investment projects could generate in the future. These reassessments are translated into readjusted demand/supply functions, which then push prices up or down, depending on the net imbalance between demand and supply, towards a fundamental value. As a consequence, observed prices are considered the best embodiments of the present value of future cash flows. In this view, market movements are purely exogenous without any internal feedback loops. In particular, the most extreme losses occurring during crashes are considered to be solely triggered exogenously.

  • Sales Distribution of Consumer Electronics

    Abstract

    Using the uniform most powerful unbiased test, we observed the sales distribution of consumer electronics in Japan on a daily basis and report that it follows both a lognormal distribution and a power-law distribution and depends on the state of the market. We show that these switches occur quite often. The underlying sales dynamics found between both periods nicely matched a multiplicative process. However, even though the multiplicative term in the process displays a sizedependent relationship when a steady lognormal distribution holds, it shows a size-independent relationship when the power-law distribution holds. This difference in the underlying dynamics is responsible for the difference in the two observed distributions.

    Introduction

    Since Pareto pointed out in 1896 that the distribution of income exhibits a heavy-tailed structure [1], many papers has argued that such distributions can be found in a wide range of empirical data that describe not only economic phenomena but also biological, physical, ecological, sociological, and various man-made phenomena [2]. The list of the measurements of quantities whose distributions have been conjectured to obey such distributions includes firm sizes [3], city populations [4], frequency of unique words in a given novel (5-6), the biological genera by number of species [7], scientists by number of published papers [8], web files transmitted over the internet [9], book sales [10], and product market shares (11]. Along with these reports the argument over the exact distribution, whether these heavy-tailed distributions obey a lognormal distribution or a power-law distribution, has been repeated over many years as well [2]. In this paper we use the statistical techniques developed in this literature to clarify the sales distribution of consumer electronics.

PAGE TOP