Causality detection based on information-theoretic approaches in time series analysis
Hlaváčková-Schindler, Katarina; Paluš, Milan; Vejmelka, Martin and Bhattacharya, Joydeep. 2007. Causality detection based on information-theoretic approaches in time series analysis. Physics Reports, 441(1), pp. 1-46. ISSN 03701573 [Article]
No full text availableAbstract or Description
Synchronization, a basic nonlinear phenomenon, is widely observed in diverse complex systems studied in physical, biological and other natural sciences, as well as in social sciences, economy and finance. While studying such complex systems, it is important not only to detect synchronized states, but also to identify causal relationships (i.e. who drives whom) between concerned (sub) systems. The knowledge of information-theoretic measures (i.e. mutual information, conditional entropy) is essential for the analysis of information flow between two systems or between constituent subsystems of a complex system. However, the estimation of these measures from a set of finite samples is not trivial. The current extensive literatures on entropy and mutual information estimation provides a wide variety of approaches, from approximation-statistical, studying rate of convergence or consistency of an estimator for a general distribution, over learning algorithms operating on partitioned data space to heuristical approaches. The aim of this paper is to provide a detailed overview of information theoretic approaches for measuring causal influence in multivariate time series and to focus on diverse approaches to the entropy and mutual information estimation.
Item Type: |
Article |
||||
Identification Number (DOI): |
|||||
Departments, Centres and Research Units: |
|||||
Dates: |
|
||||
Item ID: |
4954 |
||||
Date Deposited: |
21 Feb 2011 14:03 |
||||
Last Modified: |
30 Jun 2017 13:22 |
||||
Peer Reviewed: |
Yes, this version has been peer-reviewed. |
||||
URI: |
Edit Record (login required) |