dc.description.abstract |
Interactions between the foreign exchange market and the stock market of a country are considered to be an important internal force of the markets in a financially liberalized environment. If causal relationship from a market to the other is not detected, then informational efficiency exists in the other whereas existence of causality implies that hedging of exposure to one market by taking position in the other market will be effective. The temporal relationship between the forex market and the stock market of developing and developed countries has been studied, especially after the East Asian financial crisis of 1997–98, using various methods like cross-correlation, cross-spectrum, and error correction model, but these methods identify only linear relations. A statistically rigorous approach to the detection of interdependence, including non-linear dynamic relationships, between time series is provided by tools defined using the information theoretic concept of entropy. Entropy is the amount of disorder in the system and also is the amount of information needed to predict the next measurement with a certain precision. The mutual information between two random variables X and Y with a joint probability mass function p(x,y) and marginal mass functions p(x) and p(y), is defined as the relative entropy between the joint distribution p(x,y) and the product distribution p(x)*p(y). Mutual information is the reduction in the uncertainty of X due to the knowledge of Y and vice versa. Since mutual information measures the deviation from independence of the variables, it has been proposed as a tool to measure the relationship between financial market segments. However, mutual information is a symmetric measure and does not contain either dynamic information or directional sense. Even time delayed mutual information does not distinguish information actually exchanged from shared information due to a common input signal or history and therefore does not quantify the actual overlap of the information content of two variables. Another information theoretic measure called transfer entropy has been introduced by Thomas Schreiber (2000) to study the relationship between dynamic systems; the concept has also been applied by some authors to study the causal structure between financial time series. |
|