Proceedings Volume 6601

Noise and Stochastics in Complex Systems and Finance

cover
Proceedings Volume 6601

Noise and Stochastics in Complex Systems and Finance

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 12 June 2007
Contents: 7 Sessions, 30 Papers, 0 Presentations
Conference: SPIE Fourth International Symposium on Fluctuations and Noise 2007
Volume Number: 6601

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 6601
  • Structure and Communities in Networks
  • Network Structure and Function
  • Financial Fluctuations
  • Interacting Economic Systems
  • Economics and Networks
  • Poster Session
Front Matter: Volume 6601
icon_mobile_dropdown
Front Matter: Volume 6601
This PDF file contains the front matter associated with SPIE Proceedings Volume 6601, including the Title Page, Copyright information, Table of Contents, and the Conference Committee listing.
Structure and Communities in Networks
icon_mobile_dropdown
Community dynamics in social networks
Gergely Palla, Albert-László Barabási, Tamás Vicsek
We study the statistical properties of community dynamics in large social networks, where the evolving communities are obtained from subsequent snapshots of the modular structure. Such cohesive groups of people can grow by recruiting new members, or contract by loosing members; two (or more) groups may merge into a single community, while a large enough social group can split into several smaller ones; new communities are born and old ones may disappear. We find significant difference between the behaviour of smaller collaborative or friendship circles and larger communities, eg. institutions. Social groups containing only a few members persist longer on average when the fluctuations of the members is small. In contrast, we find that the condition for stability for large communities is continuous changes in their membership, allowing for the possibility that after some time practically all members are exchanged.
Origin of scaling on networks, structural inhomogeneity, and preference in dynamical behaviour
Bernard Kujawski, Bosiljka Tadić, G. J. Rodgers
We examine the fluctuation properties of packet traffic on scale-free networks and random graphs using two different dynamical rules for moving packets; random diffusion and a locally navigated diffusive motion with preferred edges. We find that preferential behaviour in either the topology or in the dynamics leads to the scaling of fluctuations of the number of packets passing nodes and the number of packets flowing along edges, respectively. We show that the absence of any preference results in the absence of scaling, and when scaling occurs it is non-universal with the scaling exponents depending on the acquisition time window, the network structure and the diffusion rule.
Quality functions in community detection
Community structure represents the local organization of complex networks and the single most important feature to extract functional relationships between nodes. In the last years, the problem of community detection has been reformulated in terms of the optimization of a function, the Newman-Girvan modularity, that is supposed to express the quality of the partitions of a network into communities. Starting from a recent critical survey on modularity optimization, pointing out the existence of a resolution limit that poses severe limits to its applicability, we discuss the general issue of the use of quality functions in community detection. Our main conclusion is that quality functions are useful to compare partitions with the same number of modules, whereas the comparison of partitions with different numbers of modules is not straightforward and may lead to ambiguities.
Structure of LiveJournal social network
The structural properties of LiveJournal social network have been studied. The power-law region in in- and out- degrees distributions has been revealed and analyzed. A large highly isolated social cluster corresponding to the Russian-speaking users was discovered and treated separately and peculiarities of its structure were discussed. The opinion dynamics simulation on LiveJournal network was conducted and stable states with multiple consensuses were found reflecting the impact of the social network geometry on the opinion formation.
The role of edge weights in social networks: modelling structure and dynamics
The structure of social networks influences dynamic processes of human interaction and communication, such as opinion formation and spreading of information or infectious diseases. To facilitate simulation studies of such processes, we have developed a weighted network model to resemble the structure of real social networks, in particular taking into account recent observations on weight-topology correlations. The model iterates on a fixed size network, reaching a steady state through processes of weighted local searches, global random attachment, and random deletion of nodes. There are essentially two parameters which can be used to tune network properties. The generated networks display community structure, with strong internal links and weak links connecting the communities. Similarly to empirical observations, strong ties correlate with overlapping neighbourhoods, and under edge removal, the network becomes fragmented faster when weak ties are removed first. As an example of the effects that such structural properties have on dynamic processes, we present early results from studies of social dynamics describing the competition of two non-excluding opinions in a society, showing that the weighted community structure slows down the dynamics as compared to randomized references.
Network Structure and Function
icon_mobile_dropdown
Emerging behavior in online bidding
I. Yang, B. Kahng
With the advancement in the information age, people are using electronic media more frequently for commercial transactions. Online auction is a prototypical example. In online auctions, bidders or agents can easily participate into many different transactions simultaneously and the number of bidders participating in a given transaction is not bounded. Owing to such benefits, distinct features emerge compared with the traditional auctions, which are reviewed here. There form a number of bidders who are responsible for a significant fraction of the total bidding activities due to the online characteristics. We show that they exert strong influence on the final prices in distinct auctions. This domination of online auctions by such a unusually active minority may be a generic feature of all online mercantile processes. On the other hand, the bidding process in the auction systems is described by using a master equation with the transition probability determined with empirical data. We show that the bidding at the last moment is a rational and effective strategy to win in an eBay auction. Finally, the bidding pattern emerging from the interactions between individual bidders or items is analyzed in the perspective of the graph theory.
Financial Fluctuations
icon_mobile_dropdown
Measuring volatility and correlations with high-frequency data
We review applications, published in three separate papers, of a recently proposed method to estimate volatility and correlation when prices are observed at a high frequency rate. The method is based on Fourier analysis and does not require any data manipulation, leading to less noisy estimates than the traditional methodologies proposed so far.
The limit order book on different time scales
Financial markets can be described on several time scales. We use data from the limit order book of the London Stock Exchange (LSE) to compare how the fluctuation dominated microstructure crosses over to a more systematic global behavior.
A method for detecting complex correlation in time series
V. Alfi, A. Petri, L. Pietronero
We propose a new method for detecting complex correlations in time series of limited size. The method is derived by the Spitzer's identity and proves to work successfully on different model processes, including the ARCH process, in which pairs of variables are uncorrelated, but the three point correlation function is non zero. The application to financial data allows to discriminate among dependent and independent stock price returns where standard statistical analysis fails.
Modeling the Epps effect of cross correlations in asset prices
Bence Tóth, János Kertész
We review the decomposition method of stock return cross-correlations, presented previously for studying the dependence of the correlation coefficient on the resolution of data (Epps effect). Through a toy model of random walk/Brownian motion and memoryless renewal process (i.e. Poisson point process) of observation times we show that in case of analytical treatability, by decomposing the correlations we get the exact result for the frequency dependence. We also demonstrate that our approach produces reasonable fitting of the dependence of correlations on the data resolution in case of empirical data. Our results indicate that the Epps phenomenon is a product of the finite time decay of lagged correlations of high resolution data, which does not scale with activity. The characteristic time is due to a human time scale, the time needed to react to news.
Statistics of extreme values in time series with intermediate-term correlations
It will be discussed the statistics of the extreme values in time series characterized by finite-term correlations with non-exponential decay. Precisely, it will be considered the results of numerical analyses concerning the return intervals of extreme values of the fluctuations of resistance and defect-fraction displayed by a resistor with granular structure in a nonequilibrium stationary state. The resistance and defect-fraction are calculated as a function of time by Monte Carlo simulations using a resistor network approach. It will be shown that when the auto-correlation function of the fluctuations displays a non-exponential and non-power-law decay, the distribution of the return intervals of extreme values is a stretched exponential, with exponent largely independent of the threshold. Recently, a stretched exponential distribution of the return intervals of extreme values has been identified in long-term correlated time series by Bunde et al. (2003) and Altmann and Kantz (2005). Thus, the present results show that the stretched exponential distribution of the return intervals is not an exclusive feature of long-term correlated time series.
Statistics of extremes, traffic jams, and natural disasters
Analyzing probability distributions from water level time series and calculating the first passage time distributions gives the probability of firstly exceeding a given threshold corresponding to a flood or general disaster event. The method will be applied to the water level recordings of the Danube river from which since 100 years very accurate notes exist. The method is transferred to time series of traffic volumes interpreting traffic breakdowns as extreme events. Three different traffic situations can be distinguished: (a) Stable traffic flow where any fluctuations decay over time (b) metastable traffic flow where fluctuations neither decay nor grow and (c) unstable traffic flow where a breakdown can be expected for sure if the observation time is long enough. The traffic dynamics is translated into a first passage time distribution. This describes the distribution of time periods observing for the first time the formation of a traffic jam of a certain length or number of vehicles. The distribution contains a time lag, a maximum corresponding to a time period of a Brownian motion drift reaching the critical jam length, and a tail describing exceptional long waiting times for jam formation. The cumulative first passage time distribution can be interpreted as breakdown probability distribution. It outlines when reaching a breakdown a given probability in an assumed observation time. It leads directly to the probabilistic definition of the capacity as a traffic volume leading to an unstable traffic pattern with a given probability within a given observation time. This definition can substitute the existing definitions and opens the possibility to quantitatively describing the influence of traffic control systems on the capacity.
Avalanche correlation in power spectra
Roberto Eggenhöffner, Edvige Celasco, Marcello Celasco
Outstanding topic on noise phenomena is the occurrence of peaks in the wide frequency range from mHz to above MHz in the power spectra of many natural systems. Recently, the challenging interest has oriented to focus the spectral peaks superimposed to the 1/f noise. Until now, all existing theories failed to explain peaked spectra. Here we highlight the role of correlation among avalanches as the main source of the noise peaks observed. The present theory is based on first principle statistics of elementary events clustered in time-amplitude correlated avalanches. A spectral power master equation suitable to explain peaked noise spectra arising from avalanche correlations is achieved analytically. Excellent agreement with our experiments in superconductors and with experiments in Escherichia coli, in single DNA molecule and in single electron tunneling is reported. Our statistical model shows that avalanche correlation gives wide peaks in the power spectrum superimposed to the 1/f behavior with high slope, a typical signature of avalanche processes.
Effect of random failures on traffic in complex networks
We study the effect of random removal of nodes in networks on the maximum capability to deliver information in communication processes. Measuring the changes on the onset of congestion, we observe different behaviors depending on the network structure, governed by the distribution of the algorithmic betweenness (number of paths traversing a node given a communication protocol) of the nodes, and particularly by the node with the highest betweenness. We also compare the robustness of networks from a topological and dynamical point of view. We find that for certain values of traffic load, after suffering a random failure, the network can be physically connected but the nodes are unable to communicate due congestion. These results highlight the necessity to include dynamical considerations in studies about resilience of complex networks.
Interacting Economic Systems
icon_mobile_dropdown
Evolutionary and adaptive learning in complex markets: a brief summary
We briefly review some work on expectations and learning in complex markets, using the familiar demand-supply cobweb model. We discuss and combine two different approaches on learning. According to the adaptive learning approach, agents behave as econometricians using time series observations to form expectations, and update the parameters as more observations become available. This approach has become popular in macro. The second approach has an evolutionary flavor and is sometimes referred to as reinforcement learning. Agents employ different forecasting strategies and evaluate these strategies based upon a fitness measure, e.g. past realized profits. In this framework, boundedly rational agents switch between different, but fixed behavioral rules. This approach has become popular in finance. We combine evolutionary and adaptive learning to model complex markets and discuss whether this theory can match empirical facts and forecasting behavior in laboratory experiments with human subjects.
Cascades of failure and extinction in dynamically evolving complex systems
There is a rapidly growing literature on cascades in networks whose topology is fixed. This paper considers networks whose topology evolves over time. It extends the concept of 'robust yet fragile' to evolving networks. Such networks can be robust in the sense that the average fitness of the system rises over time. But they are also fragile: the proportion of extinction events which are very large increases.
Economics and Networks
icon_mobile_dropdown
Macro-economic models with non-zero dispersion
Masanao Aoki
Using two simple stochastic dynamic models, this paper demonstrates that the coe cient of variation of aggregate output, GDP, does not necessarily go to zero when the number of sectors or economic agents goes to infinity. This paper shows that this phenomenon, known as non-self averaging in physics, occurs in the two-parameter Poisson-Dirichlet models, and in certain balanced triangular urn models of growth. This implies that the standard microeconomic functions for aggregate outpu based on the representative agent models have little value, since these models do not provide us with better picture of the long-run behavior of the model. The paper also shows both models have a generalized Mittag-Le er density function, which has power-law tail.
Economic sector identification in a set of stocks traded at the New York Stock Exchange: a comparative analysis
C. Coronnello, M. Tumminello, F. Lillo, et al.
We review some methods recently used in the literature to detect the existence of a certain degree of common behavior of stock returns belonging to the same economic sector. Specifically, we discuss methods based on random matrix theory and hierarchical clustering techniques. We apply these methods to a set of stocks traded at the New York Stock Exchange. The investigated time series are recorded at a daily time horizon. All the considered methods are able to detect economic information and the presence of clusters characterized by the economic sector of stocks. However, different methodologies provide different information about the considered set. Our comparative analysis suggests that the application of just a single method could not be able to extract all the economic information present in the correlation coefficient matrix of a set of stocks.
The Italian Interbank Network: statistical properties and a simple model
We use the theory of complex networks in order to quantitatively characterize the structure of reciprocal expositions of Italian banks in the interbank money market market. We observe two main different strategies of banks: small banks tend to be the lender of the system, while large banks are borrowers. We propose a model to reproduce the main statistical features of this market. Moreover the network analysis allows us to investigate properties of robustness of this system.
Free zero-range processes on networks
L. Bogacz, Z. Burda, W. Janke, et al.
A free zero-range process (FRZP) is a simple stochastic process describing the dynamics of a gas of particles hopping between neighboring nodes of a network. We discuss three different cases of increasing complexity: (a) FZRP on a rigid geometry where the network is fixed during the process, (b) FZRP on a random graph chosen from a given ensemble of networks, (c) FZRP on a dynamical network whose topology continuously changes during the process in a way which depends on the current distribution of particles. The case (a) provides a very simple realization of the phenomenon of condensation which manifests as the appearance of a condensate of particles on the node with maximal degree. A particularly interesting example is the condensation on scalefree networks. Here we will model it by introducing a single-site inhomogeneity to a k-regular network. This simplified situation can be easily treated analytically and, on the other hand, shows quantitatively the same behavior as in the case of scale-free networks. The case (b) is very interesting since the averaging over typical ensembles of graphs acts as a kind of homogenization of the system which makes all nodes identical from the point of view of the FZRP. In effect, the partition function of the steady state becomes invariant with respect to the permutations of the particle occupation numbers. This type of symmetric systems has been intensively studied in the literature. In particular, they undergo a phase transition to the condensed phase, which is caused by a mechanism of spontaneous symmetry breaking. In the case (c), the distribution of particles and the dynamics of network are coupled to each other. The strength of this coupling depends on the ratio of two time scales: for changes of the topology and of the FZRP. We will discuss a specific example of that type of interaction and show that it leads to an interesting phase diagram. The case (b) mentioned above can be viewed as a limiting case where the typical time scale of topology fluctuations is much larger than that of the FZRP.
An analytical approach to cascades on random networks
James P. Gleeson, Diarmuid J. Cahalane
The expected steady-state fraction of active nodes in Watts' model of threshold dynamics on random networks is determined analytically. The analysis applies to random graphs with arbitrary degree distributions, and includes the effect of finite seed fractions. The seed fraction is shown to have a strong impact upon the existence of global cascades and Watts' cascade condition is extended to include these effects.
Volatility and serial correlation: revisiting the LeBaron effect
According to the LeBaron effect, serial correlation is low when volatility is high and vice-versa. We show that it is true only for the predictable part of the volatility, while volatility which cannot be forecasted is positively linked to serial correlation. Since the mechanism of price formation can be very different in small and large markets we investigate the effect of volatility on intraday serial correlation in Italy (a small market) and U.S. (a large market). We find substantial differences in the impact of volatility in the two markets.
Poster Session
icon_mobile_dropdown
Diffusion covariation and co-jumps in bidimensional asset price processes with stochastic volatility and infinite activity Lévy jumps
Fabio Gobbi, Cecilia Mancini
In this paper we consider two processes driven by diffusions and jumps. The jump components are Levy processes and they can both have finite activity and infinite activity. Given discrete observations we estimate the covariation between the two diffusion parts and the co-jumps. The detection of the co-jumps allows to gain insight in the dependence structure of the jump components and has important applications in finance. Our estimators are based on a threshold principle allowing to isolate the jumps. This work follows Gobbi and Mancini (2006) where the asymptotic normality for the estimator of the covariation, with convergence speed &sqrt;h, was obtained when the jump components have finite activity. Here we show that the speed is &sqrt;h only when the activity of the jump components is moderate.
Statistics of level crossing intervals: discretized version and comparison with experimental studies
Nobuko Fuchikami, Shunya Ishioka
Previously we derived the probability density function (PDF) of the zero-crossing interval for 1/fα noise and found that the PDF, L(t) obeys the power law of the form 1/tc whose exponent c relates to the exponent α of the power spectrum density as c = 3-α when 0 < α < 1 and c = (5 - α)/2 when 1< α < 2. (Proc. SPIE Vol. 5471, p. 29, 2004). This analytical result agreed with numerical experiments by Mingesz et al. (Proc. SPIE Vol. 5110, p.312, 2003) for 0.7 less than or equivalent to α < 2, but not for 0 < α less than or equivalent to 0.7; the experimental PDF deviates from the power law in the latter range. We present here a discretized version of the previous theory by noting that the experimental time interval takes discrete numbers. The present result agrees well with the experiment for the whole range of α and explains the deviation from the power law of PDF in the range of small α.
Finding keywords amongst noise: automatic text classification without parsing
Andrew G. Allison, Charles E. M. Pearce, Derek Abbott
The amount of text stored on the Internet, and in our libraries, continues to expand at an exponential rate. There is a great practical need to locate relevant content. This requires quick automated methods for classifying textual information, according to subject. We propose a quick statistical approach, which can distinguish between 'keywords' and 'noisewords', like 'the' and 'a', without the need to parse the text into its parts of speech. Our classification is based on an F-statistic, which compares the observed Word Recurrence Interval (WRI) with a simple null hypothesis. We also propose a model to account for the observed distribution of WRI statistics and we subject this model to a number of tests.
A Bayesian estimation of a stochastic predator-prey model of economic fluctuations
Ghassan Dibeh, Dmitry G. Luchinsky, Daria D. Luchinskaya, et al.
In this paper, we develop a Bayesian framework for the empirical estimation of the parameters of one of the best known nonlinear models of the business cycle: The Marx-inspired model of a growth cycle introduced by R. M. Goodwin. The model predicts a series of closed cycles representing the dynamics of labor's share and the employment rate in the capitalist economy. The Bayesian framework is used to empirically estimate a modified Goodwin model. The original model is extended in two ways. First, we allow for exogenous periodic variations of the otherwise steady growth rates of the labor force and productivity per worker. Second, we allow for stochastic variations of those parameters. The resultant modified Goodwin model is a stochastic predator-prey model with periodic forcing. The model is then estimated using a newly developed Bayesian estimation method on data sets representing growth cycles in France and Italy during the years 1960-2005. Results show that inference of the parameters of the stochastic Goodwin model can be achieved. The comparison of the dynamics of the Goodwin model with the inferred values of parameters demonstrates quantitative agreement with the growth cycle empirical data.
Limited resolution and multiresolution methods in complex network community detection
Detecting community structure in real-world networks is a challenging problem. Recently, it has been shown that the resolution of methods based on optimizing a modularity measure or a corresponding energy is limited; communities with sizes below some threshold remain unresolved. One possibility to go around this problem is to vary the threshold by using a tuning parameter, and investigate the community structure at variable resolutions. Here, we analyze the resolution limit and multiresolution behavior for two different methods: a q-state Potts method proposed by Reichard and Bornholdt, and a recent multiresolution method by Arenas, Fernandez, and Gomez. These methods are studied analytically, and applied to three test networks using simulated annealing.
Time-frequency analysis of econometric time series
Sharif Corinaldi, Leon Cohen
We review the basic concepts of time-frequency analysis which are methods that indicate not only that which frequencies in a time series but also when they existed. A number of examples are given to illustrate the possible use of these methods to econometric series. The methods are applied to the Beveridge Wheat Price Series.
International tourism network
The interest in tourism has always been strong, for its important role in economic flows among nations. On this study we analyze the arrivals of international tourism (edges) over 206 countries and territories (nodes) around the world, on the year 2004. International tourist arrivals reached a record of 763 million in 2004. We characterize analytically the topological and weighted properties of the resulting network. International tourist arrivals are analyzed over in strength and out strength flows, resulting on a highly directed network, with a very heterogeneity of weights and strengths. The inclusion of edge weights and directions on the analysis of network architecture allows a more realistic insight on the structure of the networks. Centrality, assortativity and disparity are measured for the topological and weighted structure. Assortativity measures the tendency of having a high weight edges connecting two nodes with similar degrees. ITN is disassortative, opposite to social network. Disparity quantifies the how similar are the flows on a node neighborhood, measuring the heterogeneity of weights for in flows and out flows of tourism. These results provide an application of the recent methods of weighted and directed networks, showing that weights are relevant and that in general the modeling of complex networks must go beyond topology. The network structure may influence how tourism hubs, distribution of flows, and centralization can be explored on countries strategic positioning and policy making.