Understanding and predicting complex turbulent flows such as the weather is one of the greatest challenges of our time. The art of modeling such complex systems has taken great strides over the last 20 years. It is remarkable that until quite recently, real knowledge about this phenomenon was based on an extremely small number of low precision weather monitoring stations spread unevenly around the world. A revolution in the field followed the launching of earth sensing satellites, allowing data collection from a high density of surface, air and sea sensors. A high precision visualization of the weather state led to the understanding of fundamental features such the life cycles of El Nino and many others. A similar revolution happens in many fields of present day science: In astrophysics gigantic sky surveys and other high-resolution data collection projects give us a much more accurate picture of the present and past of our Universe. In medicine various tomography techniques help us to visualize processes in the human body.
In many ways, a valid analogy can be made between the weather and the transitory behavior of the Internet. In order to visualize and to understand the dynamics of Internet its topology should be continuously monitored and the traffic of data packets should be measured with high temporal precision and good spatial resolution. During the last few years many efforts have been made to monitor network topology and to develop traffic measurement techniques (Cooperative Association for Internet Data Analysis (CAIDA), National Laboratory for Applied Network Research (NLANR), R?seaux IP Europ?ens (RIPE) Test Traffic Measurement program). While these efforts are very important steps toward the dynamical characterization of Internet, they are unsatisfactory from the point of view of the needs of complexity research aiming to understand the dynamics of these networks. The traffic measurement boxes in the current measurement efforts are sending packets with very low-resolution (seconds apart) and in simple flow streams, such as periodic, where there are no interactions between the probes. In contrast, the fast growing literature on active probing analysis involves sending packets whose size and spacing are precisely modulated, in order to generate, and exploit, non-linear effects as they traverse network elements. The aims of such `probe design' and analysis are more ambitious than a simple direct measurement of end-to-end delay. The data processing aims to invert the `filtering' that the network imposes on the probe stream, in order to infer the structure of, and conditions within, the network itself. More specifically, techniques exist for making inferences on link bandwidths, and the loading of network links via the measurement of dynamic `available bandwidth'. There are also new network tomography proposals where some information on the spatial structure of traffic characteristics can be gained. The desired high spatio-temporal resolution can only be achieved if we integrate these methods into one measurement network, where high temporal resolution can be achieved between the measurement hosts. To achieve this we build a hardware based traffic observatory from high temporal resolution measurement boxes and a central management system.
We build a measurement infrastructure in Europe that is able to carry out high temporal resolution (~10 nano second), globally synchronized, active measurements between the measurement boxes. It will provide us a high resolution, spatially extended dynamic picture of fast changes in the network traffic. This can open up the possibility to new kind of network tomography, where cross correlation between measurement flows can be measured on a fine timescale and the internal state of the network, far away from the ends of the network, where the measurement devices are located, can be reconstructed and its time behavior can be studied, and the data can be analyzed with methods developed in the complexity science literature.