RESEARCH VISION & AMBITION
Motivation
Recent advances in experimental techniques such as detectors, sensors and scanners, have opened up new vistas into physical, biological and social processes on many levels of detail. The complete cascade from the individual components to the integrated systems crosses many orders of magnitude in temporal and spatial scales. These complex adaptive systems display endless signatures of order, disorder, self-organization and self-annihilation. Understanding, quantifying and handling this complexity is one of the biggest scientific challenges of our time.
Example
A prototypical example comes from biomedicine, where we have data from virtually all levels between 'molecule and men' and yet we have no models where we can study these processes as a whole. It is a real complex system: from a biological cell, made of thousands of different molecules that work together, to billions of cells that build our tissue, organs and immune system, to our society, six billion unique interacting individuals. The complete cascade from the genome, proteome, metabolome, physiome to health constitutes multi-scale, multi-science systems, and crosses many orders of magnitude in temporal and spatial scales.
Theory
The basic problem underlying the theoretical questions is that processes studied by natural scientists involve systems that are either continuous, stochastic, spatially extended, or any combination of these, and fall strictly outside the range of discrete computation theory. The study of information processing in such complex dynamical systems is, therefore, still in its infancy. The basic questions to answer are: 'Can we detect and describe the computational structure in natural processes and can we provide a quantitative characterization of essential aspects of this structure?'
This simple question leads to a plethora of theoretical challenges, related to information processing, automata theory, information theory and complexity, synchronous vs. asynchronous vs. evolutionary computing, etc.. My aim is to study such questions in the context of dynamic complex systems for instance modeled as cellular automata. Under the strong assumption that e.g. physical and biological processes are nothing but examples of universal computation, we have constructed automata in the past for specific tasks.
A possible approach comes from recent work on epsilon-machines revealing the group and semi-group symmetries possessed by the spatial patterns and indicating the minimum amount of memory required to reproduce the configuration ensemble, a quantity known as the statistical complexity. It was shown that the notion of excess entropy, a form of mutual information, can be used as an information theoretic measure of apparent spatial memory required to describe the complex systems. However, there is no unique indicator of complexity in the same way as e.g. entropy characterizes disorder, nor are there any successful mean field approaches or renormalization theories known to capture the hierarchy of information processing observed. Evolution of space-time processes in parallel cellular automata is studied through (epsilon-tau)-entropy models.
Challenges
The sheer complexity and range of spatial and temporal scales defies any existing numerical model and computational capacity. The only way out is by combining data on all levels of detail with large scale particle-based, stochastic and continues models; an open research area. Conceptual, theoretical and methodological foundations are necessary in understanding these multi-scale processes, dynamic networks, and the associated predictability limits of such large scale computer simulations.
Ambition
To be a school of thought in the field of computational science and lead the way to understand how nature processes information.
The Scientific Method
With Robert Pirsig I think that ‘the real purpose of the scientific method is to discover that nature has not misled you into thinking you know something you actually don’t know’.
It is nice to read that the ancient philosophers like Plato and Aristotle find science to be about ‘organizing facts such that we can find a coherent explanation of reality’.
What else is computational science about?! The art and science is in the abstraction of what we observe, modeling it and studying its behavior through explicit computer simulations and validating it against experimentation.
Recent advances in experimental techniques such as detectors, sensors and scanners, have opened up new vistas into physical, biological and social processes on many levels of detail. The complete cascade from the individual components to the integrated systems crosses many orders of magnitude in temporal and spatial scales. These complex adaptive systems display endless signatures of order, disorder, self-organization and self-annihilation. Understanding, quantifying and handling this complexity is one of the biggest scientific challenges of our time.
Example
A prototypical example comes from biomedicine, where we have data from virtually all levels between 'molecule and men' and yet we have no models where we can study these processes as a whole. It is a real complex system: from a biological cell, made of thousands of different molecules that work together, to billions of cells that build our tissue, organs and immune system, to our society, six billion unique interacting individuals. The complete cascade from the genome, proteome, metabolome, physiome to health constitutes multi-scale, multi-science systems, and crosses many orders of magnitude in temporal and spatial scales.
Theory
The basic problem underlying the theoretical questions is that processes studied by natural scientists involve systems that are either continuous, stochastic, spatially extended, or any combination of these, and fall strictly outside the range of discrete computation theory. The study of information processing in such complex dynamical systems is, therefore, still in its infancy. The basic questions to answer are: 'Can we detect and describe the computational structure in natural processes and can we provide a quantitative characterization of essential aspects of this structure?'
This simple question leads to a plethora of theoretical challenges, related to information processing, automata theory, information theory and complexity, synchronous vs. asynchronous vs. evolutionary computing, etc.. My aim is to study such questions in the context of dynamic complex systems for instance modeled as cellular automata. Under the strong assumption that e.g. physical and biological processes are nothing but examples of universal computation, we have constructed automata in the past for specific tasks.
A possible approach comes from recent work on epsilon-machines revealing the group and semi-group symmetries possessed by the spatial patterns and indicating the minimum amount of memory required to reproduce the configuration ensemble, a quantity known as the statistical complexity. It was shown that the notion of excess entropy, a form of mutual information, can be used as an information theoretic measure of apparent spatial memory required to describe the complex systems. However, there is no unique indicator of complexity in the same way as e.g. entropy characterizes disorder, nor are there any successful mean field approaches or renormalization theories known to capture the hierarchy of information processing observed. Evolution of space-time processes in parallel cellular automata is studied through (epsilon-tau)-entropy models.
Challenges
The sheer complexity and range of spatial and temporal scales defies any existing numerical model and computational capacity. The only way out is by combining data on all levels of detail with large scale particle-based, stochastic and continues models; an open research area. Conceptual, theoretical and methodological foundations are necessary in understanding these multi-scale processes, dynamic networks, and the associated predictability limits of such large scale computer simulations.
Ambition
To be a school of thought in the field of computational science and lead the way to understand how nature processes information.
The Scientific Method
With Robert Pirsig I think that ‘the real purpose of the scientific method is to discover that nature has not misled you into thinking you know something you actually don’t know’.
It is nice to read that the ancient philosophers like Plato and Aristotle find science to be about ‘organizing facts such that we can find a coherent explanation of reality’.
What else is computational science about?! The art and science is in the abstraction of what we observe, modeling it and studying its behavior through explicit computer simulations and validating it against experimentation.