What is Simulation?
What exactly is meant by a simulation?
This article discusses the nature of simulations, how they are constructed and validated, and whether a computer is necessary. The final part of the article discusses whether the act of simulation is a necessary part of the process or whether the rules and structures themselves are sufficient to define a simulation. This relates to Stephen Hawking’s musing: “What is it that breathes fire into the equations and makes a universe for them to describe?”.
There is no doubt that simulation is now a hugely important and widely-used research methodology, covering fields as diverse as cell biology, cosmology and manufacturing processes. Some idea of the range and scope of simulation can be obtained by following up some of the links to Web-Based Simulation Forums and Societies.
It is useful to try to understand what exactly we mean by a simulation. The term itself implies that we are simulating, or mimicking a target object, probably a real or imagined world. Sometimes the term is used synonymously with 'computer model', although simulations have an important history which pre-date the computer era. Role-Playing, Wargames, and even the Big Brother House can all be considered to be types of simulation. However, it was with the advent of computers, especially when computing power increased dramatically in the 70s and 80s that large scale simulations of global events (e.g. climate change, macroeconomic forecasting etc.), and gaming scenarios (e.g. Sim City, Populus) that the term 'simulation' began to take on its current meaning as a computer 'model' of some aspect of real life.
We should however, not automatically assume that the term 'simulation' necessarily signifies an attempt to mimic reality, let alone incorporate all the features we see around us. Taken on its own terms, a simulation is simply an attempt to create some sort of viable artificial world, with its own rules, its own logic, its own events, artifacts and possibly its own characters which interact in a vaguely analogous fashion to the experience we label as 'reality'. We should also not assume that these artificial worlds would, even in the far future, would necessarily contain conscious entities. Extrapolating from some of the current offerings, sufficiently complex artificial world might admit conscious beings, but again these need not necessarily be human - such beings might be aliens, anthropomorphisations, or even superintelligent shades of the colour blue.
In addition, having opened up the possibility of 'artificial worlds', many other types of computer-generated environments now fall under this umbrella. There have been many attempts to create Artifical Life scenarios, from Conway's Game of Life to Gene Pool. Clearly these have a long way to go before they achieve any realistic mimickry of living creatures, let alone achieving any form of intelligence.
Some Features of Simulations
Computer simulations are normally used where deterministic solutions are impossible or intractable. Such uses can range from simple Monte-Carlo methods of integrating mathematical functions, to modelling bird migration behaviour. These simulations typically have stochastic features built into their models; this is normally to mimic the 'random' events which occur from time to time, but can be simply a device for obtaining a more reliable or statistically valid answer.
It is actually not strictly necessary for a computer to be involved, even in a mathematical simulation. In the 60's and 70's many simulations were paper-based, and involved lots of calculations done by hand, with tabulations. A computer merely makes the calculations easier, and facilitates the whole. In essence, what is required in a simulation, is a set of clearly-defined events (normally randomised), a set of rules governing the interations beteeen events, and a systematic method of recording.
Traditionally, those simulations which involve the occurrences of sequences of events are said to be discrete- or continuous- time based. Discrete simulation 'time-slice', that is they move time on in small, clearly-defined amounts, and events occur at the point at which the clock 'ticks'. Continuous simulations are event-based, and events are executed in order; in principle the time between events is as long as it takes. This could be years or nanoseconds. However, because a computer is a finite state machine, all computer simulations are in essence based on discrete units of time, and therefore simulated time by default must, in some sense, be quantized. To that extent, if time is continuous, a computer simulation of an external reality, by its very nature, can only ever be an approximation.
Fishman, G.S., 2001, Discrete-Event Simulation: Modeling, Programming, and Analysis, Berlin: Springer-Verlag, 0-387-95160-1
Douverstein, W, 2006, Continuous Simulation, http://www.cs.uu.nl/docs/vakken/sim/continuous.pdf, accessed February 2007.
How Simulations Work
The way that a computer simulation is normally developed is by extracting essential features from the world, and finding logical and mathematical relationships between them. These relationships can normally be expressed in the form of algorithms, and work best when the variable changes can be clearly quantified. The model is then created and tested, to see to what extent the model produces results that conform to expectations, normally attempting to match historical data. Initially the modelling will be quite crude, and results only bear a passing resemblance to reality; however, as more features are included, and further logical and mathematical relationships are incorporated, a further round of testing can ensue, and the process begins again. This is called the modelling cycle, and is described in detail on Peter Ball's Website.
This is by no means an exact science. The level of detail involved in doing this modelling can vary enormously. For example, on early Global Weather forecasting systems, the blocks used were of the order of 100km squares and above. The descriptions of the various climate models from The Canadian Centre for Climate Modelling and Analysis illustrate this well; the most complex model comes in two versions, one of which has grid sizes of 3.75 degrees lat/long and the other of grid size 1.85 degrees lat/long. Current levels of forecasting can sometimes use 1km cubes and smaller. Clearly more detail can potentially mean greater accuracy, but there is a trade off here between accuracy, and speed. The more detail a simulation incorporates, the slower the simulation will run.
The trade-off can be countered by increasing the computing power. There is a natural limit to this. If we were to simulate our current 'reality', it would necessarily either run at a slower rate, or it would involve a loss of detail, or, more probably, both of these. In any mathematical modelling, the Law of Diminishing Returns eventually begins to kick in. It takes ever more effort to incorporate greater level of detail, with little appreciable difference in the output. Mathematical modellers instinctively know when to stop pedalling round the modelling cycle, and simulation modellers too, tend to stop when good results only get slightly better with huge amounts of effort.
Typically simulations which model real life are 'trained' to approximate the aspect of life that they are modelling as part of the validation process. Many runs of the simulation are carried out, and variables are changed, relationships tweaked, in order to match an already-existing set of data gleaned from the real world. The simulation then will be tested on some new data, to make predictions, to see whether this matches what we would expect. There is a whole raft of formal validation techniques to ensure that the simulation actually does what is is supposed to do. See for example, the collection of papers from Robert Sargent, a regular contributor to the Winter Simulation Conference. None of these is methods is precise, and always there will be anomalies and 'freak' events which occur, exactly as in real life. A major difficulty is to distinguish between natural freak events, and an error in the algorithm or the state of the variables.
At some point the simulation is accepted as 'valid', and it is put to work, making predictions. As simulations normally contain stochastic features, different runs will result in different predictions. A simple way of countering this is to carry out multiple runs, collect data and perform statistical analyses, creating confidence intervals and undertaking statistical tests. In most cases, this is sufficient to get a fairly reliable answer; however there are cases, for example in weather simulations, where the outcomes of different runs vary so wildly that no reliable forecast can be given. These are the situations where 'the butterfly effect' is so pronounced that prediction is deemed impossible.
Sargent R. G., 2005, Verification and validation of Simulation Models, Proceedings of the 37th conference on Winter simulation, Orlando, Florida
The Fire in The Equations
An important question to ask is whether the simulation is actually the set of entities, algorithms and relationships or whether it is the act of simulating itself. In other words, if the world around us is a simulation, what exactly drives it, and what is the essential element that makes it work? Is it the algorithm, is it the state of the variables and objects or is it the processing of the algorithm or is it a combination of both? This is the parallel question to that asked by Hawking (1996), and further explored by Ferguson in her book " The Fire in the Equations"; that is, what essence makes the mathematical equations that describe the universe change from being symbols on a piece of paper to become particles and energy?
There is a clear difference between an algorithm written on paper, or stored electronically as a program, and what happens when the program is run. Processing makes the simulation 'come alive'. The algorithm and the processing can be regarded as two separate features of the simulation, but with very different outcomes. Stepping through the algorithm is sufficient to cause it to function, to change states of objects and mimic real-world processes. Sequentially following the logic and changing the state of objects in a step-by-step manner appears to breathe life into the system. This not only requires action, it also requires interpretation. Just having a sequence of memory-states alone is not meaningful. It is the interpretation that we put on those memory states that makes the simulation real.
One interesting conclusion to all this, is that if it were possible to simulate conscious minds by a computer program, then that program could be written on paper, and stepped through. In stepping through the program, consciousness would somehow be created - but how, and where? This starkly highlights the dilemma: how can it be possible for a set of algorithms to be anything more than just a collection of symbols or a sequence of memory-states? Do we not need a conscious entity to make sense of these, and if so, how then could it be possible that a conscious entity itself could ever arise from the same process?
Ferguson, K., 2004, The Fire in the Equations: Science, Religion & the Search for God, Templeton Books
Hawking, S., 1996: 'A Brief History of Time', Bantam