Validity and Assurance of Simulation-Based Research

Computer models and simulations have gone a long way in becoming acceptable, even familiar research tools. Novel programming paradigms and modelling techniques have already allowed researchers to push forward the complexity barrier of mathematical models; object-oriented programming and agent-based modelling are instances of such computer science contributions that are now part of the scientific vocabulary. Even more, simulation-based research may be the only means to studying exotic topics e.g. artificial life or other virtual realities. Simulation-based research faces though two major, entwined challenges: validity and interpretation. Constructing "valid" simulators is difficult, especially when validity itself is not well understood and the topics being studied are of complex nature. Interpreting simulation results is also problematic: the extent to which an abstraction can be said to adequately reflect a complex reality is hard to fully evaluate.

This research has two main aims: firstly, to build understanding and capacity in the validation of simulation-based research and secondly, to contribute to the CoSMoS process for developing scientific simulations with documented validity.

This research brings together three scientific fields: computer science, Safety Critical Systems (SCS) and ecology. If the foundations are laid on computer science, SCS is seen as a source of insight into the use of effective argumentation techniques. CoSMoS has already criticised the often naive modelling efforts that fail to provide an adequate measure of their validity; SCS provides a wide curriculum of techniques for constructing large-scale, tractable arguments and we believe simulation-based research needs such capabilities. As such, we have been probing SCS for philosophical and pragmatic contributions. We will also be evaluating the use of the HAZOP deviational analysis in better understanding research assumptions (modelling and simulation assumptions).

Ecology is a domain that has not only employed computer models and simulations in its research, but that has actively contributed to simulation science by proposing new modelling, validation and documentation techniques. For our purposes, ecology  is both a source of inspiration and a domain of applicability: this work is grounded on a plant ecology case study. For experimentation we are using four simulators implementing the plant ecology model: the original one developed in C (cplants), an occam-pi simulation (occplants), a NetLogo version (nplants) and finally, a distributed version of the occam-pi simulation (occplants2, currently under development).

Scientific literature is prolific in terms of uncertainty, robustness and sensitivity analysis techniques, more recent efforts being directed towards analysing complex systems simulations. Our work on using structured argumentation and analysing research assumptions is intended to complement and use such techniques.

0. Beginnings: Literature Review and Qualifying Dissertation

1. A first use of structured argumentation in a scientific context: arguing the equivalence of complex systems simulations

2. Making validity and structured argumentation central to the research process: argument-driven validation

3. Assurance of complex systems simulations

Five decades ago, validation has been described as "the most elusive of all the unresolved problems associated with computer simulation techniques" [1]. The situation is rather similar in the context of modern, large-scale, complex system simulations. Data limitations, ungrounded assumptions, incomplete testing are just part of the set of elements that make simulation-based research be hard to validate. Validity cannot be demonstrated or proved to an absolute extent. Researchers need to provide compelling evidence and argumentation capable of building confidence in the quality of their work: they need to provide assurance.

4. Assumptions analysis

The use of computer models and simulations provides unique advantages, but is also affected by two main classes of uncertainty: epistemic and aleatory. Simulation literature documents a number of analysis techniques: uncertainty analysis, sensitivity analysis, robustness analysis, etc. This ongoing research phase is looking at the use of argumentation techniques in conjunction with existing analysis techniques, in order to obtain a better understanding of model performance and the impact of its assumptions.

Simulators: nplants