Professional Documents
Culture Documents
ABSTRACT
This paper discusses validation and verification of simulation
models. The different approaches to deciding model validity
are presented; how model validation and verification relate
to the model development process are discussed; various
validation techniques are defined; conceptual model validity,
model verification, operational validity, and data validity
are described; ways to document results are given; and a
recommended procedure is presented.
1
INTRODUCTION
39
Sargent
Value
Cost
Cost
0%
Model Confidence
Value
of
Model
to
User
100%
VALIDATION PROCESS
40
Problem
Entity
Conceptual
Model
Validity
Analysis
and
Modeling
Operational
Validity
Experimentation
Data
Validity
Computerized
Model
Computer Programming
and Implementation
Conceptual
Model
Computerized
Model
Verification
VALIDATION TECHNIQUES
41
Sargent
empiricism, and positive economics into a multistage process of validation. This validation method consists of (1)
developing the models assumptions on theory, observations,
general knowledge, and function, (2) validating the models
assumptions where possible by empirically testing them,
and (3) comparing (testing) the input-output relationships
of the model to the real system.
Operational Graphics: Values of various performance
measures, e.g., number in queue and percentage of servers
busy, are shown graphically as the model moves through
time; i.e., the dynamic behaviors of performance indicators
are visually displayed as the simulation model moves through
time.
Parameter VariabilitySensitivity Analysis: This technique consists of changing the values of the input and
internal parameters of a model to determine the effect upon
the models behavior and its output. The same relationships
should occur in the model as in the real system. Those
parameters that are sensitive, i.e., cause significant changes
in the models behavior or output, should be made sufficiently accurate prior to using the model. (This may require
iterations in model development.)
Predictive Validation: The model is used to predict
(forecast) the system behavior, and then comparisons are
made between the systems behavior and the models forecast
to determine if they are the same. The system data may come
from an operational system or from experiments performed
on the system. e.g., field tests.
Traces: The behavior of different types of specific
entities in the model are traced (followed) through the
model to determine if the models logic is correct and if
the necessary accuracy is obtained.
Turing Tests: People who are knowledgeable about
the operations of a system are asked if they can discriminate between system and model outputs. (Schruben (1980)
contains statistical tests for use with Turing tests.)
4
Conceptual model validity is determining that (1) the theories and assumptions underlying the conceptual model are
correct, and (2) the model representation of the problem
entity and the models structure, logic, and mathematical
and causal relationships are reasonable for the intended
purpose of the model. The theories and assumptions underlying the model should be tested using mathematical analysis
and statistical methods on problem entity data. Examples
of theories and assumptions are linearity, independence,
stationary, and Poisson arrivals. Examples of applicable
statistical methods are fitting distributions to data, estimating parameter values from the data, and plotting the data
to determine if they are stationary. In addition, all theories used should be reviewed to ensure they were applied
correctly; for example, if a Markov chain is used, does the
system have the Markov property, and are the states and
transition probabilities correct?
Next, each submodel and the overall model must be
evaluated to determine if they are reasonable and correct
for the intended purpose of the model. This should include
determining if the appropriate detail and aggregate relationships have been used for the models intended purpose,
and if the appropriate structure, logic, and mathematical and
causal relationships have been used. The primary validation
techniques used for these evaluations are face validation and
traces. Face validation has experts on the problem entity
evaluate the conceptual model to determine if it is correct and
reasonable for its purpose. This usually requires examining
the flowchart or graphical model, or the set of model equations. The use of traces is the tracking of entities through
each submodel and the overall model to determine if the
logic is correct and if the necessary accuracy is maintained.
If errors are found in the conceptual model, it must be
revised and conceptual model validation performed again.
DATA VALIDITY
42
MODEL VERIFICATION
OPERATIONAL VALIDITY
NON-OBSERVABLE
SYSTEM
OBJECTIVE
APPROACH
43
COMPARISON
USING
STATISTICAL
TESTS AND
PROCEDURES
COMPARISON
TO OTHER
MODELS USING
STATISTICAL
TESTS AND
PROCEDURES
Sargent
tests. Graphs are the most commonly used approach, and
confidence intervals are next.
7.1 Graphical Comparison of Data
The behavior data of the model and the system are graphed
for various sets of experimental conditions to determine
if the models output behavior has sufficient accuracy for
its intended purpose. Three types of graphs are used:
histograms, box (and whisker) plots, and behavior graphs
using scatter plots. (See Sargent (1996a) for a thorough
discussion on the use of these for model validation.) An
example of a box plot is given in Figure 3, and examples
of behavior graphs are shown in Figures 4 and 5. A variety
of graphs using different types of (1) measures such as the
mean, variance, maximum, distribution, and time series of
a variable, and (2) relationships between two measures of a
single variable (see Figure 4) and between measures of two
variables (see Figure 5) are required. It is important that
appropriate measures and relationships be used in validating
a model and that they be determined with respect to the
models intended purpose. See Anderson and Sargent (1974)
for an example of a set of graphs used in the validation of
a simulation model.
These graphs can be used in model validation in different
ways. First, the model development team can use the graphs
in the model development process to make a subjective
judgment on whether a model possesses sufficient accuracy
for its intended purpose. Second, they can be used in the face
validity technique where experts are asked to make subjective
judgments on whether a model possesses sufficient accuracy
for its intended purpose. Third, the graphs can be used is
in Turing tests. Another way they can be used is in IV&V.
System
Model
100
80
40
44
45
Sargent
9
RECOMMENDED PROCEDURE
DOCUMENTATION
Documentation on model verification and validation is usually critical in convincing users of the correctness of a
model and its results, and should be included in the simulation model documentation. (For a general discussion on
documentation of computer-based models, see Gass (1984).)
Both detailed and summary documentation are desired. The
detailed documentation should include specifics on the tests,
evaluations made, data, results, etc. The summary documentation should contain a separate evaluation table for data
validity, conceptual model validity, computer model verification, operational validity, and an overall summary. See
Table 2 for an example of an evaluation table of conceptual
model validity. (See Sargent (1994, 1996b) for examples
of two of the other evaluation tables.) The columns of the
table are self-explanatory except for the last column, which
refers to the confidence the evaluators have in the results
or conclusions, and this is often expressed as low, medium,
or high.
2.
3.
4.
5.
6.
7.
8.
Technique(s)
Used
Face validity
Historical
Accepted
approach
Derived from
empirical data
Theoretical
derivation
Justification for
Technique Used
Reference to
Supporting Report
Result/
Conclusion
Confidence
In Result
Strengths
Weaknesses
Overall evaluation for
Computer Model Verification
Overall
Conclusion
Justification for
Conclusion
46
Confidence
In Conclusion
47
Sargent
ulation: The Diversity of Applications, ed. T. Hensen,
Society for Computer Simulation, San Diego, CA, pp.
2452250.
Sargent, R. G. 1979. Validation of Simulation Models,
Proc. of the 1979 Winter Simulation Conf., San Diego,
CA, pp. 497503.
Sargent, R. G. 1981. An Assessment Procedure and a Set
of Criteria for Use in the Evaluation of Computerized
Models and Computer-Based Modelling Tools, Final
Technical Report RADC-TR-80-409.
Sargent, R. G. 1982. Verification and Validation of Simulation Models, Chapter IX in Progress in Modelling and
Simulation, ed. F. E. Cellier, Academic Press, London,
pp. 159169.
Sargent, R. G. 1984. Simulation Model Validation, Simulation and Model-Based Methodologies: An Integrative
View, ed. Oren, et al., Springer-Verlag.
Sargent, R. G. 1985. An Expository on Verification and
Validation of Simulation Models, Proc. of the 1985
Winter Simulation Conf., pp. 15-22.
Sargent, R. G. 1986. The Use of Graphic Models in Model
Validation, Proc. of the 1986 Winter Simulation Conf.,
Washington, D.C., pp. 237241.
Sargent, R. G. 1988. A Tutorial on Validation and Verification of Simulation Models, Proc. of 1988 Winter
Simulation Conf., pp. 3339.
Sargent, R. G. 1990. Validation of Mathematical Models, Proc. of Geoval-90: Symposium on Validation of
Geosphere Flow and Transport Models, Stockholm,
Sweden, pp. 571579.
Sargent, R. G. 1991. Simulation Model Verification and Validation, Proc. of 1991 Winter Simulation Conf., Phoenix,
AZ, pp. 3747.
Sargent, R. G. 1994. Verification and Validation of Simulation Models, Proc. of 1994 Winter Simulation Conf.,
Lake Buena Vista, FL, pp. 7787.
Sargent, R. G. 1996a. Some Subjective Validation Methods
Using Graphical Displays of Data, Proc. of 1996 Winter
Simulation Conf., pp. 345351.
Sargent, R. G. 1996b. Verifying and Validating Simulation
Models, Proc. of 1996 Winter Simulation Conf., pp.
5564.
Sargent, R. G. 1998. Verification and Validation of Simulation Models, Proc. of 1998 Winter Simulation Conf.,
pp. 121130.
Schlesinger, et al. 1979. Terminology for Model Credibility,
Simulation, 32, 3 pp. 103104.
Schruben, L. W. 1980. Establishing the Credibility of
Simulations, Simulation, 34, 3, pp. 101105.
Shannon, R. E. 1975. Systems Simulation: The Art and the
Science, Prentice-Hall.
Whitner, R. B. and O. Balci. 1989. Guidelines for Selecting
and Using Simulation Model Verification Techniques,
48