Você está na página 1de 30

Practical and Technical Challenges in Verification and Validation

William L. Oberkampf, PhD


Sandia National Laboratories (retired) Consulting Engineer Austin, Texas wloconsulting@gmail.com

American Society of Mechanical Engineers Verification and Validation Symposium Las Vegas, Nevada May 2 4, 2012

Outline
 History of the development of V&V standards  Ideas for resolution of conflicting concepts  Technical challenges in modeling and simulation V&V  Conclusions

Goals of Verification and Validation


 Assessment and improvement of the credibility, accuracy, and
trustworthiness of products or services

 Assessment procedures:
 Principles and procedures must be applicable to broad classes of products or services  Must be able to make objective measurements of the trustworthiness of contributing elements

 Improvement procedures:
 Procedures must be applicable to contributing elements, subelements, sub-sub-elements, etc.  Procedures must be relevant to the trustworthiness of the product or service

Institute of Electrical and Electronics Engineers


 1012-1986 IEEE: Standard for Software Verification and
Validation Plans
 Provided minimum requirements for software V&V plans  Definition-Verification: The process of evaluating the products of a software development phase to provide assurance that they meet requirements defined for them by the previous phase  Definition-Validation: The process of testing a computer program and evaluating the results to ensure compliance with specific requirements

 1059-1993 IEEE: Guide for Software Verification and


Validation Plans
 Recommended approaches for improved software V&V planning

American Nuclear Society


 ANS-10.4-1987 (Renewed 1998): Guidelines for the
Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry
 Recommended guidelines for V&V for scientific and engineering computer programs  Definition-Verification: The process of evaluating the products of a software development phase to provide assurance that they meet the requirements defined for them by the previous phase  Definition-Validation: The process of testing a computer program and evaluating the results to ensure compliance with specified requirements Guidelines based on a software V&V perspective

U. S. Department of Defense
 In the early 1990s the Defense Modeling and Simulation
Office (DMSO) was tasked to study V&V concepts Are the concepts of V&V established by IEEE appropriate for DoD needs?

 In 1994, fundamentally different concepts were codified:


 Definition-Verification: The process of determining that a model implementation accurate represents the developers conceptual description of the model  Definition-Validation: The process of determining the degree to which a model is an accurate representation of the real world form the perspective of the intended uses of the model

The emphasis shifted away from software reliability, to modeling and simulation credibility.
6

American Institute of Aeronautics and Astronautics


 In 1992 the AIAA Committee on Standards for Computational
Fluid Dynamics began studying varying terminology and concepts of V&V

 In 1998 the Committee produced the first engineering


standard for V&V based on M&S concepts: (AIAA-G-077-1998, renewed 2009) Guide for the V&V of Computational Fluid Dynamics Simulations
 Definition-Verification: The process of determining that a model implementation accurately represents the developers conceptual description of the model and the solution to the model  Definition-Validation: same as DoD definition

 Codified the concepts of:


 Importance of solution verification  Validation hierarchy  Prediction distinguished from validation
7

American Society of Mechanical Engineers


 In 2001 the first V&V standards committee was formed in
ASME: Committee on V&V in Computational Solid Mechanics

 In 2006 the Committee produced ASME V&V 10-2006: Guide


for V&V in Computational Solid Mechanics

 Codified the concepts of:


 Conceptual model, mathematical model, and computational model  Code verification is distinguished from solution verification  Adopted a comprehensive view of validation, i.e., the prediction for the conditions of the application of interest must also satisfy the specified accuracy requirements of the model  Uncertainty quantification explicitly required in V&V

Institute of Electrical and Electronics Engineers


 1278.4-1997 IEEE (Trial-Use): Recommended Practice for
Distributed Interactive Simulation-Verification, Validation and Accreditation
 Recommended how-to guidelines for VV&A of distributed interactive simulation exercises

 1012-1998 (revision of 1012-1986) IEEE: Standard for Software


Verification and Validation
 Recommended software V&V processes to assess conformance to requirements and intended use of the software

 1012-2004 (revision of 1012-1998): IEEE Standard for Software


Verification and Validation
 Software V&V includes management, acquisition, supply, development, operation, and maintenance of software

Institute of Electrical and Electronics Engineers


 1516.4-2007 IEEE: Recommended Practice for VV&A of a
Federation an Overlay to the High Level Architecture Federation Development and Execution Process

 Recommends practices VV&A practices and procedures for a


high level federation of software

 1597.1-2008 IEEE: Standard for Validation of Computational


Electromagnetics Computer Modeling and Simulations
 Recommends procedures to validate modeling and simulation techniques, codes, and models

 1597.2-2010 IEEE: Recommended Practice for Validation of


Computational Electromagnetics Computer Modeling and Simulation
 Shows how to validate solutions using measurements, alternate codes, canonical, or analytical methods

1597 clearly applies to M&S of physical processes


10

American Nuclear Society International Organization for Standardization


 ANS 10.4-2008 (revision of 10.4-1998): Verification and Validation of
Non-Safety-Related Scientific and Engineering Computer Programs for the Nuclear Industry

 ISO 14064-3:2006 Part 3: Specification with Guidance for the


Validation and Verification of Greenhouse Gas Assertions

 ISO 14065:2007: Greenhouse Gases Requirements for


Greenhouse Gas Validation and Verification Bodies for Use in Accreditation or Other Forms of Recognition

 ISO 16730:2008: Fire Safety Engineering Assessment, Verification


and Validation of Calculation Methods

 ISO 10303-1488:2010: Industrial Automation Systems and


Integration Product Data Representation and Exchange Part 1488: Verification and Validation

 ISO 14066:2011: Greenhouse Gases Competence Requirements


for Greenhouse Gas Validation Teams and Verification Teams
11

American Society of Mechanical Engineers American Society of Civil Engineers


 ASME V&V 20 2009: Standard for Verification and Validation
in Computational Fluid Dynamics and Heat Transfer

 ASCE Standard 2009: Verification and Validation of 3D FreeSurface Flow Models

 ASME V&V 10.1 2012: An Illustration of the Concepts of


Verification and Validation in Computational Solid Mechanics These standards take a M&S V&V perspective

20 conceptually conflicting V&V standards have been produced. Houston, we have a problem.

12

Ideas for Resolution of Conflicting Views of V&V


 What are the primary goals of software V&V vs M&S V&V?
Software V&V is focused on software reliability M&S V&V is focused on simulation credibility

 What does software V&V vs M&S V&V produce?


Software V&V: software  product M&S V&V: information  service

13

Four Critical Challenges in M&S V&V:  Development of a V&V Plan


 Example of questions that should be answered in the V&V plan:
 What is the application domain over which the model is expected to make predictions?  What system response quantities (SRQs) is the model expected to predict?  What are the code and solution verification requirements?  What validation hierarchy is appropriate for the system of interest?  What is the validation domain for each tier of the validation hierarchy?  What validation metrics are to be used?  What are the accuracy requirements for the model in the validation domain?  What are the accuracy requirements for the model in the application domain?  What are the costs, schedule, and manpower requirements to complete the V&V plan?
14

Where Do We Stand with Regard to Constructing and Using V&V Plans?

15

 Validation Metrics
 What is a validation metric?
A measure of agreement between computational results and experimental measurements for SRQs of interest

 Steps to evaluate a validation metric result:


1) Choose a system response quantity of interest 2) Experimentally measure, if possible, all input quantities needed for the code 3) Experimentally measure the system response quantity of interest 4) Using the code and all the input data provided, compute the system response quantity of interest 5) Compute a difference between the experimental measurements and the computational results

16

Model Accuracy Assessment, Calibration and Prediction

(from Oberkampf and Barone, 2006)


17

Approaches to Constructing Validation Metrics


1. Hypothesis testing methods 2. Comparing the statistical mean of the simulation and and the mean of the experimental measurements 3. Bayesian methods 4. Assessing if the simulation passes through the scatter in the experimental data 5. Comparison of cumulative distribution functions from the simulation and the experimental measurements (area metric) What are the goals of using a validation metric? Estimating model form uncertainty Assess model adequacy with respect to application requirements

18

 Extrapolation of Models
 At each V in the
validation domain, one can:

 Calibrate parameters  Compute a validation


metric result

 To approximate the
validation metric over the validation domain, one can:

 Interpolation function  Regression fit

 Beyond the validation


domain, one must extrapolate the model

(adapted from Trucano et al, 2002)


19

Prediction Within the Validation Domain: Interpolation

 Traditionally, model
form uncertainty was not estimated

 As a result, model
form uncertainty was ignored

 For high-dimensional
input spaces, it is difficult to determine if one is interpolating or extrapolating

(from Oberkampf and Roy, 2010)


20

Prediction Far Outside the Validation Domain: Large Extrapolation


 Extrapolations can
occur in terms of:
 Input quantities  Non-parametric spaces

 Extrapolation may
require:
 Large changes in coupled physics, e.g., heating effects on structural dynamics  Large changes in geometry or subsystem interactions, e.g., partially melted fuel rods in a reactor

 Large extrapolations
should result in large increases in uncertainty

(from Oberkampf and Roy, 2010)


21

 Predictive Capability
  y = f (x )  x = { x1 , x2 , xm }  y = { y1 , y2 , yn }

(from Oberkampf and Roy, 2010)


22

Sources of Uncertainty
 Uncertainty in input parameters (mathematical and numerical):
 Input data parameters (independently measureable and nonmeasureable)  Uncertainty modeling parameters  Numerical algorithm parameters

 Numerical solution error:


 Round-off error  Iterative error  Spatial and temporal discretization error

 Model form uncertainty:


 Estimated over the validation domain using a validation metric  Extrapolation of the validation metric outside of the validation domain  Estimated at the application conditions using competing models

What is included in Predictive Capability?


23

Example of Probability-box with a Mixture of Aleatory and Epistemic Uncertainty

(from Roy and Oberkampf, 2011)

24

Example Showing Total Uncertainty Using Alternate Competing Models

Predicted Track of Hurricane Emily 2005

(from Green, 2007)


25

Concluding Remarks
 Conflicting views between software and M&S V&V can be resolved if:
 It is accepted that we are interested in different deliverables  Turf is considered secondary to the advancement of technology and the public good

 Technical progress is critically needed in:


    Improved guidance and practice in constructing V&V plans Construction, interpretation, and use of validation metrics Approaches to extrapolating various types of uncertainties Approaches to estimating and interpreting predictive uncertainty

Quote from William H. Press: Simulation and mathematical modeling will power the 21st Century the way steam powered the 19th.

26

References
 Ayyub, B. M. and G. J. Klir (2006). Uncertainty Modeling and Analysis in Engineering
and the Sciences, Boca Raton, FL, Chapman & Hall.

 Bayarri, M. J., J. O. Berger, R. Paulo, J. Sacks, J. A. Cafeo, J. Cavendish, C. H. Lin, and


J. Tu (2007), A Framework for Validation of Computer Models, Technometrics, Vol. 49, No. 2, pp. 138-154.

 Coleman, H. W. and F. Stern (1997), Uncertainties and CFD Code Validation, Journal
of Fluids Engineering, Vol. 119, pp. 795-803.

 Chen, W., L. Baghdasaryan, T. Buranathiti, and J. Cao (2004), Model Validation via
Uncertainty Propagation and Data Transformations, AIAA Journal, Vol. 42, No. 7, pp. 1406-1415.

 Chen, W., Y. Xiong, K-L Tsui, and S. Wang (2008), A Design-Driven Validation
Approach Using Bayesian Prediction Models, Journal of Mechanical Design, Vol. 130, No. 2.

 Dowding, K., R. G. Hills, I. Leslie, M. Pilch, B. M. Rutherford, and M. L. Hobbs (2004),


Case Study for Model Validation: Assessing a Model for Thermal Decomposition of Polyurethane Foam, Sandia National Laboratories, SAND2004-3632, Albuquerque, NM.

 Ferson, S., W. L. Oberkampf, and L. Ginzburg (2008), Model Validation and Predictive
Capability for the Thermal Challenge Problem, Computer Methods in Applied Mechanics and Engineering, Vol. 197, pp. 2408-2430.

 Ferson, S. and W. L. Oberkampf (2009), Validation of Imprecise Probability Models,


International Journal of Reliability and Safety, Vol. 3, No. 1-3, pp. 3-22.

27

References (continued)
 Ferson, S. and W. T. Tucker (2006), Sensitivity Analysis Using Probability Bounding,
Reliability Engineering and System Safety, Vol. 91, No. 10-11, pp. 1435-1442.

 Ferson, S., W. L. Oberkampf, and L. Ginzburg (2008), Model Validation and Predictive
Capability for the Thermal Challenge Problem, Computer Methods in Applied Mechanics and Engineering, Vol. 197, pp. 2408-2430.

 Green, L. L. (2007), Uncertainty Analysis of Historical Hurricane Data, American Institute


of Aeronautics and Astronautics, Paper 2007-1101.

 Helton, J. C., J. D. Johnson, C. J. Sallaberry, and C. B. Storlie (2006), Survey of SamplingBased Methods for Uncertainty and Sensitivity Analysis, Reliability Engineering and System Safety, Vol. 91, No. 10-11, pp. 1175-1209.

 Hasselman, T. K. (2001), Quantification of Uncertainty in Structural Dynamic Models,


Journal of Aerospace Engineering, Vol. 14, No. 4, pp. 158-165.

 Hills, R. G. (2006), Model Validation: Model Parameter and Measurement Uncertainty,


Journal of Heat Transfer, Vol. 128, No. 4, pp. 339-351.

 Hills, R. G. and T. G. Trucano (2002), "Statistical Validation of Engineering and Scientific


Models: A Maximum Likelihood Based Metric," Sandia National Laboratories, SAND2001-1783, Albuquerque, NM.

 Kaplan, S. and B. J. Garrick (1981). "On the Quantitative Definition of Risk." Risk Analysis.
1(1), 11-27.

28

References (continued)
 Kennedy, M. C. and A. OHagan (2001), Bayesian Calibration of Computer Models,
Journal of the Royal Statistical Society Series B - Statistical Methodology, Vol. 63, No. 3, pp. 425-450.

 Morgan, M. G. and M. Henrion (1990). Uncertainty: A Guide to Dealing with Uncertainty


in Quantitative Risk and Policy Analysis. 1st Ed., Cambridge, UK, Cambridge University Press.

 OHagan, A. (2006), Bayesian Analysis of Computer Code Outputs: A Tutorial,


Reliability Engineering and System Safety, Vol. 91, No. 10-11, pp. 1290-1300.

 Oberkampf, W. L. and M. F. Barone (2006), "Measures of Agreement Between


Computation and Experiment: Validation Metrics," Journal of Computational Physics, Vol. 217, No. 1, pp. 5-36; also, Sandia National Laboratories, SAND2005-4302.

 Oberkampf, W. L. and T. G. Trucano (2002), Verification and Validation in


Computational Fluid Dynamics, Progress in Aerospace Sciences, Vol. 38, No. 3, pp. 209-272.

 Oberkampf, W. L., T. G. Trucano and C. Hirsch (2004). "Verification, Validation, and


Predictive Capability in Computational Engineering and Physics." Applied Mechanics Reviews. 57(5), 345-384.

 Oberkampf, W. L. and T. G. Trucano (2008). "Verification and Validation Benchmarks."


Nuclear Engineering and Design. 238(3), 716-743.

 Oberkampf, W.L. and C. J. Roy (2010), Verification and Validation in Scientific


Computing, Cambridge University Press, Cambridge, UK.

29

References (continued)
 Roache, P. (2009). Fundamentals of Verification and Validation, Socorro, New Mexico,
Hermosa Publishers.

 Roy, C. J. (2005). "Review of Code and Solution Verification Procedures for


Computational Simulation." Journal of Computational Physics. 205(1), 131-156.

 Roy, C. J. and W. L. Oberkampf (2011). "A Comprehensive Framework for Verification,


Validation, and Uncertainty Quantification in Scientific Computing." Computer Methods in Applied Mechanics and Engineering. 200(25-28), 2131-2144.

 Saltelli, A., M. Ratto, T. Andres, F. Campolongo, J. Cariboni, D. Gatelli, M. Saisana, S.


Tarantola (2008), Global Sensitivity Analysis: The Primer, Wiley, Hoboken, NJ.

 Sprague, M. A. and T. L. Geers (1999), Response of Empty and Fluid-Filled, Submerged


Spherical Shells to Plane and Spherical, Step-Exponential Acoustic Waves, Shock and Vibration, Vol. 6, No. 3, pp. 147-157.

 Stern, F., R. V. Wilson, H. W. Coleman and E. G. Paterson (2001), Comprehensive


Approach to Verification and Validation of CFD Simulations-Part 1: Methodology and Procedures, Journal of Fluids Engineering, Vol. 123, No. 4, pp. 793-802.

 Trucano, T. G., M. Pilch and W. L. Oberkampf. (2002). "General Concepts for Experimental
Validation of ASCI Code Applications." Sandia National Laboratories, SAND2002-0341, Albuquerque, NM.

 Zhang, R. and S. Mahadevan (2003), Bayesian Methodology for Reliability Model


Acceptance, Reliability Engineering and System Safety, Vol. 80, No. 1, pp. 95-103.

30

Você também pode gostar