Features of Software Solutions in the Field of Verification and Validation of Numerical Models

The numerical modeling increasingly replaces expensive experiments and physical prototypes in engineering and research, but the practical value of simulations depends on the demonstrated credibility of results. This study systematizes contemporary approaches and software capabilities for verificatio...

Повний опис

Збережено в:
Бібліографічні деталі
Дата:2025
Автори: Гейко, О. О., Варава , І. А.
Формат: Стаття
Мова:Українська
Опубліковано: Інститут проблем реєстрації інформації НАН України 2025
Теми:
Онлайн доступ:https://drsp.ipri.kiev.ua/article/view/354592
Теги: Додати тег
Немає тегів, Будьте першим, хто поставить тег для цього запису!
Назва журналу:Data Recording, Storage & Processing

Репозитарії

Data Recording, Storage & Processing
Опис
Резюме:The numerical modeling increasingly replaces expensive experiments and physical prototypes in engineering and research, but the practical value of simulations depends on the demonstrated credibility of results. This study systematizes contemporary approaches and software capabilities for verification and validation (V&V) of numerical models in the context of widely used industry standards and guidance. The discussion considers standardization of the model life cycle and evidence requirements for V&V, including NASA practices (NASA-STD-7009), U.S. Department of Defense VV&A guidance, and the ASME V&V /VVUQ family, which also introduces uncertainty quantification (UQ) as an integral element of decision-grade modeling. There are summarized the core functions implemented in modern V&V-oriented software solutions: (1) libraries of verification test problems, including benchmarks with reference solutions and systematic grid/time refinement studies; (2) support for the Method of Manufactured Solutions (MMS) to uncover implementation errors when exact analytical solutions for real problems are unavailable; (3) repositories of validation datasets and workflows for comparison with experiments, including statistical metrics of agreement; (4) automation of regression checks and “continuous validation” integrated into development pipelines (CI/CD), enabling repeatable pass/fail criteria across code versions and computing environments; (5) UQ and sensitivity analysis modules (e.g., Monte Carlo-based propagation, confidence intervals, and multidimensional validation indicators) to separate model-form inadequacy from input and numerical uncertainties; and (6) risk-informed planning of V&V depth based on model criticality, including emerging scenarios of digital twins where models are updated and validated using operational data streams. Based on this synthesis, the investigation proposes a practical set of selection criteria for V&V tools and outlines requirements for a modular V&V subsystem integrated with numerical solvers and project infrastructure. Key requirements include provenance management for traceability and reproducibility (inputs, solver settings, code/environment versions, logs, artifacts), scalable orchestration of test campaigns locally and on HPC resources, standardized reporting suitable for audit, and formalized acceptance thresholds aligned with the intended use of the model. The contribution of the work is a consolidated view of functional requirements and an implementation template that helps teams build evidence-based, maintainable V&V processes for modern computational models. Tabl.: 1. Fig.: 2. Refs: 10 titles.
DOI:10.35681/1560-9189.2025.27.3.354592