Quantifying Assurance in Learning-enabled Systems

Bibliographic Details
Title: Quantifying Assurance in Learning-enabled Systems
Authors: Asaadi, Erfan, Denney, Ewen, Pai, Ganesh
Publication Year: 2020
Collection: Computer Science
Subject Terms: Computer Science - Software Engineering, Computer Science - Artificial Intelligence, Computer Science - Machine Learning, Electrical Engineering and Systems Science - Systems and Control
More Details: Dependability assurance of systems embedding machine learning(ML) components---so called learning-enabled systems (LESs)---is a key step for their use in safety-critical applications. In emerging standardization and guidance efforts, there is a growing consensus in the value of using assurance cases for that purpose. This paper develops a quantitative notion of assurance that an LES is dependable, as a core component of its assurance case, also extending our prior work that applied to ML components. Specifically, we characterize LES assurance in the form of assurance measures: a probabilistic quantification of confidence that an LES possesses system-level properties associated with functional capabilities and dependability attributes. We illustrate the utility of assurance measures by application to a real world autonomous aviation system, also describing their role both in i) guiding high-level, runtime risk mitigation decisions and ii) as a core component of the associated dynamic assurance case.
Comment: Author's pre-print version of manuscript accepted for publication in the Proceedings of the 39th International Conference in Computer Safety, Reliability, and Security (SAFECOMP 2020)
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2006.10345
Accession Number: edsarx.2006.10345
Database: arXiv
More Details
Description not available.