Distribution-aware Validation and Verification of Neural Networks
Matthew Dwyer
Department of Computer Science, University of Virginia
Time: 2021-10-22, 11 AM - 12 PM
Location: Rice 340 and Zoom
Abstract
Neural networks (NN) are trained to produce a statistically accurate function approximation for data drawn from a target data distribution. The data distribution can be thought of as an implicit precondition on the use of the network. In safety or mission critical software, validation and verification (V&V) processes need only focus on inputs satisfying the software’s precondition, since attempts to invoke a function for inputs violating the precondition constitute errors that must be mitigated at runtime. We propose that the use of NNs in critical software should adopt a similar approach.
In this talk, we describe how generative models of the data distribution can be exploited to focus NN testing, verification, and falsification on the target data distribution. Incorporating such a model into NN V&V processes can reduce their cost and increase their effectiveness. We demonstrate this by speeding up the process of generating realistic NN test inputs and by generating more realistic counter-examples when verifying properties of NNs. We conclude by describing a number of open questions related to V&V of neural networks.