3/27/2022

Strong mixed-integer programming formulations for trained neural networks

R. Anderson, J. Huchette, W. Ma, C. Tjandraatmadja, and J.P. Vielma, Strong mixed-integer programming formulations for trained neural networks, Mathematical Programming, 2020, 183(1-2):3-39, ISSN 14364646, URL http://dx.doi.org/10.1007/s10107-020-01474-5.

We present strong mixed-integer programming (MIP) formulations for high-dimensional piecewise linear functions that correspond to trained neural networks. These formulations can be used for a number of important tasks, such as verifying that an image classification network is robust to adversarial inputs, or solving decision problems where the objective function is a machine learning model. We present a generic framework, which may be of independent interest, that provides a way to construct sharp or ideal formulations for the maximum of d affine functions over arbitrary polyhedral input domains. We apply this result to derive MIP formulations for a number of the most popular nonlinear operations (e.g. ReLU and max pooling) that are strictly stronger than other approaches from the literature. We corroborate this computationally, showing that our formulations are able to offer substantial improvements in solve time on verification tasks for image classification networks.

沒有留言:

張貼留言