产品和服务

deep learning toolbox verification library -凯发k8网页登录

ensure robustness and reliability of deep neural networks

as deep neural networks become part of engineered systems, particularly safety-critical applications, it is crucial to ensure their reliability and robustness. deep learning toolbox verification library lets you rigorously assess and test deep neural networks.

with deep learning toolbox verification library, you can:

  • verify properties of your deep neural network such as robustness to adversarial examples
  • estimate how sensitive your network predictions are to input perturbations
  • create a distribution discriminator that separates data into in- and out-of-distribution for runtime monitoring
  • deploy a runtime monitoring system that oversees network performance with your network
  • walk through a case study to verify an airborne deep learning system
illustration of desktop machine using quantum computing with matlab.

verify deep neural network robustness for classification

boost your network’s robustness against adversarial examples (subtly altered inputs designed to mislead the network) using formal methods. this approach allows testing an infinite collection of inputs, proving prediction consistency despite perturbations and guiding training enhancements that enable you to improve the network’s reliability and accuracy.

estimate deep neural network output bounds for regression

estimate the lower and upper output bounds of your network given input ranges using formal methods. this process enables you to gain insights into the network’s potential outputs for given input perturbations, ensuring reliable performance in scenarios such as control systems, signal processing, and more.

build safe deep learning systems with runtime monitoring

incorporate runtime monitoring with out-of-distribution detection to build safe deep learning systems. continuously evaluating if incoming data aligns with training data can help you decide whether to trust the network’s output or redirect it for safe handling, enhancing system safety and reliability.

case study: verifying an airborne deep learning system

explore a case study to verify an airborne deep learning system in line with aviation industry standards such as do-178c, arp4754a, and prospective easa and faa guidelines. this case study provides a comprehensive view of the steps necessary to fully comply with industry standards and guidelines for deep learning systems.

网站地图