Machine Learning for Verification and Testing
Introduction
Coverage is the primary concern when designing a test set in verification and testing problems. However, the definition of “coverage” is different in different problems. For example, for digital design, the test set is supposed to cover as many states of the finite state machine (FSM) or input situations as possible. For analog/RF design, since the input is continuous and the system can be very sensitive to environmental disturbance, a sampling strategy that can cover most input values and working situations is needed. As for the test of semiconductor technology, a test point is a design that needs to be synthesized or fabricated, and the test set needs to cover the whole technology library.
Test Set Redundancy Reduction for Digital Design Verification
The verification of a digital design will be carried out in each stage of the EDA flow, in which the verification space of a digital design under test (DUT) usually includes a huge number of situations. Thus, manually designing the test set requires rich expertise and is not scalable. Originally, the test set is usually generated by a biased random test generator with some constraints which can be configured by setting a series of directives. Later on, Coverage-Directed test Generation (CDG) techniques are used to optimize the test set generation process. The basic idea of CDG is to simulate, monitor, and evaluate the coverage contribution of different combinations of input and initial state. And then, the derived results are used to guide the generation of the test set. There are many CDG works that are based on various ML algorithms such as Bayesian Network, Markov Model, Genetic Algorithm(GA), rule learning, SVM and NN.
GA can be applied in CDG problems. We can combine the biased random test generation with GA. First, a constraint model is described and encoded, then a set of constraint models with different configurations is sent into the simulator to evaluate the coverage performance. GA method is used to search for a better configuration with higher coverage.
Test Complexity Reduction for Digital Design
Recently, GCNs are used to solve the observation point insertion problem for the testing stage. Inserting an observation point between the output of module 1 and the input of module 2 will make the test results of module 1 observable and the test inputs of module 2 controllable. We can use GCN to insert fewer test observation points while maximizing the fault coverage. More specifically, the netlist is first mapped to a directed graph, in which nodes represent modules, and edges represent wires. Then, the nodes are labeled as easy-to-observe or difficult-to-observe, and a GCN classifier is trained. Compared with commercial test tools, this method based on GCN can reduce the observation points by fair amount.
There are mainly two ways of accelerating the verification and testing process: 1) Reducing the test set redundancy; 2) Reducing the complexity of the testing, verification and diagnosis process. To reduce the test set redundancy or to optimize the generation of test instances, coverage-directed test generation has been studied for a long time, which can be aided by lots of ML algorithms. Recently, test set redundancy reduction of analog/RF design or even the test of semiconductor technology have raised a lot of attention, and more ML methods are applied to solve these problems. As for reducing the verification & test complexity, there are studies that adopt low-cost tests for analog/RF design, and some other studies that focus on fast bug classification and localization.
Comments
Post a Comment