Master Thesis: Gradient-Based Test Pattern Generation for Digital Logic Circuits
-
chair:
CDNC
- Kontaktperson:
Background
Test pattern generation is a fundamental step in validating the correctness of digital logic circuits. Traditional Automatic Test Pattern Generation (ATPG) methods rely on Boolean logic and heuristic search. This thesis explores a cross-disciplinary approach: leveraging gradient-based optimization, a technique rooted in deep learning, to generate test vectors for digital circuits.
By approximating logical gates (e.g., AND, OR, NOT) with differentiable functions, we can treat a logic circuit similarly to a neural network. This opens up the possibility of applying tools like PyTorch to optimize inputs that trigger specific fault conditions following a process much like backpropagation in neural network training. This project seeks to build a prototype system that bridges the worlds of digital design and machine learning.
Thesis Objectives
• Develop a differentiable model of a digital logic circuit using smooth approximations of logic gates.
• Implement a gradient-based method for generating test vectors that expose faults.
• Benchmark the approach against classical ATPG tools (e.g., Synopsys TetraMAX) using standard circuits such as ISCAS’85/’89.
Tasks and Work Packages
Literature Review
• Study classical ATPG techniques and digital fault models.
• Analyze differentiable logic approaches and neural surrogate modeling.
Circuit Modeling
• Implement logic gates using differentiable approximations in PyTorch.
• Convert Verilog-based logic circuits into ML-compatible representations.
Test Pattern Generation
• Design a custom loss function to represent fault coverage.
• Apply gradient descent or related optimizers to find high coverage input vectors.
Evaluation & Benchmarking
• Compare with standard ATPG in terms of fault coverage, test length, and computational cost.
• Use industry-standard benchmarks
Skills Required for the Thesis
• Basic knowledge of digital logic design and computer architecture
• Background in deep learning (e.g., PyTorch)
• Programming skills: Python, Verilog, C/C++
Skills Acquired Within the Thesis
• Hands-on experience applying ML to digital logic validation
• Proficiency in modeling hybrid logic/ML systems using PyTorch
• Exposure to modern ATPG benchmarks and tools
• Opportunity to publish at top-tier Design/Test/EDA conferences or journals
References
• Automatic Structural Test Generation for Analog Circuits using Neural Twins
• Deep Differentiable Logic Gate Networks (NeurIPS 2022)