Scalable Trace Signal Selection Using Machine Learning

K. Rahmani, P. Mishra, and S. Ray

In G. Byrd, K. Schneider, N. Chang, and S. Ozev editors, Proceedings of the 31st IEEE International Conference on Computer Design (ICCD 2013), Asheville, NC, USA, October 2013, pages 384-389. IEEE.

© 2013 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.


A key problem in post-silicon validation is to identify a small set of traceable signals that are effective for debug during silicon execution. Traditional signal selection techniques structural analysis of the circuit typically have poor restoration quality; simulation-based selection techniques provide superior restorability but incur significant computation overhead. In this paper, we propose an efficient signal selection technique using machine learning to take advantage of simulation-based signal selection while significantly reducing the simulation overhead. Our approach uses (1) bounded mock simulations to generate training vectors set for the machine learning technique, and (2) an elimination approach to identify the most profitable signals set. Experimental results indicate that our approach can provide speed-up of up to 60X in selection time (16X on average) and can improve restorability by up to 66.9% (14.1% on average).

Relevant files