The traditional approach to worst-case static timing analysis uses deterministic delays for gates and wires. These fixed delay values are extracted from best- and worst-case process parameter sets obtained through repeated device simulations. However, with the advent of deep-submicron technology, the intra-chip variations of process components and interconnect geometries is expected to increase by as much as 65%. This, coupled with the presence of variable correlations between delay elements causes further deviation from the fixed delay model. As a consequence, typical worst-case timing analysis leads to overly conservative solutions as it assumes worst-case delay characteristics and fails to recognize the inherent statistical variation. The end product suffers due to lost performance and expensive over-design.
In order to reliably design modern ICs, the timing model must be freed from deterministic approximations and the underlying randomness must be exposed through a coherent probabilistic abstraction. The objective of this research is to propose a framework and methodology for statistical timing analysis. In particular, there are two driving questions: (1) choosing an appropriate probabilistic model to describe delays while trading off between accuracy and computational complexity and (2) implementing an efficient algorithm to compute delay distributions of combinational circuits based on the chosen model. The incorporation of such a probabilistic framework for timing analysis reduces the gap between timing constraints predicted by tools and the actual silicon performance, obviates over-design, and relieves the designer of unnecessary iterations to attain timing closure.