9.13paper

12
A Methodology to Measure the Verification Gap in Analog and Mixed-Signal Designs Scintera Networks, Inc Jonathan B. David Plato Networks, Inc Mrityunjay (Jay) Singh Presented at Session 9.13

Upload: mohammad-seemab-aslam

Post on 18-Jul-2016

214 views

Category:

Documents


1 download

DESCRIPTION

doc

TRANSCRIPT

Page 1: 9.13Paper

A Methodology to Measure the Verification Gap in Analog and Mixed-Signal Designs

Scintera Networks, Inc

Jonathan B. David

Plato Networks, Inc

Mrityunjay (Jay) Singh

Presented at

Session 9.13

Page 2: 9.13Paper

CDNLive-Silicon Valley 2007 Measuring the Verification Gap in Analog and Mixed-Signal Designs 9.13

ABSTRACT A high fraction of design bugs identified in Mixed-Signal SOC's are attributed to the Analog and Mixed signal sections. Traditional approaches to verifying these circuits depend on circuit simulation and waveform review, with the hope that enough corner cases can be run prior to tapeout to catch all the problems. This paper introduces a new Verification metric for the analog space, and demonstrates an environment to capture and evaluate this metric throughout the Design Verification cycle, so that, as with Coverage in the digital design space, designers can understand the gap between what has been verified, and what could be, using this information to expand the verification in the most important areas. Verification Elements for this environment are described in Verilog-A/AMS, and are built from simple descriptions using Perl and Python scripts. Simulation, and results analysis are automated using OCEAN scripts. The Verification data for determining the metric is collected and visualized using scripts. Portions of the example design have been fabricated at TSMC and Tower in the 0.18u node. The example design uses the Cadence Generic CDK at the 180nm node to demonstrate the environment and methodology. Provided example files will assist design teams in replicating this methodology. .

1. INTRODUCTION Many factors drive the push to integrate Analog and Mixed-Signal circuits with ASIC designs. However a major impediment is risk that an undiscovered bug in the analog section or on its interface will require debug, and fix and the consequent extra mask costs, and schedule slips. Thus mixed-signal design verification methodology improvements have the potential for a large return on the invested effort. While much progress has been made in the last decade, the lack of a methodology to keep track of the verification status, and direct remaining efforts leaves that risk largely unmeasured.

1.1 The State of Mixed Signal Design Verification While circuit simulation and HDL’s have a history spanning more than 3 decades[2,3], only the recent decade has seen the introduction and adoption of Analog/Mixed-Signal extensions[4,5,9] to popular design languages, such as Verilog [6,8] and VHDL. Given a focus on simulation, the EDA industry has developed several approaches (including UltraSIM) to speeding up the simulation at the transistor level, and these tools are in fairly widespread use, compatible with the existing family of design languages. At the beginning of the current decade the term “verification” started being used by analog design teams in contexts other than DRC and LVS. Today, the use of AMS behavioral modeling for system level verification [11,14,15], with interface checks coded into the models [17,19] is common. Analog simulators are supporting “assertions” to allow the design team to customize the problem regions for which they get warnings. Behavioral models can be easily combined with transistor-level designs to accelerate portions of the verification effort [18]. Co-simulation environments exist even for system level tools like Simulink®. Commercial Environments provide management of Simulation Test-Benches [7], and the associated simulation results, and provide scripting languages like OCEAN [20], or MDL when the needs are more specialized. Either method allows automation for running multiple simulation jobs, based on available hardware and licenses. Discussion has even started on formal methods for analog design [16]. In spite of these all these (valuable) improvements, its still nearly impossible for a design team to be sure they have run ALL the right simulations on today’s large designs to catch as many design bugs as possible. While careful planning and great “lists” help ( ie list of simulations, list of specifications, and lists of simulation results), there is always the possibility that a late change to the design was not simulated sufficiently, or that a subcircuit was just left off the list. The amount of double checking and comparison that usually occurs is reminiscent of doing LVS on a light table, which makes it a bottleneck in the AMS design process. We need a metric, akin to “coverage”, for analog design.

1.2 Requirements for a Verification Scoreboard In determining the requirements for enhancements to our Mixed Signal Design Verification methodology, we focus on the issue of determining if enough of the right simulations have been run, and of directing additional simulation efforts. It is assumed that other aspects of the methodology [18, 19] will validate correct circuit function, and measure required performance where appropriate to the analog side of the design. The first requirement is that we discover ALL design elements, without dependence on existing test-benches, so that we can determine if any simulations have been run on that element. For our prototype methodology this implies a “netlister” type of program to traverse the design from the top (or highest level assembled) to initialize our verification database (discussed in section 2.4). Next, for each block being validated, we need the simulation conditions captured consistently, no matter which simulator or testbench is used, or even which view of the block is used in that simulation. This information needs to be passed to the

J.David, M.Singh Page 2 September 2007

Page 3: 9.13Paper

CDNLive-Silicon Valley 2007 Measuring the Verification Gap in Analog and Mixed-Signal Designs 9.13

verification database in a manner that clearly identifies which block it concerns and what the conditions and results for that simulation are. We discuss this data format in more detail in section 2.2, and generating it in [1]. Third, we need to determine what parameters we need to track in the database. For each simulation we need to know the view used, and if it was not the top device in the testbench which block called it. We need to know which PVT corner was simulated, and values of control signals. We probably also want to keep track of some key measurements, like static current or maybe even some simple signal characteristics like gain, or delay. We discuss these in more detail in section 2.1. Finally, we need to determine how to keep the score, how to calculate the value of our metric, even what to call it. In the Digital Design Verification methodology, the metric used is Functional Coverage. Motivated by the more complicated nature of this metric in the analog design space, we’ll move up a dimension from Area to Volume, and call our metric Enclosure. We discuss the mechanics of calculating design enclosure in more detail in section 2.3.

1.3 Technologies to Build on As in most enterprises, to get the maximum leverage for our efforts we look to existing technologies from which to build our application. We need technologies for discovering the design, capturing the simulation data, passing it to our database, calculating the scores, and displaying it. Design discovery is similar to the process used for netlist generation. While many examples exist, especially in SKILL®, one of the authors (Jay) has previously written a design hierarchy display tool, which we’ll adapt for this purpose (see section 2.4). Since it provides much of our motivation, we’ll look to the Assertion Based Design Verification world [10] for how to capture the data we need. The prototype for this flow, IBM’s Sugar, parsed the Sugar specification file, to build “checker” modules for use in Logic simulators. A similar model seems appropriate here. While we won’t assert any properties, we will specify parameters that we want to observe, and measurements we want to make, in a similar fashion to PSL statements, and parse those to produce a Verilog-A/AMS wrapper module which will capture the information we need. For passing the observations and measurements to our database, especially in our prototype phase, we want a message syntax that is easy to extend and easy to debug. From the internet world, we can borrow the technology that is used in all the blog readers (AKA: RSS aggregators) we want more time to “waste” on. This underlying technology is XML, eXtensible Markup Language [12], which allows a nearly arbitrary “tree” structure for the data being sent, free (or commercial) test parsers on everyone’s desktop (your browser), and in the case when things go wrong, its really just tagged ASCII text and readable by the humans who have to fix what’s gone wrong. The fun part is getting Verilog-A print statements to create valid XML output or, better yet, automatically generating that Verilog-A from our short description [1]. Interestingly, XML points to another technology to help us with displaying the results for our users, eXtensible Stylesheet Language, and XSL Transformations. But, as hackers with too much to learn, we’ll do our first prototype in perl, dumping our results to CSV files for display in a tool on nearly as many desktops, the common spreadsheet.

1.4 The Challenges While there are a number of difficulties to be overcome in this effort, there are two major ones that we address here: Development of a Metric, and its calculation, and automating the initialization of the verification database. A third challenge is building of the required wrappers from a simple description similar to PSL assertions, which is covered in the companion paper [1]. The proposed metric we will call “Enclosure”, as an analog to Coverage used in the digital verification methodology, considering that Analog Verification is a problem with additional performance dimensions beyond speed, power and function. Much as there is a coverage percentage determined for a design in a verification environment, a ratio of what is covered to the total design, we will calculate a multi-dimensional “Volume” of enclosure, and compare this to the total volume of the design space.

J.David, M.Singh Page 3 September 2007

Page 4: 9.13Paper

CDNLive-Silicon Valley 2007 Measuring the Verification Gap in Analog and Mixed-Signal Designs 9.13

EAGLETEaglet_top

Score = XXX

PLL BIASGEN ADC

VCO PD-CP

RCVR

LNA MIXER

EAGLETEaglet_top

Score = XXX

PLL BIASGEN ADC

VCO PD-CP

RCVR

LNA MIXER

AMSDTcl script

Spectresim

UltrasimADE

Oceanscript

Sim TBADE

Scoreboard

AMSDTcl script

Spectresim

UltrasimADE

Oceanscript

Sim TBADE

Scoreboard

2. METHODOLOGY OVERVIEW For the block level observations, it is important that the information from each simulation is captured in the same way, no matter which test bench is being used to run the simulation, nor which representation of the design is being simulated. This requires that all simulations feed this data to a verification database, or scoreboard, and that all information about a block have a consistent set of observation information. In order to accommodate both AMS and Analog simulation flows, we implement this as a Verilog-A/AMS wrapper around each verified block. This wrapper is built automatically from a simple description “hidden” in the comments of a Verilog-A/AMS shell view, we’ll call the vamsunit (similar to the vunit used in the Assertion based Verification methodology[10].) The wrappers will record their observations in XML files, which will be passed to the database for tracking the results and determining the enclosure score.

2.1 Observation Information For each observed block, in each simulation run, we need to record a consistent set of information. Extending the Volume concept, our approach will be to consider each element of this list as one axis whose value points to a location in N-space. Thus each observation will occur a point in the design space, and is the “volume” of the space defined by connecting the set of observed points in that space.

Figure 1 Collecting Verification data into common score board

Table 1: Categories of Observation Dimensions

Table 1 shows the list of observation elements. Unfortunately it is easy to see that while all the observations for one block will have a consistent set, each block may have some unique elements, depending on the types of devices used, types and quantity of the supply and bias signals, and the number of control signals. Since its possible that not all of these will be relevant for our Metric, well classify them into Major and Minor dimensions, and expect each category will have at least 1 major dimension. except controls, as some blocks will not have any. The sample major dimensions are identified with bold text in Table 1,

Environment Temperature Supply Voltage(s) Bias Current(s)

Process MOS corner – both N & P Passive – RES, IND, CAP DIODE/BJT

Controls Logical (pwrdn, en) Digital (set[3:]) Analog

Application Inst name View used Lib & Cell Analysis name Version?

Measurements Iddq, Vib, Tlh_in_out Vcm_out

Distinguishing between an analog signal, an analog control and an environment parameter is largely a matter of intent. If the signal is actively varied in operation, it’s a signal. If it’s static, but intentionally adjustable, it’s a control. If it doesn’t fit those cases, it’s probably an environment parameter. Signals, at least analog ones, are not used to define our design volume, but are what we measure IN that volume. Since every measurement is packaged with its observation point, its possible to identify exactly what simulation must be re-run to validate a problematic result. An example vamsunit description is shown in Listing 1, for a CML buffer with bias current, and a powerdown control pin.

J.David, M.Singh Page 4 September 2007

Page 5: 9.13Paper

CDNLive-Silicon Valley 2007 Measuring the Verification Gap in Analog and Mixed-Signal Designs 9.13

2.2 Creating the Wrapper modules Once we have the vamsunit description, the next step is to create the wrapper module, using the process described in [1] , add it to the library, then use it in one or more simulations. The resulting XML observation file from one of the simulations is shown in Listing 2.

Listing 1 CML_demobuf.vamsunit: example verification unit written in OSL // VerilogA for CHRONOS_top_sim, CML_demobuf, vaunit // msdv dut CHRONOS_top_sim.CML_demobuf:schematic ; `include "constants.vams“ `include "disciplines.vams“ module CML_demobuf(out_n, out_p, gnd_a, vdd33, vdd_a, in_n, in_p, iref500u, pwrdn); output out_n, out_p; inout gnd_a, vdd33, vdd_a; input in_n, in_p, iref500u, pwrdn; electrical out_n, out_p; electrical gnd_a, vdd33, vdd_a; electric n_n, in_p, ireal i f500u, pwrdn; // msdv Tsub: observe environ.temperature; // msdv Vgnd_a: observe environ.reference #(.units("V")) (gnd_a); // msdv Vdd_a: observe environ.supply #(.units("V")) (vdd_a); // msdv Iq_vdd_a: measure dcamps #(.units("A"),.scalar("m")) // when(analysis("static","tran")) (vdd_a); // msdv Ibref500u: observe environ.bias #(.units("A"),.scalar("u")) // (iref500u ); // msdv Vbref500u: measure dcvolts #(.units("V")) when(analysis("static")) // (iref500u,gnd_ a); // msdv mos: observe process.cmos #(.tox_units("Ang"),.cj_units("pf/um^2"), // .dvth_units("V")) mos_pmonitor; // msdv scn_vars: observe process.passive #(.count(3), // .what({captol,indtol,restol})) passives_monitor; // msdv pwrdn: observe control.binsig #(.vth(0.75)) (pwrdn); // msdv out_in: measure sigpath.gain_delay #(.samples(25),.diff('TRUE), // .inv('FALSE),. firstsample (10), .units("none"), .timeaccy(5p) , // .dly_units("s"),.ampl_units("V"),.dscalar("p"),.stdy_dly(50p), // .start_time(100p)) while(!pwrdn) from(in_p,in_n) to(out_p,out_n); endmodule

Listing 2 Sample XML observtation data point

<?xml version='1.0'?>

<observation>

<dut>

<library>CHRONOS_top_sim</library>

<cell>CML_demobuf</cell>

<view>schematic</view>

</dut>

<instance> DUT </instance>

<vwrapper> $Header: /../CML_demobuf_vwrp/..va,vs 1.6 2007/07/20 jbdavid $ </vwrapper>

<point count="0" >

<environ>

<temperature name="Tsub" units="C"> 50.0 </temperature>

<reference name="Vgnd_a" units="V" atport="gnd_a" > 0 </reference>

<supply name="Vdd_a" units="V" atport="vdd_a" > 1.5 </supply>

<bias name="Ibref500u" units="A" atport="iref500u" > 500 u </bias>

</environ>

<process>

J.David, M.Singh Page 5 September 2007

Page 6: 9.13Paper

CDNLive-Silicon Valley 2007 Measuring the Verification Gap in Analog and Mixed-Signal Designs 9.13

<section type="cmos" name="mos">

<tox units="Ang"><p> 28.5 </p><n> 28.1 </n></tox>

<Cj units="pf/um^2"><p> 0.00136925 </p><n> 0.001376 </n></Cj>

<dVth units="V"><p> 0 </p><n> 0 </n></dVth>

</section>

<section type="passive" name="scn_vars">

<captol> 1 </captol>

<indtol> 1 </indtol>

<restol> 1 </restol>

</section>

</process>

<control>

<signal name="pwrdn" type="binlogic" size="1" > 1 </signal>

</control>

<measures>

<Iq name="Iq_vdd_a" analysis="static tran" units="A" > 0.00125283 m </Iq>

<Vdc name="Vbref500u" analysis="static" units="V" > 47.1933 </Vdc>

<amplitude_static name="out_in_in" analysis="static" atport="in" units="V" diff="true" ><diff> 0 </diff><comn> 1.35 </comn></amplitude_static>

<amplitude_static name="out_in_out" analysis="static" atport="out" units="V" diff="true" ><diff> 0 </diff><comn> 1.49981 </comn></amplitude_static>

<gain_static name="out_in" analysis="static" fromport="in" toport="out" units="none" diff="true" > undef </gain_static>

</measures>

</point >

<point count="1" >

… the only thing different for this point is …

<control>

<signal name="pwrdn" type="binlogic" size="1" > 0 </signal>

</control>

<measures>

<Iq name="Iq_vdd_a" analysis="static tran" units="A" > 0.14608 m </Iq>

<Vdc name="Vbref500u" analysis="static" units="V" > 47.1934 </Vdc>

<gain name="out_in" analysis="tran" fromport="in" toport="out" units="none" samples="26" diff="true" ><r> 1.35985 </r><f> 1.35993 </f></gain>

<delay name="out_in" analysis="tran" fromport="in" toport="out" units="s" samples="26" diff="true" ><r> 21.5284 p </r><f> 21.5278 p </f></delay>

<amplitude name="out_in_in" analysis="tran" atport="in" units="V" samples="26" diff="true" ><diff><r> 0.294222 </r><f> -0.294222 </f></diff><comn><r> 1.34987 </r><f> 1.34987 </f></comn></amplitude>

<amplitude name="out_in_out" analysis="tran" atport="out" units="V" samples="26" diff="true" ><diff><r> 0.400097 </r><f> -0.400122 </f></diff><comn><r> 1.18602 </r><f> 1.18597 </f></comn></amplitude>

</measures>

</point >

</observation>

J.David, M.Singh Page 6 September 2007

Page 7: 9.13Paper

CDNLive-Silicon Valley 2007 Measuring the Verification Gap in Analog and Mixed-Signal Designs 9.13

2.3 Calculating the Volume of Enclosure Now that we have several observation points, we need to calculate a value of our metric. One approach that appears to have merit on the surface, is to just count the number of unique corners, run and compare this to the number of vertices of the design space. However this would only work if we could guarantee that all the simulations were run at a limited number of vertices, plus perhaps, midpoint. Except for the application and the digital controls, the variables used in our “space” are real valued, with an infinite number of possible values, so we need a more sophisticated way of measuring the progress to our goal. Additionally, using a count rather than a volume, looses the connection to the specific combinations of all the important factors of the design space that we need to check. For these reasons, we purport that a volume measurement is a better way to measure the progress of our verification efforts. Determining the volume of enclosure would only require a fairly simple calculation if we could guarantee that all the vertices existed, as shown in Equation 1.

∏=

N

ddlength

0

Equation 1 In fact this is useful for calculating the Total Volume of the design space, so we can determine the fraction enclosed by the observed points. Back in calculus we learned that doing a triple integration over the region where the object existed would do the trick. A very “Analog” type of solution, but we don’t even do real simulations this way.

∫∫∫ZYX

dxdydzzyxfshape,,

),,(

Equation 2 As in the case of simulation, we approximate the solution, here with a recursive summation as indicated in Equation 3.

{ }∑ ∑ ∑= = =

∈⋅Δ⋅Δ⋅ΔM

MMM

N

i

N

i

N

i

Miiiii

Mi

ObserveddddDDD0 0 0

10011

1

0

01001 ,...,,L

Equation 3 Of course to complicate matters, to get the right answer we’ll want to use a trapezoidal approximation as we’ll show in the following example.

( )iiY

N

i

ii yyxxxdy −⋅+

≈ +

−+∫ ∑ 1

11

02

Equation 4

2.3.1 An example Volume calculation in 2 dimensions A Volume in 2 dimensions is just an area, but simple examples are instructive. We take 3 points as shown in the table below. This defines a right triangle with base and height = 1 for an area of ½*B*H = ½.

Points 0 1 2

Xval 0 1 1Yval 1 1 0

We will take one of the dimensions, for each of its values, determine the span in the other dimension. 0 is an allowed span as long as there is at least one point. Then we use the resulting span in the trapezoidal approximation to the integral in Equation 4. For Y = 0, we have only 1 point at X= 1, for a span of 0. For Y = 1, we have points at X= 0 and X= 1, for a span of 1. Taking the span in X at yi as our xi, we get ½ *1 = ½ as our area, matching the well known formula. A perl implementation is shown in Listing 3.

Listing 3: calcVol_major: perl subroutine to calculate a volume of enclosure sub calcVol_major { my ( $obsdata, $dut, $points, $dimtype, @confdims) = @_;

J.David, M.Singh Page 7 September 2007

Page 8: 9.13Paper

CDNLive-Silicon Valley 2007 Measuring the Verification Gap in Analog and Mixed-Signal Designs 9.13

my @volpoints = ( 0, 0); #no volume, no points my $comparedim = pop(@confdims); my $span; my @range; my @cmnpoints; my %vec = (); # vector of values to integrate over my %numvec = (); # vector of values to integrate over - indexed by the numeric value.. if (@confdims) { #There are still elements, we need to call this routine again #find the points in common for each value, if any, get the volume foreach $confval (keys (%{$$obsdata{$dut}{"majordims"}{$dimtype}{$comparedim}})) { @cmnpoints = grep( exists($$obsdata{$dut}{"majordims"}{$dimtype}{$comparedim}{$confval}{$_}), @$points); @volinfo = calcVol_major( $obsdata, $dut, \@cmnpoints, $dimtype, @confdims); $vec{$confval} = $volinfo[0] if ($volinfo[1]>0); #zero if points in common but no span $numvec{sprintf("%+.6e",eng2number($confval))} = $volinfo[0] if (($volinfo[1]>0) && (eng2number($confval) ne "NaN")); } #If we are going to sort by these values we have to have exact match @range = sort { $a <=> $b } keys( %numvec); if (scalar(@range)>1) { $span = $range[$#range] - $range[0]; foreach $ipoint ($range[1]..$range[$#range-1]) { #this would be an empty list otherwise if ( $numvec{sprintf("%+.6e",$ipoint)} == 0 ) { # if its zero.. and not an end point #we can delete it.. delete($numvec{sprintf("%+.6e",$ipoint)}); } } } # now we can do our summation @range = sort { $a <=> $b } keys( %numvec); # new array with zeros removed.. if (scalar(@range)>1) { $testvol = 0; foreach $segment (0 .. $#range -1) {# at worst its 0..0 $span = $range[$segment+1] - $range[$segment]; $trap = 0.5*($numvec{sprintf("%+.6e",$range[$segment+1])} + $numvec{sprintf("%+.6e",$range[$segment])}); $testvol = $testvol + $trap*$span; } $volpoints[0] = $testvol; #the volume to return } # $volpoints[1] =scalar(@range); #the number of points to return.. } else { # we are on the innermost loop #find the points in common, get the span of them and sum over the value $testvol = 0; foreach $confval(keys( {$$obsdata{$dut}{"majordims"}{$dimtype}{$comparedim}})) { @cmnpoints = grep( exists( $$obsdata{$dut}{"majordims"}{$dimtype}{$comparedim}{$confval}{$_}), @$points); $vec{$confval}++ if (@cmnpoints); } @range = sort { $a <=> $b } map(eng2number($_), keys( %vec)); if (scalar(@range)>1) { $span = $range[$#range] - $range[0]; $volpoints[0] = $span; } $volpoints[1] =scalar(@range); } return @volpoints; }

2.3.2 Determining the Dimensions of Enclosure Given the limited observation count for many simulation efforts, the likelihood of that the volume of enclosure is zero is high. Nonetheless, we want to provide useful feedback on what else needs to be done, for example which additional

J.David, M.Singh Page 8 September 2007

Page 9: 9.13Paper

CDNLive-Silicon Valley 2007 Measuring the Verification Gap in Analog and Mixed-Signal Designs 9.13

simulations to run. The verifier needs to know which dimensions are causing the zero result. To provide this information, we examine each dimension in turn. The first test rejects it immediately if the span over all points in that dimension is zero. These will be reported as “invariant” dimensions, since they cannot contribute to a volume. Then for each dimension, we calculate a test volume with the other confirmed dimensions, accepting the dimension under test if the result is non-zero. The ratio of the confirmed dimensions to the total dimensions will track the additional parameters varied in simulation to increase the Enclosure.

2.3.3 Presenting the Results The results for an example cell are shown in . Enclosure Ratio, and dimensions enclosed are calculated for the results, and for each category of results. For the minor dimensions the span of the dimension is reported. If there is no span, a value of invariant is listed. Based on these results, we’d first want to set up some simulations where this is used inside other

blocks. We’d also want to run some additional process corners. The current corners, SS, TT, and FF, lie along a line in the N and P “dimension”. Thus, taking both the dVth.p or dVth.n dimensions is going to result in a zero enclosed volume. Adding some SF and FS simulations would improve that score. Finally we notice that only 1 value of bias current has been used in simulations so far. This should probably be promoted to a Major dimension, and certainly should have some additional values used in some simulations.

Figure 3

MSDV verification Enclosure for CHRONOS_top_sim . CML_demobuf : schematicDimension category Vol Enclosed Vol Total % EnclosurDimensionsDimenstion % Dimensions EncloseDimensions lacking EnclosuAll Major Dimensions 0.25 1 25% 4 7 57.14% applic.instance Application Major Dimensions 0 1 0% 0 1 0% instance Environment Major Dimensions 20 30 66.67% 2 2 100%Process Major Dimensions 0.0025 0.01 25% 2 3 66.67% section.mos.dVth.p Control Major Dimensions 1 1 100% 1 1 100%

Dimension category Minor Dim Span Values.. applic "vwrapper" "Invariant" "$Header: vs 1.6 2007/07/20 11:58:35 jbdavid Exp $ "process "section.scv_vars "0.2" "0.9 " "1.1 " "1 "process "section.mos_33 "12" "71.5 " "65.5 " "77.5 "process "section.mos.Cj.p "0.0001369" "0.0014377"0.0013008 "0.00136925 "process "section.mos.Cj.n "0.0001376" "0.0014448"0.0013072 "0.001376 "process "section.mos_33 "12" "71.5 " "65.5 " "77.5 "process "section.mos_33 "9.09999999999999e- "0.0008645"0.0009555 "0.00091 "process "section.mos_33 "0.0001267" "0.0013304"0.0012037 "0.001267 "process "section.mos_33 "0.2" "0.12 " "-0.08 " "0.02 "process "section.mos.tox "1.34" "29.17 " "27.83 " "28.5 "process "section.mos_33 "0.16" "-0.08 " "0.08 " "0 "process "section.mos.tox "1.34" "27.43 " "28.1 " "28.77 "process "section.scv_vars "0.1" "1.1 " "1 "process "section.dis_rpoly "Invariant" "260.09 "process "section.dis_rpoly "Invariant" "319.55 "process "section.dis_rpoly "Invariant" "319.55 "process "section.dis_rpoly "Invariant" "260.09 "environ "supply.Vdd33" "0.4" "3.3 " "3.1 " "3.5 "environ "bias.Ibref500u" "Invariant" "500 u "environ "reference.Vgnd_ "Invariant" "0 "

Figure 3 Sample Verification Score Sheet

2.4 Scoring an entire design The results shown so far are interesting on a single block level, but the impact will be much higher when we can present a summary like that shown in Fi . To get there we need to know which lower level,

gure 2

• Starting from Any point in the design– (eventually the Top) show:

• EAGLET.eaglet_top:schematic• xxx transistors xxx instrumented (20%)• xxx blocks xxx have observations recorded (10%)• Design Volume is NN dimensions, 3% enclosed• Observation Density Ratio is XX:1

5% Instrumented(Lib.Cell) 0% instrumented

– DTMF (Lib.Cell) 50% observed– PLL (Lib.Cell) 30% enclosed– BIAS (Lib.Cell) 8:1 density …– Inst Lib.Cell Score Rollup

– Receiver (Lib.Cell)– ADC

J.David, M.Singh Page 9 September 2007

Figure 2 Desired Top level summary

Page 10: 9.13Paper

CDNLive-Silicon Valley 2007 Measuring the Verification Gap in Analog and Mixed-Signal Designs 9.13

instrumented blocks are used in the major design block. Rather than assembling this information by hand, we’ll use a netlist type skill program like that shown in listing Listing 4. This explores the design from the top, discovering all vamsunit cells in the design, and using this for scoring the first level of the verification effort, instrumenting the design. This information will be written as an XML file with links to the score sheets of all instances in all views of the design. This will allow an XML parser (and XSLT) to follow the link and retrieve this information allowing all the remaining scores to be rolled up when the results are viewed. This script will need to be run periodically as the design matures so that results created from obsolete versions of the design can be removed from the results, and new elements added to the design can be scored. Work on this aspect of the methodology continues.

Listing 4 MSDVbuildTree: Traverses the design to initialized the verification database ;------------------------------------------------------------------------;

; Schematic Tree.

;------------------------------------------------------------------------;

procedure( MSDVbuildTree( @key (cvId geGetEditCellView())

(tab "")

(viewList "schematic cmos_sch symbol")

(vunitList "vaunit vamsunit")

(scoreBoardDir "../work_hdl/jbdscoretest")

)

MSDVRecurseCv( cvId nil

?viewList viewList

?vunitList vunitList

?scoreBoardDir scoreBoardDir )

)

procedure( MSDVoutfile( scoreboardDir libName cellName viewName fileName )

fpathlcv = buildString( list(scoreboardDir libName cellName viewName) "/")

unless( isFile( fpathlcv )

fpathlc = buildString( list(scoreboardDir libName cellName ) "/")

unless( isFile( fpathlc)

fpathl = buildString( list(scoreboardDir libName) "/")

unless( isFile( fpathl)

createDir(fpathl) )

createDir(fpathlc) )

createDir(fpathlcv) )

fname = buildString( list(scoreboardDir libName cellName viewName fileName) "/")

outfile( fname )

)

procedure( MSDVRecurseCv( cvId vamsunitId @key

(viewList "schematic cmos_sch symbol")

(vunitList "vaunit vamsunit")

scoreBoardDir

)

let((inst contents tmp tmpvunit port line)

contents = append1(contents "<?xml version='1.0'?>")

contents = append1(contents

sprintf( nil "<designblock><lib>%s</lib><cell>%s</cell><view> %s</view> "

cvId~>libName cvId~>cellName cvId~>viewName

)

)

J.David, M.Singh Page 10 September 2007

Page 11: 9.13Paper

CDNLive-Silicon Valley 2007 Measuring the Verification Gap in Analog and Mixed-Signal Designs 9.13

if( vamsunitId

contents = append1(contents

sprintf( nil "<vunit><lib>%s</lib><cell>%s</cell><view> %s</view></vunit> "

vamsunitId~>libName vamsunitId~>cellName vamsunitId~>viewName

)

)

)

contents = append1(contents "<contains>")

foreach( hdr cvId~>instHeaders

if( (length(hdr~>instances) > 0)

&& (inst = car( hdr~>instances ))

&& (tmp = dbGetAnyInstSwitchMaster( inst viewList )) then

unless(member( tmp->cellName list("ipin" "opin" "iopin") )

contents = append1(contents

sprintf( nil "<inst><lib>%s</lib><cell>%s</cell><view> %s</view>\

<qty> %d</qty></inst>"

tmp->libName

tmp->cellName

tmp->viewName

length(hdr~>instances)

)

)

)

if( tmpvunit = dbGetAnyInstSwitchMaster( inst vunitList )

tmpvunit->viewName tmp->cellName tmp->viewName)

)

unless( tmp->cellViewType == "schematicSymbol"

MSDVRecurseCv( tmp tmpvunit

?viewList viewList ?vunitList vunitList

?scoreBoardDir scoreBoardDir

)

)

)

)

contents = append1(contents "</contains>")

contents = append1(contents "</designblock>")

port = MSDVoutfile( scoreBoardDir

cvId~>libName cvId~>cellName cvId~>viewName "Contents.xml")

foreach( line contents

fprintf( port "%s\n" line);

)

close(port)

)

)

J.David, M.Singh Page 11 September 2007

Page 12: 9.13Paper

CDNLive-Silicon Valley 2007 Measuring the Verification Gap in Analog and Mixed-Signal Designs 9.13

3. CONCLUSIONS A Metric for Scoring the Analog and Mixed-Signal Design Verification effort, Enclosure, has been defined. Leveraging the new methodology for adding monitors to the design described in [1], a procedure for determining the enclosure of a design block has been demonstrated. The value of the results to guiding additional verification efforts has been shown. While the next steps for the development of this methodology are defined, the results that are now being generated are already useful in their current form.

4. ACKNOWLEDGMENTS Our thanks go out to the rest of the Bldg. Zero wrkg. group for their fruitful discussions, input and insights: Mahesh, Nandu, John, John, Eric, Volker, Venky, Tony, Cliff, Teng-Kiat, ND, Kevin, Stephen, Julian, and the others we cannot mention.

5. REFERENCES [1] J. David, “Automatic Mixed-Signal Design Verification Instrumentation with Observation Specification Language”, to

be presented at 2007 IEEE International Behavioral Modeling and Simulation Workshop, Sept 2007, http://www.bmas-conf.org

[2] L. W. Nagel, “SPICE2: A Computer Program to Simulate Semiconductor Circuits,” University of California, Berkeley, Memo. no. ERL-M250, May 1975.

[3] Y.Chu, D. L. Dietmeyer, J. R. Duley, F. J. Hill. M. R. Barbacci, C. W. Rose, G. Order, B. Johnson, and M. Roberts, “ Three Decades of HDLs – Part I: CDL through TI-HDL,” IEEE Design Test Comput., vol. 9, pp 69-81, June 1992.

[4] K. Kundert, O. Zinke, The Designer’s Guide to Verilog-AMS, Boston:Kluwer, 2004 [5] Dan Fitzpatrick, Ira Miller, Analog Behavioral Modeling with the Verilog-A Language Boston: Kluwer, 1998 [6] Samir Palnitkar, Verilog HDL Mountain View: SunSoft, 1996 [7] Virtuoso® Specification-driven Environment User Guide, Product version 4.1 San Jose: Cadence Design Systems, 2004

sourcelink.cadence.com [8] IEEE Standard Verilog® Hardware Description Language, IEEE Std. 1364-2001 rev C, New York, 2001 [9] Verilog-AMS Language Reference ManualVersion 2.2, November 2004 Napa: Accelera, 2004. [10] Assertion-Based Verification for Simulation Using PSL; Lecture Manual, Version 5.3, October 2004 San Jose:

Cadence, 2004. [11] G. Bonfini, M. Chiavacci, R. Mariani, R. Saletti, “A New Verification Approach for Mixed-Signal Systems”,

BMAS2005 Web-only Publications: http://www.bmas-conf.org/2005/web-only-pubs/BMAS2005_21.pdf . [12] E.T. Ray, Learning XML (Second Ed) Sebastopol: O'Reilly, 2003. [13] D. Leenaerts, G. Gielen, R.A. Rutenbar, “CAD solutions and outstanding challenges for mixed-signal and RF IC

design”, ICCAD 2001. IEEE/ACM International Conference on Computer Aided Design, Pages:270-277. [14] K. Kundert, H. Chang, D. Jefferies, G. Lamant, E. Malavasi, F. Sendig, “Design of mixed-signal systems-on-a-chip”

IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems,, Vol.19, Iss.12, Dec 2000, Pages:1561-1571.

[15] T.E. Bonnerud, B. Hernes, T.Ytterdal, “A mixed-signal, functional level simulation framework based on SystemC for system-on-a-chip applications”, IEEE Conference on Custom Integrated Circuits, 2001,Pages:541-544.

[16] S. Gupta, B.H. Krogh, R.A. Rutenbar, “Towards formal verification of analog designs”, Computer Aided Design, 2004. ICCAD-2004. IEEE/ACM International Conference on, 7-11 Nov. 2004, Pages: 210- 217

[17] R.O. Peuzzi, “Verification of Digitally Calibrated Analog Systems with Verilog-AMS Behavioral Models”, Proceedings of the 2006 IEEE International Behavioral Modeling and Simulation Workshop, Sept. 2006, Pages: 7-16

[18] J. David, “Efficient functional verification for mixed signal IP”, BMAS 2004. Proceedings of the 2004 IEEE International Behavioral Modeling and Simulation Conference, Oct. 2004, Pages: 53- 58.

[19] J. David, “Verification of CML circuits used in PLL contexts with Verilog-AMS”, Proceedings of the 2006 IEEE International Behavioral Modeling and Simulation Workshop, Sept. 2006, Pages: 97-102.

[20] J. David, “Functional Verification of a Differential Operational Amplifier”, Proc. Intl. Cadence Usergroup Conference 2001, paper F50, Dec 2001

[21] Extensible Stylesheet Language (XSL) Version 1.1, W3C Recommendation 05 December 2006 at http://www.w3.org/TR/2006/REC-xsl11-20061205/

J.David, M.Singh Page 12 September 2007