tem.10.m

21
Flowtracer /TEM - Test Environment Manager A Collaborative Integrated SystemC-based work flow tool for Distributed ASIC Design and Verification Teams Charles Hart Envision Systems, 1803 Marabu Way, Fremont CA, 94539 /USA [email protected] Abstract While recent advancements in process technologies and sophisticated design methods enable true, system-on-a- chip (SoC) designs, the increased complexity of Hardware & Software in these designs levy increasingly demanding verification requirements on design teams. Verification already consumes over half the time and resources that are dedicated to today's 130-90nm SoC designs. SoC and ASIC engineers can expect even greater verification requirements with each new process generation. (V. Berman [1]) At least 80 percent of hardware designed today using hardware design languages (HDLs), like Verilog and VHDL, are designed at the register-transfer logic (RTL) level. Many design teams are finding that RTL descriptions with HDLs are at their limit for handling the size and complexity of today's designs. The issues include simulation times that are too long, verification solutions that are too complex and designs that are too large to describe at the RTL level. (V. Berman [2]) A 'higher level' design verification model is required which allows faster simulation cycle times of a 'higher level' simulation language. One solution is to implement architectural trade-offs on a High-level SystemC model. SystemC is a candidate for the language that will be used at all levels of system and chip design. Using SystemC in conjunction with RTL/Behavioral Design can accelerate this transition. SystemC opens the door for a new method of developing hardware systems and it can benefit current SoC design flows. In this paper, we describe the need for a higher level verification language and the use of more integrated design verification tools. If these tools can be shared between Design and Verification teams, they help to create visualization of processes and test results. This visualization of processes and test results can speed up implementation and management of the Verification Test Plan and the Design Verification Process. Also described is the implementation of an Integrated SystemC-based Verification environment that addresses these issues by using Runtime Design Automation's Flowtracer/EDA and it's Application Layer, the Test Environment Manager (TEM). This environment uses a ( Session based) Verification Methodology using both Assertion-based Verification and Coverage-driven Verification on a Web-browser interface for Distributed Design and Verification Teams. 1 Introduction While recent advancements in process technologies and sophisticated design methods enable true, system-on-a- chip(Soc) designs, the increased complexity of Hardware and Software in these designs levy increasingly demanding verification requirements on design teams. Verification already consumes over half the time and resources that are dedicated to today's 130-90nm SoC designs. SoC and ASIC engineers can expect even greater verification requirements with each new process generation (V. Berman [1]). Moore's Law predicted that

Post on 21-Oct-2014

774 views

Category:

Technology


0 download

DESCRIPTION

Flowtracer Disturbuted Build &Test Manager

TRANSCRIPT

Page 1: Tem.10.M

Flowtracer /TEM - Test Environment Manager

A Collaborative Integrated SystemC-based work flow

tool for Distributed ASIC Design and Verification Teams

Charles Hart

Envision Systems, 1803 Marabu Way,

Fremont CA, 94539 /USA

[email protected]

Abstract

While recent advancements in process technologies and sophisticated design methods enable true, system-on-a-chip (SoC) designs, the increased complexity of Hardware & Software in these designs levy increasinglydemanding verification requirements on design teams. Verification already consumes over half the time andresources that are dedicated to today's 130-90nm SoC designs. SoC and ASIC engineers can expect evengreater verification requirements with each new process generation. (V. Berman [1])

At least 80 percent of hardware designed today using hardware design languages (HDLs), like Verilog and VHDL,are designed at the register-transfer logic (RTL) level. Many design teams are finding that RTL descriptions withHDLs are at their limit for handling the size and complexity of today's designs. The issues include simulation timesthat are too long, verification solutions that are too complex and designs that are too large to describe at the RTLlevel. (V. Berman [2])

A 'higher level' design verification model is required which allows faster simulation cycle times of a 'higher level'simulation language. One solution is to implement architectural trade-offs on a High-level SystemC model.SystemC is a candidate for the language that will be used at all levels of system and chip design. Using SystemCin conjunction with RTL/Behavioral Design can accelerate this transition. SystemC opens the door for a newmethod of developing hardware systems and it can benefit current SoC design flows.

In this paper, we describe the need for a higher level verification language and the use of more integrated designverification tools. If these tools can be shared between Design and Verification teams, they help to createvisualization of processes and test results. This visualization of processes and test results can speed upimplementation and management of the Verification Test Plan and the Design Verification Process.

Also described is the implementation of an Integrated SystemC-based Verification environment that addressesthese issues by using Runtime Design Automation's Flowtracer/EDA and it's Application Layer, the TestEnvironment Manager (TEM). This environment uses a ( Session based) Verification Methodology using bothAssertion-based Verification and Coverage-driven Verification on a Web-browser interface for Distributed Designand Verification Teams.

1 Introduction

While recent advancements in process technologies and sophisticated design methods enable true, system-on-a-chip(Soc) designs, the increased complexity of Hardware and Software in these designs levy increasinglydemanding verification requirements on design teams. Verification already consumes over half the time andresources that are dedicated to today's 130-90nm SoC designs. SoC and ASIC engineers can expect evengreater verification requirements with each new process generation (V. Berman [1]). Moore's Law predicted that

Page 2: Tem.10.M

the number of transistors per integrated circuit (gate count) would double every 18 to 24 months. Moore's Law hasheld for thirty years; it looks like it will hold for another ten at least. However, Front-end Verification requirementsgrow approximately as the square of the gate count - a sobering fact as designers move to next-generationprocess technologies capable of supporting 100M gates .(V. Berman [2]) (See Figures xx)

Figure 1: Relationship between Increasingly Complex Design Flows and Cell Geometries. Note the increase in thenumber of process steps and handoffs as cell geometries get smaller.

At least 80 percent of hardware designed today using hardware-design languages (HDL's), like Verilog and VHDL,are designed at the register-transfer logic (RTL) level. Many design teams are finding that RTL descriptions withHDL's are at or beyond their limit for handling the size and complexity of today's designs. The issues includesimulation times that are too long, verification solutions that are too complex and designs that are too large todescribe at the register-transfer logic (RTL) level. Along with rising gate counts, engineers face a shift towardsoftware-intensive designs that significantly exacerbates verification complexity. (V. Berman [2]) As a result of theincreasing verification complexity and concurrently shortened Design and Verification Cycles, today's verificationmethodologies, with HDL's only, are not likely to succeed.

Page 3: Tem.10.M

2 EDA Environment Reality - Design Challenges (in 2005)

Six convergent factors are changing the way EDA tools are used in SoC design.

Diverse Tools – Tools from differing EDA vendors are often used together during development. These EDAtools, despite best intentions, were not built to work together. “Many developers and engineering teamsusing these same design tools get stuck in a rut. They're doing things the same way because it's easier notto change or..{omissis}..because the process works – somewhat.” (A. Raynaud [3]) EDA vendors must movebeyond 'point-tool' technologies, and offer customers EDA tools with deeper integration, broader toolfunctional and interoperability.

Aging Project Methodologies – Engineers stitch these EDA tools together, using ad-hoc and legacy scripts,without accounting for intra-tool dependency information. “ 56 % of tools budgets are spent internally –80%of which is spent stitching tools together. Few vendors currently provide seamless design flows. Weview this as one of biggest opportunities for the EDA industry to accelerate growth“. (Deutsche Bank/EETimes [4])

Shortening Project Schedules – Shorter Design and Verification Cycles with increasing design complexity,require more communication between design teams. More communication requires more design timeparticularly between distributed design teams. Also, there are more interrelated and concurrent designprocesses that overlap in time; and when things go wrong there is less time for corrective action. Today'sProject Schedule Management requires multiple views of interrelated Design and Verification processesincluding: HDL Source Revision Control, Test Simulation Environment and Verification Test status, and Bug(or Issue Dependency) Tracking to name a few.

Cost and time-to-market(CTTM)-”CTTM issues are reaching breaking point. Exploding chip complexity istesting traditional EDA tool flows. Customers are demanding deeper integration with fewer bugs in order todeliver SoC design for the growing wireless and consumer electronics market.” (Deutsche Bank [4]) “Increasing gate counts and greater software content are fueling a dramatic increase in system hardwarestates. They are driving a need for more effective system-level verification methodologies.” (V. Berman [1])

Design Outsourcing – According to the recent EE Times 2005 EDA Survey; “39% of the respondentsindicated that their company outsources some portion of chip design to a third-party provider. {Omissis}..Historically, a pickup in outsourcing was a reasonable leading indicator for increased design activity—companies utilize third-party vendors to offload incremental demand until the “upturn” looks sustainable, atwhich point full-time design engineers are hired.”(Deutsche Bank [4]) Distributed design and verificationteams, both in different locations or time-zones, working on different pieces of the design are becoming thenorm for SoC and ASICs . Design and Verification managers, and engineers need tools to track largeamounts of data to communicate and manage a project's progress effectively .

Collaborative Products - “Development teams can leverage a variety of collaborative products, such as e-mail and messaging products, discussion forums, workflow process engines, software configurationmanagement, version control and bug-tracking tools, and portals. One problem corporate IT often faces isthe 'islands' of development tools and methodologies that may exist within a large company. When buildingteams across business units or different companies, this can be an obstacle. But there are a few availablesolutions {omissis}, both in terms of the design building community -- and in working in distributed teamsacross the Internet.” (C. Fyre [10])

Page 4: Tem.10.M

Design Verification Challenges and Alternatives

The rest of the paper proceeds as follows:

First, we describe the need for a higher level verification language and the use of more integrated designverification tools. If these tools can be shared between Design and Verification teams, they help to createvisualization of processes and test results. This visualization of processes and test results can speed upimplementation and management of the Verification Test Plan and the Design Verification Process.

Second, we layout the Software Development Requirements for a Design/Compile/Simulate/Debug Environmentusing a Virtual Prototyping Flow based on SystemC. The initial step of the functional verification is the creation ofan executable specification using SystemC. This bit-true model is bus cycle accurate (or quasi cycle accurate) andprovides a flexible software development platform for test and verification. Through the use of Assertion-basedtesting and Code Coverage this model simultaneously implements both Design Specification Validation andVerification.

Further, we prove the efficacy of SystemC-based design flow by using Flowtracer/EDA to implement an integratedBrowser-based verification environment (TEM) and it's management tools . The Web browser's scripts, when usedin conjunction with CVS and a SQL Database, become a test management infrastructure (Knowledge-base); atracking system allowing each design or verification engineer to learn how their work integrates with the work ofothers, while working independently. This shared knowledge helps to remove human variability in build and testenvironment and increases design verification productivity while minimizing rework.

In the end, we summarize the results of the TEM implementation .

3 Set of Alternative Solutions – A Need for Change in approach in Design Verification

Goal: Reduce the Verification/Design Gap – Moore's Law: Increasing gates count vs. Verification Effort

Productivity: Effort in Design Verification has been increasing for the last years. = transistors/staff-months

Problem: It has reached 70% of the total development effort, leaving only 30% for the actual design [5]

System level design: Challenges and Requirements

Page 5: Tem.10.M

One solution to exploding chip gate-counts and the associated verification complexity is to raise the level ofabstraction used in verification and use 'higher-level' models, like C/C++, during system-level simulations.

But each solution has its own limitations. Basic C language lacks the three fundamental concepts necessary tomodel hardware designs: Concurrency, Connectivity (Hierarchy) and Time. Also, basic C++ also lacks thenecessary features to support the (HVL) productivity cycle: Randomization, Constrainability, Temporal Expressionsand Functional Coverage Measurement (J. Bergeron [7]).

SystemC and Transaction-Level Modeling (TLM)

SystemC is a de facto industry-standard language implemented as a C++ class library, providing both system-level and HDL modeling capabilities. SystemC provides hardware-oriented constructs within the context of C++ asa class library implemented in standard C++. Its use spans design and verification from concept to implementationin hardware and software. SystemC provides an open-source, inter-operable modeling platform which enables thedevelopment and exchange of very fast system-level C++ models. It also provides a stable platform fordevelopment of system-level tools.

SystemC “extends” the C++ language without changing the syntax by using classes. Models are introduced as ameans of encapsulating Software and Hardware Behavior and describing hierarchy.` The SystemC Library is a setof C++ classes developed the following concepts :

Concurrency – The ability to describe actions that occur at the same time,independently of each other.Hardware systems are inherently concurrent and their components operate in parallel (as events in time);and are defined and executed within the SystemC simulator. In SystemC, parallelism is implemented viaprocesses (SC_METHOD) and software threads (SC_THREAD).

Connectivity – The ability to describe a design using simpler blocks then connecting them together. InSystemC, the simpler components are Modules, Ports, Processes, Interfaces, Channels and Events. Amodule may represent a system, a block, a board, or a chip and the ports may represent the interface, pins,etc.

Notion of Time – The ability to represent how the internal state of a design block evolves over time. A modelof computation (MOC) is broadly defined by a model of time (real, integer, untimed) and event orderingconstraints (globally ordered, partially ordered, etc.) with methods of communication between processes andrules for process activation.

Hardware Data Types – These describe arbitrary precision integers, fixed point numbers and 4 valued logic(ie. the data type used for describing a tri-state logic value (0,1,X,Z) - in RTL Hardware Gates) .

SystemC is controlled by an industry consortium called OSCI (Open SystemC Initiative). SystemC is currently inthe process of becoming an IEEE standard, with an originally targeted completion date of the first quarter of 2005.OSCI was formed in 2000 and took on the development and standardization of SystemC beginning with Version1.0, which provided RTL and behavioral HDL modeling capabilities.

The SystemC 2.0 Library, the current version, added general system-level modeling capabilities with channels,interfaces, and events. In 2003, the SystemC Verification Library (SCV 1.0) was released. SCV, originally basedon Cadence's TestBuilder, was rebuilt on top of SystemC and submitted it to the Open SystemC Initiative.TestBuilder provides a C++ signal class, interfacing C++ to an HDL design at the signal level. Layered onSystemC, the SystemC Verification Library (SCV) is a set of C++ classes that provides support for randomization,constraints and temporal expressions. [2]

Additional information about the SystemC language, tools, and OSCI (Open SystemC Initiative) organization canbe found at: www.systemc.org .

Page 6: Tem.10.M

Transaction-Level Modeling – Simulation Advantages (Speed and Code Size)

In SystemC, Transaction-Level Models(TLM) are used for modeling an executable platform and typically describeHardware or behavioral simulations of hardware. The TLM model interface is 'Timed Functional' and may or maynot be cycle-accurate. (in TEM, TLMs are cycle-accurate). One of SystemC's key advantages in using TLM overRTL is it's simulation speed.

Typically, in a head-to-head comparison of RTL language Simulation cycle-times:

C has a 25:1 speed advantage, over RTL

Page 7: Tem.10.M

SystemC TLM has a 100-1000:1 speed advantage, over RTL

also in terms of Simulation Code Size (lines of code):

SystemC (TLM vs. RTL) TLM models are 10x smaller than RTL:

Consequently, they are easier to write and faster to simulate.

Flowtracer/EDA : The Work Flow Manager

Flowtracer/EDA is a resource management tool that tracks flows consisting of jobs, files and the dependenciesbetween them. It can direct the parallel execution of jobs to bring files up-to-date (like Make) using any of thepopular batch queuing systems. It has command-line, X-Windows GUI and Web-based interfaces to review andcontrol flows.

The Flowtracer/EDA usage model has two main steps:

Plan out a flow by executing a TCL configuration file,

Execute that flow by triggering an update process.

As the flow executes, it is possible to monitor the progress as jobs pass, fail or wait to be executed. EDA is basedon a dependency technology called “Runtime Tracing”,which is the ability to dynamically discover (tool, job or file)dependencies via the operating system, without explicitly being told what they are. This feature, of interceptingbehavior dependencies (between tools and compute jobs), is a very important one within an integrated frameworkof tools like TEM, and essentially automates TEM's dependency management.

What is Flowtracer TEM? Test Environment Manager

Flowtracer/TEM is an application built on top of Flowtracer EDA. This application has been designed to support theDesign Test and Verification engineer. Flowtracer/TEM attempts to unify under a common browser interface thedaily activities of Distributed Design and Verification Teams, related to verification of complex hardware andsoftware systems, including

the interaction with the revision control system, like CVS, Perforce, SOS, ..etc

the interaction with batch processing on the CPU Compute Farm like LSF, SGE, Flowtracer/NC

the interaction with a large number of simulation and test runs – of various configurations

the interaction with the SQL data used to store historical data about the testing activity

Main attributes : TEM + SystemC for Verification

Rapid prototyping and Modeling environment

High Level Architectural Modeling

Implementable modeling – Executable Specification

Page 8: Tem.10.M

Modular system that integrates disparate tools components needed to manage test sessions. Thesecomponents include:

A relational database MySQL -or- ORACLE

A code revision control system CVS

A compute farm (local servers) Flowtracer/NC -or- LSF, SGE

A bug tracking and requirements system Bugzilla

Integrated Design Team e-mailing lists

Page 9: Tem.10.M

Goals and Benefits of TEM:

To Implement a Browser interface, effectively as a Control Panel for Distributed Design Teams,0 to perform theVerification process (via test sessions): where verification jobs can be launched, suspended, rerun orterminated ; and coordinated on a matrix of jobs vs. test processes on: module-by-module basis and by anoverall Simulation test bench basis.

Allow visualization of project data for the team developers, project leads and managers: bring together adevelopment team that may cross departmental company boundaries or is distributed around the world.

Promote clearer verification developer and design team communication, where the need for enhancedcollaboration and teamwork is paramount.

Modularize Verification testing, into a workflow process construct; consisting of a Build system, CodeRevision Control, Bug Tracking and achieve/retrieval of previous test sessions.

Automatically handle via a configurable workflow system: Controlling differing Simulations, Simulation'sEnvironment, Compute Resources & Test bench test files used, Test Regression & Code Coverage options.

Handle change control which refers to a number of issues:

Code changes – (Code Revision Control) Store/checkout of the Simulation and Test bench test files.

Achieving – File revisions (CSV) vs. Test results – to keep track of the relevant changes in files thatcould cause changes in outcomes.

Relate test failures to causes for known bugs and automatically record unknown bugs for laterbroadcast to Verification testers and Design block owners via email.

Handle the Simulation Process:

Launching the Simulation onto the CPU server farm (via load sharing mechanism)

Collecting the Simulation output

Parsing the output logs and evaluation of the test Pass/Fail Criteria

Relaunching jobs with the “exact environment” of previous test session failures

Relate test failures to causes for known bugs and automatically record unknown bugs for laterbroadcast to Verification testers and Design block owners via email.

Page 10: Tem.10.M
Page 11: Tem.10.M

Test History – Test Analysis: Spanning multiple Sessions

Test Results – File Change Analysis between Sessions

As part of the Change Analysis Process, Flowtracer/TEM achieves (CVS) File code revisions against (TestSession) results – to keep track of the relevant changes in files that could cause changes in outcomes. In fact, theChange Analysis page, below, is one of the most interesting and powerful features of Flowtracer/TEM.

The Change Analysis Process brings together:

information about file revisions

information about file history (Test file results)

information about file dependencies (Test build and Workflow- like Makefiles)

First, TEM finds out which files have been modified between test sessions. Then it finds the files that are “upstream” dependencies of the specific test under consideration. Finally TEM finds the intersection of these twosets, to determine which changes are the most likely to be relevant for the change of test behavior that's beenobserved.

Page 12: Tem.10.M

Figure: Given the RSA test history above, find the relevant file changes, for a given change in test behavior(pass>fail ,etc) of the RSA Cryptography test, between any two test sessions.

Page 13: Tem.10.M

Test Management - Adding a New Test Session

The instantiation of a New Test Session, adds an entry into the Session Database and also does a local CVScheckout of the SystemC Libraries, Test Session Flow and Test harness/case files required to do the simulation.Prior to the 'Add new session' instantiation, a typical user (or team member) would choose the details of the testrun and simulation type by completing the New Session form.

Normally test users add a new test session, by filling out the form below, using the browser interface.

Page 14: Tem.10.M

For a Given - New Session: the CVS Check-Out View determines which set of files while be checked-out by CVSand what type of System Simulation and Tests will be run.

In the future, many other views may be possible – using different Simulator or Co-Simulators- Like Verilog or aDirect-C interface. The following page shows only the SystemC 2.1 View, it's tests, SystemC 2.1 Library and theSCV 1.0 Library (renamed here SCV 2.1).

. 4 Implementation Phase – System Model (Executable Specification -SystemC)

Page 15: Tem.10.M

These models are an executable specification of the system and are useful for architecture exploration and foralgorithm determination and proof. They typically describe both hardware and software components.

Here, we layout the requirements for a Design/Compile/Simulate/Debug Environment using a Virtual PrototypingFlow based on SystemC. Page below describes the Virtual Prototyping components and their execution order.

5 TEM Architecture – using SystemC Libraries

Because SystemC uses a behavior Transaction Level Modeling (TLM), we construct a transaction level layer, atest harness [tests], common of all test benches for the design-under-verification (DUV). Simulation TestFunctions required to implement the test cases identified in the Verification Plan, are built on top of the testharness. The Test Function and the Test Harness together form a test bench. We use as a template for simulationtest bench, 18 real-world “examples” supplied by the OCSI SystemC 2.1 Library. TEM consists of 20 test functionsand 18 test harnesses, making 18 test benches in total.

Figure N: Page below displays 20 classes of tests; which are determined by slicing up a test session based on thetest structure. Each test is farther divided into a number of CPU jobs, denoted in Status.

Page 16: Tem.10.M

5.1) Hardware and Software Configuration

We ran our solution on commodity Linux operating systems, such as Linux Red Hat and SUSE on x86 processors,and Unix, Solaris 10 on Sun Opteron w/AMD64. Solaris 10 required an AMD64 port of the SystemC 2.1Libraries.[14] Since for SystemC is a C++ development environment, the standard C/C++ developmentenvironment is used. A SystemC reference simulator and Class Library (SystemC 2.1) were downloaded from theOSCI Website.

The OCSI SystemC 2.1 Library and Test's Makefiles supplied within the original (oct_12_2004.beta), were firstmodified to support the modified “source code” for the AMD64 port of the SystemC 2.1 Libraries. They were thenreorganized to support GNU Gcov based Code Coverage, and later all the SystemC 2.1 Makefiles were translatedinto TCL flows via both manual (hand translation) and automatic translation of Makefiles into TCL Flows (viavmake). The Tool Command Language (TCL) is a scripting and programming language. The flows were finallymodified into a tools framework, consisting of many CGI modules written in TCL, that drive the various componenttools of TEM: CVS-Revision Control, Gcov-Code Coverage, Bugzilla-Bug Tracking, mySQL-the SQLDatabase and the CPU Compute Farm.

TEM Configuration Management

Much of the work in setting up Frowtracer/TEM goes into writing a good configuration file. The configuration file

Page 17: Tem.10.M

describes how each TEM component tool controls their environment. The configuration file defines:

Which TEM component subsystems are used (e.g CVS – Code Revision control, MySQL -Database,Bug Tracking system, and CPU Compute Farm for batch Processing).

How the tests are defined and which Workflow Description file is used (../flows/SCSessionFlow.tcl).Flowtracer/TEM looks at a set of jobs and determines which test these jobs implement.

Example of the page that shows the Configuration file for Flowtracer/TEM

Our TEM experiment soon proved that building the SystemC2.1 Libraries (oct_12_2004.beta) and test benches,

Page 18: Tem.10.M

from scratch, was more effective then we previous thought possible. Using the GNU C/C++ compiler GCC 3.4.3,the SystemC 2.1 Libraries build to completion in under 10 minutes, with a P4 x86, and with the SUN Computeserver farm in under 5 minutes. The Sun CPU Farm consisting of 2-4 SUN Opteron CPU's running Solaris10 .

We used as a template for simulation test bench, 18 real-world “examples” supplied by the OCSI SystemC 2.1Library. TEM consists of 20 test functions and 18 test harnesses, making 18 test benches in total. A few of the testfunctions and harnesses were manually modified, to accept parametric input (command-line arguments).The 18test harnesses finally breakdown into a total of 105 verification tests cases. Each test case is submitted, as aseries of single jobs, to the CPU farm for execution. This concludes our discussion of software modifications.

Figure 7: The TEM our algorithm, compared with the other applications.

6 Conclusions: TEM Simulation Results

We have taken great pains to describe out TEM was setup; now, the payoff, is to discuss our results.

Seizing upon this ideal configuration, using SystemC for simulation with no synthesized RTL, we ran four novelexperiments: (1) we measured Test Code Coverage, and Disk Usage (per Test Session), the number ofassertions over-time and Verification Test Simulation throughput on our SUN CPU farm; (2) we ran on 3 CPUnodes spread throughout an intranet network, and compared them against running locally; (3) we ran 500+ testsessions with a stand-alone SUN CPU farm , at DAC 2005, and compared results against running locally; and (4)

Page 19: Tem.10.M

we asked (and answered) what would happen if standard web browsers were used instead of local Flowtracerproject windows.

In general, we were pleased and encouraged with the outcomes of the TEM experiments: which proved that usinga higher-level approach to Verification and Simulation Test, simulation cycle-time throughput could be increasedwhile simultaneously giving Distributed Design and Verification Teams an environment for sharing and achievingresults.

Future Work – with Test Environment Manager

While TEM's integration of open-source tools and the SystemC2.1 Libraries has been successfully used inspeeding up the Front-end Verification process, it is by no means limited to stitching just together thefunctionality of Front-end Verification tools. Given the high level of tools integration required throughout acomplex SoC design flow, particularly in the 130-90nm and below range, TEM could be used by the ChipDesign and Layout Teams to stitch together into one flow the following tools functionality: PhysicalSynthesis, Timing Analysis, Power and Floor planning, and Place and Route .

Acknowledgments:

I would like to thank the RTDA Development Team team for their support in the design and implementation ofTEM tools components integration needed to manage the test sessions. Runtime Design Automationdeserves much of the credit for the final completion of the idea of TEM. Also Gilles Descamps, PhD, ATITechnologies, should be acknowledged, as the originator of the idea (in 2002), of using local web-basedFlowtracer/EDA scripts to run jobs and present results to Distributed Design Teams; which later becamethe basis for the TEM framework.

Runtime Design Automation – CTO, Andrea Casotto, PhD. (Flowtracer/ TEM' s primary developer)

RTDA Website - Santa Clara CA http://www.rtda.com

I would also like to thank Frank A. Kingswood, http://www.kingswood-consulting.co.uk for his assistance inporting the SystemC2.1 Libraries, onto the AMD64 platform.

Credits & Trademarks:

Open SystemC Initiative (OSCI)- SystemC is a trademark or registered trademark of Open SystemC Initiative,Inc. in the United States and other countries and is used with permission.

© 2004 Cadence Design Systems, Inc. All rights reserved. Cadence, the Cadence logo, Verilog & TestBuilder areregistered trademarks of Cadence Design Systems, Inc.

Page 20: Tem.10.M

© 2004 Synopsys a registered trademark of Synopsys, Inc. in the United States and/or other jurisdictions.

© 2005 Runtime Design Automation, Inc. Flowtracer/EDA & Flowtracer/TEM are trademarks of Runtime DesignAutomation, Inc. in the United States and other countries and is used with permission.

© 2005 Advanced Micro Devices, Inc. All rights reserved. AMD, the AMD Arrow logo, AMD Opteron, andcombinations thereof, are trademarks of Advanced Micro Devices, Inc.. Sun and Solaris10 are registeredtrademarks of Sun Microsystems Corporation in the United States and/or other jurisdictions. Intel andPentium 4 are trademarks of Intel Corporation.

© 2004 Forte Design Systems. All rights reserved. Forte trademarks and/or service marks may be registered inthe United States and/or other jurisdictions. Oracle is a registered trademark of Oracle Corporation in theUnited States and/or other jurisdictions. Other product and company names used in this presentation arefor identification purposes only and may be trademarks of their respective companies.

References:

[1] Victor Berman, Cadence Design Systems, Inc.,“A Tale of SystemC and SystemVerilog” Methods: SystemLanguages :Chip Design June/July 2005, pp19-21

[2] Victor Berman, Cadence Design Systems, Inc. DVcon 2004 “Raising Level of Abstraction for Design &Verification: SystemC & SystemVerilog in a Multi language Env.”

[3] Alian Raynaud, “The New Gate Count: What is Verification's Real Cost”, Electronic Design, ED Online ID#5954, 27 October 2003.

[4] EE Times EDA Survey Results, fittedasurvey-fullreport.pdf, Deutsche Bank F.I.T.T, 23 June 2005

[5] Ilkka Tuuomi, “The Lives and Death of Moore's Law”, First Monday,http://www.firstmonday.org/issues/issue7_11/tuomi/#2, 23 July 2005

[6] Gartner DataQuest, www.gartner.com , June 2005.

[7] Janick Bergeron, “Writing Testbenches. 2nd Edition, Functional Verification of HDL Models”, Kluwer,2004(pp189)

[8] Mark Creamer, “Nine reasons to adopt SystemC ESL design”, EE Times 16 Sept 2004. URL:http://www.eetimes.com/showArticle.jhtml?articleID=47212187

[9] Alian Clouard, Kshitiz Jain, Frank Ghenassia, Laurent Maillet-Contoz, Jean-Phillippe Strassen “UsingTranslational Level Models in SoC Design Flow”,Experimental Results – Simulation Figures,pg 57,STMicroelectronis, France- W. Muller, W. Rosenstiel and J. Ruf, SystemC: Methodologies and Applications,

Page 21: Tem.10.M

Kluwer,2003

[10] Colleen Frye,”Can IT developers work together?”, Application Development Trends Magazine,1 Nov. 2002.

[11] Charles S. Hart, “Flowtracer/TEM – Integrated Workflow Tool for ASIC Design & Verification Teams, RTDADAC2005_Flowtracer_TEM.ppt, MS PowerPoint 2000, June 2005.

[12] Tom Katsioulas , “Design Resource Management for Complex Soc Design”, RTDA _PitchTomKat.ppt, MSPowerPoint 2000, Dec. 2003.

[13] Ole Blaurock, “A SystemC-Based Modular Design and Verification Framework for C-model Reuse in a HW-SW Co-Design Flow,”icdcsw,vol. 07, no.7, pp.838-843, 24th 2004. Dept. of Computer Science, University ofHamburg, Hamburg, Germary

[14] Frank A. Kingswood, “SystemC: Bug 409 - AMD64 patch for SystemC 2.1 (SolarisOpteron),(systemc_2_1.oct_12_2004.beta/Makefile systemc-amd64/Makefile). http://www.kingswood-consulting.co.uk/frank/sc-amd64.patch

Charles S. Hart, “SystemC : Bug 447 Fix SystemC 2.1 - AMD64 (x86_64 port) on Solaris10 w/ gcc 3.4.3",https://www.systemc.org/tracker/index.php?aid=447, 10 Jun. 2005.

[15] Open SystemC Initiative Website: www.systemc.org. SystemC Language Reference Manual. Included withSystemC 2.0.1 download from www.systemc.org. SystemC User Guide. Included with SystemC 2.0.1download from www.systemc.org.SystemC 2.1 Library (oct_12_2004.beta) download fromwww.systemc.org. 2004.

[[16] Forte Design Systems, SystemC Training Course, www.forteds.com/SystemC/training/index.asp. , 2004.

[17] Cadence TestBuilder 1.3, www.testbuilder.net , Cadence Design Systems, Inc., Sept 2003.