WordPress Websites for Engineers: Elevate Your Brand
Tem.10.M
1. Flowtracer /TEM - Test Environment Manager
A Collaborative Integrated SystemC-based work flow
tool for Distributed ASIC Design and Verification Teams
Charles Hart
Envision Systems, 1803 Marabu Way,
Fremont CA, 94539 /USA
hartc_2000@yahoo.com
Abstract
While recent advancements in process technologies and sophisticated design methods enable true, system-on-a-
chip (SoC) designs, the increased complexity of Hardware & Software in these designs levy increasingly
demanding verification requirements on design teams. Verification already consumes over half the time and
resources that are dedicated to today's 130-90nm SoC designs. SoC and ASIC engineers can expect even
greater verification requirements with each new process generation. (V. Berman [1])
At least 80 percent of hardware designed today using hardware design languages (HDLs), like Verilog and VHDL,
are designed at the register-transfer logic (RTL) level. Many design teams are finding that RTL descriptions with
HDLs are at their limit for handling the size and complexity of today's designs. The issues include simulation times
that are too long, verification solutions that are too complex and designs that are too large to describe at the RTL
level. (V. Berman [2])
A 'higher level' design verification model is required which allows faster simulation cycle times of a 'higher level'
simulation language. One solution is to implement architectural trade-offs on a High-level SystemC model.
SystemC is a candidate for the language that will be used at all levels of system and chip design. Using SystemC
in conjunction with RTL/Behavioral Design can accelerate this transition. SystemC opens the door for a new
method of developing hardware systems and it can benefit current SoC design flows.
In this paper, we describe the need for a higher level verification language and the use of more integrated design
verification tools. If these tools can be shared between Design and Verification teams, they help to create
visualization of processes and test results. This visualization of processes and test results can speed up
implementation and management of the Verification Test Plan and the Design Verification Process.
Also described is the implementation of an Integrated SystemC-based Verification environment that addresses
these issues by using Runtime Design Automation's Flowtracer/EDA and it's Application Layer, the Test
Environment Manager (TEM). This environment uses a ( Session based) Verification Methodology using both
Assertion-based Verification and Coverage-driven Verification on a Web-browser interface for Distributed Design
and Verification Teams.
1 Introduction
While recent advancements in process technologies and sophisticated design methods enable true, system-on-a-
chip(Soc) designs, the increased complexity of Hardware and Software in these designs levy increasingly
demanding verification requirements on design teams. Verification already consumes over half the time and
resources that are dedicated to today's 130-90nm SoC designs. SoC and ASIC engineers can expect even
greater verification requirements with each new process generation (V. Berman [1]). Moore's Law predicted that
2. the number of transistors per integrated circuit (gate count) would double every 18 to 24 months. Moore's Law has
held for thirty years; it looks like it will hold for another ten at least. However, Front-end Verification requirements
grow approximately as the square of the gate count - a sobering fact as designers move to next-generation
process technologies capable of supporting 100M gates .(V. Berman [2]) (See Figures xx)
Figure 1: Relationship between Increasingly Complex Design Flows and Cell Geometries. Note the increase in the
number of process steps and handoffs as cell geometries get smaller.
At least 80 percent of hardware designed today using hardware-design languages (HDL's), like Verilog and VHDL,
are designed at the register-transfer logic (RTL) level. Many design teams are finding that RTL descriptions with
HDL's are at or beyond their limit for handling the size and complexity of today's designs. The issues include
simulation times that are too long, verification solutions that are too complex and designs that are too large to
describe at the register-transfer logic (RTL) level. Along with rising gate counts, engineers face a shift toward
software-intensive designs that significantly exacerbates verification complexity. (V. Berman [2]) As a result of the
increasing verification complexity and concurrently shortened Design and Verification Cycles, today's verification
methodologies, with HDL's only, are not likely to succeed.
3. 2 EDA Environment Reality - Design Challenges (in 2005)
Six convergent factors are changing the way EDA tools are used in SoC design.
Diverse Tools – Tools from differing EDA vendors are often used together during development. These EDA
tools, despite best intentions, were not built to work together. “Many developers and engineering teams
using these same design tools get stuck in a rut. They're doing things the same way because it's easier not
to change or..{omissis}..because the process works – somewhat.” (A. Raynaud [3]) EDA vendors must move
beyond 'point-tool' technologies, and offer customers EDA tools with deeper integration, broader tool
functional and interoperability.
Aging Project Methodologies – Engineers stitch these EDA tools together, using ad-hoc and legacy scripts,
without accounting for intra-tool dependency information. “ 56 % of tools budgets are spent internally –
80%of which is spent stitching tools together. Few vendors currently provide seamless design flows. We
view this as one of biggest opportunities for the EDA industry to accelerate growth“. (Deutsche Bank/EE
Times [4])
Shortening Project Schedules – Shorter Design and Verification Cycles with increasing design complexity,
require more communication between design teams. More communication requires more design time
particularly between distributed design teams. Also, there are more interrelated and concurrent design
processes that overlap in time; and when things go wrong there is less time for corrective action. Today's
Project Schedule Management requires multiple views of interrelated Design and Verification processes
including: HDL Source Revision Control, Test Simulation Environment and Verification Test status, and Bug
(or Issue Dependency) Tracking to name a few.
Cost and time-to-market(CTTM)-”CTTM issues are reaching breaking point. Exploding chip complexity is
testing traditional EDA tool flows. Customers are demanding deeper integration with fewer bugs in order to
deliver SoC design for the growing wireless and consumer electronics market.” (Deutsche Bank [4])
“Increasing gate counts and greater software content are fueling a dramatic increase in system hardware
states. They are driving a need for more effective system-level verification methodologies.” (V. Berman [1])
Design Outsourcing – According to the recent EE Times 2005 EDA Survey; “39% of the respondents
indicated that their company outsources some portion of chip design to a third-party provider. {Omissis}..
Historically, a pickup in outsourcing was a reasonable leading indicator for increased design activity—
companies utilize third-party vendors to offload incremental demand until the “upturn” looks sustainable, at
which point full-time design engineers are hired.”(Deutsche Bank [4]) Distributed design and verification
teams, both in different locations or time-zones, working on different pieces of the design are becoming the
norm for SoC and ASICs . Design and Verification managers, and engineers need tools to track large
amounts of data to communicate and manage a project's progress effectively .
Collaborative Products - “Development teams can leverage a variety of collaborative products, such as e-
mail and messaging products, discussion forums, workflow process engines, software configuration
management, version control and bug-tracking tools, and portals. One problem corporate IT often faces is
the 'islands' of development tools and methodologies that may exist within a large company. When building
teams across business units or different companies, this can be an obstacle. But there are a few available
solutions {omissis}, both in terms of the design building community -- and in working in distributed teams
across the Internet.” (C. Fyre [10])
4. Design Verification Challenges and Alternatives
The rest of the paper proceeds as follows:
First, we describe the need for a higher level verification language and the use of more integrated design
verification tools. If these tools can be shared between Design and Verification teams, they help to create
visualization of processes and test results. This visualization of processes and test results can speed up
implementation and management of the Verification Test Plan and the Design Verification Process.
Second, we layout the Software Development Requirements for a Design/Compile/Simulate/Debug Environment
using a Virtual Prototyping Flow based on SystemC. The initial step of the functional verification is the creation of
an executable specification using SystemC. This bit-true model is bus cycle accurate (or quasi cycle accurate) and
provides a flexible software development platform for test and verification. Through the use of Assertion-based
testing and Code Coverage this model simultaneously implements both Design Specification Validation and
Verification.
Further, we prove the efficacy of SystemC-based design flow by using Flowtracer/EDA to implement an integrated
Browser-based verification environment (TEM) and it's management tools . The Web browser's scripts, when used
in conjunction with CVS and a SQL Database, become a test management infrastructure (Knowledge-base); a
tracking system allowing each design or verification engineer to learn how their work integrates with the work of
others, while working independently. This shared knowledge helps to remove human variability in build and test
environment and increases design verification productivity while minimizing rework.
In the end, we summarize the results of the TEM implementation .
3 Set of Alternative Solutions – A Need for Change in approach in Design Verification
Goal: Reduce the Verification/Design Gap – Moore's Law: Increasing gates count vs. Verification Effort
Productivity: Effort in Design Verification has been increasing for the last years. = transistors/staff-months
Problem: It has reached 70% of the total development effort, leaving only 30% for the actual design [5]
System level design: Challenges and Requirements
5. One solution to exploding chip gate-counts and the associated verification complexity is to raise the level of
abstraction used in verification and use 'higher-level' models, like C/C++, during system-level simulations.
But each solution has its own limitations. Basic C language lacks the three fundamental concepts necessary to
model hardware designs: Concurrency, Connectivity (Hierarchy) and Time. Also, basic C++ also lacks the
necessary features to support the (HVL) productivity cycle: Randomization, Constrainability, Temporal Expressions
and Functional Coverage Measurement (J. Bergeron [7]).
SystemC and Transaction-Level Modeling (TLM)
SystemC is a de facto industry-standard language implemented as a C++ class library, providing both system-
level and HDL modeling capabilities. SystemC provides hardware-oriented constructs within the context of C++ as
a class library implemented in standard C++. Its use spans design and verification from concept to implementation
in hardware and software. SystemC provides an open-source, inter-operable modeling platform which enables the
development and exchange of very fast system-level C++ models. It also provides a stable platform for
development of system-level tools.
SystemC “extends” the C++ language without changing the syntax by using classes. Models are introduced as a
means of encapsulating Software and Hardware Behavior and describing hierarchy.` The SystemC Library is a set
of C++ classes developed the following concepts :
Concurrency – The ability to describe actions that occur at the same time,independently of each other.
Hardware systems are inherently concurrent and their components operate in parallel (as events in time);
and are defined and executed within the SystemC simulator. In SystemC, parallelism is implemented via
processes (SC_METHOD) and software threads (SC_THREAD).
Connectivity – The ability to describe a design using simpler blocks then connecting them together. In
SystemC, the simpler components are Modules, Ports, Processes, Interfaces, Channels and Events. A
module may represent a system, a block, a board, or a chip and the ports may represent the interface, pins,
etc.
Notion of Time – The ability to represent how the internal state of a design block evolves over time. A model
of computation (MOC) is broadly defined by a model of time (real, integer, untimed) and event ordering
constraints (globally ordered, partially ordered, etc.) with methods of communication between processes and
rules for process activation.
Hardware Data Types – These describe arbitrary precision integers, fixed point numbers and 4 valued logic
(ie. the data type used for describing a tri-state logic value (0,1,X,Z) - in RTL Hardware Gates) .
SystemC is controlled by an industry consortium called OSCI (Open SystemC Initiative). SystemC is currently in
the process of becoming an IEEE standard, with an originally targeted completion date of the first quarter of 2005.
OSCI was formed in 2000 and took on the development and standardization of SystemC beginning with Version
1.0, which provided RTL and behavioral HDL modeling capabilities.
The SystemC 2.0 Library, the current version, added general system-level modeling capabilities with channels,
interfaces, and events. In 2003, the SystemC Verification Library (SCV 1.0) was released. SCV, originally based
on Cadence's TestBuilder, was rebuilt on top of SystemC and submitted it to the Open SystemC Initiative.
TestBuilder provides a C++ signal class, interfacing C++ to an HDL design at the signal level. Layered on
SystemC, the SystemC Verification Library (SCV) is a set of C++ classes that provides support for randomization,
constraints and temporal expressions. [2]
Additional information about the SystemC language, tools, and OSCI (Open SystemC Initiative) organization can
be found at: www.systemc.org .
6. Transaction-Level Modeling – Simulation Advantages (Speed and Code Size)
In SystemC, Transaction-Level Models(TLM) are used for modeling an executable platform and typically describe
Hardware or behavioral simulations of hardware. The TLM model interface is 'Timed Functional' and may or may
not be cycle-accurate. (in TEM, TLMs are cycle-accurate). One of SystemC's key advantages in using TLM over
RTL is it's simulation speed.
Typically, in a head-to-head comparison of RTL language Simulation cycle-times:
C has a 25:1 speed advantage, over RTL
7. SystemC TLM has a 100-1000:1 speed advantage, over RTL
also in terms of Simulation Code Size (lines of code):
SystemC (TLM vs. RTL) TLM models are 10x smaller than RTL:
Consequently, they are easier to write and faster to simulate.
Flowtracer/EDA : The Work Flow Manager
Flowtracer/EDA is a resource management tool that tracks flows consisting of jobs, files and the dependencies
between them. It can direct the parallel execution of jobs to bring files up-to-date (like Make) using any of the
popular batch queuing systems. It has command-line, X-Windows GUI and Web-based interfaces to review and
control flows.
The Flowtracer/EDA usage model has two main steps:
Plan out a flow by executing a TCL configuration file,
Execute that flow by triggering an update process.
As the flow executes, it is possible to monitor the progress as jobs pass, fail or wait to be executed. EDA is based
on a dependency technology called “Runtime Tracing”,which is the ability to dynamically discover (tool, job or file)
dependencies via the operating system, without explicitly being told what they are. This feature, of intercepting
behavior dependencies (between tools and compute jobs), is a very important one within an integrated framework
of tools like TEM, and essentially automates TEM's dependency management.
What is Flowtracer TEM? Test Environment Manager
Flowtracer/TEM is an application built on top of Flowtracer EDA. This application has been designed to support the
Design Test and Verification engineer. Flowtracer/TEM attempts to unify under a common browser interface the
daily activities of Distributed Design and Verification Teams, related to verification of complex hardware and
software systems, including
the interaction with the revision control system, like CVS, Perforce, SOS, ..etc
the interaction with batch processing on the CPU Compute Farm like LSF, SGE, Flowtracer/NC
the interaction with a large number of simulation and test runs – of various configurations
the interaction with the SQL data used to store historical data about the testing activity
Main attributes : TEM + SystemC for Verification
Rapid prototyping and Modeling environment
High Level Architectural Modeling
Implementable modeling – Executable Specification
8. Modular system that integrates disparate tools components needed to manage test sessions. These
components include:
A relational database MySQL -or- ORACLE
A code revision control system CVS
A compute farm (local servers) Flowtracer/NC -or- LSF, SGE
A bug tracking and requirements system Bugzilla
Integrated Design Team e-mailing lists
9. Goals and Benefits of TEM:
To Implement a Browser interface, effectively as a Control Panel for Distributed Design Teams,0 to perform the
Verification process (via test sessions): where verification jobs can be launched, suspended, rerun or
terminated ; and coordinated on a matrix of jobs vs. test processes on: module-by-module basis and by an
overall Simulation test bench basis.
Allow visualization of project data for the team developers, project leads and managers: bring together a
development team that may cross departmental company boundaries or is distributed around the world.
Promote clearer verification developer and design team communication, where the need for enhanced
collaboration and teamwork is paramount.
Modularize Verification testing, into a workflow process construct; consisting of a Build system, Code
Revision Control, Bug Tracking and achieve/retrieval of previous test sessions.
Automatically handle via a configurable workflow system: Controlling differing Simulations, Simulation's
Environment, Compute Resources & Test bench test files used, Test Regression & Code Coverage options.
Handle change control which refers to a number of issues:
Code changes – (Code Revision Control) Store/checkout of the Simulation and Test bench test files.
Achieving – File revisions (CSV) vs. Test results – to keep track of the relevant changes in files that
could cause changes in outcomes.
Relate test failures to causes for known bugs and automatically record unknown bugs for later
broadcast to Verification testers and Design block owners via email.
Handle the Simulation Process:
Launching the Simulation onto the CPU server farm (via load sharing mechanism)
Collecting the Simulation output
Parsing the output logs and evaluation of the test Pass/Fail Criteria
Relaunching jobs with the “exact environment” of previous test session failures
Relate test failures to causes for known bugs and automatically record unknown bugs for later
broadcast to Verification testers and Design block owners via email.
10.
11. Test History – Test Analysis: Spanning multiple Sessions
Test Results – File Change Analysis between Sessions
As part of the Change Analysis Process, Flowtracer/TEM achieves (CVS) File code revisions against (Test
Session) results – to keep track of the relevant changes in files that could cause changes in outcomes. In fact, the
Change Analysis page, below, is one of the most interesting and powerful features of Flowtracer/TEM.
The Change Analysis Process brings together:
information about file revisions
information about file history (Test file results)
information about file dependencies (Test build and Workflow- like Makefiles)
First, TEM finds out which files have been modified between test sessions. Then it finds the files that are
“upstream” dependencies of the specific test under consideration. Finally TEM finds the intersection of these two
sets, to determine which changes are the most likely to be relevant for the change of test behavior that's been
observed.
12. Figure: Given the RSA test history above, find the relevant file changes, for a given change in test behavior
(pass>fail ,etc) of the RSA Cryptography test, between any two test sessions.
13. Test Management - Adding a New Test Session
The instantiation of a New Test Session, adds an entry into the Session Database and also does a local CVS
checkout of the SystemC Libraries, Test Session Flow and Test harness/case files required to do the simulation.
Prior to the 'Add new session' instantiation, a typical user (or team member) would choose the details of the test
run and simulation type by completing the New Session form.
Normally test users add a new test session, by filling out the form below, using the browser interface.
14. For a Given - New Session: the CVS Check-Out View determines which set of files while be checked-out by CVS
and what type of System Simulation and Tests will be run.
In the future, many other views may be possible – using different Simulator or Co-Simulators- Like Verilog or a
Direct-C interface. The following page shows only the SystemC 2.1 View, it's tests, SystemC 2.1 Library and the
SCV 1.0 Library (renamed here SCV 2.1).
.4 Implementation Phase – System Model (Executable Specification -SystemC)
15. These models are an executable specification of the system and are useful for architecture exploration and for
algorithm determination and proof. They typically describe both hardware and software components.
Here, we layout the requirements for a Design/Compile/Simulate/Debug Environment using a Virtual Prototyping
Flow based on SystemC. Page below describes the Virtual Prototyping components and their execution order.
5 TEM Architecture – using SystemC Libraries
Because SystemC uses a behavior Transaction Level Modeling (TLM), we construct a transaction level layer, a
test harness [tests], common of all test benches for the design-under-verification (DUV). Simulation Test
Functions required to implement the test cases identified in the Verification Plan, are built on top of the test
harness. The Test Function and the Test Harness together form a test bench. We use as a template for simulation
test bench, 18 real-world “examples” supplied by the OCSI SystemC 2.1 Library. TEM consists of 20 test functions
and 18 test harnesses, making 18 test benches in total.
Figure N: Page below displays 20 classes of tests; which are determined by slicing up a test session based on the
test structure. Each test is farther divided into a number of CPU jobs, denoted in Status.
16. 5.1) Hardware and Software Configuration
We ran our solution on commodity Linux operating systems, such as Linux Red Hat and SUSE on x86 processors,
and Unix, Solaris 10 on Sun Opteron w/AMD64. Solaris 10 required an AMD64 port of the SystemC 2.1
Libraries.[14] Since for SystemC is a C++ development environment, the standard C/C++ development
environment is used. A SystemC reference simulator and Class Library (SystemC 2.1) were downloaded from the
OSCI Website.
The OCSI SystemC 2.1 Library and Test's Makefiles supplied within the original (oct_12_2004.beta), were first
modified to support the modified “source code” for the AMD64 port of the SystemC 2.1 Libraries. They were then
reorganized to support GNU Gcov based Code Coverage, and later all the SystemC 2.1 Makefiles were translated
into TCL flows via both manual (hand translation) and automatic translation of Makefiles into TCL Flows (via
vmake). The Tool Command Language (TCL) is a scripting and programming language. The flows were finally
modified into a tools framework, consisting of many CGI modules written in TCL, that drive the various component
tools of TEM: CVS-Revision Control, Gcov-Code Coverage, Bugzilla-Bug Tracking, mySQL-the SQL
Database and the CPU Compute Farm.
TEM Configuration Management
Much of the work in setting up Frowtracer/TEM goes into writing a good configuration file. The configuration file
17. describes how each TEM component tool controls their environment. The configuration file defines:
Which TEM component subsystems are used (e.g CVS – Code Revision control, MySQL -Database,
Bug Tracking system, and CPU Compute Farm for batch Processing).
How the tests are defined and which Workflow Description file is used (../flows/SCSessionFlow.tcl).
Flowtracer/TEM looks at a set of jobs and determines which test these jobs implement.
Example of the page that shows the Configuration file for Flowtracer/TEM
Our TEM experiment soon proved that building the SystemC2.1 Libraries (oct_12_2004.beta) and test benches,
18. from scratch, was more effective then we previous thought possible. Using the GNU C/C++ compiler GCC 3.4.3,
the SystemC 2.1 Libraries build to completion in under 10 minutes, with a P4 x86, and with the SUN Compute
server farm in under 5 minutes. The Sun CPU Farm consisting of 2-4 SUN Opteron CPU's running Solaris10 .
We used as a template for simulation test bench, 18 real-world “examples” supplied by the OCSI SystemC 2.1
Library. TEM consists of 20 test functions and 18 test harnesses, making 18 test benches in total. A few of the test
functions and harnesses were manually modified, to accept parametric input (command-line arguments).The 18
test harnesses finally breakdown into a total of 105 verification tests cases. Each test case is submitted, as a
series of single jobs, to the CPU farm for execution. This concludes our discussion of software modifications.
Figure 7: The TEM our algorithm, compared with the other applications.
6 Conclusions: TEM Simulation Results
We have taken great pains to describe out TEM was setup; now, the payoff, is to discuss our results.
Seizing upon this ideal configuration, using SystemC for simulation with no synthesized RTL, we ran four novel
experiments: (1) we measured Test Code Coverage, and Disk Usage (per Test Session), the number of
assertions over-time and Verification Test Simulation throughput on our SUN CPU farm; (2) we ran on 3 CPU
nodes spread throughout an intranet network, and compared them against running locally; (3) we ran 500+ test
sessions with a stand-alone SUN CPU farm , at DAC 2005, and compared results against running locally; and (4)
21. Kluwer,2003
[10] Colleen Frye,”Can IT developers work together?”, Application Development Trends Magazine,1 Nov. 2002.
[11] Charles S. Hart, “Flowtracer/TEM – Integrated Workflow Tool for ASIC Design & Verification Teams, RTDA
DAC2005_Flowtracer_TEM.ppt, MS PowerPoint 2000, June 2005.
[12] Tom Katsioulas , “Design Resource Management for Complex Soc Design”, RTDA _PitchTomKat.ppt, MS
PowerPoint 2000, Dec. 2003.
[13] Ole Blaurock, “A SystemC-Based Modular Design and Verification Framework for C-model Reuse in a HW-
SW Co-Design Flow,”icdcsw,vol. 07, no.7, pp.838-843, 24th 2004. Dept. of Computer Science, University of
Hamburg, Hamburg, Germary
[14] Frank A. Kingswood, “SystemC: Bug 409 - AMD64 patch for SystemC 2.1 (Solaris
Opteron),(systemc_2_1.oct_12_2004.beta/Makefile systemc-amd64/Makefile). http://www.kingswood-
consulting.co.uk/frank/sc-amd64.patch
Charles S. Hart, “SystemC : Bug 447 Fix SystemC 2.1 - AMD64 (x86_64 port) on Solaris10 w/ gcc 3.4.3",
https://www.systemc.org/tracker/index.php?aid=447, 10 Jun. 2005.
[15] Open SystemC Initiative Website: www.systemc.org. SystemC Language Reference Manual. Included with
SystemC 2.0.1 download from www.systemc.org. SystemC User Guide. Included with SystemC 2.0.1
download from www.systemc.org.SystemC 2.1 Library (oct_12_2004.beta) download from
www.systemc.org. 2004.
[[16] Forte Design Systems, SystemC Training Course, www.forteds.com/SystemC/training/index.asp. , 2004.
[17] Cadence TestBuilder 1.3, www.testbuilder.net , Cadence Design Systems, Inc., Sept 2003.