SlideShare une entreprise Scribd logo
1  sur  17
Télécharger pour lire hors ligne
Deploying Functional
                           Qualification
                       at STMicroelectronics
                     Methodologies & Case Studies

                                 Olivier HALLER
                               STMicroelectronics
                     IP&Design, Functional Verification Group


DAC 2008 - Anaheim
  June 9th 2008
Agenda

    Motivation and history
          Initial engagement
          Deployment of functional qualification within ST
    Functional Qualification Methodology @ ST
          Use Models & Methodology
          Sharing Best Practices within ST
          Detection Strategies
          Future directions
    Case studies
          Measuring 3rd party IP Quality
          Detection strategy on a Video Codec




DAC’2008 - Anaheim              Elevating Confidence in Design IP
ST & Certess - Motivation and history

    Code & Functional Coverage were not enough
          Only measures capacity of stimuli to exercise the design
          Nothing guarantees proper verification of covered areas

    Interest of Mutation-Based Technology was obvious
          Filling a major hole in functional verification
          First objective metric to measure functional correctness

    Started to collaborate with Certess in 2004
          Service model – on Certitude prototype
          Very promising results
                Highlighted verification environment weaknesses




DAC’2008 - Anaheim                 Elevating Confidence in Design IP
Deployment
    Main factors of Certitude adoption
          Capable to cope with heterogeneous environments, non-intrusively
          Quality gains: first objective measure of functional correctness
          Productivity gain: help to focus verification efforts
          Tool robustness & ease of use
    Limitations
          CPU Time Consumption
          Analysis of results: from Non Detected fault to bug
          Methodology was missing to maximize R.O.I on qualification runs
    Deployment
          Limitations reduced with proper methodology
          Now covering 80% of ST’s IPs
          Certitude Metrics are our most important verification completeness
          indicators
                Code coverage has been replaced by Activation ratio



DAC’2008 - Anaheim                 Elevating Confidence in Design IP
Certitude Use Models

    Mean to measure the functional completeness
    of Verification Environments (thus of IP)
          Certitude-Metric mode
          Criterion to know on which IP to focus verification effort
          Criterion to select an IP provider
    And to improve quality of Verification
    Environments, thus of IPs
          Certitude-Improvement mode
          Many RTL bugs found using this technology
          Detection Strategy is key to reduce CPU consumption
          and ease results analysis

DAC’2008 - Anaheim           Elevating Confidence in Design IP
Certitude Metric Mode - Global Quality Indicators
      Certitude provides 3 orthogonal
      quality metrics
      Activation score (A/F)
            Quality of Stimuli and their ability to
            control mutations
      Propagation score (P/A)                                          Activation

            Quality of Stimuli and their ability to
            observe mutations
      Detection score (D/P)
            Quality of the checking system in the VE
                                                                              Propagation
      Global metric (D/F)                                 Detection
            Figure summarizing all.

DAC’2008 - Anaheim                 Elevating Confidence in Design IP
Deploying best practices within ST

    Verification cockpit
          Factorization of the setup of all verification tools
    Internal Qualification Workshop focusing on
          Detection Strategy
          Results Analysis
    Knowledge Sharing
          Certitude User’s Group run Quarterly in ST
          Intranet portal :
                Document, Bug trackers, Mailing list, Wiki, Subversion



DAC’2008 - Anaheim                Elevating Confidence in Design IP
Verification Cockpit                                                                         Tools
Productivity gains in setup                                                                  setup
                                                                                              VIP
                                                                                             access
                                                          Repositories
                                                                                               Log
                              DUT                         management                        filtering

                                                  VIP             Versioning              Repositories
                                                 access           management              management
                                      V
                                                                                               Job
                  simulator            E           Log                       Job           dispatching
                    setup                       filtering                dispatching
                                       R                                                  Versioning
                                         I Testbench                                      management
           Regression
            manager                       F                                             Factorization,
             setup                                                                      Best practices,
                                                       Test                            Company specific,
  Qualification               Coverage      F       regression                           Site specific,
   tool setup
                              collection     L                                          Last update, …
                                              O
                                                           Qualification
                                              W

                                                                                        VC setup
                                      verified


DAC’2008 - Anaheim                          Elevating Confidence in Design IP
Verification Cockpit
Benefits for Certitude Users
    Robustness and of the certitude_execute
          Encapsulation of workarounds to LSF and Clearcase common issues
          Tolerance to IT stability issues (efficient use of retry)
    Central maintenance
          No certitude_execute to write !
          Benefits of best practices and avoid common mistakes (eg. End of test)
          Reduce need for support
    Usage of advanced features
          Checker Analysis
          Status History
          Metric collection script
          Certess’s Hierarchical Dropping & Incremental Detection (aka HBID)
          Analysis hints scripts



DAC’2008 - Anaheim                Elevating Confidence in Design IP
Methodology evolution - future

    Usage of Certitude to measure “assertion density”
          Prototype Certitude & IFV integration already in Verification Cockpit
          Experimented on two blocks with very encouraging results
          Mutation Based Metrics becomes a common metric for formal &
          dynamic
    Certitude-C
          To qualify ‘C’ reference models and HLS models
    Integration within Cadence’s Enterprise Manager
    Linking mutations to functionality and verification plan




DAC’2008 - Anaheim              Elevating Confidence in Design IP
Case studies

    Following are a few case studies that illustrate
    two kinds of environments we are running
          Case study 1: Third-party IP qualification
          Case study 2: Video Codec Incremental Qualification




DAC’2008 - Anaheim         Elevating Confidence in Design IP
Case study 1 : 3rd Party - IP qualification
    •    Case study 1:                                              Activation Score (A/F)
             •    Application: 3rd party IP                             95%
                                                                                                         IP            ST Ref ST Avg 3rd Party IP
                                                                                             Activation Score (A/F)       95%    97%         97%
             •    HDL Directed Environment                              90%                  Propagation Score (P/A)     80%     90%         80%
                                                                        85%                  Global Metric (D/F)         75%     80%         66%
             •    ~300 tests, 30 minutes                                80%                  Detection Score (D/P)       90%     93%         85%
                                                                        75%
             •    Code Coverage ~100%                                   70%
                                                                        65%
                                 Detection Score (D/P)                  60%                               Propagation Score (P/A)

•       Challenges
                                                                                                            ST Ref
         •       Convince 3rd Party IP provider
                                                                                                            ST Avg
         •       High revenue, high visibility chip;                                                        3rd Party IP

                 reduce respin risk
                                                                     Global Metric (D/F)



    •    Results
             •    Helped us to push IP provider to improve verification environment
                  • and monitor progress
             •    Low detection score highlighted manual waveform checks

        DAC’2008 - Anaheim                               Elevating Confidence in Design IP
Case study 2 : Video Codec
•        Case study 2:                                           •     Environment
          • Application: Video CODEC                                      •    SystemC-TLM platform
           •        1.5 Millions Gate                                     •    Regression: ~1 CPU year
           •        23000 tests considered
           •        Experiment done in 2006

    •          Challenges
                •     Scalability : very complex IP with very long regression suite
                •     Full Detection was out of reach for this IP and detection strategy
                      needs to be adapter
•        Results
           •        One major weakness in the verification environment was highlighted
           •        “Detection Strategy” was born, hierarchical dropping & incremental detection




        DAC’2008 - Anaheim                      Elevating Confidence in Design IP
Case study 2 : Hierarchical Dropping
                                                     Evolution through passes
                 60000                                                                                                                                                                         100.0%


                                                                                                                                                                                               90.0%
                 50000
                                                                                                                                                                                               80.0%


                                                                                                                                                                                               70.0%
                 40000
                                                                                                                                                                                               60.0%
Faults




                 30000                                                                                                                                                                         50.0%

                                                                                                                                                                                               40.0%
                 20000
                                                                                                                                                                                               30.0%


                                                                                                                                                                                               20.0%
                 10000
                                                                                                                                                                                               10.0%


                     0                                                                                                                                                                         0.0%
                                                                                                                                                                      Pass 5 (200tc/fault on
                                               Pass 1 (7 tc/fault) [25-   Pass 2 (12 tc/fault)   Pass 3 (17 tc/fault)   Pass 3 with DBU (17 Pass 4 with DBU (22
                           Start [04-Jul-06]                                                                                                                           143 faults) [16-Aug-
                                                      Jul-06]                [30-Jul-06]            [07-Aug-06]         tc/fault) [07-Aug-06] tc/fault) [11-Aug-06]
                                                                                                                                                                                06]
         F-DBU                  52938                  52938                    52938                  52938                   51626                 51626                    51626
         NA                     2773                    2773                     2773                   2773                   2725                   2725                    2725
         NP (inc. weaks)        1603                   19389                    15410                  14216                   13530                 12661                    12552
         D                        0                    21528                    25061                  25701                   25693                 26044                    26096
         ND                       0                     1952                     2103                   2192                   2056                   2089                    2146
         Dr                       0                     7296                     7591                   8047                   7613                   8094                    8092
         NYQ                    48562                     0                        0                      9                      9                     13                      15
         A/(F-DBU)              94.8%                  94.8%                    94.8%                  94.8%                   94.7%                 94.7%                   94.7%
         P/A                    0.0%                   46.8%                    54.1%                  55.6%                   56.7%                 57.5%                   57.8%
         D/P                    0.0%                   91.7%                    92.3%                  92.1%                   92.6%                 92.6%                   92.4%
         D/(F-DBU)              0.0%                   40.7%                    47.3%                  48.5%                   49.8%                 50.4%                   50.5%




 DAC’2008 - Anaheim                                                                Elevating Confidence in Design IP
Summary

    Functional Qualification introduction over the last 3
    years had major benefits
          Major increase in overall chips quality
          Good productivity increase in verification process
          Objective and common metrics
    Methodology was key to enable these benefits
    Lots of potential future applications
          Application to formal methods
          Application to C/C++ based models
          Industry shared criterion to measure 3rd party provided IPs
    Missing a competitor to give more credit to the
    technology ;-)
          Industry should align on a mutation model



DAC’2008 - Anaheim              Elevating Confidence in Design IP
BACKUP Slides




DAC 2008 - Anaheim
  June 9th 2008
Certitude Metrics - ST References
    Global Metric
          Representing the overall quality of the Verification Environment
          ST reference : 75%, but usually higher
    Activation Score
          Measures the ability of the test suite to exercise all the RTL of the
          IP
          Similar to code coverage
          ST reference : 95%, & 100% explained
          Missing % should deeply studied & fixed or explained
    Propagation Score
          Measures the ability of the test suite to propagate mutations to the
          outputs of the IP
          ST reference : 80%, but should probably be enhanced by adding
          more test scenarios to reach 90%
    Detection Score
          Measures the ability of the environment to catch errors
          ST reference : 90%, but usually higher
DAC’2008 - Anaheim               Elevating Confidence in Design IP

Contenu connexe

Tendances

Alliance Successful Selenium Automation
Alliance Successful Selenium AutomationAlliance Successful Selenium Automation
Alliance Successful Selenium Automationsadams22
 
G&G Relationship Development 1.Defense
G&G Relationship Development 1.DefenseG&G Relationship Development 1.Defense
G&G Relationship Development 1.DefenseThe G&G Mfg. Co.
 
PowerPoint Presentation
PowerPoint PresentationPowerPoint Presentation
PowerPoint Presentationrhofkens
 
Dfr Presentation
Dfr   PresentationDfr   Presentation
Dfr Presentationeraz
 
Introducing Obsidian Software and RAVEN-GCS for PowerPC
Introducing Obsidian Software and RAVEN-GCS for PowerPCIntroducing Obsidian Software and RAVEN-GCS for PowerPC
Introducing Obsidian Software and RAVEN-GCS for PowerPCDVClub
 
Miratech software testing and QA services
Miratech software testing and QA servicesMiratech software testing and QA services
Miratech software testing and QA servicesMiratech
 
Pr 005 qa_workshop
Pr 005 qa_workshopPr 005 qa_workshop
Pr 005 qa_workshopFrank Gielen
 
Persistent Analytical Instrumentation Expertise
Persistent Analytical Instrumentation ExpertisePersistent Analytical Instrumentation Expertise
Persistent Analytical Instrumentation ExpertiseSebastien RATTIER
 
Ravit Danino HP - Roles and Collaboration in Agile
Ravit Danino HP - Roles and Collaboration in AgileRavit Danino HP - Roles and Collaboration in Agile
Ravit Danino HP - Roles and Collaboration in AgileAgileSparks
 
Shirly Ronen - Documenting an agile defect
Shirly Ronen - Documenting an agile defectShirly Ronen - Documenting an agile defect
Shirly Ronen - Documenting an agile defectAgileSparks
 
Combining requirements engineering and testing in agile.
Combining requirements engineering and testing in agile. Combining requirements engineering and testing in agile.
Combining requirements engineering and testing in agile. SYSQA BV
 
2012 student track - vs2010
2012   student track - vs20102012   student track - vs2010
2012 student track - vs2010Tim Mahy
 
Shirly Ronen - User story testing activities
Shirly Ronen - User story testing activitiesShirly Ronen - User story testing activities
Shirly Ronen - User story testing activitiesAgileSparks
 
Neil Tompson - SoftTest Ireland
Neil Tompson - SoftTest IrelandNeil Tompson - SoftTest Ireland
Neil Tompson - SoftTest IrelandDavid O'Dowd
 

Tendances (18)

Alliance Successful Selenium Automation
Alliance Successful Selenium AutomationAlliance Successful Selenium Automation
Alliance Successful Selenium Automation
 
G&G Relationship Development 1.Defense
G&G Relationship Development 1.DefenseG&G Relationship Development 1.Defense
G&G Relationship Development 1.Defense
 
PowerPoint Presentation
PowerPoint PresentationPowerPoint Presentation
PowerPoint Presentation
 
Dfr Presentation
Dfr   PresentationDfr   Presentation
Dfr Presentation
 
Introducing Obsidian Software and RAVEN-GCS for PowerPC
Introducing Obsidian Software and RAVEN-GCS for PowerPCIntroducing Obsidian Software and RAVEN-GCS for PowerPC
Introducing Obsidian Software and RAVEN-GCS for PowerPC
 
Miratech software testing and QA services
Miratech software testing and QA servicesMiratech software testing and QA services
Miratech software testing and QA services
 
Neha_Kansal
Neha_KansalNeha_Kansal
Neha_Kansal
 
Resume_Pragathi
Resume_PragathiResume_Pragathi
Resume_Pragathi
 
Pr 005 qa_workshop
Pr 005 qa_workshopPr 005 qa_workshop
Pr 005 qa_workshop
 
Persistent Analytical Instrumentation Expertise
Persistent Analytical Instrumentation ExpertisePersistent Analytical Instrumentation Expertise
Persistent Analytical Instrumentation Expertise
 
Ravit Danino HP - Roles and Collaboration in Agile
Ravit Danino HP - Roles and Collaboration in AgileRavit Danino HP - Roles and Collaboration in Agile
Ravit Danino HP - Roles and Collaboration in Agile
 
Shirly Ronen - Documenting an agile defect
Shirly Ronen - Documenting an agile defectShirly Ronen - Documenting an agile defect
Shirly Ronen - Documenting an agile defect
 
Combining requirements engineering and testing in agile.
Combining requirements engineering and testing in agile. Combining requirements engineering and testing in agile.
Combining requirements engineering and testing in agile.
 
2012 student track - vs2010
2012   student track - vs20102012   student track - vs2010
2012 student track - vs2010
 
Lifecycle
LifecycleLifecycle
Lifecycle
 
Kgomadam Candidacy
Kgomadam CandidacyKgomadam Candidacy
Kgomadam Candidacy
 
Shirly Ronen - User story testing activities
Shirly Ronen - User story testing activitiesShirly Ronen - User story testing activities
Shirly Ronen - User story testing activities
 
Neil Tompson - SoftTest Ireland
Neil Tompson - SoftTest IrelandNeil Tompson - SoftTest Ireland
Neil Tompson - SoftTest Ireland
 

En vedette (9)

Jonathan bromley doulos
Jonathan bromley doulosJonathan bromley doulos
Jonathan bromley doulos
 
Dv club foils_intel_austin
Dv club foils_intel_austinDv club foils_intel_austin
Dv club foils_intel_austin
 
Chris brown ti
Chris brown tiChris brown ti
Chris brown ti
 
Dill may-2008
Dill may-2008Dill may-2008
Dill may-2008
 
3 d to _hpc
3 d to _hpc3 d to _hpc
3 d to _hpc
 
Colwell validation attitude
Colwell validation attitudeColwell validation attitude
Colwell validation attitude
 
Ludden power7 verification
Ludden power7 verificationLudden power7 verification
Ludden power7 verification
 
Zehr dv club_12052006
Zehr dv club_12052006Zehr dv club_12052006
Zehr dv club_12052006
 
Zhang rtp q307
Zhang rtp q307Zhang rtp q307
Zhang rtp q307
 

Similaire à Benjamin q4 2008_bristol

Testing in an Open Source Middleware Platform Space The WSO2 Way.
Testing in an Open Source Middleware Platform Space  The WSO2 Way.Testing in an Open Source Middleware Platform Space  The WSO2 Way.
Testing in an Open Source Middleware Platform Space The WSO2 Way.WSO2
 
Unosquare SlideShare Presentation
Unosquare SlideShare PresentationUnosquare SlideShare Presentation
Unosquare SlideShare PresentationMichael Barrett
 
Bridging the gap between pre-production testing and production monitoring
Bridging the gap between pre-production testing and production monitoringBridging the gap between pre-production testing and production monitoring
Bridging the gap between pre-production testing and production monitoringfsyed
 
Automated Testing for CA Plex and 2E
Automated Testing for CA Plex and 2EAutomated Testing for CA Plex and 2E
Automated Testing for CA Plex and 2ECM First Group
 
Linkroad Corporate Snapshot
Linkroad Corporate SnapshotLinkroad Corporate Snapshot
Linkroad Corporate Snapshotmelvinyou
 
Ashwini - Effective use of CI by QA
Ashwini - Effective use of CI by QAAshwini - Effective use of CI by QA
Ashwini - Effective use of CI by QAvodQA
 
Software Quality Training
Software Quality TrainingSoftware Quality Training
Software Quality Trainingkujumari
 
5 sins of all hands ppt
5 sins of all hands ppt5 sins of all hands ppt
5 sins of all hands pptSpike Gu
 
Smartere test og udvikling med virtualiserede miljøer, Mark Garcia, IBM US
Smartere test og udvikling med virtualiserede miljøer, Mark Garcia, IBM USSmartere test og udvikling med virtualiserede miljøer, Mark Garcia, IBM US
Smartere test og udvikling med virtualiserede miljøer, Mark Garcia, IBM USIBM Danmark
 
Gredy - test automation management and team collaboration
Gredy - test automation management and team collaborationGredy - test automation management and team collaboration
Gredy - test automation management and team collaborationGredy
 
Agile Open Source Performance Testing Workshop for Business Managers
Agile Open Source Performance Testing Workshop for Business ManagersAgile Open Source Performance Testing Workshop for Business Managers
Agile Open Source Performance Testing Workshop for Business ManagersClever Moe
 
Idexcel Independent Testing Services Presentation
Idexcel Independent Testing Services PresentationIdexcel Independent Testing Services Presentation
Idexcel Independent Testing Services PresentationIdexcel
 
Stl litigation support services..shivraj patil
Stl   litigation support services..shivraj patilStl   litigation support services..shivraj patil
Stl litigation support services..shivraj patilsraaj
 
Test Automation
Test AutomationTest Automation
Test Automationnikatmalik
 
Accelerated test case - Anish bhanu
Accelerated test case - Anish bhanuAccelerated test case - Anish bhanu
Accelerated test case - Anish bhanuRoopa Nadkarni
 
XebiaLabs, CloudBees, Puppet Labs Webinar Slides - IT Automation for the Mode...
XebiaLabs, CloudBees, Puppet Labs Webinar Slides - IT Automation for the Mode...XebiaLabs, CloudBees, Puppet Labs Webinar Slides - IT Automation for the Mode...
XebiaLabs, CloudBees, Puppet Labs Webinar Slides - IT Automation for the Mode...XebiaLabs
 
Pilot essentials webinar
Pilot essentials webinarPilot essentials webinar
Pilot essentials webinarMaarga Systems
 

Similaire à Benjamin q4 2008_bristol (20)

Testing in an Open Source Middleware Platform Space The WSO2 Way.
Testing in an Open Source Middleware Platform Space  The WSO2 Way.Testing in an Open Source Middleware Platform Space  The WSO2 Way.
Testing in an Open Source Middleware Platform Space The WSO2 Way.
 
Unosquare SlideShare Presentation
Unosquare SlideShare PresentationUnosquare SlideShare Presentation
Unosquare SlideShare Presentation
 
Bridging the gap between pre-production testing and production monitoring
Bridging the gap between pre-production testing and production monitoringBridging the gap between pre-production testing and production monitoring
Bridging the gap between pre-production testing and production monitoring
 
Automated Testing for CA Plex and 2E
Automated Testing for CA Plex and 2EAutomated Testing for CA Plex and 2E
Automated Testing for CA Plex and 2E
 
Chandra Sekaran QA Lead Golden Resume
Chandra Sekaran QA Lead Golden ResumeChandra Sekaran QA Lead Golden Resume
Chandra Sekaran QA Lead Golden Resume
 
Chandra Sekaran QA Lead Golden Resume
Chandra Sekaran QA Lead Golden ResumeChandra Sekaran QA Lead Golden Resume
Chandra Sekaran QA Lead Golden Resume
 
Linkroad Corporate Snapshot
Linkroad Corporate SnapshotLinkroad Corporate Snapshot
Linkroad Corporate Snapshot
 
Ashwini - Effective use of CI by QA
Ashwini - Effective use of CI by QAAshwini - Effective use of CI by QA
Ashwini - Effective use of CI by QA
 
Software Quality Training
Software Quality TrainingSoftware Quality Training
Software Quality Training
 
5 sins of all hands ppt
5 sins of all hands ppt5 sins of all hands ppt
5 sins of all hands ppt
 
Smartere test og udvikling med virtualiserede miljøer, Mark Garcia, IBM US
Smartere test og udvikling med virtualiserede miljøer, Mark Garcia, IBM USSmartere test og udvikling med virtualiserede miljøer, Mark Garcia, IBM US
Smartere test og udvikling med virtualiserede miljøer, Mark Garcia, IBM US
 
Gredy - test automation management and team collaboration
Gredy - test automation management and team collaborationGredy - test automation management and team collaboration
Gredy - test automation management and team collaboration
 
Agile Open Source Performance Testing Workshop for Business Managers
Agile Open Source Performance Testing Workshop for Business ManagersAgile Open Source Performance Testing Workshop for Business Managers
Agile Open Source Performance Testing Workshop for Business Managers
 
Idexcel Independent Testing Services Presentation
Idexcel Independent Testing Services PresentationIdexcel Independent Testing Services Presentation
Idexcel Independent Testing Services Presentation
 
Stl litigation support services..shivraj patil
Stl   litigation support services..shivraj patilStl   litigation support services..shivraj patil
Stl litigation support services..shivraj patil
 
Test Automation
Test AutomationTest Automation
Test Automation
 
Accelerated test case - Anish bhanu
Accelerated test case - Anish bhanuAccelerated test case - Anish bhanu
Accelerated test case - Anish bhanu
 
SANGEETHA S JADAV
SANGEETHA S JADAVSANGEETHA S JADAV
SANGEETHA S JADAV
 
XebiaLabs, CloudBees, Puppet Labs Webinar Slides - IT Automation for the Mode...
XebiaLabs, CloudBees, Puppet Labs Webinar Slides - IT Automation for the Mode...XebiaLabs, CloudBees, Puppet Labs Webinar Slides - IT Automation for the Mode...
XebiaLabs, CloudBees, Puppet Labs Webinar Slides - IT Automation for the Mode...
 
Pilot essentials webinar
Pilot essentials webinarPilot essentials webinar
Pilot essentials webinar
 

Plus de Obsidian Software (20)

Yang greenstein part_2
Yang greenstein part_2Yang greenstein part_2
Yang greenstein part_2
 
Yang greenstein part_1
Yang greenstein part_1Yang greenstein part_1
Yang greenstein part_1
 
Williamson arm validation metrics
Williamson arm validation metricsWilliamson arm validation metrics
Williamson arm validation metrics
 
Whipp q3 2008_sv
Whipp q3 2008_svWhipp q3 2008_sv
Whipp q3 2008_sv
 
Vishakantaiah validating
Vishakantaiah validatingVishakantaiah validating
Vishakantaiah validating
 
Validation and-design-in-a-small-team-environment
Validation and-design-in-a-small-team-environmentValidation and-design-in-a-small-team-environment
Validation and-design-in-a-small-team-environment
 
Tobin verification isglobal
Tobin verification isglobalTobin verification isglobal
Tobin verification isglobal
 
Tierney bq207
Tierney bq207Tierney bq207
Tierney bq207
 
The validation attitude
The validation attitudeThe validation attitude
The validation attitude
 
Thaker q3 2008
Thaker q3 2008Thaker q3 2008
Thaker q3 2008
 
Thaker q3 2008
Thaker q3 2008Thaker q3 2008
Thaker q3 2008
 
Strickland dvclub
Strickland dvclubStrickland dvclub
Strickland dvclub
 
Stinson post si and verification
Stinson post si and verificationStinson post si and verification
Stinson post si and verification
 
Shultz dallas q108
Shultz dallas q108Shultz dallas q108
Shultz dallas q108
 
Shreeve dv club_ams
Shreeve dv club_amsShreeve dv club_ams
Shreeve dv club_ams
 
Sharam salamian
Sharam salamianSharam salamian
Sharam salamian
 
Schulz sv q2_2009
Schulz sv q2_2009Schulz sv q2_2009
Schulz sv q2_2009
 
Schulz dallas q1_2008
Schulz dallas q1_2008Schulz dallas q1_2008
Schulz dallas q1_2008
 
Salamian dv club_foils_intel_austin
Salamian dv club_foils_intel_austinSalamian dv club_foils_intel_austin
Salamian dv club_foils_intel_austin
 
Sakar jain
Sakar jainSakar jain
Sakar jain
 

Benjamin q4 2008_bristol

  • 1. Deploying Functional Qualification at STMicroelectronics Methodologies & Case Studies Olivier HALLER STMicroelectronics IP&Design, Functional Verification Group DAC 2008 - Anaheim June 9th 2008
  • 2. Agenda Motivation and history Initial engagement Deployment of functional qualification within ST Functional Qualification Methodology @ ST Use Models & Methodology Sharing Best Practices within ST Detection Strategies Future directions Case studies Measuring 3rd party IP Quality Detection strategy on a Video Codec DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 3. ST & Certess - Motivation and history Code & Functional Coverage were not enough Only measures capacity of stimuli to exercise the design Nothing guarantees proper verification of covered areas Interest of Mutation-Based Technology was obvious Filling a major hole in functional verification First objective metric to measure functional correctness Started to collaborate with Certess in 2004 Service model – on Certitude prototype Very promising results Highlighted verification environment weaknesses DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 4. Deployment Main factors of Certitude adoption Capable to cope with heterogeneous environments, non-intrusively Quality gains: first objective measure of functional correctness Productivity gain: help to focus verification efforts Tool robustness & ease of use Limitations CPU Time Consumption Analysis of results: from Non Detected fault to bug Methodology was missing to maximize R.O.I on qualification runs Deployment Limitations reduced with proper methodology Now covering 80% of ST’s IPs Certitude Metrics are our most important verification completeness indicators Code coverage has been replaced by Activation ratio DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 5. Certitude Use Models Mean to measure the functional completeness of Verification Environments (thus of IP) Certitude-Metric mode Criterion to know on which IP to focus verification effort Criterion to select an IP provider And to improve quality of Verification Environments, thus of IPs Certitude-Improvement mode Many RTL bugs found using this technology Detection Strategy is key to reduce CPU consumption and ease results analysis DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 6. Certitude Metric Mode - Global Quality Indicators Certitude provides 3 orthogonal quality metrics Activation score (A/F) Quality of Stimuli and their ability to control mutations Propagation score (P/A) Activation Quality of Stimuli and their ability to observe mutations Detection score (D/P) Quality of the checking system in the VE Propagation Global metric (D/F) Detection Figure summarizing all. DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 7. Deploying best practices within ST Verification cockpit Factorization of the setup of all verification tools Internal Qualification Workshop focusing on Detection Strategy Results Analysis Knowledge Sharing Certitude User’s Group run Quarterly in ST Intranet portal : Document, Bug trackers, Mailing list, Wiki, Subversion DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 8. Verification Cockpit Tools Productivity gains in setup setup VIP access Repositories Log DUT management filtering VIP Versioning Repositories access management management V Job simulator E Log Job dispatching setup filtering dispatching R Versioning I Testbench management Regression manager F Factorization, setup Best practices, Test Company specific, Qualification Coverage F regression Site specific, tool setup collection L Last update, … O Qualification W VC setup verified DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 9. Verification Cockpit Benefits for Certitude Users Robustness and of the certitude_execute Encapsulation of workarounds to LSF and Clearcase common issues Tolerance to IT stability issues (efficient use of retry) Central maintenance No certitude_execute to write ! Benefits of best practices and avoid common mistakes (eg. End of test) Reduce need for support Usage of advanced features Checker Analysis Status History Metric collection script Certess’s Hierarchical Dropping & Incremental Detection (aka HBID) Analysis hints scripts DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 10. Methodology evolution - future Usage of Certitude to measure “assertion density” Prototype Certitude & IFV integration already in Verification Cockpit Experimented on two blocks with very encouraging results Mutation Based Metrics becomes a common metric for formal & dynamic Certitude-C To qualify ‘C’ reference models and HLS models Integration within Cadence’s Enterprise Manager Linking mutations to functionality and verification plan DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 11. Case studies Following are a few case studies that illustrate two kinds of environments we are running Case study 1: Third-party IP qualification Case study 2: Video Codec Incremental Qualification DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 12. Case study 1 : 3rd Party - IP qualification • Case study 1: Activation Score (A/F) • Application: 3rd party IP 95% IP ST Ref ST Avg 3rd Party IP Activation Score (A/F) 95% 97% 97% • HDL Directed Environment 90% Propagation Score (P/A) 80% 90% 80% 85% Global Metric (D/F) 75% 80% 66% • ~300 tests, 30 minutes 80% Detection Score (D/P) 90% 93% 85% 75% • Code Coverage ~100% 70% 65% Detection Score (D/P) 60% Propagation Score (P/A) • Challenges ST Ref • Convince 3rd Party IP provider ST Avg • High revenue, high visibility chip; 3rd Party IP reduce respin risk Global Metric (D/F) • Results • Helped us to push IP provider to improve verification environment • and monitor progress • Low detection score highlighted manual waveform checks DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 13. Case study 2 : Video Codec • Case study 2: • Environment • Application: Video CODEC • SystemC-TLM platform • 1.5 Millions Gate • Regression: ~1 CPU year • 23000 tests considered • Experiment done in 2006 • Challenges • Scalability : very complex IP with very long regression suite • Full Detection was out of reach for this IP and detection strategy needs to be adapter • Results • One major weakness in the verification environment was highlighted • “Detection Strategy” was born, hierarchical dropping & incremental detection DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 14. Case study 2 : Hierarchical Dropping Evolution through passes 60000 100.0% 90.0% 50000 80.0% 70.0% 40000 60.0% Faults 30000 50.0% 40.0% 20000 30.0% 20.0% 10000 10.0% 0 0.0% Pass 5 (200tc/fault on Pass 1 (7 tc/fault) [25- Pass 2 (12 tc/fault) Pass 3 (17 tc/fault) Pass 3 with DBU (17 Pass 4 with DBU (22 Start [04-Jul-06] 143 faults) [16-Aug- Jul-06] [30-Jul-06] [07-Aug-06] tc/fault) [07-Aug-06] tc/fault) [11-Aug-06] 06] F-DBU 52938 52938 52938 52938 51626 51626 51626 NA 2773 2773 2773 2773 2725 2725 2725 NP (inc. weaks) 1603 19389 15410 14216 13530 12661 12552 D 0 21528 25061 25701 25693 26044 26096 ND 0 1952 2103 2192 2056 2089 2146 Dr 0 7296 7591 8047 7613 8094 8092 NYQ 48562 0 0 9 9 13 15 A/(F-DBU) 94.8% 94.8% 94.8% 94.8% 94.7% 94.7% 94.7% P/A 0.0% 46.8% 54.1% 55.6% 56.7% 57.5% 57.8% D/P 0.0% 91.7% 92.3% 92.1% 92.6% 92.6% 92.4% D/(F-DBU) 0.0% 40.7% 47.3% 48.5% 49.8% 50.4% 50.5% DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 15. Summary Functional Qualification introduction over the last 3 years had major benefits Major increase in overall chips quality Good productivity increase in verification process Objective and common metrics Methodology was key to enable these benefits Lots of potential future applications Application to formal methods Application to C/C++ based models Industry shared criterion to measure 3rd party provided IPs Missing a competitor to give more credit to the technology ;-) Industry should align on a mutation model DAC’2008 - Anaheim Elevating Confidence in Design IP
  • 16. BACKUP Slides DAC 2008 - Anaheim June 9th 2008
  • 17. Certitude Metrics - ST References Global Metric Representing the overall quality of the Verification Environment ST reference : 75%, but usually higher Activation Score Measures the ability of the test suite to exercise all the RTL of the IP Similar to code coverage ST reference : 95%, & 100% explained Missing % should deeply studied & fixed or explained Propagation Score Measures the ability of the test suite to propagate mutations to the outputs of the IP ST reference : 80%, but should probably be enhanced by adding more test scenarios to reach 90% Detection Score Measures the ability of the environment to catch errors ST reference : 90%, but usually higher DAC’2008 - Anaheim Elevating Confidence in Design IP