Scan and ATPG Process Guide

download Scan and ATPG Process Guide

of 422

description

dft

Transcript of Scan and ATPG Process Guide

  • Scan and ATPG Process Guide(DFTAdvisor, FastScan and FlexTest)

    Software Version 8.2008_3August 2008

    1999-2008 Mentor Graphics CorporationAll rights reserved.

    This document contains information that is proprietary to Mentor Graphics Corporation. The original recipient of thisdocument may duplicate this document in whole or in part for internal business purposes only, provided that this entirenotice appears in all copies. In duplicating any part of this document, the recipient agrees to make every reasonableeffort to prevent the unauthorized use and distribution of the proprietary information.

  • This document is for information and instruction purposes. Mentor Graphics reserves the right to makechanges in specifications and other information contained in this publication without prior notice, and thereader should, in all cases, consult Mentor Graphics to determine whether any changes have beenmade.

    The terms and conditions governing the sale and licensing of Mentor Graphics products are set forth inwritten agreements between Mentor Graphics and its customers. No representation or other affirmationof fact contained in this publication shall be deemed to be a warranty or give rise to any liability of MentorGraphics whatsoever.

    MENTOR GRAPHICS MAKES NO WARRANTY OF ANY KIND WITH REGARD TO THIS MATERIALINCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY ANDFITNESS FOR A PARTICULAR PURPOSE.

    MENTOR GRAPHICS SHALL NOT BE LIABLE FOR ANY INCIDENTAL, INDIRECT, SPECIAL, ORCONSEQUENTIAL DAMAGES WHATSOEVER (INCLUDING BUT NOT LIMITED TO LOST PROFITS)ARISING OUT OF OR RELATED TO THIS PUBLICATION OR THE INFORMATION CONTAINED IN IT,EVEN IF MENTOR GRAPHICS CORPORATION HAS BEEN ADVISED OF THE POSSIBILITY OFSUCH DAMAGES.

    RESTRICTED RIGHTS LEGEND 03/97

    U.S. Government Restricted Rights. The SOFTWARE and documentation have been developed entirelyat private expense and are commercial computer software provided with restricted rights. Use,duplication or disclosure by the U.S. Government or a U.S. Government subcontractor is subject to therestrictions set forth in the license agreement provided with the software pursuant to DFARS 227.7202-3(a) or as set forth in subparagraph (c)(1) and (2) of the Commercial Computer Software - RestrictedRights clause at FAR 52.227-19, as applicable.

    Contractor/manufacturer is:Mentor Graphics Corporation

    8005 S.W. Boeckman Road, Wilsonville, Oregon 97070-7777.Telephone: 503.685.7000

    Toll-Free Telephone: 800.592.2210Website: www.mentor.com

    SupportNet: supportnet.mentor.com/Send Feedback on Documentation: supportnet.mentor.com/user/feedback_form.cfm

    TRADEMARKS: The trademarks, logos and service marks ("Marks") used herein are the property ofMentor Graphics Corporation or other third parties. No one is permitted to use these Marks without theprior written consent of Mentor Graphics or the respective third-party owner. The use herein of a third-party Mark is not an attempt to indicate Mentor Graphics as a source of a product, but is intended toindicate a product from, or associated with, a particular third party. A current list of Mentor Graphicstrademarks may be viewed at: www.mentor.com/terms_conditions/trademarks.cfm.

  • Scan and ATPG Process Guide, V8.2008_3 3August 2008

    Table of Contents

    Chapter 1Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

    What is Design-for-Test?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15DFT Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

    Top-Down Design Flow with DFT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16DFT Design Tasks and Products . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

    Chapter 2Understanding Scan and ATPG Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

    Understanding Scan Design. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Internal Scan Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24Scan Design Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24Understanding Full Scan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26Understanding Partial Scan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27Choosing Between Full or Partial Scan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28Understanding Wrapper Chains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29Understanding Test Points . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31Test Structure Insertion with DFTAdvisor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

    Understanding ATPG. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34The ATPG Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34Mentor Graphics ATPG Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35Full-Scan and Scan Sequential ATPG with FastScan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35Non- to Full-Scan ATPG with FlexTest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

    Understanding Test Types and Fault Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37Test Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38Fault Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40Fault Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47Fault Classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49Testability Calculations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57

    Chapter 3Understanding Common Tool Terminology and Concepts . . . . . . . . . . . . . . . . . . . . . . . . 59

    Scan Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59Scan Cells . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59Scan Chains. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63Scan Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64Scan Clocks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

    Scan Architectures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65Mux-DFF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65Clocked-Scan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66LSSD. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

    Test Procedure Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

  • Table of Contents

    4August 2008

    Scan and ATPG Process Guide, V8.2008_3

    Model Flattening . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68Understanding Design Object Naming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68The Flattening Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69Simulation Primitives of the Flattened Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

    Learning Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73Equivalence Relationships . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73Logic Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74Implied Relationships . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74Forbidden Relationships . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75Dominance Relationships . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

    ATPG Design Rules Checking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76General Rules Checking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76Procedure Rules Checking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77Bus Mutual Exclusivity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77Scan Chain Tracing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78Shadow Latch Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78Data Rules Checking. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79Transparent Latch Identification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79Clock Rules Checking. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80RAM Rules Checking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80Bus Keeper Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80Extra Rules Checking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81Scannability Rules Checking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81Constrained/Forbidden/Block Value Calculations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

    Chapter 4Understanding Testability Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83

    Synchronous Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84Synchronous Design Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

    Asynchronous Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85Scannability Checking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85

    Scannability Checking of Latches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86Support for Special Testability Cases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86

    Feedback Loops . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86Structural Combinational Loops and Loop-Cutting Methods . . . . . . . . . . . . . . . . . . . . . . 86Structural Sequential Loops and Handling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93Redundant Logic. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95Asynchronous Sets and Resets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95Gated Clocks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96Tri-State Devices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96Non-Scan Cell Handling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97Clock Dividers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102Pulse Generators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103JTAG-Based Circuits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104Testing RAM and ROM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104Incomplete Designs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111

  • Table of Contents

    Scan and ATPG Process Guide, V8.2008_3 5August 2008

    Chapter 5Inserting Internal Scanand Test Circuitry. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113

    The DFTAdvisor Process Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113DFTAdvisor Inputs and Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115Test Structures Supported by DFTAdvisor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116Invoking DFTAdvisor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119

    Preparing for Test Structure Insertion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120Selecting the Scan Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120Defining Scan Cell and Scan Output Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120Enabling Test Logic Insertion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121Specifying Clock Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124Specifying Existing Scan Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125Handling Existing Boundary Scan Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128Changing the System Mode (Running Rules Checking) . . . . . . . . . . . . . . . . . . . . . . . . . . 129

    Identifying Test Structures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129Selecting the Type of Test Structure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129Setting Up for Full Scan Identification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130Setting Up for Clocked Sequential Identification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130Setting Up for Sequential Transparent Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131Setting Up for Wrapper Chain Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131Setting Up for Sequential (ATPG, Automatic, SCOAP, and Structure) Identification . . . 133Setting Up for Test Point Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135Manually Including and Excluding Cells for Scan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137Reporting Scannability Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139Automatic Recognition of Existing Shift Registers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141Running the Identification Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143Reporting Identification Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143

    Inserting Test Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144Setting Up for Internal Scan Insertion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144Setting Up for Test Point Insertion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147Buffering Test Pins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147Running the Insertion Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148Merging Scan Chains with Different Shift Clocks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149

    Saving the New Design and ATPG Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151Writing the Netlist. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152Writing the Test Procedure File and Dofile for ATPG. . . . . . . . . . . . . . . . . . . . . . . . . . . . 152Running Rules Checking on the New Design. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153Exiting DFTAdvisor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153

    Inserting Scan Block-by-Block . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153Verilog Flow Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154

    Chapter 6Generating Test Patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157

    FastScan and FlexTest Basic Tool Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157FastScan and FlexTest Inputs and Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160Understanding the FastScan ATPG Method. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161Understanding FlexTests ATPG Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167

  • Table of Contents

    6August 2008

    Scan and ATPG Process Guide, V8.2008_3

    Performing Basic Operations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170Invoking the Applications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170Setting the System Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173

    Setting Up Design and Tool Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173Setting Up the Circuit Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173Setting Up Tool Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180Setting the Circuit Timing (FlexTest Only) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185Defining the Scan Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188

    Checking Rules and Debugging Rules Violations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191Running Good/Fault Simulation on Existing Patterns. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192

    Fault Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192Good Machine Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196

    Running Random Pattern Simulation (FastScan) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197Changing to the Fault System Mode. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197Setting the Pattern Source to Random . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197Creating the Faults List. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198Running the Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198

    Setting Up the Fault Information for ATPG . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198Changing to the ATPG System Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198Setting the Fault Type. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199Creating the Faults List. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199Adding Faults to an Existing List . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200Loading Faults from an External List . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200Writing Faults to an External File. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201Setting Self-Initialized Test Sequences (FlexTest Only) . . . . . . . . . . . . . . . . . . . . . . . . . . 201Setting the Fault Sampling Percentage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201Setting the Fault Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202Setting the Hypertrophic Limit (FlexTest Only) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202Setting DS Fault Handling (FlexTest Only) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202Setting the Possible-Detect Credit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203

    Performing ATPG . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203Setting Up for ATPG . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204Creating Patterns with Default Settings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219Compressing Patterns (FlexTest Only). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220Approaches for Improving ATPG Efficiency. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220Saving the Test Patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224

    Creating an IDDQ Test Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225Creating a Selective IDDQ Test Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226Generating a Supplemental IDDQ Test Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228Specifying Leakage Current Checks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229

    Creating a Static Bridge Test Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230Bridge Fault Test Pattern Prerequisites. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230Creating a Static Bridge Fault Test Set. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231Creating the Bridge Fault Definition File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231Creating the Bridge Fault Definition File with Calibre . . . . . . . . . . . . . . . . . . . . . . . . . . . 235

    Creating a Delay Test Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237Creating a Transition Delay Test Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 238Transition Fault Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239Basic Procedure for Generating a Transition Test Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243

  • Table of Contents

    Scan and ATPG Process Guide, V8.2008_3 7August 2008

    Timing for Transition Delay Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243Preventing Pattern Failures Due to Timing Exception Paths . . . . . . . . . . . . . . . . . . . . . . . 246Creating a Path Delay Test Set (FastScan) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251At-Speed Test Using Named Capture Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261Mux-DFF Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269Generating Test Patterns for Different Fault Models and Fault Grading . . . . . . . . . . . . . . 275

    Generating Patterns for a Boundary Scan Circuit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277Dofile and Explanation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277TAP Controller State Machine . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278Test Procedure File and Explanation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279

    Creating Instruction-Based Test Sets (FlexTest) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283Instruction-Based Fault Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283Instruction File Format . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284

    Using FastScan MacroTest Capability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 286The MacroTest Process Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287Qualifying Macros for MacroTest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 289When to Use MacroTest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 290Defining the Macro Boundary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 292Defining Test Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296Recommendations for Using MacroTest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 298MacroTest Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 300

    Verifying Test Patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306Simulating the Design with Timing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306Debugging Simulation Mismatches in FastScan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307When, Where, and How Many Mismatches? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 309DRC Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 309Shadow Cells . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 310Library Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 310Timing Violations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311Analyzing the Simulation Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311Automatically Analyzing Simulation Mismatches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313Analyzing Patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315Checking for Clock-Skew Problems with Mux-DFF Designs . . . . . . . . . . . . . . . . . . . . . . 315

    Chapter 7Test Pattern Formatting and Timing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317

    Test Pattern Timing Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318Timing Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319General Timing Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319Generating a Procedure File. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320Defining and Modifying Timeplates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321Saving Timing Patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324

    Features of the Formatter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324Pattern Formatting Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325Saving Patterns in Basic Test Data Formats . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 328Saving in ASIC Vendor Data Formats . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334

  • Table of Contents

    8August 2008

    Scan and ATPG Process Guide, V8.2008_3

    Appendix AGetting Help . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337

    Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337Online Command Help . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337Mentor Graphics Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337

    Appendix BGetting Started with ATPG . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 339

    Preparing the Tutorial Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 339Full Scan ATPG Tool Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 339

    Running DFTAdvisor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342Running FastScan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 344

    Appendix CClock Gaters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 347

    Basic Clock Gater Cell. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 347Two Types of Embedding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 347

    Ideal Case (Type-A) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 348Potential DRC Violator (Type-B). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349

    Cascaded Clock Gaters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351Understanding a Level-2 Clock Gater . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351Example Combinations of Cascaded Clock Gaters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351

    Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352

    Appendix DDebugging State Stability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353

    Understanding State Stability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353Displaying the State Stability Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353State Stability Data Format. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 354State Stability Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356

    Appendix ERunning FastScan as a Batch Job . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 379

    Commands and Variables for the dofile . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 379Command Line Options. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 381

    Starting a Batch Job . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 382Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 382

    Appendix FUser Interface Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385

    Command Line Window. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 386Control Panel Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388Running Batch Mode Using Dofiles. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 389Generating a Log File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 390Running UNIX Commands. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 390Conserving Disk Space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 390Interrupting the Session . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391

  • Table of Contents

    Scan and ATPG Process Guide, V8.2008_3 9August 2008

    Exiting the Session . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391DFTAdvisor User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 392FastScan User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 393FlexTest User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395

    Index

    Third-Party Information

    End-User License Agreement

  • Scan and ATPG Process Guide, V8.2008_3 10August 2008

    List of Figures

    Figure 1-1. Top-Down Design Flow Tasks and Products . . . . . . . . . . . . . . . . . . . . . . . . . . . 18Figure 1-2. ASIC/IC Design-for-Test Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20Figure 2-1. DFT Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Figure 2-2. Design Before and After Adding Scan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25Figure 2-3. Full Scan Representation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26Figure 2-4. Partial Scan Representation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27Figure 2-5. Full, Partial, and Non-Scan Trade-offs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29Figure 2-6. Example of Partitioned Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30Figure 2-7. Wrapper Chains Added to Partition A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31Figure 2-8. Uncontrollable and Unobservable Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32Figure 2-9. Testability Benefits from Test Point Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . 32Figure 2-10. Manufacturing Defect Space for a Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37Figure 2-11. Internal Faulting Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41Figure 2-12. Single Stuck-At Faults for AND Gate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42Figure 2-13. IDDQ Fault Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44Figure 2-14. Transition Fault Detection Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46Figure 2-15. Fault Detection Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48Figure 2-16. Path Sensitization Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48Figure 2-17. Example of Unused Fault in Circuitry. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50Figure 2-18. Example of Tied Fault in Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50Figure 2-19. Example of Blocked Fault in Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51Figure 2-20. Example of Redundant Fault in Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . 52Figure 2-21. Fault Class Hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56Figure 3-1. Common Tool Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59Figure 3-2. Generic Scan Cell . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60Figure 3-3. Generic Mux-DFF Scan Cell Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . 60Figure 3-4. LSSD Master/Slave Element Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61Figure 3-5. Dependently-clocked Mux-DFF/Shadow Element Example . . . . . . . . . . . . . . . 62Figure 3-6. Independently-clocked Mux-DFF/Shadow Element Example . . . . . . . . . . . . . . 62Figure 3-7. Mux-DFF/Copy Element Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63Figure 3-8. Generic Scan Chain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63Figure 3-9. Generic Scan Group . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64Figure 3-10. Scan Clocks Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65Figure 3-11. Mux-DFF Replacement. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66Figure 3-12. Clocked-Scan Replacement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66Figure 3-13. LSSD Replacement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67Figure 3-14. Design Before Flattening . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69Figure 3-15. Design After Flattening. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69Figure 3-16. 2x1 MUX Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71Figure 3-17. LA, DFF Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71

  • List of Figures

    Scan and ATPG Process Guide, V8.2008_3 11August 2008

    Figure 3-18. TSD, TSH Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72Figure 3-19. PBUS, SWBUS Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72Figure 3-20. Equivalence Relationship Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74Figure 3-21. Example of Learned Logic Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74Figure 3-22. Example of Implied Relationship Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . 75Figure 3-23. Forbidden Relationship Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75Figure 3-24. Dominance Relationship Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76Figure 3-25. Bus Contention Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77Figure 3-26. Bus Contention Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78Figure 3-27. Simulation Model with Bus Keeper . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80Figure 3-28. Constrained Values in Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81Figure 3-29. Forbidden Values in Circuitry. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82Figure 3-30. Blocked Values in Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82Figure 4-1. Testability Issues. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83Figure 4-2. Structural Combinational Loop Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86Figure 4-3. Loop Naturally-Blocked by Constant Value. . . . . . . . . . . . . . . . . . . . . . . . . . . . 87Figure 4-4. Cutting Constant Value Loops . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87Figure 4-5. Cutting Single Multiple-Fanout Loops . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88Figure 4-6. Loop Candidate for Duplication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88Figure 4-7. TIE-X Insertion Simulation Pessimism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89Figure 4-8. Cutting Loops by Gate Duplication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89Figure 4-9. Cutting Coupling Loops . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90Figure 4-10. Delay Element Added to Feedback Loop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92Figure 4-11. Sequential Feedback Loop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93Figure 4-12. Fake Sequential Loop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94Figure 4-13. Test Logic Added to Control Asynchronous Reset. . . . . . . . . . . . . . . . . . . . . . 95Figure 4-14. Test Logic Added to Control Gated Clock . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96Figure 4-15. Tri-state Bus Contention . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97Figure 4-16. Requirement for Combinationally Transparent Latches . . . . . . . . . . . . . . . . . . 98Figure 4-17. Example of Sequential Transparency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99Figure 4-18. Clocked Sequential Scan Pattern Events. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100Figure 4-19. Clock Divider . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102Figure 4-20. Example Pulse Generator Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103Figure 4-21. Long Path Input Gate Must Go to Gates of the Same Type . . . . . . . . . . . . . . . 104Figure 4-22. Design with Embedded RAM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105Figure 4-23. RAM Sequential Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107Figure 5-1. Internal Scan Insertion Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113Figure 5-2. Basic Scan Insertion Flow with DFTAdvisor . . . . . . . . . . . . . . . . . . . . . . . . . . . 114Figure 5-3. The Inputs and Outputs of DFTAdvisor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115Figure 5-4. DFTAdvisor Supported Test Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117Figure 5-5. Test Logic Insertion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122Figure 5-6. Example Report from Report Dft Check Command. . . . . . . . . . . . . . . . . . . . . . 140Figure 5-7. Lockup Cell Insertion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151Figure 5-8. Hierarchical Design Prior to Scan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153Figure 5-9. Final Scan-Inserted Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155

  • List of Figures

    12August 2008

    Scan and ATPG Process Guide, V8.2008_3

    Figure 6-1. Test Generation Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157Figure 6-2. Overview of FastScan/FlexTest Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158Figure 6-3. FastScan/FlexTest Inputs and Outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160Figure 6-4. Clock-PO Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163Figure 6-5. Cycle-Based Circuit with Single Phase Clock . . . . . . . . . . . . . . . . . . . . . . . . . . 167Figure 6-6. Cycle-Based Circuit with Two Phase Clock. . . . . . . . . . . . . . . . . . . . . . . . . . . . 168Figure 6-7. Example Test Cycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169Figure 6-8. Data Capture Handling Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183Figure 6-9. Efficient ATPG Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204Figure 6-10. Circuitry with Natural Select Functionality. . . . . . . . . . . . . . . . . . . . . . . . . . 206Figure 6-11. Single Cycle Multiple Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209Figure 6-12. Flow for Creating a Delay Test Set. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 238Figure 6-13. Transition Delay . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239Figure 6-14. Transition Launch and Capture Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240Figure 6-15. Events in a Broadside Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240Figure 6-16. Basic Broadside Timing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241Figure 6-17. Events in a Launch Off Shift Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241Figure 6-18. Basic Launch Off Shift Timing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241Figure 6-19. Broadside Timing Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 244Figure 6-20. Launch Off Shift (Skewed) Timing Example . . . . . . . . . . . . . . . . . . . . . . . . . . 245Figure 6-21. Multicycle Path Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 246Figure 6-22. Setup Time and Hold Time Violations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247Figure 6-23. Across Clock Domain Hold Time Violation. . . . . . . . . . . . . . . . . . . . . . . . . . . 248Figure 6-24. Effect Cone of a Non-specific False Path Definition . . . . . . . . . . . . . . . . . . . . 251Figure 6-25. Path Delay Launch and Capture Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252Figure 6-26. Robust Detection Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253Figure 6-27. Non-robust Detection Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254Figure 6-28. Functional Detection Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255Figure 6-29. Example Use of Transition_condition Statement. . . . . . . . . . . . . . . . . . . . . . . 257Figure 6-30. Example of Ambiguous Path Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259Figure 6-31. Example of Ambiguous Path Edges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259Figure 6-32. On-chip Clock Generation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263Figure 6-33. PLL-Generated Clock and Control Signals. . . . . . . . . . . . . . . . . . . . . . . . . . . . 264Figure 6-34. Cycles Merged for ATPG . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266Figure 6-35. Cycles Expanded for ATPG . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267Figure 6-36. Mux-DFF Example Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270Figure 6-37. Mux-DFF Broadside Timing, Cell to Cell . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270Figure 6-38. Broadside Timing, Clock Pulses in Non-adjacent cycles . . . . . . . . . . . . . . . . . 272Figure 6-39. Mux-DFF Cell to PO Timing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273Figure 6-40. Mux-DFF PI to Cell Timing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274Figure 6-41. State Diagram of TAP Controller Circuitry . . . . . . . . . . . . . . . . . . . . . . . . . . . 278Figure 6-42. Example Instruction File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285Figure 6-43. Conceptual View of MacroTest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 286Figure 6-44. Basic Scan Pattern Creation Flow with MacroTest . . . . . . . . . . . . . . . . . . . . . 288Figure 6-45. Mismatch Diagnosis Guidelines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308

  • List of Figures

    Scan and ATPG Process Guide, V8.2008_3 13August 2008

    Figure 6-46. Simulation Transcript . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312Figure 6-47. ModelSim Waveform Viewer Display . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315Figure 6-48. Clock-Skew Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316Figure 7-1. Defining Basic Timing Process Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317Figure B-1. Tool Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 340Figure B-2. Scan and ATPG Tool and Command Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341Figure B-3. DFTAdvisor dofile dfta_dofile.do . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342Figure B-4. FastScan dofile fs_dofile.do . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 345Figure C-1. Basic Clock Gater Cell . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 347Figure C-2. Two Types of Embedding for the Basic Clock Gater . . . . . . . . . . . . . . . . . . . . 348Figure C-3. Type-B Clock Gater Causes Tracing Failure . . . . . . . . . . . . . . . . . . . . . . . . . . . 350Figure C-4. Sample EDT Test Procedure Waveforms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 350Figure C-5. Two-level Clock Gating . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351Figure D-1. Design Used in State Stability Examples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357Figure D-2. Typical Initialization Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359Figure D-3. Three-bit Shift Register (Excerpted from Figure D-1). . . . . . . . . . . . . . . . . . . . 361Figure D-4. Initialization with a Non-Shift Clock . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 362Figure D-5. Clocking ff20 with a Pulse Generator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 365Figure F-1. Common Elements of the DFT Graphical User Interfaces. . . . . . . . . . . . . . . . . 385Figure F-2. DFTAdvisor Control Panel Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 393Figure F-3. FastScan Control Panel Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395Figure F-4. FlexTest Control Panel Window. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 397

  • Scan and ATPG Process Guide, V8.2008_3 14August 2008

    List of Tables

    Table 2-1. Test Type/Fault Model Relationship . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40Table 4-1. FastScan and FlexTest RAM/ROM Commands . . . . . . . . . . . . . . . . . . . . . . . . . 109Table 5-1. Test Type Interactions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119Table 5-2. Scan Direction and Active Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123Table 6-1. ATPG Constraint Conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207Table 6-2. ATPG Accelerator Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214Table 6-3. Bridge Definition File Keywords . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232Table 6-4. Pin Value Requirements for ADD Instruction . . . . . . . . . . . . . . . . . . . . . . . . . . 284Table C-1. Clock Gater Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352Table F-1. Session Transcript Popup Menu Items . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387Table F-2. Command Transcript Popup Menu Items . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387

  • Scan and ATPG Process Guide, V8.2008_3 15August 2008

    Chapter 1Overview

    The Scan and ATPG Process Guide gives an overview of ASIC/IC Design-for-Test (DFT)strategies and shows the use of Mentor Graphics ASIC/IC DFT products as part of typical DFTdesign processes. This document discusses the following DFT products: DFTAdvisor,FastScan, and FlexTest.

    What is Design-for-Test?Testability is a design attribute that measures how easy it is to create a program tocomprehensively test a manufactured designs quality. Traditionally, design and test processeswere kept separate, with test considered only at the end of the design cycle. But in contemporarydesign flows, test merges with design much earlier in the process, creating what is called adesign-for-test (DFT) process flow. Testable circuitry is both controllable and observable. In atestable design; setting specific values on the primary inputs results in values on the primaryoutputs which indicate whether or not the internal circuitry works properly. To ensuremaximum design testability, designers must employ special DFT techniques at specific stagesin the development process.

    DFT StrategiesAt the highest level, there are two main approaches to DFT: ad hoc and structured. Thefollowing subsections discuss these DFT strategies.

    Ad Hoc DFTAd hoc DFT implies using good design practices to enhance a design's testability, withoutmaking major changes to the design style. Some ad hoc techniques include:

    Minimizing redundant logic

    Minimizing asynchronous logic

    Isolating clocks from the logic

    Adding internal control and observation points

    Using these practices throughout the design process improves the overall testability of yourdesign. However, using structured DFT techniques with the Mentor Graphics DFT toolsyields far greater improvement. Thus, the remainder of this document concentrates onstructured DFT techniques.

  • Scan and ATPG Process Guide, V8.2008_316

    OverviewTop-Down Design Flow with DFT

    August 2008

    Structured DFTStructured DFT provides a more systematic and automatic approach to enhancing designtestability. Structured DFTs goal is to increase the controllability and observability of a circuit.Various methods exist for accomplishing this. The most common is the scan design technique,which modifies the internal sequential circuitry of the design. You can also use the Built-inSelf-Test (BIST) method, which inserts a devices testing function within the device itself.Another method is boundary scan, which increases board testability by adding circuitry to achip. Understanding Scan and ATPG Basics describes these methods in detail.

    Top-Down Design Flow with DFTFigure 1-1 shows the basic steps and the Mentor Graphics tools you would use during a typicalASIC top-down design flow. This document discusses those steps shown in grey; it alsomentions certain aspects of other design steps, where applicable. This flow is just a generaldescription of a top-down design process flow using a structured DFT strategy. The nextsection, DFT Design Tasks and Products, gives a more detailed breakdown of the individualDFT tasks involved.

    As Figure 1-1 shows, the first task in any design flow is creating the initial RTL-level design,through whatever means you choose. In the Mentor Graphics environment, you may choose tocreate a high-level VHDL or Verilog description using ModelSim, or a schematic usingDesign Architect. You then verify the designs functionality by performing a functionalsimulation, using ModelSim or another vendor's VHDL/Verilog simulator.

    If your designs format is in VHDL or Verilog format and it contains memory models, at thispoint you can add built-in self-test (BIST) circuitry. MBISTArchitect creates and inserts RTL-level customized internal testing structures for design memories.

    Also at the RTL-level, you can insert and verify boundary scan circuitry using BSDArchitect(BSDA). Then you can synthesize and optimize the design using either Design Compiler oranother synthesis tool, followed by a timing verification with a static timing analyzer such asPrimeTime.

    At this point in the flow you are ready to insert internal scan circuitry into your design usingDFTAdvisor. You may then want to re-verify the timing because you added scan circuitry.Once you are sure the design is functioning as desired, you can generate test patterns. You canuse FastScan or FlexTest (depending on your scan strategy) to generate a test pattern set inthe appropriate format.

    Now you should verify that the design and patterns still function correctly with the propertiming information applied. You can use ModelSim, QuickPath, or some other simulator toachieve this goal. You may then have to perform a few additional steps required by your ASICvendor before handing the design off for manufacture and testing.

  • OverviewTop-Down Design Flow with DFT

    Scan and ATPG Process Guide, V8.2008_3 17August 2008

    NoteIt is important for you to check with your vendor early on in your design process forspecific requirements and restrictions that may affect your DFT strategies. For example,the vendor's test equipment may only be able to handle single scan chains (see page 24),have memory limitations, or have special timing requirements that affect the way yougenerate scan circuitry and test patterns.

  • Scan and ATPG Process Guide, V8.2008_318

    OverviewTop-Down Design Flow with DFT

    August 2008

    Figure 1-1. Top-Down Design Flow Tasks and Products

    =b+c;

    Insert InternalScan Circuitry

    Synthesize/OptimizeDesign

    Create InitialDesign

    Hand off

    Generate/VerifyTest Patterns

    ModelSimText Editor

    Design CompilerBuildGates

    DFTAdvisor

    FastScanFlexTest

    0110

    ModelSim

    Insert/VerifyBoundary Scan

    CircuitryBSDArchitect

    Re-verify Timing(optional)

    ModelSim

    Insert/VerifyBuilt-in Self Test

    CircuitryMBISTArchitect

    1011

    P/F

    to Vendor

    VerifyFunctionality

    LBISTArchitect

    a sc_outspecified by shift procedure

    sc_in sc_out

    sc_outdata

    sc_in

    sc_en

    clkdata

    sc_in

    sc_enclk

    mux-DFFD1 QD2ENCK Q'

    MUX

    sc_outdatasc_in

    sc_en

    clk

    D Q

    Q'DFF

  • Understanding Common Tool Terminology and ConceptsScan Terminology

    Scan and ATPG Process Guide, V8.2008_3 61August 2008

    their values to the scan cells output. The following subsections describe the different memoryelements a scan cell may contain.

    Master ElementThe master element, the primary memory element of a scan cell, captures data directly from theoutput of the previous scan cell. Each scan cell must contain one and only one master element.For example, Figure 3-3 shows a mux-DFF scan cell, which contains only a master element.However, scan cells can contain memory elements in addition to the master. Figures 3-4through 3-7 illustrate examples of master elements in a variety of other scan cells.

    The shift procedure in the test procedure file controls the master element. If the scan cellcontains no additional independently-clocked memory elements in the scan path, this procedurealso observes the master. If the scan cell contains additional memory elements, you may need todefine a separate observation procedure (called master_observe) for propagating the masterelements value to the output of the scan cell.

    Slave ElementThe slave element, an independently-clocked scan cell memory element, resides in the scanchain path. It cannot capture data directly from the previous scan cell. When used, it stores theoutput of the scan cell. The shift procedure both controls and observes the slave element. Thevalue of the slave may be inverted relative to the master element. Figure 3-4 shows a slaveelement within a scan cell.

    Figure 3-4. LSSD Master/Slave Element Example

    In the example of Figure 3-4, Aclk controls scan data input. Activating Aclk, with sys_clk(which controls system data) held off, shifts scan data into the scan cell. Activating Bclkpropagates scan data to the output.

    Shadow ElementThe shadow element, either dependently- or independently-clocked, resides outside the scanchain path. It can be inside or outside of a scan cell. Figure 3-5 gives an example of a scan cellwith a dependently-clocked, non-observable shadow element with a non-inverted value.

    Latch

    Latch

    BclkAclk

    sc_insys_clk

    datasc_outMaster

    Element

    SlaveElement

    Q

  • Scan and ATPG Process Guide, V8.2008_362

    Understanding Common Tool Terminology and ConceptsScan Terminology

    August 2008

    Figure 3-5. Dependently-clocked Mux-DFF/Shadow Element Example

    Figure 3-6 shows a similar example where the shadow element is independently-clocked.

    Figure 3-6. Independently-clocked Mux-DFF/Shadow Element Example

    You load a data value into the dependently-clocked shadow element with the shift procedure. Ifthe shadow element is independently clocked, you use a separate procedure calledshadow_control to load it. You can optionally make a shadow observable using theshadow_observe procedure. A scan cell may contain multiple shadows but only one may beobservable, because the tools allow only one shadow_observe procedure. A shadow elementsvalue may be the inverse of the masters value.

    The definition of a shadow element is based on the shadow having the same (or inverse) valueas the master element it shadows. A variety of interconnections of the master and shadow willaccomplish this. In Figure 3-5, the shadows data input is connected to the masters data input,and both FFs are triggered by the same clock edge. The definition would also be met if theshadows data input was connected to the masters output and the shadow was triggered on thetrailing edge, the master on the leading edge, of the same clock.

    Copy ElementThe copy element is a memory element that lies in the scan chain path and can contain the same(or inverted) data as the associated master or slave element in the scan cell. Figure 3-7 gives anexample of a copy element within a scan cell in which a master element provides data to thecopy.

    FF

    FFMasterElement

    ShadowElement

    sc_outsc_insc_en

    dataMUXS

    clk

    FF

    FFMasterElement

    ShadowElement

    sc_outsc_in

    sys_clk

    sc_en

    dataMUXS

    clk

  • Understanding Common Tool Terminology and ConceptsScan Terminology

    Scan and ATPG Process Guide, V8.2008_3 63August 2008

    Figure 3-7. Mux-DFF/Copy Element Example

    The clock pulse that captures data into the copys associated scan cell element also captures datainto the copy. Data transfers from the associated scan cell element to the copy element in thesecond half of the same clock cycle.

    During the shift procedure, a copy contains the same data as that in its associated memoryelement. However, during system data capture, some types of scan cells allow copy elements tocapture different data. When the copys value differs from its associated element, the copybecomes the observation point of the scan cell. When the copy holds the same data as itsassociated element, the associated element becomes the observation point.

    Extra ElementThe extra element is an additional, independently-clocked memory element of a scan cell. Anextra element is any element that lies in the scan chain path between the master and slaveelements. The shift procedure controls data capture into the extra elements. These elements arenot observable. Scan cells can contain multiple extras. Extras can contain inverted data withrespect to the master element.

    Scan ChainsA scan chain is a set of serially linked scan cells. Each scan chain contains an external input pinand an external output pin that provide access to the scan cells. Figure 3-8 shows a scan chain,with scan input sc_in and scan output sc_out.

    Figure 3-8. Generic Scan Chain

    CopyElement

    FF

    FFMasterElement

    sc_out

    sc_in

    clk

    sc_en

    dataMUXS

    data

    sc_in

    sc_enclk sc_out

    0N-1 N-2 N-3

  • Scan and ATPG Process Guide, V8.2008_364

    Understanding Common Tool Terminology and ConceptsScan Terminology

    August 2008

    The scan chain length (N) is the number of scan cells within the scan chain. By convention, thescan cell closest to the external output pin is number 0, its predecessor is number 1, and so on.Because the numbering starts at 0, the number for the scan cell connected to the external inputpin is equal to the scan chain length minus one (N-1).

    Scan GroupsA scan chain group is a set of scan chains that operate in parallel and share a common testprocedure file. The test procedure file defines how to access the scan cells in all of the scanchains of the group. Normally, all of a circuits scan chains operate in parallel and are thus in asingle scan chain group.

    Figure 3-9. Generic Scan Group

    You may have two clocks, A and B, each of which clocks different scan chains. You often canclock, and therefore operate, the A and B chains concurrently, as shown in Figure 3-9.However, if two chains share a single scan input pin, these chains cannot be operated in parallel.Regardless of operation, all defined scan chains in a circuit must be associated with a scangroup. A scan group is a concept used by Mentor Graphics DFT and ATPG tools.

    Scan groups are a way to group scan chains based on operation. All scan chains in a group mustbe able to operate in parallel, which is normal for scan chains in a circuit. However when scanchains cannot operate in parallel, such as in the example above (sharing a common scan inputpin), the operation of each must be specified separately. This means the scan chains belong todifferent scan groups.

    sci1

    sc_enclk sco1

    sci2sco2

    0N-1 N-2 N-3

    0N-1 N-2 N-3

  • Understanding Common Tool Terminology and ConceptsScan Architectures

    Scan and ATPG Process Guide, V8.2008_3 65August 2008

    Scan ClocksScan clocks are external pins capable of capturing values into scan cell elements. Scan clocksinclude set and reset lines, as well as traditional clocks. Any pin defined as a clock can act as acapture clock during ATPG. Figure 3-10 shows a scan cell whose scan clock signals are shownin bold.

    Figure 3-10. Scan Clocks Example

    In addition to capturing data into scan cells, scan clocks, in their off state, ensure that the cellshold their data. Design rule checks ensure that clocks perform both functions. A clocks off-state is the primary input value that results in a scan elements clock input being at its inactivestate (for latches) or state prior to a capturing transition (for edge-triggered devices). In the caseof Figure 3-10, the off-state for the CLR signal is 1, and the off-states for CK1 and CK2 areboth 0.

    Scan ArchitecturesYou can choose from a number of different scan types, or scan architectures. DFTAdvisor, theMentor Graphics internal scan synthesis tool, supports the insertion of mux-DFF (mux-scan),clocked-scan, and LSSD architectures. Additionally, DFTAdvisor supports all standard scantypes, or combinations thereof, in designs containing pre-existing scan circuitry. You can usethe Set Scan Type command (see page 120) to specify the type of scan architecture you wantinserted in your design.

    Each scan style provides different benefits. Mux-DFF or clocked-scan are generally the bestchoice for designs with edge-triggered flip-flops. Additionally, clocked-scan ensures data holdfor non-scan cells during scan loading. LSSD is most effective on latch-based designs.

    The following subsections detail the mux-DFF, clocked-scan, and LSSD architectures.

    Mux-DFFA mux-DFF cell contains a single D flip-flop with a multiplexed input line that allows selectionof either normal system data or scan data. Figure 3-11 shows the replacement of an originaldesign flip-flop with mux-DFF circuitry.

    D1Q1D2Q2

    CK2Q1'Q2'

    CK1

    CLR

  • Scan and ATPG Process Guide, V8.2008_366

    Understanding Common Tool Terminology and ConceptsScan Architectures

    August 2008

    Figure 3-11. Mux-DFF Replacement

    In normal operation (sc_en = 0), system data passes through the multiplexer to the D input ofthe flip-flop, and then to the output Q. In scan mode (sc_en = 1), scan input data (sc_in) passesto the flip-flop, and then to the scan output (sc_out).

    Clocked-ScanThe clocked-scan architecture is very similar to the mux-DFF architecture, but uses a dedicatedtest clock to shift in scan data instead of a multiplexer. Figure 3-12 shows an original designflip-flop replaced with clocked-scan circuitry.

    Figure 3-12. Clocked-Scan Replacement

    In normal operation, the system clock (sys_clk) clocks system data (data) into the circuit andthrough to the output (Q). In scan mode, the scan clock (sc_clk) clocks scan input data (sc_in)into the circuit and through to the output (sc_out).

    LSSDLSSD, or Level-Sensitive Scan Design, uses three independent clocks to capture data into thetwo polarity hold latches contained within the cell. Figure 3-13 shows the replacement of anoriginal design latch with LSSD circuitry.

    D

    CLKQ D

    CLK

    OriginalFlip-flop

    Replaced bymux-DFF Scan Cell

    QDFF

    data

    sc_insc_en

    MUXS

    clk

    sc_out(Q)

    D

    CLKQ

    CLK

    OriginalFlip-flop

    Replaced byClocked-Scan Cell

    Ddatasc_in

    sc_clksys_clk

    sc_outQ(Q)

  • Understanding Common Tool Terminology and ConceptsTest Procedure Files

    Scan and ATPG Process Guide, V8.2008_3 67August 2008

    Figure 3-13. LSSD Replacement

    In normal mode, the master latch captures system data (data) using the system clock (sys_clk)and sends it to the normal system output (Q). In test mode, the two clocks (Aclk and Bclk)trigger the shifting of test data through both master and slave latches to the scan output (sc_out).

    There are several varieties of the LSSD architecture, including single latch, double latch, andclocked LSSD.

    Test Procedure FilesTest procedure files describe, for the ATPG tool, the scan circuitry operation within a design.Test procedure files contain cycle-based procedures and timing definitions that tell FastScan orFlexTest how to operate the scan structures within a design. In order to utilize the scan circuitryin your design, you must:

    Define the scan circuitry for the tool.

    Create a test procedure file to describe the scan circuitry operation. DFTAdvisor cancreate test procedure files for you.

    Perform DRC process. This occurs when you exit from Setup mode.

    Once the scan circuitry operation passes DRC, FastScan and FlexTest processes assume thescan circuitry works properly.

    If your design contains scan circuitry, FastScan and FlexTest require a test procedure file. Youmust create one before running ATPG with FastScan or FlexTest.

    For more information on the new test procedure file format, see Test Procedure File in theDesign-for-Test Common Resources Manual, which describes the syntax and rules of testprocedure files, give examples for the various types of scan architectures, and outline thechecking that determines whether the circuitry is operating correctly.

    Q

    OriginalLatch

    Replaced byLSSD Scan Cell

    MasterLatch

    Dclk

    SlaveLatch

    QLatch

    D

    clk

    sc_out

    datasys_clk

    sc_inAclk

    Bclk

    Q

    D Q

  • Scan and ATPG Process Guide, V8.2008_368

    Understanding Common Tool Terminology and ConceptsModel Flattening

    August 2008

    Model FlatteningTo work properly, FastScan, FlexTest, and DFTAdvisor must use their own internalrepresentations of the design. The tools create these internal design models by flattening themodel and replacing the design cells in the netlist (described in the library) with their ownprimitives. The tools flatten the model when you initially attempt to exit the Setup mode, justprior to design rules checking. FastScan and FlexTest also provide the Flatten Model command,which allows flattening of the design model while still in Setup mode.

    If a flattened model already exists when you exit the Setup mode, the tools will only reflattenthe model if you have since issued commands that would affect the internal representation of thedesign. For example, adding or deleting primary inputs, tying signals, and changing the internalfaulting strategy are changes that affect the design model. With these types of changes, the toolmust re-create or re-flatten the design model. If the model is undisturbed, the tool keeps theoriginal flattened model and does not attempt to reflatten.

    For a list of the specific DFTAdvisor commands that cause flattening, refer to the Set SystemMode command page in the DFTAdvisor Reference Manual. For FastScan and FlexTest relatedcommands, see below:

    Related CommandsFlatten Model - creates a primitive gate simulation representation of the design.

    Report Flatten Rules - displays either a summary of all the flattening ruleviolations or the data for a specific violation.

    Set Flatten Handling - specifies how the tool handles flattening violations.

    Understanding Design Object NamingDFTAdvisor, FastScan, and FlexTest use special terminology to describe different objects in thedesign hierarchy. The following list describes the most common:

    Instance a specific occurrence of a library model or functional block in the design.

    Hierarchical instance an instance that contains additional instances and/or gatesunderneath it.

    Module a Verilog functional block (module) that can be repeated multiple times.Each occurrence of the module is a hierarchical instance.

    surfer

    surfer2008-12-1 0:23

  • Understanding Common Tool Terminology and ConceptsModel Flattening

    Scan and ATPG Process Guide, V8.2008_3 69August 2008

    The Flattening ProcessThe flattened model contains only simulation primitives and connectivity, which makes it anoptimal representation for the processes of fault simulation and ATPG. Figure 3-14 shows anexample of circuitry containing an AND-OR-Invert cell and an AND gate, before flattening.

    Figure 3-14. Design Before Flattening

    Figure 3-15 shows this same design once it has been flattened.

    Figure 3-15. Design After Flattening

    After flattening, only naming preserves the design hierarchy; that is, the flattened netlistmaintains the hierarchy through instance naming. Figures 3-14 and 3-15 show this hierarchypreservation. /Top is the name of the hierarchys top level. The simulation primitives (two ANDgates and a NOR gate) represent the flattened instance AOI1 within /Top. Each of theseflattened gates retains the original design hierarchy in its namingin this case, /Top/AOI1.

    The tools identify pins from the original instances by hierarchical pathnames as well. Forexample, /Top/AOI1/B in the flattened design specifies input pin B of instance AOI1. This

    /Top

    AOI1 AND1

    AOIBCDE

    A

    ZY

    A

    B

    /Top/AOI1DE

    /Top/AOI1BC

    Y

    /Top/AND1ZA

    B

    Pin Pathname/Top/AND1/B

    Pin Pathname/Top/AOI1/B

    /Top/AOI1

    UnnamedPins

  • Scan and ATPG Process Guide, V8.2008_370

    Understanding Common Tool Terminology and ConceptsModel Flattening

    August 2008

    naming distinguishes it from input pin B of instance AND1, which has the pathname/Top/AND1/B. By default, pins introduced by the flattening process remain unnamed and are notvalid fault sites. If you request gate reporting on one of the flattened gates, the NOR gate forexample, you will see a system-defined pin name shown in quotes. If you want internal faultingin your library cells, you must specify internal pin names within the library model. Theflattening process then retains these pin names.

    You should be aware that in some cases, the design flattening process can appear to introducenew gates into the design. For example, flattening decompose a DFF gate into a DFF simulationprimitive, the Q and Q outputs require buffer and inverter gates, respectively. If your designwires together multiple drivers, flattening would add wire gates or bus gates. Bidirectional pinsare another special case that requires additional gates in the flattened representation.

    Simulation Primitives of the Flattened ModelDFTAdvisor, FastScan, and FlexTest select from a number of simulation primitives when theycreate the flattened circuitry. The simulation primitives are multiple-input (zero to four), single-output gates, except for the RAM, ROM, LA, and DFF primitives. The following list describesthese simulation primitives:

    PI, PO - primary inputs are gates with no inputs and a single output, while primaryoutputs are gates with a single input and no fanout.

    BUF - a single-input gate that passes the values 0, 1, or X through to the output.

    FB_BUF - a single-input gate, similar to the BUF gate, that provides a one iterationdelay in the data evaluation phase of a simulation. The tools use the FB_BUF gate tobreak some combinational loops and provide more optimistic behavior than when TIEXgates are used.

    NoteThere can be one or more loops in a feedback path. In Atpg mode, you can display theloops with the Report Loops command. In Setup mode, use Report Feedback Paths.

    The default loop handling is simulation-based, with the tools using the FB_BUF tobreak the combinational loops. In Setup mode, you can change the default with the SetLoop Handling command. Be aware that changes to loop handling will have an impactduring the flattening process.

    ZVAL - a single-input gate that acts as a buffer unless Z is the input value. When a Z isthe input value, the output is an X. You can modify this behavior with the Set ZHandling command.

    INV - a single-input gate whose output value is the opposite of the input value. The INVgate cannot accept a Z input value.

  • Understanding Common Tool Terminology and ConceptsModel Flattening

    Scan and ATPG Process Guide, V8.2008_3 71August 2008

    AND, NAND - multiple-input gates (two to four) that act as standard AND and NANDgates.

    OR, NOR - multiple-input (two to four) gates that act as standard OR and NOR gates. XOR, XNOR - 2-input gates that act as XOR and XNOR gates, except that when either

    input is an X, the output is an X.

    MUX - a 2x1 mux gate whose pins are order dependent, as shown in Figure 3-16.

    Figure 3-16. 2x1 MUX Example

    The sel input is the first defined pin, followed by the first data input and then the seconddata input. When sel=0, the output is d1. When sel=1, the output is d2.

    NoteFlexTest uses a different pin naming and ordering scheme, which is the same ordering asthe _mux library primitive; that is, in0, in1, and cnt. In this scheme, cnt=0 selects in0 dataand cnt=1 selects in1 data.

    LA, DFF - state elements, whose order dependent inputs include set, reset, andclock/data pairs, as shown in Figure 3-17.

    Figure 3-17. LA, DFF Example

    Set and reset lines are always level sensitive, active high signals. DFF clock ports areedge-triggered while LA clock ports are level sensitive. When set=1, out=1. Whenreset=1, out=0. When a clock is active (for example C1=1), the output reflects itsassociated data line value (D1). If multiple clocks are active and the data they are tryingto place on the output differs, the output becomes an X.

    TLA, STLA, STFF - special types of learned gates that act as, and pass the design rulechecks for, transparent latch, sequential transparent latch, or sequential transparent flip-flop. These gates propagate values without holding state.

    seld1d2

    outMUX

    setreset

    C1 outD1C2D2

  • Scan and ATPG Process Guide, V8.2008_372

    Understanding Common Tool Terminology and ConceptsModel Flattening

    August 2008

    TIE0, TIE1, TIEX, TIEZ - zero-input, single-output gates that represent the effect of asignal tied to ground or power, or a pin or state element constrained to a specific value(0,1,X, or Z). The rules checker may also determine that state elements exhibit tiedbehavior and replace them with the appropriate tie gates.

    TSD, TSH - a 2-input gate that acts as a tri-state driver, as shown in Figure 3-18.

    Figure 3-18. TSD, TSH Example

    When en=1, out=d. When en=0, out=Z. The data line, d, cannot be a Z. FastScan usesthe TSD gate, while FlexTest uses the TSH gate for the same purpose.

    SW, NMOS - a 2-input gate that acts like a tri-state driver but can also propagate a Zfrom input to output. FastScan uses the SW gate, while FlexTest uses the NMOS gatefor the same purpose.

    BUS - a multiple-input (up to four) gate whose drivers must include at least one TSD orSW gate. If you bus more than four tri-state drivers together, the tool creates cascadedBUS gates. The last bus gate in the cascade is considered the dominant bus gate.

    WIRE - a multiple-input gate that differs from a bus in that none of its drivers are tri-statable.

    PBUS, SWBUS - a 2-input pull bus gate, for use when you combine strong bus andweak bus signals together, as shown in Figure 3-19.

    Figure 3-19. PBUS, SWBUS Example

    The strong value always goes to the output, unless the value is a Z, in which case theweak value propagates to the output. These gates model pull-up and pull-down resistors.FastScan uses the PBUS gate, while FlexTest uses the SWBUS gate.

    ZHOLD - a single-input buskeeper gate (see page 80 for more information onbuskeepers) associated with a tri-state network that exhibits sequential behavior. If theinput is a binary value, the gate acts as a buffer. If the input value is a Z, the output

    en

    d outTSD

    PBUSBUS

    TIE0

    ZVAL(strong)(weak)

  • Understanding Common Tool Terminology and ConceptsLearning Analysis

    Scan and ATPG Process Guide, V8.2008_3 73August 2008

    depends on the gates hold capability. There are three ZHOLD gate types, each with adifferent hold capability:

    o ZHOLD0 - When the input is a Z, the output is a 0 if its previous state was 0. If itsprevious state was a 1, the output is a Z.

    o ZHOLD1 - When the input is a Z, the output is a 1 if its previous state was a 1. If itsprevious state was a 0, the output is a Z.

    o ZHOLD0,1 - When the input is a Z, the output is a 0 if its previous state was a 0, orthe output is a 1 if its previous state was a 1.

    In all three cases, if the previous value is unknown, the output is X.

    RAM, ROM- multiple-input gates that model the effects of RAM and ROM in thecircuit. RAM and ROM differ from other gates in that they have multiple outputs.

    OUT - gates that convert the outputs of multiple output gates (such as RAM and ROMsimulation gates) to a single output.

    Learning AnalysisAfter design flattening, FastScan and FlexTest perform extensive analysis on the design to learnbehavior that may be useful for intelligent decision making in later processes, such as faultsimulation and ATPG. You have the ability to turn learning analysis off, which may bedesirable if you do not want to perform ATPG during the session. For more information onturning learning analysis off, refer to the Set Static Learning command or the Set SequentialLearning command reference pages in the ATPG and Failure Diagnosis Tools ReferenceManual.

    The ATPG tools perform static learning only onceafter flattening. Because pin and ATPGconstraints can change the behavior of the design, static learning does not consider theseconstraints. Static learning involves gate-by-gate local simulation to determine informationabout the design. The following subsections describe the types of analysis performed duringstatic learning.

    Equivalence RelationshipsDuring this analysis, simulation traces back from the inputs of a multiple-input gate through alimited number of gates to identify points in the circuit that always have the same values in thegood machine. Figure 3-20 shows an example of two of these equivalence points within somecircuitry.

  • Scan and ATPG Process Guide, V8.2008_374

    Understanding Common Tool Terminology and ConceptsLearning Analysis

    August 2008

    Figure 3-20. Equivalence Relationship Example

    Logic BehaviorDuring logic behavior analysis, simulation determines a circuits functional behavior. Forexample, Figure 3-21 shows some circuitry that, according to the analysis, acts as an inverter.

    Figure 3-21. Example of Learned Logic Behavior

    During gate function learning, the tool identifies the circuitry that acts as gate types TIE (tied 0,1, or X values), BUF (buffer), INV (inverter), XOR (2-input exclusive OR), MUX (single selectline, 2-data-line MUX gate), AND (2-input AND), and OR (2-input OR). For AND and ORfunction checking, the tool checks for busses acting as 2-input AND or OR gates.