Ch1

46
Hardware Functional Verification By: John Goss Verification Engineer IBM [email protected]

description

Hardware functional verification

Transcript of Ch1

  • Hardware Functional VerificationBy: John GossVerification [email protected]

  • Other ReferencesText References: Writing Testbenches: Functional Verification of HDL Models by Janick BergeronA Designers Guide to VHDL by Peter AshendenAdditional information can be on the web found at:http://janick.bergeron.com/wtbhttp://www.vhdl.org

  • Introduction

  • What this course is about?60% 80% of effort in design is dedicated to verificationUnlike synthesizeable code, no strict coding styles for verification (free-for-all)Absence of constraints and lack of available expertise and references in verification has resulted in ad hoc approachesMost HDL books (VHDL or Verilog) deal with design, not verificationOver the years, these HDL books have been refined as synthesis tools have been refined

  • What this course is about? (cont)To teach necessary concepts for tools of verificationDescribe a process for carrying out effective functional verificationPresent techniques for applying stimulus and monitoring the response of a design utilizing bus functional modelsPresent the importance of behavioral modeling

  • Prior KnowledgeThis class focuses on functional verification of hardware design using either VHDL or VerilogExpect students to have a basic knowledge of one of these languagesExpect students to have basic understanding of digital hardware designClass will focus more on VHDL

  • VHDL vs.. VerilogWhat language should I use?This is usually dictated by ones experience and personal preferenceTypically, when working with a language, you do not notice the things that are simple to do, instead you notice the frustrations and how easy it would be if you were using the other languageBoth languages are inadequate for verification (by themselves)Both languages are equal in terms of the area under the learning curve. VHDLs learning curve is steeper, but Verilogs goes on much further

  • Why HDL Verification?I mentioned 60% - 80% time spent in verification WHY??Product time-to-markethardware turn-around timevolume of "bugs"Development costs"Early User Hardware" (EUH)

  • Why HDL Verification? (cont)Cost of bugs over timeLonger a bug goes undetected, the more expensive it isBug found early (designer sim) has little costFinding a bug at chip/system has moderate costRequires more debug time and isolation timeCould require new algorithm, which could effect schedule and cause board reworkFinding a bug in System Test (test floor) requires new spin of a chipFinding bug in customers environment can cost hundreds of millions and worst of all - Reputation

  • What is Verification?Not a testbenchNot a series of testbenches

  • Verification is a process used to demonstrate the functional correctness of a design. Also called logic verification or simulation.

  • What is a testbench?A testbench usually refers to the code used to create a pre-determined input sequence to a design, then optionally observe the response.Generic term used differently across industryAlways refers to a testcaseMost commonly (and appropriately), a testbench refers to code written (VHDL, Verilog, etc) at the top level of the hierarchy. The testbench is often simple, but may have some elements of randomnessCompletely closed system No inputs or outputseffectively a model of the universe as far as the design is concerned.Verification challenge: What input patterns to supply to the Design Under Verification and what is expected for the output for a properly working design

  • Importance of VerificationMost books focus on syntax, semantics and RTL subsetGiven the amount of literature on writing synthesizeable code vs.. writing verification testbenches, one would think that the former is a more daunting task. Experience proves otherwise.70% of design effort goes to verificationProperly staffed design teams have dedicated verification engineers.Verification Engineers usually outweigh designers 2-180% of all written code is in the verification environment

  • Verification is on critical path

  • Want to minimize Verification Time!

  • Ways to reduce verification timeVerification can be reduced through:Parallelism: Add more resourcesAbstraction: Higher level of abstraction (i.e. C vs.. Assembly)Beware though this means a reduction in controlAutomation: Tools to automate standard processesRequires standard processesNot all processes can be automated

  • Reconvergence ModelConceptual representation of the verification processMost important questionWhat are you verifying?VerificationTransformation

  • Human Factor in Verification ProcessAn individual (or group of individuals) must interpret specification and transform into correct function.SpecificationInterpre-tationRTL CodingVerification

  • Ways to reduce human-introduced errorsAutomationTake human intervention out of the processPoka-YokaMake human intervention fool-proofRedundancyHave two individuals (or groups) check each others work

  • AutomationObvious way to eliminate human-introduced errors take the human out.Good in conceptReality dictates that this is not feasibleProcesses are not defined well enoughProcesses require human ingenuity and creativity

  • Poka-YokaTerm coined in Total Quality Management circlesMeans to mistake-proof the human interventionTypically the last step in complete automationSame pitfalls as automation verification remains an art, it does not yield itself to well-defined steps.

  • RedundancyDuplicate every transformationEvery transformation made by a human is either:Verified by another individualTwo complete and separate transformations are performed with each outcome compared to verify that both produced the same or equivalent resultSimplestMost costly, but still cheaper than redesign and replacement of a defective productDesigner should NOT be in charge of verification!

  • What is being verified?Choosing a common origin and reconvergence points determines what is being verified and what type of method to use.Following types of verification all have different origin and reconvergence points:Formal VerificationModel CheckingFunctional VerificationTestbench Generators

  • Formal VerificationOnce the end points of formal verification reconvergence paths are understood, then you know exactly what is being verified.2 Types of Formal:EquivalenceModel Checking

  • Equivalence CheckingCompares two models to see if equivalenceProves mathematically that the origin and output are logically equivalentExamples:RTL to Gates (Post Synthesis)Post Synthesis Gates to Post PD Gates

  • Equivalence Reconvergence ModelRTLGatesCheckSynthesis

  • Model CheckingForm of formal verificationCharacteristics of a design are formally proven or disprovedLooks for generic problems or violations of user defined rules about the behavior of the design

  • Model Checking Reconvergence ModelSpecificationRTLAssertionsRTLInterpretationModelChecking

  • Functional VerificationVerifies design intentWithout, one must trust that the transformation of a specification to RTL was performed correctlyProve presence of bugs, but cannot prove their absence

  • Functional Reconvergence ModelSpecificationRTLFunctionalVerification

  • Testbench GeneratorsTool to generate stimulus to exercise code or expose bugsDesigner input is still requiredRTL code is the origin and there is no reconvergence pointVerification engineer is left to determine if the testbench applies valid stimulusIf used with parameters, can control the generator in order to focus the testbenches on more specific scenarios

  • Testbench Generation Reconvergence ModelRTLTestbenchTestbenchGenerationCode Coverage/ProofMetrics

  • Functional Verification ApproachesBlack-Box ApproachWhite-Box ApproachGrey-Box Approach

  • Black-Box The black box has inputs, outputs, and performs some function. The function may be well documented...or not. To verify a black box, you need to understand the function and be able to predict the outputs based on the inputs. The black box can be a full system, a chip, a unit of a chip, or a single macro.

  • White-BoxWhite box verification means that the internal facilities are visible and utilized by the testbench stimulus.Examples: Unit/Module level verification

  • Grey-BoxGrey box verification means that a limited number of facilities are utilized in a mostly black-box environment.Example: Most environments! Prediction of correct results on the interface is occasionally impossible without viewing an internal signal.

  • Perfect VerificationTo fully verify a black-box, you must show that the logic works correctly for all combinations of inputs.This entails:Driving all permutations on the input linesChecking for proper results in all casesFull verification is not practical on large pieces of designs, but the principles are valid across all verification.

  • Verification VS. TestTwo often confusedPurpose of test is to verify that the design was manufactured properlyVerification is to ensure that the design meets the functionality intent

  • Verification and Test Reconvergence ModelSpecificationSiliconVerificationHW DesignNet listTestFabrication

  • Verification And Design ReuseWont use what you dont trust.How to trust it? Verify It.For reuse, designs must be verified with more strict requirementsAll claims, possible combinations and uses must be verified.Not just how it is used in a specific environment.

  • Cost of VerificationNecessary EvilAlways takes too long and costs too muchVerification does not generate revenueYet indispensableTo create revenue, design must be functionally correct and provide benefits to customerProper functional verification demonstrates trustworthiness of the design

  • When is Verification Done?Never truly done on complex designsVerification can only show presence of errors, not their absenceGiven enough time, errors will be uncoveredQuestion Is the error likely to be severe enough to warrant the effort spent to find the error?

  • When is Verification Done? (Cont)Verification is similar to statistical hypothesis.Hypothesis Is the design functionally correct?

  • Hypothesis Matrix

  • Verification TerminologyEDA: Engineering Design Automation Tool vendors. I.E. Synopsys, ModelTech, Cadence, etc.Behavioral: Code written to perform the function of logic on the interface of the DUV.Macro: 1) A behavioral. 2) A piece of logic.Driver/Agitator/Stimulator/Generator/Bus Functional Model (BFM): Code written to manipulate the inputs of the DUV. Typically this is behavioral code. It understands the interface protocols.Checker: Code written to verify the outputs of the DUV. A checker may have some knowledge of what the driver has done. A check must also verify interface protocol compliance.

  • Verification Terms (continued)Snooper/Monitor: Code that watches interfaces or internal signals to help the checkers perform correctly. Also can be used by drivers to be more stressful and adaptive.Architecture: Design criteria as seen by the customer. Designs architecture is specified in documents usually a specification in which the design must be compliant with (verified against)Micro Architecture: The designs implementation. It refers to the constructs that are used in the design (I.E. pipelines, caches, etc).

    Most HDL books deal with design, not verification maybe 1 or 2 chapters that gloss over verificationThe techniques for applying stimulus and monitoring the response of a design is done by abstracting the operations of an interface these are called bus functional models or BFMs.Behavioral modeling is used to parallelize the implementation and verification of a design and to perform more efficient simulations.Most people associate behavioral model with synthesizeable or RTL modeling. The context in which I use behavioral model is used to describe any model that adequately emulates the functionality of a design. The code may or may not be synthesizeableIdeally, students should have experience in writing synthesizeable models and be familiar in running simulation tools (either VHDL or Verilog)Recommend the VHDL book for reference, as I will be touching base on utilizing the VHDL language to its fullest.I prefer VHDL and have more experience with it. I mentioned 60% - 80% time spent in verification WHY??The answer to WHY?? simply put Cost. We are out there to deliver products and make money. More it costs, less we make.Chapter 1 Begins here!Test QuestionReuse is a key term in todays short time-to-market environment and verification is a key to design reuse.Test Question - Verification challengeExamples:Parallelism: Digging a hole in the ground can be parallelized by providing more workers armed with shovels. In verification, it is necessary to be able to write and debug testbenches in parallel with each other as well as in parallel with the design implementation. But remember, there is a limit on the number of resources to apply If given 9 men and one women it still takes 9 months to make a baby.Abstraction: Instead of adding additional resources with shovels, invest in a backhoe to dig the hole. Caution: Using a backhoe to dig a hole suffers from some loss-of-control. Worker no longer directly interacts with the dirt. Digging happens much faster, but with less precision. Also consider a novice vs.. expert operating the backhoe.Automation: Holes must be dug in a variety of shapes, sizes, depths, locations, and in a variety of soil types.Verification faces similar challenges. Variety of functions, interfaces, protocols, and transformations must be verified, so it is not possible to provide general purpose automation. But there are areas that can be automated. For example, there exists trench digging machines to lay cable.Purpose of verification is to ensure that the result of some transformation is as intended or as expected. For example: Purpose of verifying your checkbook is to ensure that all transactions have been recorded accurately and confirm that the balance in the register reflects the actual amount of available funds.Human introduced errors are introduced by interpretationWhen a designer verifies own design then verifying his or her own interpretation, not specificationAutomation Not an option due to the fact that hardware design is not a well-defined process, certain steps can be automated, but not the complete processPoka-Yoka simple foolproof steps, since verification process is an art, it does not yield itself to well defined steps.

    Redundancy Simplest, but most costly still cheaper than respinning a defective ASIC. Requires every transformation to be duplicated. In the industry, there should be a 2 to 1 ratio of verifiers to designers.This defines which transformation is being verified.Formal is often misunderstood. Engineers believe that by using mathematical exploration, that the need for testbenches are eliminated. This is FALSE.

    Checks to see if transformation is correct.Why do this?? EDA Tools are written by humans They make mistakes to!This checks process points to ensure that functionality is maintained. I.E. Clock tree insertion, scan chain connections, etc.Problem with formal verification:Knowing which assertions to prove and expressing them correctly is the most difficult part.Very little tools for model checkingTools that do exist can only check small designs (< 1000 latches/registers)Unless a spec is written in a formal language with precise semantics, it is impossible to prove a design meets the intent of its specification. Specs are written in natural language by individuals of varying degrees of abilities to communicate.It uses metrics, or results of some proof, RTL, to generate testbenches (or stimulation) to increase the metric or to exercise design to violate some property.Ex: 2 input design 4 combinations generator can be exhaustiveNow think of 8 input design 256 combinations Are all of them correct?No knowledge of actual implementation. Disadvantages:Can only control inputs and observe outputs, thus it lacks controllability and observability. It is difficult to locate source of problem.Advantages:Independent of implementation.I.E. could be VHDL, Verilog, C model, Arch model, behavioral, RTL, etc.Opposite of Black-Box.Full visibility and controllability of internals.Quickly setup interesting conditions. (I.e. Setup counter to rollover.)Tightly integrated with implementation.Testbench must change if DUV changes.Testing verifies that internal notes can be toggledThoroughness of testing depends on controllability and observability of internal nodes

    Thoroughness of test depends on controllability and observability of internal nodes.One method is scan.Linking all registers together into a chain. This chain is accessible from the chip pins. This way one can control and observe the internal nodes.This puts restrictions into design.It keeps down cost due to fabrication problems.Janicks book contains references for more info.If restriction are imposed due to test, why shouldnt some be imposed due to verification!!!!Verification should be taken into consideration during architecture. Architects should ask the following 2 questions:What is is supposed to do?How is it going to be verified?As number of bugs found decreases, cost and time of finding remaining ones increases. So when is verification done?Law of diminishing returns.2 questions:How much is enough?When will I be done?