Вы находитесь на странице: 1из 71

F8 : Introduction to an Advanced

Verification Methodology

1
Simulation/Verification Flow

Quartus/Vivado

Modelsim Prototype in
FPGA

2
The need for Verification…
• ~70% of effort when developing RTL
– trend: growing
• Testbenches are more complex than RTL models
• Growth of testbench complexity is more than linear with
RTL complexity.
– example:
• 10 state machines with 4 states: 410 total states
• 20 state machines with 4 states: 420 total states
• 10 state machines with 8 states: 810 total states

• We have to test the RTL model under situations similar to


those that the manufactured chip will encounter during
use (we have to develop a “model of the universe”)
3
This lecture
• We will study an Advanced Verification
Methodology for ASICs, i.e., we will learn
– how to develop high-quality verification environments
– how to use advanced verification techniques such as
assertions and coverage for digital systems

4
What is verification?
Verification ensures that the RTL performs the correct
function.

What defines the correct function of the RTL?


Verification Specification Design

What is the format of the specification?


1. Paper document
2. Executable spec
a) SystemC
b) C++
c) Matlab
d) etc.

5
Why Verify?
The later in the product cycle a bug is found the more
costly it is.

6
Needs of a Verification Language
• Very different needs from RTL
– RTL: the language must be simple enough for the
“stupid” synthesis tools to understand them and know
how to synthesize them
• ex – fifo: use a RAM, with pointer A and pointer B, increment
the pointers when reading or writing

– Verification: the language must be super-powerful to


be able to implement quickly and efficiently the “model
of the universe”, no need to be understood by
synthesis tools
• ex – fifo: virtual storage with push and pop

7
Hardware Description Languages
• VHDL/Verilog
– Developed in the 1980s
– Good languages for RTL design – synthesizable code
– Very few non-synthesizable constructs for writing
testbenches

8
Hardware Verification languages

• e/Vera
– Developed in the 1990s
– Only high-level constructs, impossible to write RTL
code
– People had to mix VHDL/Verilog RTL models with
e/Vera testbenches
• HVL cover three main features lacking in HDLs:
– Random constrained stimuli generation
– Assertions
– Functional coverage

9
System Verilog
• Hardware Description and Verification language
• Superset of Verilog – all Verilog systems work in
System Verilog
• First standardized by IEEE in 2005. In 2013: IEEE
standard 1800-2012
– Download the standard
– “Holy book of SystemVerilog”
– Answer to all of your questions are inside
– Good for reference, not for learning
• System Verilog “is” the standard IEEE 1800-2012
– Simulators/Synthesizers implementations might be
incomplete

10
System Verilog (ctd.)
• RTL subset of the language:
– Small superset of the Verilog RTL subset – some
constructs have been inserted to simplify different
things
• Verification subset of the language:
– Very very very big superset of Verilog
• Object-oriented constructs
• Random constrained stimuli generation
• Assertions
• Functional Coverage

11
System Verilog vs VHDL-2019
• System Verilog testbenches can be used to test
System Verilog RTL models, but can also be
used to test VHDL/Verilog RTL models – mixed-
language simulation

• VHDL-2019 has adopted many of the System


Verilog concepts, to enable better testing
purposes.
– Time will tell if it is enough…

12
Today’s lecture
• What is verification?
• Why verify?
• The verification process
• Verification plan exercise
• Directed testing
• Testbench exercise
• Randomization
• Functional coverage
• Testbench components
• Black/white/gray box verification methods
• HW 1

13
Verification Basics

14
Software design flow
• Goes through the following steps (in order):
– Specifications
– High-level language (C, ...)
– Low-level language (assembly)
– Machine language

• Only the specs-to-high language translation is


made by humans (most of the time)
– If the high-level language is correct, the machine
language will work

15
Software implementation flow
• The high-level language to machine-language flow is:
– Fast
– Inexpensive
• Software flow:
1) Write HL language
2) Translate HL language to machine language
3) Run machine language and find bugs
4) Fix HL language
5) Translate HL language to machine language
6) Run machine language and find bugs
7) Repeat from 4 until finished

16
FPGA implementation flow

• The FPGA implementation flow is similar to


software:
1) Write RTL language
2) Translate RTL language to gates and implement in FPGA
3) Run on FPGA and find bugs
4) Fix RTL language
5) Translate RTL language to gates and implement in FPGA
6) Run on FPGA and find bugs
7) Repeat from 4 until finished

17
ASIC design flow
• Goes in order through the following steps:
– Specifications/System Design
– Register Transfer Level
– Netlist
– Layout
– Physical chip

18
ASIC design flow (ctd.)
• From specs to RTL:
– human translation
• From RTL to Netlist to Layout to Physical:
– mostly automated translation (considered a solved – or
mostly solved - problem)
– If RTL is bug-free, the physical chip will work

• The main source of bugs is the specs-to-RTL


translation

19
ASIC implementation flow (ctd.)

1) Write RTL language


2) Simulate RTL and find bugs
3) Fix RTL language
4) Simulate RTL and find bugs
5) Repeat from 3 until finished
6) Synthesize, make layout, implement chip

20
Validation vs Verification
• Validation:
– are we making the right thing, that will answer to
the needs of the user?
– are the specs right?

• Verification:
– are we making what we wanted to make?
– is the model equivalent to the specs?

(Actually, I don’t fully agree on this definition. In my opinion, validation checks for
presence of faults, while verification checks for the absence of faults)

21
The Verification Process

Specification

Verifier interprets the spec Designer interprets the spec

Verifier creates a verification plan Designer creates a design spec

Verifier creates tests Designer creates the logic


BUGS!

22
Checking RTL to spec equivalence
• The person (team) who wrote the RTL must not
be the one who checks it correctness
– Else, bugs might not be found due to double-mistake in
interpreting the specs
– In case of errors, the mistake in interpreting the specs
might be due to either the designer or the verifier, or
both.

23
Testing at Different Levels
Block Level
- Maximum control
- Need to emulate peripheral blocks
- Might need to emulate DUT I/O
- Easiest to debug
- Highest simulation performance

System Level
- Minimum control
- Need to emulate system I/O
- Hardest to debug
- Lowest simulation performance
24
The Verification Plan
1. Binds a requirement in the specification to the test(s)

Specification: Test Plan:


The I2C bus will - Check reset values
read/write the config - Write a writable config reg
registers - Write a read-only config reg
- Read a config reg
- Read a memory mapped input
- Stall an I2C transaction
2. Pick a Technique for testing feature
a. Block, hybrid, or System level
b. Directed or random
c. Assertions
d. Emulation (FPGA, accelerator, etc)
e. Verification IP
f. Self checking or visual 25
The Verification plan (ctd.)
• Plan of all what should be verified in a DUT
– Should include all possible features and potential
sources of bugs
– When all tests in the verification plan have been tested
and no bugs were found, then the verification work is
over

26
Formal verification vs
Simulation-based verification
• Formal verification is a new paradigm:
– Tools prove that the RTL is equivalent to a high-level
model or that it satisfies certain properties
– No need for simulation
– Might totally substitute simulation-based verification
one day…

• We consider only simulation-based verification

27
Evolution of the Verification Plan

Continuously
y
updated

Design and Verification follow the “Waterfall” flow


28
Dr. Meeta Yadiv, ASIC Verification, North Carolina State University Fall 2007 Source: Will, Goss, Roesner: Comprehensive Functional Verification: Elsevier
Basic Testbench Functionality

1. Generate stimulus
2. Apply stimulus to DUT
3. Capture the responses
4. Check for correctness
5. Measure progress

29
Directed Testing
Most (all) designers probably specify directed testing in their
test plan
- Steady progress
- Little up-front infrastructure development
- Small design changes could mean massive test changes

30
Directed tests
• Testbenches without randomness, targeting a
specific item in the verification plan.
– Example:
write in the fifo for 16 cycles after each other, check that the fifo is full,
then read all 16 elements, check that it is empty

• If the design is complex enough, it is impossible


to cover all features with directed testbenches

31
Methodology Basics
Our verification environments will use the following principles
1. Constrained random stimulus
2. Functional coverage
3. Layered testbench using transactors
4. Common testbench for all tests
5. Test-specific code kept separate from testbench

32
Random Verification
• Random Verification Procedure:
1) Generate random tests using random constrained stimuli
generation
2) Check for bugs and correct them if there are any
3) Check for the coverage values. If not satisfying, add constraints
and repeat from 1

• Note: some directed testbenches might be


necessary to cover the corner cases

33
Random Constrained Stimuli
Generation
• Not to be confused with the “random” construct in
Verilog/VHDL
• Main idea:
– define random variables and constraints
– ask the “random solver” to find a random set of
variables that satisfies the constraints
– constraints can be added, and/or disabled to create
different tests

34
Random Constrained Stimuli
Generation
• Example:
– random bit a
– random integers b, c between 0 and 255
– constraint CA: if a=1 then (b+c)!=256
– constraint CB: b>c
• It would be hard to make a routine to randomly
generate one of the legal combinations using only
direct randomization of variables

35
What should you randomize?

Much more than data


1. Device configuration
2. Environment configuration
3. Protocol exceptions
4. Errors and violations
5. Delays
6. Test order
7. Seed for the random test

36
$random
• Available since Verilog 1995.
 $random creates a 32-bit signed random number.
 If your range is a power of 2, use implicit bit selection
logic [3:0] addr;
addr = $random;
 If your range is not a power of 2 use modulus (%).
 Example: generate a random addr between 0 and 5
addr = $unsigned($random) % 6;
 New to System Verilog is $urandom and $urandom_range
$urandom_range(5,0);

 Unless your code changes, the random generation will be the same. Use
a seed to change the generation
integer r_seed = 2;
addr = $unsigned($random(r_seed)) % 6;
37
Functional Coverage
How do you know your random test bench is doing anything
useful?
 Functional coverage measures how many items in your test plan have
been tested.
 For the ALU Example:
 Have all opcodes been exercised?
 Have operands taken values of max pos, max neg, 0?
 Have all permutation of operands been exercised?

• Functional coverage can be collected manually or by writing System


Verilog coverage statements.

38
Assertions
• Tools for automatic checking of properties
– “automated waveform checker”
– Example:
 when req goes at one, grant must be 1 between 2 and 3 cycles later
 req must never be at one for more than two consecutive cycles

• Can be used in testbenches or “bundled” with the


RTL to check input correctness

39
Feedback from Functional Coverage
• Tests might need to be modified or additional tests written
due to feedback from functional coverage.

Manual or Automatic
feedback

Driver Assertions Monitor

DUT

40
Code Coverage
• Except functional coverage, there are other coverage
metrics that are tool features, and can be used
independently of the language
– These do not require to write code to enable coverage
– Statement coverage: has every statement in the DUT been
executed?
– Path coverage: have all paths been followed?
– Expression coverage: have all causes for control-flow change
been tried?
– FSM coverage: has every state in an FSM been visited? Every
transition followed?

41
Code coverage –
Statement coverage
y if (a>1 || b>1) begin
y c <= d;
y d <= d+1;
y end
y else begin
y if (a==2) begin
n d <= d-1;
y end
y else
y d <= d-2;
y end
y end

42
Code coverage – Path coverage
if (a>1 && b>1) begin Run 1
c <= d;
d <= d+1;
end
if (a>2) begin
if (a==3) begin
d <= d-1;
end
else
d <= d-2;
end
end

43
Code coverage – Path coverage
if (a>1 && b>1) begin Run 2
c <= d;
d <= d+1;
end
if (a>2) begin
if (a==3) begin
d <= d-1;
end
else
d <= d-2;
end
end

44
Code coverage – Path coverage
if (a>1 && b>1) begin At the end of all runs, 
c <= d;
we find out this legal 
d <= d+1;
end
path was not exercised
if (a>2) begin
if (a==3) begin
d <= d-1;
end
else
d <= d-2;
end
end

45
Code coverage –
Expression coverage
if ((a>1 && b>1) || (a<0) || (b<0)) begin
c <= d;
d <= d+1;
end

• All statements were executed (100% statement


coverage), but not all values for the expressions became
true

46
Code coverage –
FSM coverage
• Usually checks if all states in a state machine
were exercised
– But you should also check that all transitions in the
state machine were exercised

47
Testbench Components
• Testbench wraps around the Design Under Test
- Generate stimulus
- Capture response
- Check for correctness
- Measure progress through coverage numbers
• Features of an effective testbench
• Reusable and easy to modify for different DUTs <- Object oriented
• Testbench should be layered to enable reuse
- Flat testbenches are hard to expand.
- Separate code into smaller pieces that can be developed separately
and combine common actions together
• Catches bugs and achieves coverage quickly <- Randomize!

48
Testbenches

• Basic model: give inputs, read outputs

Inputs generator RTL model Outputs checker

• The element to test is called DUT (Design Under


Test) or DUV (Design Under Verification)

49
Verification models
• Basic model:
– Give inputs, check outputs - black box verification
– We might use white-box verification (give inputs,
check internal signals)
– Or grey-box verification (give inputs, check some
internal signals specifically inserted for debug purpose)

50
Black/White/Grey Box testing
methods
• Blackbox verification
• Use I/O only for determining bugs
• Difficult to fully verify and debug

• Whitebox verification
• Use I/O and signals in the DUT for determining bugs
• Easy to debug

• Greybox verification
• A hybrid of blackbox and white box
verification
51
Testbench Structuring
• It is important to conceptually divide testbenches into blocks,
depending on the function:
– Generator of high-level input data
(example: we decide to send out 4 packets of 1024 bytes followed by 2 packets of
64 bytes)
– Driver: read the high-level input data and drive the DUT input ports.
– Output monitor: read data from the DUT’s output ports.
– Output checking: check correctness of the output

• This allows easier readability and reuse. If the DUT input protocol
change, only the driver must change.
• This requirement is most important for big systems, with a lot of reuse
and many people working on it.

52
Layered Testbench
Flat testbench Layered testbench
//Write 32’hFFFF to 16’h50 task write(reg[15:0] addr,
@(posedge clk); reg [31:0] data);
PAddr <= 16’h50;
PWData <= 32’hFFFF; @(posedge clk);
PWrite <= 1’b1; PAddr <= addr;
PSel <= 1’b1; PWData <= data;
PWrite <= 1’b1;
// Toggle PEnable PSel <= 1’b1;
@(posedge clk);
PEnable <= 1’b1; // Toggle PEnable
@(posedge clk); @(posedge clk);
PEnable <= 1’b0; PEnable <= 1’b1;
@(posedge clk);
PEnable <= 1’b0;
endtask
53
Structured testbenches

Outputs checker
Generator of
Checks outputs with
High‐level inputs Expected HL outputs

Inputs driver Outputs monitor
Gives the inputs to RTL model Reads the outputs and
The DUT Translates into HL

54
Signal and Command Layers
• Signal Layer
- DUT and all pin level connections
• Command Layer
- Driver converts low level commands to inputs of DUT
- Assertions look at I/O or internal signals of DUT
- Monitor converts outputs of DUT to low level transaction results

55
Testbenches

• Often a Golden model can be used:

Golden model Outputs checker
Generator of
(matlab, TLM, Checks outputs with
High‐level inputs Expected HL outputs
Timed, untimed, …)

Inputs driver Outputs monitor
Gives the inputs to RTL model Reads the outputs and
The DUT Translates into HL

• Golden model and RTL must be developed by


different teams, errors might be in both
56
Functional Layer
• Agent
- Breaks down higher level commands into low level commands
- Informs the Scoreboard of commands sent to the DUT
• Scoreboard
- Predicts the results of the commands
• Checker
- Compares the result predicted by the scoreboard with the result
collected by the Monitor

57
Scenario Layer
• Generator
- Breaks down a scenario into high level commands
• Scenario examples for an MP3 player
- Play music from storage
- Download new music
- Adjust volume
- Change tracks, etc.

58
Test Layer and Functional Coverage
• Test block determines: • Functional Coverage
- What scenarios to run - Measures progress of tests
- Timing of scenarios - Changes throughout project
- Random constraints

Functional Coverage
59
SoC Verification
• Collection of IPs
– Each IP must first be verified at block-level
• Then top-level verification follows
– Verification systems for IPs are packaged into VIPs (Verification
IPs), with drivers, monitors, assertions to check input correctness,
high-level models, etc.
– A scoreboard keeps track of which tests have been run and
coverage
• Possible to build a chip in which only some components
are RTL, the others are golden models
– Using VIPs it is easy to build fast complex models of what is
around a block or a chip

60
UVM
• Universal Verification Methodology
– Methodology on top of System Verilog that automates all this
– Supported by VHDL-2019
• Key focus: reuse
– Components are enclosed into agents, containing checkers,
monitors, drivers, etc.
– A chip can be built connecting together the different VIPs

• We do not consider UVM in this course, it is only


adapted for complex systems
– For the testbench complexity level used in the course labs, some
form of “conceptual” division between testbench blocks is
considered sufficient.
61
Summary

62
Simulation Environment Phases
• Build
- Generate DUT configuration and environment configuration
- Build testbench environment
- Reset the DUT
- Configure the DUT
• Run
- Start environment
- Run the test
• Wrap-up
- Sweep
- Report

63
Maximum Code Reuse
• Put your effort into your testbench, not into your
tests.
• Write 100’s or 1000’s of directed tests or far fewer
random tests.
• Use directed test to cover missed functional coverage

64
Testbench Performance
• Directed tests run quicker than random tests
- Random testing explores a wide range of the state space
- Simulation time is cheap, your time is not

• Avoid visually verified tests

• Test maintenance can be a huge effort

65
IL2203 Rest of the course
• Remaining parts of the course (3 ECTS)
– SEM1 – Seminars (1.5 ECTS)
– LAB2 – System Verilog Labs (1.5 ECTS)

• Seminars (two groups - half-class in each group)


– Study the System Verilog Language Guide
– Read the verification course book
– Solve exercises from the book

66
Verification Course Book
• We use the book(s)

SystemVerilog for Verification


Chris Spear, Greg Tumbush
(on-line via KTH library)

SystemVerilog Standard IEEE standard 1800-2012


(download the standard)

67
Reversed Classroom Pedagogics
• You will study the chapters and solve the
exercises at home before the lecture
– Exercises are associated with the chapters in the
electronic book (on the home page).

• You will present your solutions during class


– We randomly pick a student to present each task
– Your performance in class will decide your grade (P/F)

68
IL2203 Part 2 - Seminar & Lab Order
• Seminar 1 – Solving/discussing selected problems from
– Chapter 2 – Data_types
– Chapter 3 – Procedural Statements and Routing
– Chapter 4 - Connecting the Testbench and Design
– Chapter 5 - Basic OOP

• Verification Lab 1 – Testing of an unknown function

• Seminar 2
– Chapter 6 – Randomization
– Chapter 7 - Threads and Interprocess Communication
– Chapter 8 - Advanced OOP and Testbench Guidelines
– Chapter 9 - Functional Coverage and Assertions (Self-study H)

• Verification Lab 2 – Make an advanced testbench in System Verilog for


the Microcontroller FSM you developed in Lab 3

69
Seminar Schedule
• Seminar slots allocated
– 3/10 10-12 Before VHDL exam - Replace with
VHDL Exercise instead?
– 31/10 8-10 Two days in a row does not make
– 1/11 13-15 sense - remove one of these two
– 5/11 10-12 Keep?

• Laboration slots
– 6/11 Testing Unknown functionality
– 13/11 Advanced testbench in System Verilog

70
Labs & Tools - QuestaSim 10.0b 64-bit

71

Вам также может понравиться