AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to...

11
IEEE Design & Test of Computers 1 Abstract—The automotive trends is to increase the electronics systems inside vehicles. The complexity of such systems is rising with the number of components involved on the one hand and on the other hand on the tighter interaction between these components, being analog, digital hardware or software. The verification of Electronic Control Systems (ECU) becomes more and more challenging. In this paper we show that the Universal Verification Methodology (UVM), initially developed for digital systems, which consists in clearly distinguishing between the test scenario, described in an abstract way, the Device Under test (DUT) and the test environment that translates the test to the DUT interface, can be extended to analog and mixed signal systems. We introduce the UVM-SystemC-AMS library implemented above SystemC and its AMS extension SystemC- AMS. This approach is used to verify two ECUs from automotive industry. The first use case shows how UVM can be used for the simulation-based verification of a complex mixed-signal design. The second use case shows how UVM can be used on a Hardware In the Loop (HIL) system to verify/validate a FPGA prototype. Index Terms— Design Under Test (DUT), Electronic Control Unit (ECU), Electronic System Level (ESL), Hardware In the Loop (HIL), system simulation, SystemC, SystemC-AMS, Timed Data Flow (TDF), Transaction Level Modeling (TLM), Universal Verification Methodology (UVM), system verification, Virtual Prototyping (VP). I. INTRODUCTION The complexity of electronic systems for the automotive industry is increasing. Such systems are designed to exploit Manuscript received 24 October 2014. “This work was supported in part by the project Verification for Heterogeneous Reliable Design and Integration (VERDI) which is funded by the European Commission within the 7 th Framework Program (FP7/ICT 287562) ”. M. Barnasconi is with NXP Semiconductor, Eindhoven, Netherland ([email protected]). K. Einwich and T. Vörtler are with Fraunhofer IIS, Design Automation Division, EAS, Dresden, Germany (karsten.einwich @eas.iis.fraunhofer.de). F. Pêcheux, M.-M. Louërat, J.-P. Chaput, Z. Wang are with the LIP6 lab, SU-UPMC, CNRS, Paris, France ([email protected]). P. Cuenot is with Continental Automotive France, Toulouse, France ([email protected]) I. Neuman is with Continental Teves AG & Co. oHG, Germany, ([email protected]) T. Nguyen is with Infineon Technology, Austria AG, Villach, Austria ([email protected]) R. Lucas and E. Vaumorin are with Magillem Design Services, Paris, France ([email protected]) tight interaction between the physical world, captured or controlled through sensors and actuators, and the digital hardware (HW) and software (SW) world. Electronic Control Units (ECU) in cars are fully heterogeneous systems, implementing analog power electronics and low-voltage electronics, controlled by software running on an embedded processor. To master the complexity of the ECU design, a variety of Electronic System Level (ESL) design methods and languages have arisen. To that purpose, the SystemC language standard [1][2] has been extended with powerful Analog and Mixed Signal (AMS) modeling capabilities [3], addressing functional and architecture level. Yet, as great effort was made towards efficient analog and mixed system level design and modeling technologies [4][5][6][7][8][9][8], the system-level verification are left behind. Coverage-driven verification of digital IP has become more mature since the introduction of the Universal Verification Methodology (UVM) standard [10], implemented in SystemVerilog [11]. The principle is to build structured test-benches with reusable verification components. Following the UVM approach, the tests are designed in a hierarchical and modular way, using similar abstraction levels than the device under test (DUT) itself. This includes abstraction such as Transaction Level Modeling (TLM) for the test sequences [12], combined with accurate, signal interface to the DUT. We have introduced the UVM-SystemC and AMS extensions, called here UVM-SystemC-AMS, to enable the development of virtual prototypes at TLM abstraction with a structured test bench methodology like UVM. The goal is to perform extensive verification of the embedded application. Yet, one application will often require a high computational effort, which can take up to hours or days. When the DUT model includes RTL level descriptions, the simulation-based verification may become impractical. This is where the UVM approach helps to efficiently translate the test bench built for simulation verification into real hardware prototyping validation using laboratory measurements. Using UVM-SystemC-AMS we were able to establish a tight coupling between the virtual prototyping simulation and the Hardware In the Loop (HIL) laboratory validation leading to an increase in speed and bug detection. Two use cases will show the applicability of UVM- SystemC-AMS for industrial scale applications. The first use case shows how UVM can be used for the simulation-based AMS System-Level Verification and Validation using UVM in SystemC and SystemC-AMS: Automotive Use Cases M. Barnasconi, Fellow, IEEE, K. Einwich, T. Vörtler, F. Pêcheux, M.-M. Louërat, J.P. Chaput, Z. Wang, P. Cuenot, I. Neumann, T. Nguyen, R. Lucas and E. Vaumorin

Transcript of AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to...

Page 1: AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to analog and mixed signal systems. We ... language verification environment in SystemVerilog,

IEEE Design & Test of Computers

1

Abstract—The automotive trends is to increase the electronics

systems inside vehicles. The complexity of such systems is rising with the number of components involved on the one hand and on the other hand on the tighter interaction between these components, being analog, digital hardware or software. The verification of Electronic Control Systems (ECU) becomes more and more challenging. In this paper we show that the Universal Verification Methodology (UVM), initially developed for digital systems, which consists in clearly distinguishing between the test scenario, described in an abstract way, the Device Under test (DUT) and the test environment that translates the test to the DUT interface, can be extended to analog and mixed signal systems. We introduce the UVM-SystemC-AMS library implemented above SystemC and its AMS extension SystemC-AMS. This approach is used to verify two ECUs from automotive industry. The first use case shows how UVM can be used for the simulation-based verification of a complex mixed-signal design. The second use case shows how UVM can be used on a Hardware In the Loop (HIL) system to verify/validate a FPGA prototype.

Index Terms— Design Under Test (DUT), Electronic Control Unit (ECU), Electronic System Level (ESL), Hardware In the Loop (HIL), system simulation, SystemC, SystemC-AMS, Timed Data Flow (TDF), Transaction Level Modeling (TLM), Universal Verification Methodology (UVM), system verification, Virtual Prototyping (VP).

I. INTRODUCTION The complexity of electronic systems for the automotive industry is increasing. Such systems are designed to exploit

Manuscript received 24 October 2014. “This work was supported in part

by the project Verification for Heterogeneous Reliable Design and Integration (VERDI) which is funded by the European Commission within the 7th Framework Program (FP7/ICT 287562) ”.

M. Barnasconi is with NXP Semiconductor, Eindhoven, Netherland ([email protected]).

K. Einwich and T. Vörtler are with Fraunhofer IIS, Design Automation Division, EAS, Dresden, Germany (karsten.einwich @eas.iis.fraunhofer.de).

F. Pêcheux, M.-M. Louërat, J.-P. Chaput, Z. Wang are with the LIP6 lab, SU-UPMC, CNRS, Paris, France ([email protected]).

P. Cuenot is with Continental Automotive France, Toulouse, France ([email protected])

I. Neuman is with Continental Teves AG & Co. oHG, Germany, ([email protected])

T. Nguyen is with Infineon Technology, Austria AG, Villach, Austria ([email protected])

R. Lucas and E. Vaumorin are with Magillem Design Services, Paris, France ([email protected])

tight interaction between the physical world, captured or controlled through sensors and actuators, and the digital hardware (HW) and software (SW) world. Electronic Control Units (ECU) in cars are fully heterogeneous systems, implementing analog power electronics and low-voltage electronics, controlled by software running on an embedded processor. To master the complexity of the ECU design, a variety of Electronic System Level (ESL) design methods and languages have arisen. To that purpose, the SystemC language standard [1][2] has been extended with powerful Analog and Mixed Signal (AMS) modeling capabilities [3], addressing functional and architecture level.

Yet, as great effort was made towards efficient analog and mixed system level design and modeling technologies [4][5][6][7][8][9][8], the system-level verification are left behind. Coverage-driven verification of digital IP has become more mature since the introduction of the Universal Verification Methodology (UVM) standard [10], implemented in SystemVerilog [11]. The principle is to build structured test-benches with reusable verification components. Following the UVM approach, the tests are designed in a hierarchical and modular way, using similar abstraction levels than the device under test (DUT) itself. This includes abstraction such as Transaction Level Modeling (TLM) for the test sequences [12], combined with accurate, signal interface to the DUT. We have introduced the UVM-SystemC and AMS extensions, called here UVM-SystemC-AMS, to enable the development of virtual prototypes at TLM abstraction with a structured test bench methodology like UVM. The goal is to perform extensive verification of the embedded application. Yet, one application will often require a high computational effort, which can take up to hours or days. When the DUT model includes RTL level descriptions, the simulation-based verification may become impractical.

This is where the UVM approach helps to efficiently translate the test bench built for simulation verification into real hardware prototyping validation using laboratory measurements. Using UVM-SystemC-AMS we were able to establish a tight coupling between the virtual prototyping simulation and the Hardware In the Loop (HIL) laboratory validation leading to an increase in speed and bug detection.

Two use cases will show the applicability of UVM-SystemC-AMS for industrial scale applications. The first use case shows how UVM can be used for the simulation-based

AMS System-Level Verification and Validation using UVM in SystemC and SystemC-AMS:

Automotive Use Cases M. Barnasconi, Fellow, IEEE, K. Einwich, T. Vörtler, F. Pêcheux, M.-M. Louërat, J.P. Chaput,

Z. Wang, P. Cuenot, I. Neumann, T. Nguyen, R. Lucas and E. Vaumorin

Page 2: AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to analog and mixed signal systems. We ... language verification environment in SystemVerilog,

IEEE Design & Test of Computers

2

verification of a complex SystemC-AMS design, which uses Timed Data Flow (TDF) interfaces. The second use case shows how UVM can be used on a HIL system to verify/validate a FPGA prototype.

The paper is organized as follows. Section II describes the related work in the areas of methodologies and verification tools for AMS-ESL. Section III presents our approach in terms of layered verification environment architecture based on UVM, SystemC and SystemC-AMS. Section IV introduces the UVM-SystemC-AMS library and its main features. Section V describes the constrained randomization of real values as well as the functional coverage API illustrated with a SPI-controlled filter example. Section VI presents the creation of a verification environment in UVM-SystemC-AMS for an automotive verification use case. Section VII presents an automotive validation use case. Section VIII concludes the paper.

II. STATE OF THE ART AND CONTRIBUTION The trend to start earlier with system-verification in the

design cycle, in combination with the growing complexity of the actual design implementation (and its virtual prototype counterpart), emphasizes the need to apply a proven verification methodology. Therefore there have been many attempts in the past to create structured and reusable verification environments in SystemC/C++ (Figure 1). In [13], the Open Verification Methodology (OVM) formed the basis for creating a SystemC-equivalent verification environment. The System Verification Methodology [14] was based on OVM-SystemC, donated by Cadence to the community [15]. Mentor Graphics released a SystemC version as part of their Advanced Verification Methodology (AVM) [16]. Also Synopsys introduced as part of their Verification Methodology Manual (VMM) a class library in SystemC [17][18]. More recent studies address the need to support a true multi-language verification environment in SystemVerilog, SystemC and e [19].

Figure 1 : Evolution of Verification Methodologies

However, all these initiatives do not fully comply with the methods defined in the UVM standard, primarily because they are built on the former AVM, OVM, and VMM technologies. The consolidation into a single UVM standard resulted in major changes. As a consequence, the user has to deal with the

incompatibilities related to simulation semantics and language constructs. Especially the move from OVM to UVM significantly changed the way components deal with the phasing mechanism and how the end-of-test is managed. To avoid legacy concepts and constructs in modern test benches, migration to UVM standard compatible implementations should be encouraged.

Alternative solutions are proposed in [20][21][22] to address the multi-language integration challenges found in today’s verification environments, by defining a set of coding guidelines centered around TLM communication. However, the creation of reusable verification components and integration in a test bench is much more than having an agreed communication method; additional elements like test bench configuration and reuse of test sequences do require a more holistic view on UVM and its principles, and justifies making these concepts available in other languages.

Therefore an up-to-date and UVM standard compliant language definition and reference implementation is needed in SystemC/C++, which not only gives the user community a semantically and syntactically correct implementation of UVM, but also the same user experience in terms of the UVM “look & feel”. Especially the latter aspect would facilitate UVM users to start using SystemC for system-level and hardware/software co-verification, or make SystemC or software experts more familiar with the powerful UVM concepts to advance in the verification practice [23].

The recent C++ based extension for the analog mixed signal (AMS) simulation in SystemC-AMS [3] has opened the perspective for electronic mixed domain system simulation and extension to the verification domain and test coverage. There is an actual need to extend actual UVM digital oriented methodology and implementation to new construction to support the analog domain usable for the Model of Execution (MoC) of SystemC-AMS.

Ultimately, this will benefit the entire design community, where verification and system-level design practices come closer together. It will serve the universal objectives of the UVM, addressing the need for having a common verification platform, including hardware prototyping in which C-based test sequences or verification components in UVM-SystemC are reused in HIL simulation or Rapid Control Prototyping (RCP) [24].

UVM is a verification framework that allows the creation of test benches based on a constrained random stimulus principle. Instead of testing the DUT with directed test sequences, random stimulus is applied, which is shaped by constraints so that the randomly generated values are valid stimulus. As the input stimulus is randomly generated, it is very important to observe which data has been sent to the DUT, to make sure that all design corners have been tested during a verification regression run. Therefore, functional coverage can be used which allows defining our own coverage goals.

The UVM standard and associated class library implementation in SystemVerilog does not define the constructs for randomization and functional coverage, because these concepts are intrinsically part of the SystemVerilog

2003 2004 2005 2006 2007 2008 2009 2010 2011

Cadence

Synopsys

MentorGraphics

OVM2.1.1

OVM2.1

UVM1.0 EA

OVM2.0

OVM1.0

VMMVRM

(VERA based)

AVM1.0

AVM 3.1(SV+SC)

VMM1.0.1

VMM1.2

UVM1.0

OVM-ml: SV+SC+e2.0.1 (Cadence only)

AccelleraStandard

Need:AMS

extensions

URMeRM

(e based) UVM1.1

2012

Page 3: AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to analog and mixed signal systems. We ... language verification environment in SystemVerilog,

IEEE Design & Test of Computers

3

standard, defined in IEEE Std. 180 [11]. In a similar way, UVM in SystemC will not introduce such constructs for randomization and coverage, but will make use of dedicated libraries for this purpose. Several good attempts have been made to support constrained randomization for SystemC, such as the SystemC Verification library (SCV) [25] and the Constrained Random Verification Environment for SystemC (CRAVE) [26]. Also different libraries that add functional coverage to SystemC have been proposed in [12][27][28][29]. Furthermore, various commercial, proprietary or vendor-specific solutions are available.

In the following we will present our contributions: a methodology based on UVM and SystemC-AMS allowing to test mixed-signal designs at different abstraction levels. We will illustrate how the test descriptions are reusable also for the laboratory and prototype validation. To support the real analog value, we will introduce an API for randomization, which supports constraints randomization of continuous distribution functions as well as an API for coverage real of real value.

III. UVM PRINCIPLES FOR AMS SYSTEMS The AMS system-level verification methodology is based

on three UVM basic principles that have been extended to AMS systems since the first presentation of UVM-SystemC library [23]. The first one consists of a clear separation between tests, test bench and AMS DUT implementation. The second one is the definition of abstract and reusable tests scenarios. Creating a sequence of commands, which can drive an analog signal pattern, forms a test scenario. The third one concerns the test bench, or verification environment. It is the translator between the abstract test scenario and the DUT.

A. Layered approach UVM-SystemC-AMS follows a layered approach, inspired

by [17], where levels of abstraction are introduced to distinguish the test scenario form the actual verification environment on which sequences are executed. Figure 2 illustrates the 3-level top-down refinement of test sequence stimuli as well as the bottom-up reconstruction of performance indicators for verification. The virtual sequences are extracted from a scenario database (the test) and propagated to the test bench. The virtual sequencer controls all sequences at top-level. The sequencers control the sequence of data transactions and send them to drivers. The drivers translate the sequence of data transaction to Discrete Event (DE) signals for the digital IPs and to SystemC-AMS TDF samples for AMS IPs and send them to DUT ports. The monitors collect the output signal from the DUT and store them. From a relevant set of output signals, the analysis utility functions translate them to transactions. Figures of merit are computed from these outputs signal transactions as well as the associated collected input stimuli transactions. The scoreboard compares the results for the DUT with the golden model reference ones.

Figure 2: UVM based layering

B. UVM components We take advantage of the already existing principles of

UVM used in the digital domain to clearly separate the analog/mixed signal test, the test bench and the actual implementation of the DUT exhibiting AMS behaviors. Universal Verification Components (UVCs) are the primitive building blocks to create the test bench or verification environment. Figure 3 illustrates a typical example. At the top level of the hierarchy, two UVCs (called here VIP1 and VIP2) are connected to the virtual sequencer using TLM communication (see Figure 2 and in Figure 3).

A UVC contains one or more agents. The corner stone of UVC is the agent ( in Figure 3), which instantiates the sequencer, the driver (Figure 2 and in Figure 3) and the monitor (Figure 2 and in Figure 3). The agent receives sequential request and converts them into low-level data at the DUT interface. Agents may be active or passive. Active agents drive signals to the DUT and thus instantiates a sequencer and a driver. Passive agents do not drive DUT and thus do not instantiate any sequencer or driver. Whether active or passive, agents may instantiate monitors to collect results.

Figure 3: UVM-SystemC-AMS test environment

Page 4: AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to analog and mixed signal systems. We ... language verification environment in SystemVerilog,

IEEE Design & Test of Computers

4

The sequences encapsulate sequence items, also called transactions. The sequences are not part of the test environment hierarchy. They are UVM-SystemC-AMS objects that are mapped on one or more sequencers. The sequencer reacts to the orders given by the driver by getting sequence item and delivering it to the driver. The driver requests transactions (sequence item) and translate them to one or more physical signals.

All components can be configured using the UVM configuration database.

Functional coverage can be collected by adding coverage models at different levels of abstractions shown in green in Figure 3. Coverage requesting signal values is collected in a monitor ( in Figure 3). More abstract coverage estimation is collected based on transactions, which a monitor provides through an analysis port (Figure 2 and in Figure 3). The subscribers ( in Figure 3) are supplied with such kind of collected information. This allows functional coverage related to the overall checking of the verification goals to be performed by the scoreboard.

C. Reuse UVM test environments in SystemC can be reused for

verification at different levels of abstractions as well as for validation in the laboratory. UVM-SystemC-AMS test environment can be used to test a design from a system level description down to an RTL-description of the design, as SystemC can be coupled with different simulators [30][31]. Furthermore, as the C++ language can be easily compiled to different hardware platforms it is also possible to reuse UVM test environments for laboratory based validation. SystemC can run e.g. on Hardware in the Loop Simulator systems [32][33], an example of possible reuse of a test environment is shown in Figure 4. The application use case shown in Section VII demonstrates the reuse possibilities of UVM for SystemC.

Figure 4: Reuse of UVM-SystemC test environment for different

levels of abstraction

D. Test execution and verification features In order to have a consistent test bench execution flow, the

UVM-SystemC and UVM-SystemC-AMS use phases to order the major steps that take place during simulation.

Figure 5: UVM-SystemC-AMS common and runtime phases

There are three groups of phases, which are executed in the following order:

1. Pre-run phases (build_phase, connect_phase): The build_phase is executed at the start of the UVM testbench simulation and their overall purpose is to construct, configure the test bench component hierarchy. During the build_phase UVM components are indirectly constructed using the factory pattern. The connect_phase is used to interconnect all the components.

2. Runtime phase (run_phase): The test bench stimulus is generated and executed during the runtime phases, which follow the build phases. In run_phase and run-time phases, we execute the test scenarios by performing the configuration of the DUT and applying primary test stimulus to DUT. These runtime phases are all spawned SystemC processes and can consume time, unlike the other phases, which are untimed function calls. All UVM components using the run-time schedule are synchronized with respect to the pre-defined phases in the schedule.

3. Post-run phases (extract_phase, check_phase, report_phase and final_phase): where the results of the test case are collected and reported. The 4 post-run phases are used to post-process the results after the execution of the test scenario of the DUT.

E. TDF and Discrete Event synchronisation

Virtual sequences, sequences, and sampled signals represent the actual data sent to or received from the DUT at different time scales. To correctly synchronize the UVCs involved in the test environment and the DUT, it is necessary to synchronize the related SystemC and SystemC-AMS modules.

In Figure 6, an UVM-SystemC-AMS driver component consisting of two modules is presented. The first component is a SystemC TLM adaptor component. A process of this component reads the timestamps of the transactions and writes (schedules) the data of the transaction in the second delta cycle, one resolution time step before the timestamp of the transaction. This guarantees that the data will be read at the correct time by the TDF module, since SystemC-AMS reads event driven SystemC signals always at the first delta cycle of the current time.

Figure 6: UVM-SystemC-AMS driver, from DE to TDF

!"#$%&'((!"#$!)*+#,

-./01/2034)5640789:8+6+%

'4890!6;<6+"64 !"$!)*+#,

27$/=3->?1&#:%84

271$/=3->?$@3A34)5642"B6&<,60C48"6!! %803-@@>/0

@4#+!#"D8+

Page 5: AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to analog and mixed signal systems. We ... language verification environment in SystemVerilog,

IEEE Design & Test of Computers

5

This pipelined synchronization method is presented in Figure 7.

Figure 7: Pipelined synchronization process for scoreboard

Between each sequence item, the sequencer/driver waits for a period T0 (which can be variable). The monitoring thread waits for the same period, so that the emission of stimuli and the reconstruction of performance indicators are kept synchronous during the simulation, for scoreboard comparisons. Once a new sequence item has been propagated to the driver (at the very beginning of the sequence item period), the corresponding TDF driver module uses the transaction description to generate the TDF stimuli for the DUT. For instance, if the sequence item defines a new operating frequency for the wave generator, the corresponding TDF processing function reacts immediately (i.e. within a TDF period indicated by the TDF cluster time step) and generates the appropriate low-level samples. The sequence item high-level information data remain valid during the whole T0 period. At the very end of the T0 period, all the TDF stimuli samples corresponding to a given sequence item have been sent to the DUT, and all the monitored values coming from the DUT have been aggregated in order to compute the values of the performance indicators. When scheduled, the thread associated to the monitor gains direct access to these performance values, which will remain valid for the next T0 period.

IV. UVM-SYSTEMC-AMS LIBRARY IMPLEMENTATION In this section we present the UVM-SystemC-AMS library

and describe the main features with code snippets, therefore extending the UVM-SystemC library version of [23].

A. The agent Listing 1 below shows the creation of an UVM component

with the user-defined name vip_agent in UVM-SystemC-AMS. An agent encapsulates the components, which are necessary to drive and monitor the (physical) signals to (or from) the DUT. Typically, it contains three components: a sequencer, a driver and a monitor. The agent can also contain analysis functionality for basic coverage and checking, but this is not shown in this simple example.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

class vip_agent : public uvm_agent {

public:

vip_sequencer<vip_trans>* sequencer;

vip_driver<vip_trans>* driver;

vip_monitor* monitor;

UVM_COMPONENT_UTILS(vip_agent)

vip_agent( uvm_name name )

: uvm_agent( name ), sequencer(0), driver(0), monitor(0) {}

virtual void build_phase( uvm_phase& phase ) {

uvm_agent::build_phase(phase);

if ( get_is_active() == UVM_ACTIVE ) {

sequencer =

vip_sequencer<vip_trans>::type_id::create("sequencer", this);

assert(sequencer);

driver =

vip_driver<vip_trans>::type_id::create("driver", this);

assert(driver);

}

monitor = vip_monitor::type_id::create("monitor", this);

assert(monitor);

}

virtual void connect_phase( uvm_phase& phase )

{

if ( get_is_active() == UVM_ACTIVE )

{

// connect sequencer to driver

driver->seq_item_port.connect(sequencer-

>seq_item_export);

}

}

};

Listing 1: UVM-SystemC-AMS agent

B. The sequences The UVM sequence item is used as template argument for

the creation of the actual sequence, as shown in Listing 2. A sequence is derived from template class uvm_sequence (Line 3). The macro UVM_OBJECT_PARAM_UTILS supports the registration of template classes with multiple arguments, which are derived from uvm_object (Line 11). 1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

template <typename REQ = uvm_sequence_item, typename RSP =

REQ>

class sequence : public uvm_sequence<REQ,RSP>

{

public:

sequence( const std::string& name )

: uvm_sequence<REQ,RSP>( name ) {}

UVM_OBJECT_PARAM_UTILS(sequence<REQ,RSP>);

virtual void pre_body() {

if ( starting_phase != NULL )

starting_phase->raise_objection(this);

}

virtual void body() {

REQ* req,

RSP* rsp;

...

start_item(req);

// req->randomize(); //optional randomization call

finish_item(req);

get_response(rsp);

}

Seq i

Sequence_item i Sequence_item i+1

Perf_indicators i-1 Perf_indicators i

Seq i+1

TDF Time stepCompute Performances

Indicators

T0 T0

Monitor

Driver

DUTTDF(in)

DUTTDF(out)

t

t

ai, bi …

Pi-1, Ri-1 ...

ai+1, bi+1 …

Pi, Ri ...

Page 6: AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to analog and mixed signal systems. We ... language verification environment in SystemVerilog,

IEEE Design & Test of Computers

6

30

31

32

virtual void post_body() {

if ( starting_phase != NULL )

starting_phase->drop_objection(this);

}

};

Listing 2: UVM-SystemC-AMS sequence item

The callback function body is used to implement the user-specific test scenario (Line 18). The member functions start_item and finish_item are called to negotiate and then send the sequence to the sequencer (Line 22 and 24). The member function randomize, is defined as part of the compatibility layer to the SCV or CRAVE library (Line 23, see Section V). The member function body is automatically called from the higher level in the test bench, for example by explicitly calling the member function start in a virtual sequence. Alternatively, a sequence can be started implicitly by defining a default sequence in a test, along with specifying the component and phase where it should be executed. The latter approach is presented in section IV.D, Listing 4.

The callback functions pre_body and post_body (Line 13 and 30) are called before and after the callback body, respectively, to raise and to drop an objection only when the sequence has no parent sequence. For this purpose, the data member starting_phase is used, offering a handle to the default sequence (Line 15 and 32).

C. The test bench The test bench is defined as the complete verification

environment which instantiates and configures the universal verification components (UVCs), scoreboard, and virtual sequencer if necessary. The UVCs are sub-environments in the test bench, which contain one or more agents.

Listing 3 shows the implementation of the test bench. It uses the base class uvm_env (Line 1). The UVCs and scoreboard are instantiated using the factory (Lines 15,17,21,24). This facilitates component overriding from the test scenario. The configuration database is used to configure each agent in the UVC as being active or passive, by means of the global function set_config_int (Lines 19 and 20). 1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

class testbench : public uvm_env

{

public:

vip_uvc* uvc1;

vip_uvc* uvc2;

virt_sequencer* virtual_sequencer;

scoreboard* scoreboard1;

UVM_COMPONENT_UTILS(testbench);

testbench( uvm_name name )

: uvm_env( name ), uvc1(0), uvc2(0),

virtual_sequencer(0), scoreboard1(0) {}

virtual void build_phase( uvm_phase& phase )

{

uvm_env::build_phase(phase);

uvc1 = vip_uvc::type_id::create("uvc1", this);

assert(uvc1);

uvc2 = vip_uvc::type_id::create("uvc2", this);

assert(uvc2);

set_config_int("uvc1.*", "is_active", UVM_ACTIVE);

set_config_int("uvc2.*", "is_active", UVM_PASSIVE);

virtual_sequencer = virt_sequencer::type_id::create(

"virtual_sequencer", this);

assert(virtual_sequencer);

24

25

26

27

28

29

30

31

32

33

34

35

scoreboard1 =

scoreboard::type_id::create("scoreboard1", this);

assert(scoreboard1);

}

virtual void connect_phase( uvm_phase& phase )

{

virtual_sequencer->vip_seqr = uvc1->agent->sequencer;

uvc1->agent->monitor->item_collected_port.connect(

scoreboard1->xmt_listener_imp);

uvc2->agent->monitor->item_collected_port.connect(

scoreboard1->rcv_listener_imp);

}

};

Listing 3: UVM-SystemC-AMS test bench

D. The test Each UVM test is defined as a dedicated test class derived

from class uvm_test, as shown in Listing 4 (Line 1). It instantiates the testbench (Line 11) and defines the default sequence which will be executed on the virtual sequencer in the run_phase (Lines 14-17 and section IV.B). The factory member function set_type_override_by_type is used to override the original UVM driver in the agent by a new driver (Listing 4, lines 18-20). The types of the original and new driver are obtained by calling the static member function get_type. The result from the scoreboard checking is extracted in the extract_phase and updates the local data member test_pass. The actual pass/fail reporting is done in the callback report_phase using the available UVM macros for reporting information and errors. 1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

class test : public uvm_test

{

public:

testbench* tb;

bool test_pass;

test( uvm_name name ) : uvm_test( name ),

tb(0), test_pass(true) {}

UVM_COMPONENT_UTILS(test);

virtual void build_phase( uvm_phase& phase )

{

uvm_test::build_phase(phase);

tb = testbench::type_id::create("tb", this);

assert(tb);

uvm_config_db<uvm_object_wrapper*>::set( this,

tb.uvc1.agent.sequencer.run_phase",

"default_sequence",

vip_sequence<vip_trans>::type_id::get()); }

set_type_override_by_type(

vip_driver<vip_trans>::get_type() ,

new_driver<vip_trans>::get_type() );

}

virtual void run_phase( uvm_phase& phase )

{

UVM_INFO( get_name(),

"** UVM TEST STARTED **", UVM_NONE );

}

virtual void extract_phase( uvm_phase& phase )

{

if ( tb->scoreboard1.error )

test_pass = false;

}

virtual void report_phase( uvm_phase& phase )

{

if ( test_pass )

UVM_INFO( get_name(), "** UVM TEST PASSED **",

UVM_NONE );

Page 7: AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to analog and mixed signal systems. We ... language verification environment in SystemVerilog,

IEEE Design & Test of Computers

7

37 else

UVM_ERROR( get_name(), "** UVM TEST FAILED **" );

}

};

Listing 4: UVM-SystemC-AMS test definition

E. The top-level The main program, also called top-level, uses the SystemC sc_main function and contains the DUT, the interfaces connected to the DUT, and the definition of the test, as shown in Listing 5. The interfaces are stored in the configuration database to be used by the UVC drivers and monitors to connect to the DUT (Lines 6-9). 1

2

3

4

5

6

7

8

9

10

11

12

13

14

int sc_main(int, char*[])

{

dut* my_dut = new dut("my_dut");

vip_if* vif_uvc1 = new vip_if;

vip_if* vif_uvc2 = new vip_if;

uvm_config_db<vip_if*>::set(0, "*.uvc1.*",

"vif", vif_uvc1);

uvm_config_db<vip_if*>::set(0, "*.uvc2.*",

"vif", vif_uvc2);

my_dut->in(vif_uvc1->sig_a);

my_dut->out(vif_uvc2->sig_a);

run_test("test");

return 0;

}

Listing 5: UVM-SystemC-AMS main program

The test to be executed is either defined by explicitly instantiating the test object (of type uvm_test) or implicitly by specifying the test as argument of the function run_test (Line 12). The latter method makes it possible to pass the test as string via the command line, available via the arguments of the function sc_main, and pass it to the run_test method. The simulation is started using the SystemC function sc_start. It is recommended not to specify the simulation stop time, nor use the SystemC function sc_stop, as the end-of-test is automatically managed by the UVM-SystemC-AMS phasing mechanism.

F. IP-XACT IEEE1685 IP-XACT standard [34][35] enables an efficient

assembly and configuration of the structural layers of the test environment (test bench, test and top level elements) by generating the relevant SystemC and SystemC-AMS views necessary to conduct verification. It provides also a unified repository to exchange and share compatible components from multiple companies or services. In order to capture the UVM properties, we have introduced some extensions in the schema of the IP-XACT standard. These extensions concern especially:

- The identification of the different UVM objects (test, test-bench, UVC, agent, scoreboard, monitor, driver and sequencer)

- The interconnection of the DUT to the UVCs with the help of virtual interfaces

- The configuration of the platform using the UVM configuration database and agents of configuration

The assembly described in IP-XACT, reproduces the same architecture of the UVM test environment and then offers the same level of reusability. At the end, the generator will provide the SystemC and SystemC-AMS code (header and cpp files) of the different layers: top, test and test environment but also the header of the virtual sequencer containing references to the sequencers instantiated in the different UVCs. The generation is based on templates, one per output files, to introduce some flexibility. Text part (header, comments, fix library) is dumped without any modification and key words are replaced by meta-data elaborated in an internal model from the IP-XACT description. Figure 8 illustrates an example of a generator. The left column shows the template used. The right column shows the generated code.

Figure 8 : template and generated code of top level

with IP-XACT

V. CONSTRAINED RANDOMIZATION AND COVERAGE To cope with the complexity of ESL, random sequences/stimuli are used to increase the coverage of the DUT. As several attempts exist to the power of SystemC to digital verification such as SCV [25] or CRAVE [26], we have proposed an API for randomization and coverage as a compatibility layer, which can either use SCV or CRAVE as a back end solver. To correctly handle the analog signals found in AMS systems, new functions have been introduced to generate random real values, following continuous distribution functions, which are subject to constraints. More over the functional coverage API provides the functions covergroup, coverpoint and bins offered by SystemVerilog, which are extended to handle real values using intervals (Table I).

Functionality   UVM-­‐SystemC-­‐AMS  (scvx)  Coverage  model   scvx_covergroup  Coverage  point   scvx_coverpoint  Coverage  state  bins   bins()  Illegal  bins   illegal_bins()  Ignore  bins   ignore_bins()  Triggers  sampling  the  covergroup   sample()  

Table I: Basic Language constructs for functional coverage in

UVM-SystemC-AMS (scvx)

In addition to the discretized weighted values, a set of continuous distribution has been made available. The randomization features and distribution functions of C++11 [37][38] have been used to build the API as shown in Table II.

Page 8: AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to analog and mixed signal systems. We ... language verification environment in SystemVerilog,

IEEE Design & Test of Computers

8

Distribution  function   UVM-­‐SystemC-­‐AMS  (scvx)  Normal  distribution     scvx_normal_distribution  Uniform  distribution   scvx_uniform_distribution  Bernouilli  distribution   scvx_bernouilli_distribution  Piece-­‐wise   linear  probability   distribution  function  

scvx_piecewise_linear_probability_distribution  

Discretized   probability  distribution  function  

Scvx_discrete_probability_distribution  

Table II: Continuous Distribution Fucntion available for real

values in UVM-SystemC-AMS (scvx)

Figure 9 shows the histogram of the normal distribution with real values. To estimate the coverage, the range of each random variable is divided into intervals. The number of draws (tries which have fulfilled the constraints) is computed in each interval. The functional coverage is based on these figures.

Figure 9: Histogram of a Normal distribution featuring real values

The random variable a is declared in Listing 6. As an example, the member function set_distibution defines the distribution uniform_real, requiring the parameters min (15.3) and max (40.2) for random variable a. scvx::scvx_rand<real>  a;  a.set_distribution(scvx_distribution::uniform_real(15.3,  40.2));    

Listing 6: Setting a distribution function for a random variable

To illustrate the randomization for real values with constraints, a simple case is presented in Figure 10. The DUT is a SPI-controlled programmable filter. In order to verify the DUT with UVM-SystemC-AMS, 4 UVCs have been instantiated in the test environment: a sinusoidal signal generator (U1), an input monitor U2, a power supply generator (U3) and a SPI driver to generate SPI compliant commands (U4).

Figure 10: Test of a programmable filter with the proposed

UVM-SystemC-AMS

Connection between UVCs and the DUT follow the interface configuration mechanism provided by UVM-SystemC-AMS. The scoreboard is responsible for comparing the expected performance of the filter (here the gain) with the actual one.

Figure 11: Simulation result, showing SPI commands, power supply, input/output of the DUT and performance indicator (gain). The input signal is a sinusoidal one with randomized

frequency. The performance indicator is the gain of the filter. The green crosses show the DUT gain after each sequence item.

The red curve shows the reference value computed in the scoreboard.

Figure 11 presents a simulation result of the SPI controlled programmable filter. Note that the first 4 signals at the top of Figure 11 use a dedicated timescale, showing the setting of the SPI commands. The next curves use another timescale, dedicated to illustrate the randomization of real values. The input stimuli of the programmable filter is a sinusoidal signal with randomized frequency. Each frequency corresponds to a certain sequence item. At the end of each sequence item, the gain of the filter is computed from a set of collected input and output signals and the result (green crosses in Figure 11) is sent to the scoreboard. To estimate the coverage of the test, the frequency range has been divided into 12 intervals and a minimum number of frequencies is required in each interval. At the beginning of the test execution, the coverage is low, as shown in Listing 7 whereas at the end of the test, the coverage has increased, though below 100% in our example to illustrate the UVM report.

0

10

20

30

40

50

60

70

80

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40

Freq

uenc

y

x

0clk 0rs

t 0cs

00 1.2e-06 2.4e-06 3.6e-06 4.8e-06 6e-06 7.2e-06 8.4e-06 9.6e-06 1.08e-05 1.2e-05 1.32e-05

mos

i

0 6

P+

-6 0P-

-4 0 4

In+

-4 0 4

In-

-4 0 4

Out

+

-4 0 4

Out

-

00.511.5

0 0.0025 0.005 0.0075 0.01

Gai

n

t / s00.51

1.5

0 0.0025 0.005 0.0075 0.01

Gai

n

t / s

Page 9: AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to analog and mixed signal systems. We ... language verification environment in SystemVerilog,

IEEE Design & Test of Computers

9

Wavefunc  item  sequence  finished  here...  freq  =  31278.9  frequence  [100000]  ±  10%  is  not  covered!  frequence  [90000]  ±  10%  is  not  covered!  frequence  [70000]  ±  10%  is  not  covered!  frequence  [50000]  ±  10%  is  not  covered!  frequence  [40000]  ±  10%  is  not  covered!  frequence  [25000]  ±  10%  is  not  covered!  frequence  [19000]  ±  10%  is  not  covered!  frequence  [14000]  ±  10%  is  not  covered!  frequence  [10000]  ±  10%  is  not  covered!  frequence  [7000]  ±  10%is  not  covered!  frequence  coverage  is  16.6667%  received  gain  =  1.51867  383645113   ps:   test.tb.scoreboard:     Successfully   match   the   expected   gain  value  (  1.51867  )  ±  5%  Wavefunc  item  sequence  finished  here...  freq  =  34988.7  frequence  [25000]  ±  10%  is  not  covered!  frequence  [10000]  ±  10%  is  not  covered!  frequence  [7000]  ±  10%  is  not  covered!  frequence  coverage  is  75%  9621905335   ps:   test.tb.scoreboard:     Successfully   match   the   expected   gain  value  (  1.50686  )  ±  5%  

Listing 7: Output of functional coverage respectively at the beginning of the test execution and at the end.

VI. VERIFICATION USE CASE In order to evaluate the applicability of methodology and

technology to the verification of complex heterogeneous automotive system, it has been applied to an electric braking system design. The DUT consists of the electrical and the mechanical components of a brake actuator, an analog/mixed signal ASIC for brake actuator control and external circuitry.

A major step in system architecture design is to define the components of the system and to define the requirements to these components. To verify that the component requirement specifications will enable the complete system to work as intended, i.e. to meet the system requirements, we have developed an executable specification. The executable specification uses abstract behavioral simulation model allowing a significantly higher simulation performance than RTL level and transistor level representations.

The digital components of the system under test are modeled in SystemC. The analog components and the mechanical components are modeled in SystemC-AMS either as TDF data flow models or as electrical linear network models depending on the accuracy required.

The test environment is implemented using the UVM- SystemC-AMS methodology and technology. The system is stimulated by driver modules and supervised by dedicated monitors. Besides digital agents for SPI interfaces and for bit vector interfaces the test bench contains analog interface agents for creating piecewise linear waveforms and for creating arbitrary analog waveforms.

The test stimuli are defined by sequences and sequence items, which are sent to the drivers via TLM connections.

In addition to mechanisms for normal control operations, safety critical systems typically contain various mechanisms for detecting and handling system errors. Some examples are

the detection of electrical shorts, broken wires, over voltage/under voltage scenarios and mechanical blocking.

An important step in the verification of such systems is to verify that these mechanisms operate as intended. In order to enable tests simulating the corresponding failure scenarios, the test bench implements several drivers for injecting failures into the system.

Figure 12 shows the overall system and test bench including error injection for emulating a short.

Figure 12: Automotive system under test and test bench

The availability of a well structured library of verification components based on a standardized methodology significantly enabled to develop easily further tests and test benches, thereby reducing time and cost for test development.

Further the UVM methodology enabled the construction of fully automated regression test suites.

The tests have been written in a way enabling to re-use them for the validation of RTL design when it becomes available by coupling the HDL simulator simulating the system and a SystemC simulator simulating the test bench.

VII. VALIDATION USE CASE The methodology was applied on an airbag SoC, as an

example for an AMS product, with high complexity and full integration, to show, how UVM can be used on a HIL system to verify a FPGA prototype.

Figure 13: Verification and laboratory evaluation overview

For an effective HIL simulation a tight coupling between

Page 10: AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to analog and mixed signal systems. We ... language verification environment in SystemVerilog,

IEEE Design & Test of Computers

10

the verification of the virtual prototype and laboratory validation of the DUT with the HIL had to be established, as depicted in the workflow in Figure 13. Thereby the re-use of the test environment, with its test and check sequences are critical to allow an early move of the implementation onto the prototype to accelerate and aid verification and validation.

The DUT evolves from a behavioral SystemC-AMS description to a mixed abstraction model. It is then synthesized towards an FPGA prototype and results in the first hardware setup [39].

While the DUT evolves, the test environment should remain as similar as possible to guarantee functional correct testing throughout the steps. This constancy also ensures that a minimal effort is spend on test environment creation. Figure 14 illustrates the implemented software and hardware layers of the verification and laboratory validation architecture and emphasizes the re-use.

Figure 14: Software and Hardware implementation

As depicted in Figure 14, the test environment is based on UVM-SystemC represented by the three upper most layers. The high abstraction test description provides test vectors to the specific UVM agent via the virtual sequencer, which in turn provides specific driver and monitor components (not shown) to connect the test environment to the corresponding DUT implementation. During the verification, the monitor and drivers are function implementations, connecting and wrapping the modeled DUT via interfaces to the UVM-SystemC test environment. The connection in case of validation has to connect through the specific HW of the HIL-Tester.

The HIL-Tester can directly build and run the native UVM-SystemC based test environment. It was implemented using an ARM based Zynq-SoC [40], build on a Zedboard [41], running a real-time Linux scheduling the test environment.

Figure 15 demonstrates the laboratory setup of the airbag SoC DUT FPGA prototype. To realize the HIL-Tester concept, the main micro-controller is replaced now by the Zedboard with the Xilinx Zynq based ARM-SoC, on which

the UVM-SystemC test bench is re-used to emulate the physical stimulus to drive the DUT FPGA-prototype.

Figure 15: Real laboratory setup for the evaluation of the new

verification/laboratory evaluation strategy

With this implementation, it was possible to shorten and

focus the effort of a root-cause analysis in the virtual prototype through the re-use of the test sequences obtained from the HIL simulation within the computer based verification. As the test sequences are re-used, including the test vectors, a direct mapping between the verification and validation activities becomes possible.

Including the HIL in the verification process increased the speed of verification and the bug detection possibilities at an early point in time, thus reducing development costs.

The UVM-SystemC-AMS test bench can be directly re-used and it can additionally be extended for long-term tests and stress tests, which are impractical to do with classical system computer-based simulation.

With the new setup, the whole prototype system is checked at real-time speed against real sensor network used later in the application instead of sensor model simulation.

VIII. CONCLUSION We have explained how a unified methodology for the

verification of systems having interwoven AMS/RF, HW/SW and non-electrical sub-systems and functions can be achieved. As of today, there only exist verification methodologies addressing either AMS/RF or digital HW/SW functions. The clear separation of the DUT and its verification description, which has been established in the last years for complex digital systems, has been extended to analogue mixed signal ones. Thus the essential unification of analogue and digital verification is made possible. We have shown that the components and the scenarios designed for the simulation-based verification can be reused for the validation based on measurements of the complex DUT.

To support the new methodology inspired by UVM, we have introduced language constructs and generic verification components in SystemC and its AMS extensions to create application scenarios consisting of stimuli generation and response checking. We have called this library UVM-SystemC-AMS. The generic verification components with

Page 11: AMS System-Level Verification and Validation using UVM in ... · DUT interface, can be extended to analog and mixed signal systems. We ... language verification environment in SystemVerilog,

IEEE Design & Test of Computers

11

their reuse features have been illustrated by two industrial use cases: the first use one showing UVM-SystemC-AMS based verification of a complex SystemC-AMS design, the second one showing the verification/validation with HIL using a FPGA prototype. Moreover, by extending the IP-XACT packaging process for facilitating its exploitation in industrial design flows, we aim to push the reuse operation.

The implemented UVM-SystemC-AMS library introduced in this paper is under Standardization within the Accellera Systems Initiative.

REFERENCES [1] T. Wieman, B. Bhattarchary, T. Jeremiassen, C. Schröder, and B.

Vanthournout, “An Overview of SystemC Initiative Standards Development”; IEEE Design & Test of Compuers, 14-22, 2012, doi:10.1109/MDT.2012.21845518.

[2] IEEE Standards Association, “SystemC”, 2011, http://standards.ieee.org/getieee/1666/download/1666-2011.pdf

[3] Accellera Systems Initiative, “SystemC AMS 2.0 extensions”, 2013, http://www.accellera.org/downloads/standards/systemc/

[4] E. Viaud, F. Pêcheux, and A. Greiner. “An efficient TLM/T modeling and simulation environment based on conservative parallel discrete event principles”, Design, Automation & Test in Europe Conference (DATE), 2006, 94-99.

[5] Wolfgang Ecker, Volkan Esen, Robert Schwencker, Thomas Steininger, Michael Velten, TLM+ modeling of embedded HW/SW systems, Design, Automation & Test in Europe Conference (DATE), March 2010.

[6] F. Cenni, S. Scotti, E. Simeu, "SystemC AMS behavioral modeling of a CMOS video sensor," International Conference on VLSI and System-on-Chip (VLSI-SoC), Oct. 2011, pp.380-385, doi: 10.1109/VLSISoC.2011.6081614

[7] Y. Zaidi, C. Grimm, J. Haase, "Simulation based tuning of system specification," Design, Automation & Test in Europe Conference (DATE), 2011, March 2011, doi: 10.1109/DATE.2011.5763204

[8] A. Lévêque, F. Pêcheux, M-M. Louërat, H. Anoushady, F. Cenni, S. Scotti, A. Massouri, L. Clavier, “Holistic Modeling of Embedded Systems with Multi-Discipline Feedback: Application to a Precollision Mitigation Braking System”, Design, Automation & Test in Europe Conference (DATE), March 2012, 739-744.

[9] W. Li, D. Zhou, M/ Li, B.P. Nguyen, X. Zeng, “Near-Filed Communication Transceiver System Modeling and Analysis using SytemC/SystemC-AMS with the Consideration of Noise Issues, IEEE Trans. VLSI Systems, Vol. 21, N° 12, Dec. 2013, 2250-61

[10] Accellera Systems Initiative, “Universal verification Methodology standard (UVM)”, 2014, http://www.accellera.org/downloads/standards/uvm/

[11] IEEE Standards Association, “System Verilog”, 2012, http://standards.ieee.org/getieee/1800/download/1800-2012.pdf

[12] M. F.S. Oliveira, C. Kuznik, H. M. Le, D. Groβe, F. Haedicke, W. Mueller, R. Drechsler, W. Ecker, and V. Esen., “The system verification methodology for advanced TLM verification”. Int. Conference on Hardware/software codesign and system synthesis,(CODES+ISSS), 2012, 313-322. doi=10.1145/2380445.2380497

[13] A. Koczor, and W. Sakowski, “SystemC library supporting OVM compliant verification methodology”, IP Embedded System Conference and Exhibition (IP-SPOC), December 2011

[14] M. F. S. Oliveira, C. Kuznik, W. Mueller, W. Ecker and V. Esen, “A SystemC Library for Advanced TLM Verification”, Design and Verification Conference & Exhibition (DVCON), February 2012.

[15] “Cadence Extends the Open Verification Methodology Beyond System Verilog to Include SystemC and e Language Support”, http://community.cadence.com/cadence_blogs_8/b/fv/archive/2009/02/27/ovm-multi-language-libraries-a-closer-look.

[16] Mentor Graphics, “Advanced Verification Methodology Cookbook”, 2012, http://www.mentor.com/products/fv/methodologies/uvm-ovm/cb_details

[17] Synopsys, “Verification Manual for System Verilog”, http://vmm-sv.org

[18] A. Sarkar, “Verification in the trenches : A SystemC implementation of VMM1.2”, 2010, https://www.vmmcentral.org/vmartialarts/2010/12/verification-in-the-trenches-a-systemc-implementation-of-vmm1-2/

[19] IEEE Standards Association “1647-2011 - IEEE Standard for the Functional Verification Language e”, August 2011, doi: 10.1109/IEEESTD.2011.6006495

[20] J. Aynsley, SystemVerilog Meets C++: Re-use of Existing C/C++ Models Just Got Easier, Design and Verification Conference & Exhibition (DVCON), February 2010.

[21] Mentor Graphics, “UVM Connect - a SystemC TLM interface for UVM/OVM - v2.2, http://forums.accellera.org/files/file/92-uvm-connect-a-systemc-tlm-interface-for-uvmovm-v22, accessed October 2014.

[22] A. Jafri, N. Naser,” Interoperable testbenches using VMM TLM”, Synopsys User Group, April 2010.

[23] M. Barnasconi, F. Pêcheux, T. Vörtler, “Advancing system-level verification using UVM in SystemC“, Design and Verification Conference & Exhibition (DVCON), March 2014

[24] K. Einwich, T. Vörtler, T. Arndt, “New technological opportunities coming along with SystemC / SystemC AMS for AMS IP Handling and Simulation”, European Nanoelectronics Design Technology Conference (DTC), June 2012

[25] Accellera Systems Initiative, “SystemC Verification Library” http://www.accellera.org/downloads/standards/systemc/scv-2.0.0.tgz

[26] F. Haedicke, H. M. Le, D. Große, R. Drechsler, “CRAVE: An Advanced Constrained Random Verification Environment for SystemC”, Int. Symposium on System on Chip (SoC), Oct. 2012.

[27] R. Siegmund, U. Hensel, A. Herrholz, and I. Volt, “A functional coverage prototype for SystemC-based verification of chipset designs”, ESCUG Meeting, Design Automation and Test in Europe Conference (DATE), February 2004.

[28] K. Schwartz, “A technique for adding functional coverage to SystemC”, Design and Verification Conference & Exhibition (DVCON) Feb. 2007.

[29] C. Kuznik and W. Müller, “Functional coverage-driven verification with SystemC on multiple level of abstraction”, Design and Verification Conference & Exhibition (DVCON), March 2011.

[30] A. Hoffman, T. Kogel, and H. Meyr. “A framework for fast hardware-software co-simulation.” Design, automation and test in Europe Conference (DATE), 2001..

[31] M. Bombana and F. Bruschi, “SystemC-VHDL co-simulation and synthesis in the HW domain”, Design, Automation & Test in Europe Conference (DATE), March 2003, 101-105, doi: 10.1109/DATE.2003.1186679

[32] J.A. Ledin, “Hardware-in-the-Loop Simulation”, . Embedded Systems Programming, February 1999, 42-60.

[33] T.A. Johansen, T.I. Fossen, B. Vik, “Hardware-in-the-loop Testing of DP systems”, Dynamic Positioning Conference, Nov 2005.

[34] Accellera Systems Initiative, “IP-XACT standard”, 2009, http://www.accellera.org/downloads/standards/ip-xact

[35] IEEE Standards Association, “IP-XACT”, 2014, http://standards.ieee.org/getieee/1685/download/1685-2014.pdf

[36] T. Vörtler, T. Klotz, K. Einwich, Y. Li, Z. Wang, M.-M. Louërat, J.-P. Chaput, F. Pêcheux, R. Iskander and M. Barnasconi, “Enriching UVM in SystemC with AMS extensions for randomization ad functional coverage”, Design & Verification Conference & Exhibition (DVCON Europe), October 2014

[37] The C++ Standards Committee, 2011 “C++11”, http://www.open-std.org/jtc1/sc22/wg21/

[38] W. E. Brown, Random Number Generation in C++11, Document WG21 N3551, 2013

[39] T. Nguyen and S. Wooters, " Mixed-abstraction Modeling Approach with Fault Injection for Hardware-Firmware Co-design and Functional Co-verification of an Automotive Airbag System on Chip Product," SAE Int. J. Passeng. Cars – Electron. Electr. Syst. 7(1):125-132, 2014, doi:10.4271/2014-01-0240.

[40] Xilinx – Zynq-7000 All Programmable SOC, http://www.xilinx.com/products/silicon-devices/soc/zynq-7000/, accessed Oct. 2014

[41] Zedboard – Product page, http://zedboard.org/product/zedboard, accessed October 2014