Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

download Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

of 20

Transcript of Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    1/20

    Design verification and validation in product lifecycle

    P.G. Maropoulos (1)a,*, D. Ceglarek (1)b

    a Department of Mechanical Engineering, University of Bath, Claverton Down, Bath BA2 7AY, UKbWarwick Digital Laboratory, University of Warwick, Coventry, UK

    1. Introduction

    Globalisation coupled with product customisation and short

    time to market have spearheaded new levels of competitionamong manufacturers. In CIRP, the needs for design adaptability

    [1], the ability to develop products and services for the e-

    commerce era [2] and the issues of dealing with design

    complexity [3] have been recognised. To be successful in the

    global market, manufacturing companies are increasingly

    expanding simulation models from product and process based

    (value chains) to service based (value networks) by focusing on

    lifecycle simulations and design for product variation [4] to

    obtain both quality of product and robustness of processes, and

    to enable the validation and verification of products and

    processes to 6-sigma. These methods are vital to reduce process

    faults and facilitate efficient and effective engineering changes.

    Current validation and verification-based approaches mainly

    focus on product conformance to specifications, product func-

    tionality and process capability. However, even the most robust

    systems can be subject to failures during product verification and

    validation.

    This paper presents the concepts of validation and verification

    in the product lifecycle by including analysis and review of

    literature and state-of-the-art in: (i) preliminary design, (ii) digital

    product and process development; (iii) physical product and

    process realisation; (iv) system and network design; and (v)

    complex product verification and validation.

    The paper starts with a summary of thescientific motivation for

    the review of design verification and validation. The definitions of

    verification and validation are then covered, including concepts

    and definitions arising from ISO standards as well as software

    development. The paper also defines the design application areas

    in terms of products, processes and systems and reviews main-

    stream methods and systems.

    2. Motivation, scope and definitions of verification and

    validation methods and technologies

    2.1. Motivation

    The current product and production system requirements that

    influence the way products are developed and verified include:

    Mass customisation and personalisation.

    Reconfigurability and flexibility of production systems. Responsive factories.

    Products and processes need to be designed, verified and

    validated in a manner that is compatible with the above industrial

    requirements.Fig. 1shows a representation of validating products

    and processes after the digital modelling phase, clearly identifying

    the research questions and business drivers.

    Validation in the digital space is a key objective and industrial

    requirement that drives research and development. If this were to

    be feasible, the results would have been reduced lead times and

    critically, fewer failures and better perceived product quality by

    the customers.Fig. 2shows the closed-loop nature of the process

    required for managing the lifecycle data capture for design

    validation. This ability presupposes:

    Integrated and holistic views of design in order to be able to

    validate in an integrated manner.

    Digital modelling and representation ability for both the product

    and the process (function and specification testing). A time horizon that includes the product lifecycle.

    CIRP Annals - Manufacturing Technology 59 (2010) 740759

    A R T I C L E I N F O

    Keywords:

    Design

    Validation

    VerificationLifecycle management

    A B S T R A C T

    The verification and validation of engineering designs are of primary importance as they directly

    influence production performance and ultimately define product functionality and customer perception.

    Research in aspects of verification and validation is widely spread ranging from tools employed duringthe digital design phase, to methods deployed for prototype verification and validation. This paper

    reviews the standard definitions of verification and validation in the context of engineering design and

    progresses to provide a coherent analysisand classification of these activities from preliminary design, to

    design in the digital domain and the physical verification and validation of products and processes. The

    scope of the paper includes aspects of system design and demonstrates how complex products are

    validated in thecontext of their lifecycle.Industrialrequirements arehighlighted andresearch trends and

    priorities identified.

    2010 CIRP.

    * Corresponding author.

    Contents lists available atScienceDirect

    CIRP Annals - Manufacturing Technology

    j ourna l hom e pa ge : ht t p: / / e e s. e l se v i e r. com / ci rp/ de f a ul t . asp

    0007-8506/$ see front matter 2010 CIRP.

    doi:10.1016/j.cirp.2010.05.005

    http://dx.doi.org/10.1016/j.cirp.2010.05.005http://www.sciencedirect.com/science/journal/00078506http://dx.doi.org/10.1016/j.cirp.2010.05.005http://dx.doi.org/10.1016/j.cirp.2010.05.005http://www.sciencedirect.com/science/journal/00078506http://dx.doi.org/10.1016/j.cirp.2010.05.005
  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    2/20

    The following observations are valid in relation to the present

    industrial practice for design verification and validation:

    Such activities are usually executed when the design process is

    almost complete, during prototyping and first-off testing and

    development. This results in frequent deviations from the

    required form, dimensions or function, extending development

    times and increasing the compliance cost. This problem is both procedural (stage or time of execution of

    such activities and requirement for different skills) and

    theoretical (lack of robust verification and validation methods

    for deployment during the digital design stages). The aim is to execute verification and validation as early as

    possible during the design process, by developing new genera-

    tion digital or virtual testing methods.

    Complexity in design makes verification and validation even

    more difficult to apply as part of the design process.

    2.2. Scope of the keynote paper

    2.2.1. A framework for design verification and validation

    Fig. 3shows the scope of the new framework for engineeringdesign verification and validation which is lifecycle based, tracking

    the progression of engineering designs across four key stages: (i)

    from thepreliminary designstagethat sets therequirements,(ii) to

    the digital design domain, (iii) the physical, product and process

    development and prototyping phase, and (iv) the consequent

    design of the production system and network for the realisation of

    complex products and processes.

    Product and process designs are developed in the digital

    domain and the final validation usually requires the execution of

    physical trials to confirm the product properties, dimensions and

    overall functionality at component, subsystem and complete

    product level. Processes are also validated at each one of their

    physical levels so as to provide the required physical attributes of

    components, sub-assemblies and the overall product. The system

    and network design and development also includes a digital phase

    and major considerations are confirmed by validating real system

    performance. Product lifecycle aspects are best exemplified by

    considering how complex products are validated in the context of

    lifecycle considerations. The framework shown in Fig. 3, puts a

    coherent structure to the multiplicity of digital analyses, manu-

    facturing processes and metrology technologies needed for the

    verification and validation of complex products in their lifecycle.

    These techniques and methods and their relevance to design

    verification and validation are analysed herein.

    2.2.2. Keynote scope

    Thescope forthis keynote is outlined in Fig. 4. The mainfocusof

    the paper is on product and process verification and validation.

    System perspectives are also included for completeness and

    lifecycle aspects are covered by reviewing standards and practices

    in relation to the verification and validation of complex products.

    The paper principally deals with mechanical engineering designfrom meso-scale to large-scale, and the corresponding processes,

    typical of high complexity and value industry sectors such as

    aerospace, marine and automotive.

    2.3. Definitions of verification and validation

    Verification and validation are the methods that are used for

    confirming that a product, service, or system meets its respective

    specifications and fulfils its intended purpose. In general terms,

    verification is a quality control process that is used to evaluate

    [

    Fig. 1. Validation and verification requirements in the product lifecycle.

    [

    Fig. 2. Closed-loop validation and verification.

    [

    Fig. 3. A conceptual framework for design verification and validation.

    P.G. Maropoulos, D. Ceglarek/ CIRP Annals - Manufacturing Technology 59 (2010) 740759 741

  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    3/20

    whether or not a product, service, or system complies with

    regulations, specifications, or conditions imposed at the start of a

    development phase [5,6]. Validation, on the other hand, is a quality

    assurance process of establishing evidence that provides a high

    degree of assurance thata product, service, or system accomplishes

    its intended use requirements [5,6]. Verification and validation

    have been defined in various ways that do not necessarily comply

    with standard definitions. For instance, journal articles and

    textbooks use the terms verification and validation inter-

    changeably [7,8], or in some cases there is reference to verifica-

    tion, validation, and testing (VV&T) as if it were a single concept,

    with no discernible distinction among the three terms[9].Table 1

    shows definitions of verification and validation as provided by

    international and national bodies.

    The definitions given by ISO 9000 [16] originate from the

    general field of quality and focus on the provision of objective

    evidence that specified requirements have been fulfilled. The

    verification process according to ISO is broadly defined, and

    validation is focused on fulfilling an intended use or application.

    The Global Harmonisation Task Force, defines verification in a

    manner compatible with ISO, and process validation is based on

    consistent generation of results that satisfy predetermined

    requirements [19]. However, such generic definitions evolved

    due to the specific demands of application domains. For example,

    in the field of metrology, the Joint Committee for Guides in

    Metrology defines verification on the basis that a targetmeasurement uncertainty has been met [17]. The definition of

    validation is much less specific, referring to the adequacy of

    requirements for an intendeduse. The verification definitionby the

    International Organisation of Legal Metrology[18]is based on the

    interpretation of theword accurate,and it clearly creates a direct

    link with metrologyin the process of establishinghow differentthe

    real artefact is from its modelling representation.

    There are extensive definitions of verification and validation in

    the contextof digital design and these definitions also cover aspects

    of modelling and simulation. These include the IEEE Standard 610

    [10] andthe definitions of theUS Departmentof Defence(DoD) [12],

    as shown inTable 1. The US Department of Navy[13]and the CFD

    Committee of AIAA [14] provide definitions for modelling and

    simulation software systems that are derivatives of those provided

    by the US DoD. The US Food and Drug Administration has given

    definitions of digital systems verificationand validation [15], which

    explicitly include references to the consistency and correctness

    of the software. SAE Aerospace [20]and Sargent [21] reported a

    variety of design verification aspects, as shown in Fig. 5.

    In summary, the generic definitions for design verification and

    validation aregivenby ISO9000 [16]. Asthe digital stages ofdesign

    become increasingly important, the verification of the modelling

    [

    Fig. 4. Scope of the keynote paper.

    Table 1

    Definitions of verification and validation in the digital and physical domains.

    Verification Validation

    V&V processes in digital design phase The process of evaluating software to determine

    whether the products of a given development

    phase satisfy the conditions imposed at the

    start of that phase[10]

    The process of evaluating software during or at the

    end of the development process to determine whether

    it satisfies specified requirements[10]

    The process of determining that a computational

    model accurately represents the underlying

    mathematical model and its solution[11]

    The process of determining the degree to which a model

    is an accurate representation of the real world from

    the perspective of the intended uses of the model [11]

    The process of determining that a computer model,

    simulation, or federation of models and simulations

    implementations and their associated data accurately

    represent the developers conceptual description and

    specifications [12]

    The process of determining the degree to which a model,

    simulation, or federation of models and simulations,

    and their associated data are accurate representations

    of the real world from the perspective of the intended

    use(s)[12]

    The process of determining the degree to which a

    modelling and simulation (M&S) system and its

    associated data are an accurate representation of the

    real world from the perspective of the intended uses

    of the model[13]

    The process of determining that an M&S implementation

    and its associated data accurately represent the

    developers conceptual description and specifications[13]

    The process of determining that a model accurately

    represents the developers conceptual description ofthe model and the solution to the model [14]

    The process of determining the degree to which a model

    is an accurate representation of the real world from theperspective of the intended uses of the model [14]

    Providing objective evidence that the design outputs

    of a particular phase of the software development

    lifecycle meet all of the specified requirements for

    that phase[15]

    Confirmation by examination and provision of objective

    evidence that software specifications conform to user

    needs and intended uses, and that the particular

    requirements implemented through software can be

    consistently fulfilled [15]

    V&V processes in physical world Confirmation, through the provision of objective

    evidence, that specified requirements have been

    fulfilled[16]

    Confirmation, through the provision of objective

    evidence, that the requirements for a specific intended

    use or application have been fulfilled[16]

    Provision of objective evidence that a given item

    fulfils specified requirements, such as confirmation

    that a target measurement uncertainty can be met [17]

    Where the specified requirements are adequate for an

    intended use[17]

    Pertains to the examination and marking and/or

    issuing of a verification certificate for a measuring

    system [18]

    Objective evidence that a process consistently produces

    a result or product meeting its predetermined

    requirements[19]

    Confirmation by examination and provision of

    evidence that the specified requirements have beenfulfilled[19]

    Validation of requirements and specific assumptions is

    the process of ensuring that the specified requirementsare sufficiently correct and complete so that the product

    will meet applicable airworthiness requirements[20]

    The verification process ensures that the system

    implementation satisfies the validated requirements[20]

    P.G. Maropoulos, D. Ceglarek/ CIRP Annals - Manufacturing Technology 59 (2010) 740759742

  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    4/20

    and simulation aspects [10,12] will become increasingly applic-

    able. The overall process for integrated digital and physical

    prototype verification and validation is exemplified by SAE

    Aerospace[20], seeFig. 5, and the metrological practice governing

    the physical prototypes is given by VIM [17].

    3. International standards related to product and process

    design in the lifecycle perspective

    International standards play an important role in preservingthe

    designers intent and seamlessly utilising the associated informa-

    tion and manufacturing practices in a heterogeneous manufactur-

    ing environment. The transition of the designers intent from the

    digital design specification to the actual product and associated

    service realisation is illustrated in Fig. 5. Today,as eachphase ofthe

    products lifecycle is globally dispersed in supply and knowledge

    chains [2], international standards are essential to deploy

    standardised manufacturing execution protocols in order to

    establish an unambiguous definition language throughout aglobal supply chain and ensure consistent product performance in

    the service phase. Hence, the provisions of the most relevant to

    product and process verification and validation standards are

    analysed herein.

    3.1. Standards for representing product information

    Computer interpretable representation of product information

    is utilised within a variety of CAx applications for design

    verification and validation. The majority of these standards

    represent geometric information and evolved to cover other

    aspects. Standards suchas Geometrical Product Specification (GPS)

    [22], ASME Y14.5: Geometric Dimensioning and Tolerancing

    (GD&T)[23], STandard for Exchange of Product model data (STEP)[24]have thus evolved for modelling and preserving other aspects

    of product related information such as tolerances, kinematics,

    dynamics and manufacturing processes. For example, the STEP and

    GPS standards have evolved, providing product specific informa-

    tion constructs known as application protocols in STEP and GPS

    matrix in GPS.

    Current GPS standards define global guidelines along with

    fundamental principles for capturing designers intent and

    expressing design requirements. Product and process design

    characteristics such as size, angle, orientation and surface texture

    are considered as individual chains as shown in Fig. 6. The

    information regarding each characteristic is categorised according

    to its relevance in the product lifecycle. Each category is called a

    link within the GPS masterplan [22]. Thus, a comprehensivechain-link matrix (Fig. 6) has resulted in a number of GPS

    standards which address how product specific characteristics can

    be represented and utilised throughout the design, manufacture

    and verification phases of the product. For example, designers

    intentregarding the size of theproductsfeature is preserved in the

    size chain of the GPS matrix.

    Mathieu and Dantan [25] proposed to ISO a new model for

    Geometric Specification and Verification called GeoSpelling as a

    basis for GPS standards rebuilding. The merits of GPS standards

    have been exploited in a variety of digital product design

    applications such as coherent tolerancing process[26], evaluation

    of measurement uncertainty [27] and quantitative characterisa-

    tion of surface texture [28,29]. Srinivasan [30] identified the merits

    of unifying and standardising ad hoc approaches practiced byindustry. GPS allows such unification and standardisation through

    global guidelines described in the GPS masterplan [22]. More

    recent GPS standards[31]introduced the concepts of specification

    uncertainty and correlation uncertainty that directly influence

    validation and verification.

    A symbolic language called GD&T[23]has been developed for

    describing nominal geometry of parts and assemblies and

    allowable variation in the product design and verification phase.

    GD&T brings significant benefits in design and inspection activities

    as a correct GD&T representation captures design intent andshows

    the functional requirements of the part as well as themethod forits

    inspection[23]. Arguably, the most important benefit of the GD&T

    approach lies in ensuring, at the design phase, that component

    parts will assemble into the final product and function as intended[32]. Shen et al. [33] proposed a semantic GD&T representation

    model, named the constraint-tolerance-feature-graph that is

    claimed to satisfy all tolerance analysis needs. Kong et al. [34]

    formulated an approach for the analysis of non-stationary

    [

    Fig. 6. Transition of designers intent to physical realisationthrough GPS guidelines.

    [

    Fig. 5. Verification in digital and physical world (adapted from Refs. [20,21]).

    P.G. Maropoulos, D. Ceglarek/ CIRP Annals - Manufacturing Technology 59 (2010) 740759 743

  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    5/20

    tolerance variation during a multi-station assembly process with

    GD&T considerations. The application of GD&T for mechanical

    design has gained widespread acceptance by industry [35].

    However, several organisations have attempted to implement

    the method without a fundamental understanding of how the

    design process is impacted[36]. Poorly applied GD&T, ambiguous

    plus/minus location or orientation controls, and sometimes no

    variation specifications are commonly encountered[37]. The need

    to capture functional requirements and improve the design of partsas well as to consider the cost and quality issues defined by GD&T

    makes this subject an even more important element of mechanical

    engineering design[38].

    In summary, the GPS[22,31]and GD&T[23]standards are vital

    for the correct and efficient verification of mechanical engineering

    designs. There are exciting new research opportunities arising

    from the utilisation of these standards to automate the bi-

    directional relationships between design specifications, process

    capability and measurement uncertainty.

    The STEP project was launched with the objective of conserving

    the manufacturing context and developing information bridges

    between segregated CAx domains [24]. EXPRESS[39] is used to

    specify requirements on information content as it consists of

    language elements that allow an unambiguous data definition and

    specification of constraints on the data defined. The development

    of the STEP standard was governed by industrys need to overcome

    interoperability problems. The standard established a neutral data

    file format that is used for developing domain specific applications

    using application protocols (APs). For example, AP 219 [40]

    provides information requirements for analysing the dimensional

    inspection data and results of solid parts and assemblies. Fig. 7

    shows a selected set of application protocols that are vitally

    important for the communication and sharing of data required in

    design verification and validation of mechanical components.

    3.2. Standards for representing manufacturing processes

    A process in a manufacturing context is defined as a

    combination of activities that occur over a period of time inwhich objects participate[41]. The National Institute of Standards

    and Technology (NIST) in the USA developed the Process

    Specification Language (PSL) [42] to create a generic, neutral

    and high-level language for specifying processes and the integra-

    tion of multiple process-related applications. PSL uses the ontology

    based Knowledge Interchange Format to specify concepts,

    terminology and relationships for processes. Similarly, a data

    model for representing manufacturing processes was developedby

    NIST, which later became a part of the international standard ISO

    16100 for exchanging information between design and manufac-

    turing process planning software systems for mechanical products

    [43].

    The need for comprehensive information regarding specific

    manufacturing processes and the verification of components,compelled practitioners to develop process specific international

    standards such as DMIS[44], DML[45]and I++DME[46]for the

    exchange of inspection process information and measurement

    results in the production environment. Similarly, the BS EN ISO

    8062 series[47]and the BS EN ISO 10135 [48]series of standards

    within the GPS framework cover the requirements for casting and

    moulding processes. Another set of process specific standards is

    the ISO 14649 series [49], with parts corresponding to different

    processes; for instance, part 16 [50] for performing inspection

    operations in a STEP-NC manufacturing environment.

    3.3. Standards for representing manufacturing resources

    A typical manufacturing system consists of a range of resources

    such as machine tools, material handling systems, fixtures, robotic

    arms, and measurement systems[51]. Each resource has a distinct

    purpose and thus provides specific capabilities that are utilised in

    manufacturing decision-making. A variety of international stan-

    dards have evolved in order to utilise and exchange the

    information regarding manufacturing resources and their cap-

    abilities in a digital environment [52]. For example, ISO 13584 [53]

    with the acronym PLIB is a series of standards for the computer-

    based representation and exchange of part library data. PLIB is fully

    inter-operable with STEP [24]. Resource specific standards have

    evolved to satisfy business needs. For example, ISO 13399 [54]

    deals with the representation and exchange of cutting tool data

    and ASME B5.59-2[55]is an information model for machine tools.

    Measurement equipment related GPS standards [56,57] were

    developed to describe the acceptance tests for co-ordinate

    measuring machines and general requirements for GPS measuring

    equipment respectively.

    3.4. Standards for preserving design verification knowledge

    International standards are used to preserve and seamlessly

    transfer context specific knowledge obtained through design

    verification, within a heterogeneous manufacturing environment.

    Business sectors such as, aerospace manufacturing, defence, ship

    building and military equipment manufacturing intensively investin research and development activities and have a strong

    requirement to conserve and reuse knowledge acquired through

    the design verification processes. Consequently, ISO 10303 AP 209

    [58] has been developed by aerospace and commercial research

    organisations for associating engineering analysis data with

    geometric data. ISO 10303 AP 237 deals with the exchange of

    computational fluid dynamics (CFD) information, including

    product geometry, associated meshes defining the computational

    details and CFD boundary conditions[59].

    4. Verification and validation in the early stages of design

    capture intent and confirm requirements

    The early design stages are vitally important for the correctcapture of technical and lifecycle requirements arising from

    understanding and interpreting market needs. Verification is

    inherent in methods deployed during these important early stages,

    although this is not always appreciated by designers and

    manufacturing practitioners. This section outlines methods for

    design idea validation and quality function deployment (QFD) as

    well as the more technical aspects of ensuring that consistency in

    terms of key design objectives is maintained using key character-

    istics (KCs) and Design for X (DFX) techniques.

    4.1. Product idea validation and market analysis

    There are three key considerations that are applied in the early

    stages of design: (1) to prioritise customer needs (CNs) in aquantitative manner based on market analysis; (2) to select the

    best design schema; and (3) to improve communication at all

    levels of the organisation. Methods such as matrix prioritisation

    and analytical hierarchy process [60] are applied to help the

    [

    Fig. 7. Integration of designers intents within STEP framework.

    P.G. Maropoulos, D. Ceglarek/ CIRP Annals - Manufacturing Technology 59 (2010) 740759744

  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    6/20

    enterprise determine where to invest the development resources

    to achieve maximum payoff.

    The traditional way is to analyse CNs systematically and to

    transform them into the appropriate product features. However, it

    is difficult to assess the performance of the transformation process

    with an accurate quantitative evaluation. Buyukozkan et al. [61]

    presented a fuzzy group decision-making approach to better align

    CNs with objectives of product development in QFD. This

    prioritisation of customer needs creates a set of criteria that is

    used for validating the final product i.e., assessing whether the

    enterprise is building the right product, service or system.

    4.2. Quality function deployment

    QFD is a customer-driven methodology for product design and

    development that underpins quality systems and has found

    extensive applications in industry via the development of a

    multiplicity of tools and systems that aid an enterprise in

    understanding the voice of the customer [60]. QFD efficiently

    translates CNs into design requirements and parts deployment

    [62]. As shown inFig. 8, a generic QFD process consists of four

    phases in order to relate the voice of the customer to productdesign requirements (phase 1), andthen translate these intoparts

    characteristics (phase 2), manufacturing operations (phase 3),

    and production requirements (phase 4)[63]. During early design,

    the first and second phases of the four QFD phases are

    implemented [63] and part characteristics are defined. In

    summary, QFD is critical to design validation as it translates

    customer needs into part characteristics and production controls

    that can then be used fordesign verification, by forming the setof

    criteria against which product and process compliance can be

    assessed.

    4.3. Functional decomposition and flow analysis

    The verification and validation process of a function can beviewed as functional decomposition and flow analysis which aim

    to break overall functionalities down to functionally independent

    sub-functions as finely as possible[64]. A functional structure can

    be validatedby considering bothlogical and physical dependencies

    and confirming matching inputs and outputs among sub-functions

    [65]. Several flow analysis methods such as bond graph and Petri

    nets [66] and modularity methods such as function structure

    heuristic method[67], design structure matrix[68]and modular

    function deployment [69] are applicable to the verification and

    validation of functional structures.

    In an era of increasing product sophistication, engineered

    systems are likely to become more complicated, increasing the

    functional requirements [3]. Suh [3] defined complexity as the

    measure of uncertainty in achieving the functional requirements ofa complex system and outlined how axiomatic design can be used

    to reduce design complexity while satisfying the functional

    requirements within given constraints. As such, axiomatic design

    can enhance the functional validation of designs.

    4.4. The use of key characteristics in early design

    Variability in production and measurement procedures can

    result in lower than expected quality levels, compromised product

    performance and increased rectification costs. Key characteristics

    (KCs) are being used to help identify and reduce important root

    causes of variability [70]. Research focused on KCs has had a

    significant impact in improving product and process performance

    in the context of the lifecycle[71,72]. KC methodologies have beenintroduced into the product development practices of world-class

    companies [73]. Thornton [74] categorised product related KCs

    according to the level of the product model as KCs belonging to;

    product, subsystem, component, feature and feature face. Thorn-

    ton [75] proposed a method for variation risk management in

    aircraft and automotive production by establishing a direct link

    between KCs and the type of inspection process used for

    verification.

    The use of KCs for manufacturing planning during early design

    enhances process verification. Dai andTang [76] definedverification

    parametersby prioritizingKCs.Whitney [77] proposeda KCoriented

    method for assembly planning by selecting the necessary part

    features, tools and machine capabilities. Wang and Ceglarek [78]

    developed a KC based methodology for quality-driven sequence

    planning. Suri et al. [79] introduced a technique based on key

    inspection characteristics to enhance process capability. Maropou-

    los et al. [80]proposed the use of aggregate product models as a

    method for the early integration of dimensional verification and

    process planning for complex product design and assembly.

    Maropoulos et al. [81] outlined the verification and validation

    related benefits arising from the integration of measurement and

    assembly using a digital enterprise framework that links key

    elements of the product, process and resource models.

    4.5. Design for X

    Design for X (DFX) is an umbrella term used to denote design

    philosophies and methodologies which aim to improve designs by

    raising the designers awareness for a certain product lifecyclevalue or characteristic represented by X [82]. The design

    considerations applied in DFX have a direct relationship to the

    verification methods for the X objective.

    Design for Manufacture (DFM)[77,83]includes a wide range of

    design rules and guidelines defined from the perspective of

    improving the manufacturability of parts. For example, the design

    guidelines for end milling stipulate that milled features should be

    designed in such a way so that the end mill required is limited to

    3:1 in length to diameter ratio; the reason being that longer end

    mills are prone to chatter that deteriorates surface quality.

    Applying this DFM guideline will impact directly on end milling

    process capability in terms of surface quality and this will influence

    the process verification procedure, such as the sampling method

    deployed and the method of surface roughness measurement.Theimpact of Design for Assembly (DFA) [77,83] on verification

    is also direct. For instance, the part reduction of an electro-

    mechanical sub-assembly as a consequence of applying DFA may

    result in more complex parts that have additional features. This

    will directly change the inspection plan in terms of the number,

    type and sequence of measurement operations, the measurement

    points per operation and the selection of the measuring device.

    Also, DFA for automated assembly stipulates design methods so

    that parts canbe supplied in therightorientationand do not tangle

    with other parts [84]. This again increases process yield and

    influences the sampling method deployed for assembly verifica-

    tion data collection and analysis.

    Design for Ergonomics is important in labour intensive

    industries[85]and has a noticeable and positive effect on processverification, as controls and displays are re-designed so that

    readings cannot be misinterpreted. Design for changeover is vital

    in high variety environments[86]and improves process verifica-

    tion as a consequence of high repeatability set-ups.

    [

    Fig. 8.Four-phase process planning by QFD [63].

    P.G. Maropoulos, D. Ceglarek/ CIRP Annals - Manufacturing Technology 59 (2010) 740759 745

  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    7/20

    Design for 6-sigma (DFSS) is a design activity that aims to

    generate high capability, 6s processes, before production com-

    mences. DFSS is usually deployed within QFDand is also referred to

    as DefineMeasureAnalyseDesignVerify [87]. This is an

    explicit reflection of the inherent ability of DFSS to enhance the

    verification and validation of processes.

    There are considerable research challenges in developing new

    methodologies that link DFSS with KCs, so that key product

    features and dimensions are specified and evaluated by applyingprocess capability criteria. Such methods would need to be directly

    integrated with the definition of GD&T, so that datum points, key

    dimensions, inspection methods and process capability are

    interlinked in an unambiguous manner.

    5. Designverificationand validation in the digital environment

    Digital prototyping helps manufacturersto virtually simulatea

    product and its associated lifecycle phases such as, product

    manufacture, assembly and functionality, before the product is

    physically realised. This gives manufacturers an excellent

    opportunity to visualise and anticipate aspects of the physical

    performance of a design with less reliance on costly physical

    experimentation. Physical prototyping and testing is still a

    requirement, especially for complex products. However, the clear

    current industry trend is toward reducing physical testing by

    replacing suitable aspects by virtual testing and verification. The

    digital verification results are compared with the experimenta-

    tion results; this validates and certifies computational code

    embedded in a digital prototype. Thus, a validated digital

    prototype can be utilised for verifying the physical performance

    of the product manufactured in the globally dispersed supply

    chain.

    5.1. Digital mock-up

    A digital mock-up (DMU), sometimes referred to as a virtual

    prototype, is essentially a digital simulation of a physical

    prototype and is increasingly used for the verification of productfunctionality. DMU is emerging as the core design collaboration

    tool, aroundwhich different engineering teams verify the product

    through its entire lifecycle, from production planning to func-

    tional testing, maintenance and recycling [88,89]. Multiple

    engineering teams can now operate in parallel, working on the

    same DMU, and this facilitates the enterprise wide application of

    concurrent engineering practice. Recently, the usage of DMU has

    increased, mainly among aerospace and automotive companies,

    owingin a large part to the availabilityof more robust models and

    enhanced computing resources. For instance, the Chrysler

    Corporation, used DMU to reduce automobile development cycle

    by half, while resolving 1200 potential issues before the first

    physical mock-upwas built [90]. Using proprietary DMUsystems,

    Boeing was able to reduce errors and rework on its777 airliner by7080%, saving 100,000 design hours and millions of dollars[90].

    Similarly, Airbus is also increasingly exploiting the advantages of

    DMU[91].

    For complex engineering products, the use of DMU is not

    without problems, the largest of which is ensuring data quality

    between all of its suppliers, customers and design offices. For

    instance, data loss when transferring from one CAD format to

    another remains a major issue[91].

    In summary, DMU is a powerful verification tool and research

    for its development should be based on: (i) enhanced capabilities

    to simulate functional performance using functional mock-up

    methods, and (ii) the solid foundation of international standards.

    The existing STEP (ISO 10303) standard captures adequately

    geometric data, while data pertaining to history based modelling[92], assembly [93], and kinematics linkages are less well

    represented[94]. ISO 10303-105[95]is a good base for kinematic

    structure representation and supports case studies for machine

    tool modelling[96].

    5.2. Tolerance analysis and optimisation

    The primary function of tolerance setting is to balance the

    product functionality with economic factors [97]. Excessively tight

    tolerances will add cost due to more complex processing stages

    whereas inadequately wide tolerances will result in insufficient

    quality and costly rework. Tolerances are vitally important in the

    process of dimensional verification of mechanical parts and

    assemblies as the uncertainty of the measurement instrumentneedsto be an order of magnitudesmaller than thetolerance value.

    Historically, tolerances are decided on the basis of legacy practice

    within a company and as Maropoulos et al. [81] suggest, many

    tolerances are set based on process capability and not on the study

    of tolerance build-up during assembly. A review of tolerancing

    methods by Singh et al. [98] identifies the main academic and

    industrial practices dealing with tolerancing as belonging to either

    tolerance analysis or tolerance synthesis. In essence, tolerance

    analysis attempts to estimate the assembly tolerance stack-up,

    while synthesis considers the assembly and product requirements

    and distributes the assembly tolerances accordingly [99].

    5.2.1. Modelling assembly tolerances

    Dantan and Qureshi [100] describe statistical tolerance analysis

    as a 2D method that computes the probability that the product can

    be assembled and will function under a given set of tolerances. The

    assembly response function can be expressed as a function of the

    individual and independent component dimensions [101]. As

    shown in Fig. 9, there are two basic approaches to tolerance

    analysis, the worst-case method and the root sum square method

    [98]. The worst-case method assumes that the tolerances are at

    their respective extremities and the stack-up is consistently

    accumulative (i.e., there is no tolerance cancellation). This is a

    pessimistic estimate, but due to its simplicity it is still relevant

    today; however it can only be employed in one-dimension at a

    time[102]. The root sum square (RSS) method conversely gives a

    rather optimistic assembly tolerance estimate, as it is a simple

    statistical model based on the normal distribution. As before, the

    RSS method is only suited to single dimensional toleranceproblems [103].

    A more advanced method that is somewhat more indicative of

    tolerance stack-up in the physical world, is the Spotts modified

    approach [104]; this is essentially an average of the worse-case and

    the RSS model. The correction factor approach is also experi-

    mentally based,based on scaling the RSS to make it a more realistic

    figure. However, this method has particular limitations if the

    tolerances/dimensions in the stack-up vary greatly and/or are of

    small quantities[98].

    More complex assembly response functions and non-normal

    tolerance distributions can cause difficulties when using tradi-

    tional analytical techniques as a high number of samples is

    required to create an accurate estimation of the assembly

    response. In such cases, Monte Carlo Simulation (MCS) has becomea viable solution. MCS can be applied when the assembly response

    function cannot be expressed analytically as a linear model and

    also when dealing with the effects of tolerance stack-up within

    kinematic systems[105]. In the kinematic approach[106], the

    tolerance chain is treated as a kinematic loop, with the under-

    standing that the movements of the links are actually small

    displacements within prescribed tolerance zones. This approach

    involves modelling the small displacements using small displace-[

    Fig. 9. Tolerance analysis [98].

    P.G. Maropoulos, D. Ceglarek/ CIRP Annals - Manufacturing Technology 59 (2010) 740759746

  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    8/20

    ment torsors [107] and modelling the effects that local small

    displacement have on the remote functional requirement using

    Jacobian transforms [108]. Desrochers et al. [109] proposed a

    unified Jacobian-torsor model for statistical or worst-case toler-

    ance analysis or synthesis [110].

    5.2.2. Digital tolerancing methods and tolerance optimisation

    Optimizing tolerances aims to maximise the functional

    performance and economic factors associated with tolerances.The economic factor is often expressed in a quality loss function

    [111]and in most applications the Taguchi loss function is used.

    Govindaluri et al. [97] consider the quality loss from the

    perspective of the customer and the manufacturing and rejection

    costs by the manufacturer. When incorporating Taguchis quality

    loss function Cheng and Maghsoodloo [112] found that when a

    components mean varies, only the quality loss associated with

    that component will be changed; whereas when a components

    variance shifts, the optimal allowance, tolerance costs, and quality

    losses associated with each component will be affected. Tolerance

    optimisation methods are classed as either deterministic or

    stochastic; the former considers the nominal values of design

    variables with respect to given input values, using a single pointfor

    evaluation, whereas the latter consider the statistical variation of

    the design variables[113,114].

    Computer Aided Tolerancing systems can provide a simulation

    platform for modelling the effects of tolerance setting within a

    manufacturing process or assembly[115,116]. Tolerance analysis

    and synthesis are considered within a DMU to include aspects of

    tolerance build-up and assembly clashes[117]. Tolerance design

    methods have been summarised by Singh et al. [99]as shown in

    Fig. 10,including traditional and advanced methods.

    5.3. Features for machining CAD/CAM/CAPP verification

    In the last two decades, extensive research efforts in various

    segments of CAx integration using feature technology have been

    reported especially for the integration of CAD and CAM. Salomons

    et al.[118], and Subrahmanyam and Wozny[119]have identifiedthree major approaches of feature technology namely; interactive

    feature definition, automatic feature recognition and design by

    features.

    In interactive feature definition, features are defined with

    human assistance after creating the geometric model. Automatic

    feature recognition involves the comparison of the geometric

    model with pre-defined generic features. Many approaches for

    feature recognition have been reported; Lin et al. [120]extracted

    manufacturing features present in a feature-based design model,

    while ElMaraghy and ElMaraghy[121]introduced the concept of

    functional and manufacturing features.

    Presently, the design by features approach has become the

    core technology for product modelling. Feature definitions

    (templates) are placed in the feature library, from which featuresare instantiated by specifying dimension parameters, location

    parameters and application related attributes. Feature-based

    design has made a direct and very positive impact on part

    verification as helped to codify and standardise both the

    manufacturing processes and the inspection methods used for

    types of features, thus improving design verification. Research is

    still required to provide coherence in relating inspection systems

    and methods to processes, especially in cases where there is a wide

    range of measurement options available, such as theverification of

    machined features, or complex assembly features.

    Case [122] used methods associated with external approachdirections for features to enhanceprocess capability and Wong and

    Wong[123]used volumetric machining features for part model-

    ling in their feature-based design system. Several feature-based

    design systems are reported with a focus on prismatic machining

    process. In the case of machining, feature-based design allows the

    corresponding definition of standardised machining processes

    that are proven in terms of process capability. This is of major

    significance, as it allowsrapidverificationof a designin terms of its

    modelling entities and the corresponding machining process.

    Feature-based methods had a profound effect on computer

    automated process planning (CAPP) for machining. Gu et al. [124]

    identified the sequence of machining process in four stages namely;

    feature extraction, feature prioritisation, clustering of operations

    and the identifying of precedence relationships. Laperriere and

    ElMaraghy used precedencegraphsfor assembly sequence planning

    [125].Qiaoetal. [126] useda genetic algorithm methodto sequence

    the machining operations for prismatic parts. Li et al. [127] andOng

    et al. [128] tried to solve the process planning problems by

    combining the non-traditional optimisation techniques, namely

    genetic algorithm and simulated annealing. Azab and ElMaraghy

    used quadratic assignment for reconfiguring process plans [129].

    Thecommon problems andcharacteristics of theseCAPP approaches

    for machining are one or more of the following:

    Feature recognition is used in most of the approaches. Hence, the

    feature-based databases of commercial software are not utilised.

    After recognition, the features (mostly design oriented) are

    converted into application (manufacturing) features using a

    knowledge base or heuristic rules. The common attributes arenot directly transferred to application features.

    The process plans produced by these systems consider only a

    single machine set-up. But, in the factory environment, several

    machines may be used in different set-ups.

    The precedence constraints in the component are represented

    with respect to features and not with respect to low-level

    entities, namely operations.

    The set-ups were optimised with respect to the tool approach

    directions.This in turn reduces thesearch space or loosesfeasible

    design points.

    To conclude, process planning research has not as yet reached

    the maturity of key methods to focus on verification and validation

    in an integrated manner. The feature recognition approach istheoretically the most generic approach to process planning but it

    partly negates the design and process standardisation and

    verification benefits of feature-based design.

    5.4. Virtual assembly modelling and simulation

    Virtual or digital assembly modellingis a powerful andeffective

    technology for the verification of assemblies during the digital

    design phase. Assembly process planning (APP) is a core

    component of virtual assembly modellingas it deals with assembly

    constraint identification, equipment selection and sequence

    generation[130]. Wang and Ceglarek[131]proposed an assembly

    sequence planning method which comprises: (1) sequence

    generation for predetermined line configurations using k-piecemixed-graph representation of assembly; (2) dimensional quality

    model of variation propagation for assembly processes with

    compliant parts; and (3) evaluation of sequences based on the

    multivariate process capability index.

    [

    Fig. 10. Tolerance design methods [99].

    P.G. Maropoulos, D. Ceglarek/ CIRP Annals - Manufacturing Technology 59 (2010) 740759 747

  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    9/20

    Using Virtual Reality (VR), the 3D digital mock-up of the

    product can be manipulated with the assistance of VR interactive

    devices. It, therefore, attracted great interest from researchers

    dealing with assembly planning. The advantages of applying

    virtual engineering for assembly process planning were sum-

    marised by Jun et al. [132]. From the concurrent engineering

    perspective, it is preferable to implement the assembly and

    disassembly process in a virtual environment at an early stage of

    design, when only the geometric forms are determined and the

    functions can still be defined [132,133].

    The Virtual Assembly Design Environment (VADE) was created

    to demonstrate the potential and the challenges involved in the

    design and manufacturing processes[134].Fig. 11illustrates the

    usage scenario of VADE. The VADE system allows the user to

    perform assembly processes by hand and assembly tools on the

    virtual product with the import data from a parametric CAD

    system. By maintaining a dynamic correlation with a CAD system,

    the design information created during the virtual assembly process

    is updated at the end of using VADE.

    Banerjee et al. [135] studied the effectiveness of VR in assembly

    planning by comparing: blueprints, a non-immersive desktop VR

    environment and an immersive projection-base VR environment.

    The results showed that the completion time of the assembly

    process was approximately halved by utilising VR. An Augmented

    Reality (AR) based human-computer interface was developed by

    Ong et al. [136] to provide an immersive and intuitive environ-ment. Unlike VR,the assembly designand planning using AR canbe

    verified by manipulating the virtual prototypes in the real

    assembly environment, which will decrease the possibility of re-

    designing and re-planning.

    5.4.1. Digital tooling and fixturing for assembly

    Digital assembly modelling is now well established in the

    advanced engineering industries, like aerospace and automotive,

    for the design of assemblies and their integration with the design

    of tooling and the associated jigs and fixtures. Commercial

    software systems allow the seamless integration of product,

    process and resource models [137]. The data generated during

    assembly tolerance analysis can be utilised by tool designers to

    define appropriate tooling tolerances. Such systems are also beingdeployed within ITER the nuclear fusion project to model the

    manipulation of cassette tooling, the loading of which is robot

    controlled [138]. Additionally, the digital mock-up of tooling can

    simulate accessibility issues and lines of sight for an optical

    measurement system[139].

    Digital fixturing is a key enabling technology for low cost

    tooling thatwill enhance industrys capability for batch production

    and customisation of products [140]. As an extension from the

    established methods of rapid prototyping (RP) from a DMU to a

    physical mock-up, a range of rapid tooling applications are being

    developed [141]. An alternative to rapid tooling is to employ

    reconfigurable tooling; this generally requires modular compo-

    nents that allow a virtually unlimited number of tooling

    configurations. Ceglarek et al. [142]extended the N-2-1 fixturelayout design methodology by introducing a movability restraint

    condition which is essential for material handling fixture design.

    Kong and Ceglarek[143]addressed a fixture workspace synthesis

    method for reconfigurable assembly systems. Phoomboplab and

    Ceglarek [144] proposed a GA and low-discrepancy sampling

    technique-based optimal design space reduction method to

    optimise the locator positions in a multi-station assembly system

    while ensuring the robustness of the fixturing system in terms of

    the products dimensional quality.

    5.4.2. Stream-of-variation modelling and design synthesis

    Stream-of-Variation Analysis (SOVA) is a mathematical model

    to describe the relation between final product quality and processparameters of complex multistage assembly[145,146]. SOVA can

    predict potential downstream assembly problems, based on

    evaluations of the design using a large array of process variables.

    By integrating product and process design in a pre-production

    simulation, SOVA can head off individual assembly errors that

    contribute to an accumulating set of dimensional variations, which

    ultimately result in out-of-tolerance parts and products. Once in

    the ramp-up stage of production, SOVA can compare predicted

    misalignments with actual measurements to determine the degree

    of mismatch in the assemblies and diagnose the root causes of the

    errors[145,146].

    Individual design tasks must be integrated in order to optimise

    the design of the entire system. Phoomboplab and Ceglarek [147]

    proposed a design synthesis method based on a hybrid design

    structure matrix which integrates design tasks with design

    configurations of key control characteristics, especially for

    dimensional management in multistage assembly systems. The

    method can generate design tasks sequences to minimise

    simulation time as well as benchmark design task sequences in

    terms of dimensional quality improvement.

    5.5. Digital measurement modelling and planning

    5.5.1. Measurement and inspection planning techniques

    The measurement process, often called inspection process, is

    now a vital element of integrated design and manufacturing[148].

    Computer Aided Inspection Planning (CAIP) systems have been

    developed to accomplish the measurement planning task by the

    following generic procedures: (1) CAD interface and featurerecognition, (2) determination of the inspection sequence of the

    features of a part, (3) determination of the number of measuring

    points and their locations, (4) determination of the measuring

    paths, and (5) simulation and verification [149]. The stages of CAIP

    for Co-ordinate Measuring Machines (CMMs), are defined as;

    establish the best sequence of inspection steps, the detailed

    inspection procedure of each feature, feature accessibility by

    probes, probe path planning and collision checking, generating the

    CMM control commands, and the post-processing of measured

    data such as statistical and cost analysis [150].

    The first generation of inspection planning systems was

    developed by Hopp[151]and ElMaraghy and Gu[152]. Automatic

    inspection planningfor dimensional and geometric inspections has

    two distinguished levels: macro- and micro-level planning[153,154]. Subsequently, Lee et al. [155] divided the planning

    process into two steps: global inspection planning that is focused

    on the generation of an optimum inspection sequence and local

    inspection planning that is focused on minimizing errors and times

    throughout the measurement process.

    Research in CAIP falls into two categories: (a) tolerance-driven

    inspection process planning and (b) geometry-based inspection

    process planning [148]. The former considers inspections on

    features with allocated tolerance requirements while the latter

    aims to conduct an entire geometry inspection by comparing the

    obtained complete geometric description of a part or product with

    the design model. The geometry-based CAIP systems theoretically

    offer a more coherent inspection process but at a high time and

    cost [148]. Recent research has been carried out aiming at theautomation of the inspection features reorganisation, by extracting

    from the CAD model directly. Similar research concerning feature

    clustering, probe accessibility and orientation analysis dominated

    research interest for CMM-based inspection planning carried out

    [

    Fig. 11. The VADE usage scenario [134].

    P.G. Maropoulos, D. Ceglarek/ CIRP Annals - Manufacturing Technology 59 (2010) 740759748

  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    10/20

    by Limaiem and ElMaraghy [156], Zhang et al. [157]and Hwang

    et al.[158].

    With the rapid development of artificial intelligence and

    knowledge-based techniques, Expert Systems, Neural Networks

    and Fuzzy Logic were used to automate the measurement planning

    process. The expert system developed by Moroni et al. [159]

    tackles the problem of selecting touch probes and generating the

    measurement configurations. Lu et al. [160] and Hwang et al. [158]

    employed artificial neural network techniques to obtain the

    optimum inspection sequence while Beg and Shunmugam

    [161,162]achieved the same objective utilizing Fuzzy Logic on a

    prismatic part inspection process. Mohib et al. [163] used

    knowledge rules to select the most appropriate probe type and

    optimised the planned inspection tasks using a hybrid laser/CMM

    for complex geometries.

    5.5.2. Metrology process modelling for verification planning

    Process modelling is an essential technology for design

    evaluation and process planning based on the codification of

    engineering knowledge and analytical methods [164,165]. There is

    a scarcity of metrology process models for measurement planningand this may be due to the traditional industrial perception of

    metrology simply being a verification step, rather than being an

    essential element of the production process[166]. Moreover, new

    frameless metrology systems have been integrated with produc-

    tion and assembly, enhancing the need for developing a process

    model to codify their capabilities [80,81].

    Maropoulos et al. [166]proposed a theoretical framework for

    the development of metrology process models for integrating

    product design with assembly planning, based on the Digital

    Enterprise Technology methodology[167,168].Fig. 12shows the

    metrology framework, with the metrology process model posi-

    tioned central to the integration of the design verification process

    with the verification of assembly operations and the subsequent

    deployment of measurement systems that support measurement-assisted automation. The framework explicitly recognises the need

    to co-ordinate the digital verification aspects (left part ofFig. 12),

    with those that involve the physical deployment of measurement

    equipment for product and process verification (right part of

    Fig. 12)[166,168].

    Industry requires the definition of new research projects

    addressing the development and evaluation of integrated metrol-

    ogy and assembly methods and systems that offer superior

    positional and orientation accuracy, with in-built verification

    capability. Such systems must be fully compliant with relevant

    standards and best practice guides including; ISO GUM [169],

    ASME B89.4.19[170]and STEP (ISO 10303) [24].

    5.5.3. Measurement and inspection equipment selectionA vitally important stage in the digital verification planning is

    the identification and selection of inspection equipment. This

    largely refers to measuring systems deployed for dimensional and

    shape validation of parts and assemblies. There is a very wide

    spectrum of physical scale and accuracy requirements for which

    such systems need to be selected covering industrial production

    from small parts (measured in millimeters) to large, complex

    products such as aircraft, ships, and wind turbines[166,171,172].

    New techniques such as absolute length measuring interferometry

    and six-degrees-of-freedom probes are frequently combined with

    more traditional systems such as CMMs to cover the dimensional

    and shape verification needs of modern products[171,172]. The

    selection process needs to be based on metrology process models

    and employs multiple criteria with a key requirement being the

    definition and minimisation of measurement uncertainty

    [163,171]. Cai et al. [168,173] proposed an approach for large

    volume metrology instruments selection based on measurability

    characteristics (MCs) analysis. Inspired by the concept of quality

    characteristics, MCs can be used for instrument selection on the

    basis of measurement capability, cost and technology readiness.

    Muelaner et al. [174] proposed an approach employing a data

    filtering technique for instrument selection and Cuypers et al.

    [175] specify the task requirements and part restrictions before

    selecting instruments manually.

    There are exciting, new research challenges in genericmeasurement systems modelling and capability derivation that

    are essential for instrument selection and measurement planning

    within CAIP. Research is also needed for the integration of CAIP

    with CAPP, based on the coherent modelling of capabilities.

    5.6. Computational and virtual methods for functional product

    verification and optimisation

    5.6.1. Structural function verification and finite elements analysis

    The growing interest in reducing reliance on testing and cut the

    cost and time of certification of structural systems has pushed the

    academic and industrial world toward the development of Virtual

    Testing Labs (VTL) where the Finite Element Analysis (FEA)

    technique is employed to predict the possible behaviour of realworld structures until failure (Fig. 13). However, to reduce and

    replace physical testing by virtual FEA testing, procedures must be

    put in place to demonstrate that the virtual tests are able to

    replicate actual tests and to generate the necessary confidence

    within the design and certification communities.

    The first stage of FEA is the idealisation process which takes

    the real-lifestructuraldesign problem and turnsit into an idealised

    mathematical model, the Finite Element Model (FEM). The second

    stage involves selecting appropriate finite elements, mesh layouts

    and solution algorithms to define the structural behaviour of the

    idealised mechanical system. The creation of an error-control[

    Fig. 13. Virtual testing procedure.

    [

    Fig. 12. Overview of the theoretical framework for integrating measurement with assembly planning.

    P.G. Maropoulos, D. Ceglarek/ CIRP Annals - Manufacturing Technology 59 (2010) 740759 749

  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    11/20

    procedure to facilitate the user of the FEA in solving structural

    design problems has been extensively studied. Other methods for

    creating error-free FE models may involve the use of sensitivity

    analyses[176]. Besides these intrinsic FEA errors, other uncertain-

    ties are present such as the experimental boundary conditions,

    exact panel geometry and presence of initial imperfections that

    affect the accuracy of the virtual testing. Such issues are more

    pronounced for structures made of newly developed materials

    such as hybrid materials, fibre reinforced plastics (composites) dueto their high dimensional variability of products. This is becoming

    an important issue for thick large-scale structures where

    measurement of residual stresses and distortion are challenging

    tasks. To solve these issues, upstream 3D digital measurement and

    quality control techniques need to be employed in a synergistic

    manner with the finite element method for accurate representa-

    tion of structural and material behaviour under in-service loads

    (static, vibration, cyclic loads and impact).

    While classical computational stress analyses provide good

    predictions in the elastic regime, they have not previously

    achieved predictive accuracy in the presence of damage and

    fracture. This limitation is starting to be overcome by new

    simulation strategies, which combine advances in the generality

    and physical realism of damage formulations with new experi-

    mental techniques for probing the physics of failure at the micron

    and nanometer scales. These research advances are making

    possible high-fidelity virtual tests, where the mechanical beha-

    viour of a structure up to ultimate failure is computed through

    simulations of the physical processes involved at the atomic [177],

    microscopic and structural scales[178].

    5.6.2. Design function verification using computational fluid

    dynamics

    With the increasing availability of affordable access to

    substantial computing resources, computational fluid dynamics

    (CFD) is now becoming established as a viable tool for computer

    aided engineering and design, in spite of uncertainties that

    continue to surround the topics of automated mesh generation,

    solution sensitivity to mesh size and distribution, and theverification and realism of turbulence models. CFD software offers

    increasingly sophisticated (and computationally demanding)

    analysis features such as free-surface modelling, fluid-structure

    interaction (FSI) and large eddy simulation (LES).

    The turbomachinery and aircraft industries have made use of

    CFD for many years to study flows around smooth-shaped

    aerodynamic surfaces. Calibrated physical models are used for

    these flows using highly structured curvilinear (body-fitted)

    meshes to make best useof available resources. CFDhas resulted in

    significant improvements to the design of compressor and turbine

    blades [179], including the use of inverse design and multi-

    objective optimisation techniques[180], with the attention of the

    industry and researchers now turning ever more assiduously to

    improving the use of valuable compressor bleed air in gas-turbineinternal-air cooling systems[179,181].

    In aircraft design, the requirement to carry out large-scale

    computations of complete aircraft configurations motivated the

    development of empirical one-equation models of turbulence for

    computational economy [182]. Following a period in which

    turbulence models tended to move toward more complicated,

    multiple-equation closures (such as shearstress, v2-f or the even

    more substantial ReynoldsStress models), the robustness and

    relative economy of one-equation models, such as Spalart and

    Allmaras[182], is enjoying a return to more widespread favour,

    and developments of such models to account formore complicated

    flow situations are now being proposed and introduced [183].

    An important issue with the handling of complex geometries

    such as car body surfaces is the efficient translation from solidmodel geometry (CAD) representations into a form suitable for

    automated mesh generation for CFD. Dawes [184] proposes a

    tightly integrated approach in which a pre-defined mesh also acts

    as the surface geometry detection mechanism (using algorithms

    derived from medical imaging). This also lends itself to boundary

    surface adaptation in response to the flow, a process known as

    sculpting. Similar modelling of the interface between flexible

    membranes or solid surfaces and the forces exerted on them by a

    fluid medium is the basis of FSI, where finite element modelling

    can be integrated with CFD to calculate structural deformation in

    response to varying fluid dynamics loads.

    LES offers the prospect of less reliance of solutions on the often

    incomplete representation of flow physics using turbulence

    models. In LES, an unsteady turbulent flow is simulated in full

    three-dimensional and time-accurate detail, with only the excep-

    tion of very small-scale (so-called sub-grid) energy dissipation

    processes. The matching of LES techniques to more traditional

    modelling methods in low turbulence research, such as near walls,

    offers the prospect of high-fidelity numerical experiments being

    conducted replacing the need for large-scale physical testing. The

    unsteady information provided by the LES technique also lends

    itself naturally to the unsteady aerodynamics of separated flows,

    for example around wind turbine blades or around aircraft at very

    high angles of attack as shown inFig. 14,as well as providing the

    fluctuating pressure information that is vital for controlling

    unsteady vibrations or acoustic signatures.

    6. Physical product and process verification and validation

    6.1. Product design physical verification and validation

    Before digital prototyping and testing became the prerequisites

    of rapid product development, physical prototyping techniques

    were prevalent in industry and have influenced product perfor-

    mance, quality and competitiveness in the global markets. Physical

    testing is still an expected industry practice, frequently linked to

    product certification. For example, aerospace products undergo

    strict testing to pass certification criteria and automobile

    manufacturers are required to test their prototypes following

    combustion and safety standards. Moreover, physical testing

    generates valuable knowledge and data that can be utilised to

    enhance the design of future products or variants.

    6.1.1. Dimensional and shape verification and validation

    Component verification is the process of assessing the

    conformance of key features and characteristics of a manufactured

    component to the specifications prescribed by the product

    designers, as these are captured by the GD&T notations. The

    scope of this paper is according to the GPS standard [186], that

    prescribes the surface, geometric and dimensional characteristics

    involved in verification, as shown inFig. 15.

    [

    Fig. 14.Isosurface of instantaneous vorticity over an F-18C aircraft at 308angle ofattack[185].

    [

    Fig. 15. Dimensional and shape characteristics of GPS standards [186].

    P.G. Maropoulos, D. Ceglarek/ CIRP Annals - Manufacturing Technology 59 (2010) 740759750

  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    12/20

    Designers define tolerances on core models that are intended to

    describe the maximum allowable variation from the nominal size.

    Tolerances do not include any allowance for, or knowledge of, the

    measurement uncertainty of the equipment used to verify

    the dimensions. The standard ISO 14253[187]makes it clear that

    the onus is on the supplier of the measurement data to guarantee

    the conformance to specification (tolerance) of the measurements,

    and that the data takes account of measurement uncertainty.

    There are several ways of carrying out dimensional and shapeverification[171] including direct or indirect measurements, and

    measuring either all the parts (100% inspection) or a selection of

    parts. Direct measurements are taken off the part itself by deploying

    metrology systems suitable for the physical size and scale of the

    artefactsand these systemsare outlined in theenablingtechnologies

    Section 6.4. Indirectdimensionalverification requires takinginferred

    dimensions from something other than the part, for example by

    measuring the jig that is used to assemblethe part. Verification may

    also be inferred statistically through controlling and measuring the

    process, as outlined in Section 6.3, and this can bring significant cost

    benefits through improvements to process capability.

    Thelevel of inspection required forany given feature is dictated

    by the risk of non-conformance. Depending on the industry sector,

    design risk is driven by performance, safety and fit. Process and

    inspection risks are dictated by the capability of the process and

    inspection systems. Due to the criticality of aerospace components,

    high-risk features will always be subject to 100% inspection.

    Features that can be effectively controlled by validating the

    manufacturing process can be subjected to a reduced inspection

    regime, typically yielding a 5075% reduction in final inspection

    load, reducing measurement time per part.

    A freeform surface, also known as a complex or sculpted

    surface, is classified in ISO 17450-1:2007 [186] as a complex

    feature with no invariance degree. Existing technologies for

    measuring free form surfaces are detailed by Savio et al. [188].

    Photogrammetry and laser scanning are mature technologies for

    surface characterisation with measurement accuracies of 5 parts in

    105 [189]and 1 part in 104 respectively. Structured light devices

    are less mature technologies with accuracy 1 part in 105 but theyhave potential for achieving higher accuracy than laser line

    scanners due to the fundamental limits imposed by speckle effects

    [190,191]. This is where a hybrid system [163] would be

    advantageous. While the ISO GPS standard allows profile

    tolerances on freeform surfaces like straightness[192], roundness

    [193] and cylindricity [194], there is no standard for the

    verification of freeform surfaces. Multiple instruments are applic-

    able for surface verification, as shown inFig. 16.

    The production uncertainties of a free form surface, com-

    pounded by the edge trimming and the assembly processes that

    freeform surfaces typically are involved in, eventually manifest

    themselves in gaps, steps and interferences between the surfaces.

    Gap and flush problems on a fluid dynamic device, such as an

    aircraft wing, are detrimental to its performance and the fit ofautomotive panels is indicative of the build quality of the product.

    The assembly methods used to minimise freeformsurface interface

    problems can be classified as follows;

    Build to nominal: the assembled product tolerance is met by

    simply making the key features of the parts as accurately as

    possible. Typically used for small products with features that can

    be accurately produced.

    Measure and adjust: the assembled product tolerance is met by

    measuring the interfaces and adjusting some of the partsposition and/or orientation to minimise interface problems. For

    larger parts which can be difficult and expensive to produce to

    tight tolerances (such as door panels in the automotive industry),

    the position and orientation may be manipulated manually or

    automatically to minimise the overall interface problems

    [195,196].

    Measure for production: the assembled product tolerance is met

    by measuring one side of the interface and producing the other

    side using the measured data. For very large freeform shapes

    such as wings and wind turbine blades, it is very difficult and

    expensive to produce parts to tight tolerances. It is often

    preferable to tailor parts to fit the specific physical assembly by

    producing parts directly using measurements from the assembly

    [90,188].

    6.1.2. Design structure mapping and hidden features

    Hidden features can be defined as those which do not easily

    provide line-of-sight access, as occurs commonly in cluttered

    assembly environments and complex and enclosed products.

    Measurementof these features generally requires an ability to see

    around corners or measure directly through opaque objects.

    Possible approaches include; networks of line-of-sight instru-

    ments; mirrors; articulated CMM arms; through-skin sensing

    (using Hall effect sensors to locate holes, fitted with magnets, on

    components hidden by other components); and six-degrees-of-

    freedom probing. A key issue with networks of line-of-sight

    instruments is closing the metrological loop and including

    sufficient common points from one instrument to the next, so

    as to minimise error buildup.

    6.1.3. Measurement equipment deployment

    Production metrology begins with the set-up of systems and

    continues through the in-process measurement and metrology

    enabled automation [80,81]. Metrology must be seen as a

    manufacturing process and Muelaner et al. [174] developed a

    method for measurement planning and instrument deployment.

    Specification of the environmental conditions in which the

    measurement is to be carried out should include the average

    temperature, temperature gradients, pressure, humidity and

    carbon dioxide content [197]. Accuracy, properly defined as

    measurement uncertainty [169], is a key performance indicator

    for metrology. Much work has already been carried out to model

    measurement uncertainty in industrial measurement processesespecially for large volume applications [171] using models

    [

    Fig. 16. Examples of freeform surface verification applications.

    P.G. Maropoulos, D. Ceglarek/ CIRP Annals - Manufacturing Technology 59 (2010) 740759 751

  • 7/26/2019 Validation in PLM Maropoulos Ceglarek Keynote CIRP 2010

    13/20

    created for laser-based spherical co-ordinate measurement

    systems, such as laser trackers and laser radar [170,197]. Co-

    ordinate measurements may be calculated from a number of

    angular measurements obtained using cameras, theodolites, and

    iGPS[198]. Calculating the measurement is a complex task, since

    measurement uncertainty impacts on part rejection rates

    [173,174]and the accuracy of manufacturing processes.

    Decision rules for proving conformance or non-conformance

    with specifications are clearly defined by international standards.

    A component dimension must be accompanied by a tolerance

    [199] giving a lower specification limit (LSL) and an upper

    specification limit (USL) while a measurement result must be

    accompanied by an estimate of measurement uncertainty (U)

    [169]. Product conformance can be proven by a measurement

    result that is greater than LSL + U and less than USL-U [187].

    6.2. Product testing and validation

    6.2.1. Mechanical design testing

    The effective mechanical design of a stand-alone product or a

    structural component is predicated on key stages of development

    which are summarised inFig. 17. As already described in Section

    5.6.1, the output of FEA modelling depends on the construction of

    accurate meshed or meshless continua and the correct assignment

    of materials properties. In many cases such materials property

    information is available from materials textbooks [200]or in the

    form of software[201]but if new materials or bespoke composite

    materials are to be used, materials evaluation is needed to definemechanical properties.

    Using a range of test coupon geometries, materials evaluation

    performs the dual role of firstly confirming the correct selection of

    materials and secondly providing materials properties for FE

    modelling. Mechanical tests are published by standards bodies such

    as ASTM International and BSI British Standards. The mechanical

    testing of fibre composites is given by Hodgkinson [202]. Some

    materials parameters and materials tests are given in Table 2.

    Materials tests will determine elastic properties and the onset

    of yield and will determine whether a linear or a non-linear FE

    model is required to model the mechanical behaviour of parts. A

    key feature of the measurement of materials parameters is the

    effective use of instrumentation. Strain measurement devices such

    as strain gauges, extensometers and lasers are well known but

    techniques such as Electronic Speckle Pattern Interferometry

    (ESPI), Holographic Interferometry and Digital Image Correlation

    (DIC) [203] provide more accurate2D and 3D information on strain

    distributions around stress concentrations.

    An obvious method of evaluating products and components is

    to perform static structural tests in tension, compression and shear

    to destruction. Performance under cyclic load (fatigue), constant

    stress (creep) and constant strain (stress relaxation) will allow the

    determination of parameters such as fatigue life (constant

    amplitude and complex load or strain), fatigue limit, creep

    compliance and stress relaxation modulus. The observation and

    understanding of fracture is achieved by the application of optical,

    electron and atomic force microscopy. Non-destructive evaluation

    (NDE) includes a plethora of techniques, often used to locate

    defects. Some NDE methods are summarised inTable 3.

    6.2.2. Flow related physical verification and validation

    The validation of CFD analysis deals with the assessments of

    comparison between computational and experimental results

    [14,204]as shown inFig. 18and this generates valuable data for

    improving the convergence of Large Eddy Simulation and

    experimental tests. The key parameters in CFD validation tests

    deal with the aerodynamic forces that consist of three force

    components (lift, drag, side force) and three moments (pitching,

    yawing, rolling). The static aerodynamic forces and moments can

    be measured indirectly by integrating the surface pressure

    distribution [204] or directly by strain gauge balance, internal

    spring balance and load cell. The unsteady aerodynamic forces and

    moments acting on a maneuvering air vehicle [205] can be

    measured by using strain gauge balance and load cell.The external flow structure of an air vehicle can be illustrated

    qualitatively by flow pattern images and quantitatively by

    measuring flow velocities. Qualitative flow patterns can be

    Table 2

    Selected materials parameters and associated test methods.

    Property Parameter Test method

    Strength (maximum, yield, etc.) s(MPa) Tension, compression,

    flexure, etc.

    Strain (maximum, yield, etc.) e Tension, compression,

    flexure, etc.