Chapter 32 Robust DOE

12
5/31/2013 1 Chapter 32 Robust DOE Introduction The experiment procedures proposed by Genichi Taguchi (Taguchi and Konishi 1987; Ross 1988) have provoked both acclaim and criticism. Some nonstatisticians like the practicality of the techniques, while statisticians have noted problems that can lead to erroneous conclusions. However, most statisticians would agree that Taguchi has increased the visibility of DOE. In addition, most statisticians and engineers would probably agree with Taguchi that more direct emphasis should have been given in the past to the reduction of process variability and the reduction of cost in product design and manufacturing processes.

Transcript of Chapter 32 Robust DOE

5/31/2013

1

Chapter 32

Robust DOE

Introduction

• The experiment procedures proposed by Genichi Taguchi

(Taguchi and Konishi 1987; Ross 1988) have provoked both

acclaim and criticism. Some nonstatisticians like the

practicality of the techniques, while statisticians have noted

problems that can lead to erroneous conclusions. However,

most statisticians would agree that Taguchi has increased

the visibility of DOE. In addition, most statisticians and

engineers would probably agree with Taguchi that more

direct emphasis should have been given in the past to the

reduction of process variability and the reduction of cost in

product design and manufacturing processes.

5/31/2013

2

Introduction

• In this book, the term robust DOE is used to describe the

S4/IEE implementation of key points from the Taguchi

philosophy.

• Robust DOE is an extension of previously discussed DOE

design techniques that focuses not only on mean factor

effects but on expected response variability differences from

the levels of factors.

• Robust DOE offers us a methodology where focus is given

to create a process or product design that is robust or

desensitized to inherent noise input variables.

Introduction

• This chapter gives a brief overview of the basic Taguchi

philosophy as it relates to the concepts discussed in this

book.

• The loss function is also discussed along with an approach

that can be used to reduce variability in the manufacturing

process.

• In addition, the analysis of 2k residuals is discussed for

assessing potential sources of variability reduction.

5/31/2013

3

32.1 S4/IEE Application Examples:

Robust DOE

• Transactional 30,000- foot-level metric: An S4/IEE project

was to reduce DSO for invoices. Wisdom of the organization

and passive analysis led to the creation of a robust DOE

experiment that considered factors: size of order (large

versus small), calling back within a week after mailing

invoice (yes versus no), prompt-paying customer (yes

versus no), origination department (from passive analysis:

least DSO versus highest DSO average), stamping “past

due” on envelope (yes versus no). The DSO time for 10

transactions for each trial will be recorded. The average and

standard deviation of these responses will be analyzed in

the robust DOE.

32.1 S4/IEE Application Examples:

Robust DOE

• Manufacturing 30,000-foot-level metric: An S4/IEE project was to

improve the process capability/performance metrics for the

diameter of a plastic part from an injection-molding machine.

Wisdom of the organization and passive analysis led to the

creation of a DOE experiment that considered factors:

temperature (high versus low), pressure (high versus low), hold

time (long versus short), raw material (high side of tolerance

versus low side of tolerance), machine (from passive analysis:

best-performing versus worst-performing), and operator (from

passive analysis: best versus worst). The diameter for 10 parts

manufactured for each trial will be recorded. The average and

standard deviation of these responses will be analyzed in the

robust DOE.

5/31/2013

4

32.1 S4/IEE Application Examples:

Robust DOE • Product DFSS: An S4/IEE project was to improve the process

capability/performance metrics for the number of daily problem

phone calls received within a call center. Passive analysis

indicated that product setup was the major source of calls for

existing products/services. A DOE test procedure assessing

product setup time was added to the test process for new

products. Wisdom of the organization and passive analysis led to

the creation of a DOE experiment that considered factors:

features of products or services, where factors and their levels

would be various features of the product/service, including as a

factor special setup instruction sheet in box (sheet included

versus no sheet included). The setup time for three operators

was recorded for each trial. The average and standard deviation

of these responses will be analyzed in the robust DOE.

32.2 Test Strategies

• Published Taguchi (Taguchi and Konishi 1987) orthogonal

arrays and linear graphs contain both 2- and 3-level

experiment design matrices.

• The basic 2-level Taguchi design matrices are equivalent to

those in Table M, where there are n trials with n-1 contrast

column considerations for the 2-level designs of 4, 8, 16, 32,

and 64 trials. Table N contains the 2-factor interaction

confounding for the design matrices found in Table M.

• Taguchi experiment analysis techniques do not normally

dwell on interaction considerations that are not anticipated

before the start of test.

5/31/2013

5

32.2 Test Strategies

• The book suggests first considering what initial experiment

resolution is needed and manageable with the number of 2-

level factors. After the first experiment analysis, one of

several actions may be appropriate:

• The test may yield dramatic conclusions that answer the

question of concern. A simple confirmation experiment

would be appropriate.

• The results may lead to a follow-up experiment that

considers other factors in conjunction with those factors

that appear statistically significant.

• It may suggest a follow-up experiment of statistically

significant factors at a higher resolution.

32.2 Test Strategies

• If interactions are not managed properly in an experiment,

confusion and erroneous action plans can result.

• In addition, the management of these interactions is much

more reasonable with only 2-level factors are involved.

5/31/2013

6

32.3 Loss Function

• The loss function is a contribution of Genichi Taguchi (1978).

• This concept can bridge the language barrier between upper

management and those involved with technical details.

• The loss function describes the loss that occurs when a

process does not produce a product that meets a target value.

• Loss is minimized when there is “no variability” and the “best”

response is achieved in all areas of the product design.

32.3 Loss Function

• Traditionally, manufacturing

has considered all parts that

are outside the specification

limits to be equally non-

conforming, and all parts

within specification to be

equally conforming.

• In the Taguchi approach, loss

relative to the specification

limit is not assumed to be

step function.

Loss

Lower

Specification

Limit

Upper

Specification

Limit

5/31/2013

7

32.3 Loss Function

• Taguchi addresses variability in

the process using a loss function.

A common form of the loss

function is a quadratic equation:

𝐿 = 𝑘(𝑦 − 𝑚)2

where 𝐿 is the loss associated with

a particular value 𝑦. The

specification nominal value is 𝑚,

while 𝑘 is a constant.

• When this loss function is applied,

more emphasis will be put on

achieving the target as opposed to

just meeting specification limits.

Lower

Specification

Limit

Upper

Specification

Limit

Loss

32.4 Example 32.1: Loss Function

• Given that the cost of scrapping a part is $10.00 when it

deteriorates from a target by 0.5 mm, the quadratic loss

function given 𝑚 (the nominal value) of 0.0 is

$10.00 = 𝑘(0.5 − 0.0)2

Hence,

𝑘 = $40.00 𝑝𝑒𝑟 𝑚𝑚2

The loss function then becomes

𝐿 = 40(𝑦 − 0)2

• The loss function can yield different conclusions from

decisions based on classical “goalpost” specification limits.

• In addition, this loss function can help make economic

decisions about process improvement.

5/31/2013

8

32.5 Robust DOE Strategy

• Most practitioners agree with Taguchi that it is important to

reduce variability in the manufacturing process.

• To do this, Taguchi suggests using an inner and outer array

(fractional factorial design structure) to address the issue.

• The inner array addresses the items that can be controlled

(e.g., part tolerance), while the outer array addresses factors

that cannot necessarily controlled (e.g., ambient temperature)

• To analyze the data, Taguchi devised a signal-to-noise ratio

technique, which Box et al. (1988) show can yield debatable

results. However, Box states that use of the signal-to-noise

ratio concept could be equivalent to an analysis uses

logarithm of the data.

32.5 Robust DOE Strategy

• The fractional factorial designs included in this book can be

used to address reducing manufacturing variability with the

inner/outer array experimentation strategy.

• To do this, categorize the factors listed into controllable and

non-controllable factors.

• The controllable factors can be fit into a design structure

using M1 to M5, while the non-controllable factors can be set

to levels determined by another fractional factorial design.

• All the non-controllable factor experimental design trials

would be performed for each trial of the controllable factor

experimentation design. (A traditional design of 16 trials might

now contain 64 trials if the outer design contains 4 trials.)

5/31/2013

9

32.5 Robust DOE Strategy

• Now both a mean and standard deviation value can be

obtained for each trial and analyzed independently.

• The trial mean value can be directly analyzed using the DOE

procedures. The standard deviation (or variance) for each

trial should be given a logarithm transformation to normalize

standard deviation data.

• A practitioner is not required to use the inner/outer array

experimental design approach when investigating variability. It may be appropriate to construct an experiment design where

each trial is repeated and the variance between repetition is

considered a trial response. Data may need a log

transformation. The sample size needs to be large enough.

32.6 Analyzing 2𝑘 Residuals for

Sources of Variability Reduction

• A study of residuals from a single replicate of a 2𝑘 design can

give insight into process variability, because residuals can be

viewed as observed values of noise or error (Montgomory

1997, Box and Meyer 1986).

• When the level of a factor affects variability, a plot of

residuals versus the factor levels will indicate which level

caused more variability.

• The magnitude of contrast column dispersion effects can be

tested by calculating

𝐹𝑖∗ = 𝑙𝑛

𝑠2(𝑖+)

𝑠2(𝑖−) 𝑖 = 1, 2, … , 𝑛

where 𝑛 is the number of contrast columns.

5/31/2013

10

32.6 Analyzing 2𝑘 Residuals for

Sources of Variability Reduction

• The magnitude of contrast column dispersion effects can be

tested by calculating

𝐹𝑖∗ = 𝑙𝑛

𝑠2(𝑖+)

𝑠2(𝑖−) 𝑖 = 1, 2, … , 𝑛

where 𝑛 is the number of contrast columns.

• The variance of the residuals for each group of signs in each

contrast column is designated as 𝑠2(𝑖+) and 𝑠2(𝑖−).

• The statistic, 𝐹𝑖∗, is approximately normal if the two variances

are equal.

• A normal probability plot of the dispersion effects for the

contrast columns can be used to assess the significance of a

dispersion effect.

32.7 Example 32.2: Analyzing 2𝑘 Residuals

for Sources of Variability Reduction

• The present defect rate of a

process producing internal

panels is too high (5.5 defects

per panel). A 4-factor, 16-trial,

2k single replicate design was

conducted.

Factors (−𝟏)

Level

(+𝟏)

Level

A: Temperature 295 325

B: Clamp time 7 9

C: Resin flow 10 20

D: Closing time 15 30

Trial A B C D Response

1 -1 -1 -1 -1 5.0

2 1 -1 -1 -1 11.0

3 -1 1 -1 -1 3.5

4 1 1 -1 -1 9.0

5 -1 -1 1 -1 0.5

6 1 -1 1 -1 8.0

7 -1 1 1 -1 1.5

8 1 1 1 -1 9.5

9 -1 -1 -1 1 6.0

10 1 -1 -1 1 12.5

11 -1 1 -1 1 8.0

12 1 1 -1 1 15.5

13 -1 -1 1 1 1.0

14 1 -1 1 1 6.0

15 -1 1 1 1 5.0

16 1 1 1 1 5.0

5/31/2013

11

32.7 Example 32.2: Analyzing 2𝑘 Residuals

for Sources of Variability Reduction

Minitab:

Stat

DOE

Factorial

Analyze …

Graphs

Effect Plots

Normal

32.7 Example 32.2: Analyzing 2𝑘 Residuals

for Sources of Variability Reduction

Minitab:

Stat

DOE

Factorial

Analyze …

Graphs

Residuals vs Var

5/31/2013

12

32.7 Example 32.2: Analyzing 2𝑘 Residuals

for Sources of Variability Reduction

Trial A B C D AB AC BC ABC AD BD ABD CD ACD BCD ABCD Residual

1 -1 -1 -1 -1 1 1 1 -1 1 1 -1 1 -1 -1 1 -0.9375

2 1 -1 -1 -1 -1 -1 1 1 -1 1 1 1 1 -1 -1 -0.6875

3 -1 1 -1 -1 -1 1 -1 1 1 -1 1 1 -1 1 -1 -2.4375

4 1 1 -1 -1 1 -1 -1 -1 -1 -1 -1 1 1 1 1 -2.6875

5 -1 -1 1 -1 1 -1 -1 1 1 1 -1 -1 1 1 -1 -1.1875

6 1 -1 1 -1 -1 1 -1 -1 -1 1 1 -1 -1 1 1 0.5625

7 -1 1 1 -1 -1 -1 1 -1 1 -1 1 -1 1 -1 1 -0.1875

8 1 1 1 -1 1 1 1 1 -1 -1 -1 -1 -1 -1 -1 2.0625

9 -1 -1 -1 1 1 1 1 -1 -1 -1 1 -1 1 1 -1 0.0625

10 1 -1 -1 1 -1 -1 1 1 1 -1 -1 -1 -1 1 1 0.8125

11 -1 1 -1 1 -1 1 -1 1 -1 1 -1 -1 1 -1 1 2.0625

12 1 1 -1 1 1 -1 -1 -1 1 1 1 -1 -1 -1 -1 3.8125

13 -1 -1 1 1 1 -1 -1 1 -1 -1 1 1 -1 -1 1 -0.6875

14 1 -1 1 1 -1 1 -1 -1 1 -1 -1 1 1 -1 -1 -1.4375

15 -1 1 1 1 -1 -1 1 -1 -1 1 -1 1 -1 1 -1 3.3125

16 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 -2.4375

s(+) 2.25 2.72 1.91 2.24 2.211 1.808 1.802 1.798 2.052 2.276 1.972 1.926 1.518 2.086 1.615

s(-) 1.85 0.82 2.2 1.55 1.86 2.236 2.259 2.244 1.926 1.609 2.112 1.58 2.163 1.889 2.334

F* 0.39 2.39 -0.29 0.74 0.346 -0.425 -0.45 -0.44 0.126 0.693 -0.14 0.396 -0.71 0.199 -0.74

32.7 Example 32.2: Analyzing 2𝑘 Residuals

for Sources of Variability Reduction