EFCOG Contractor Guide for Performance Analysis · EFCOG Contractor Guide 2008-1 ... recurring...

19
EFCOG Contractor Guide for Performance Analysis Revision: 0 April 8, 2008

Transcript of EFCOG Contractor Guide for Performance Analysis · EFCOG Contractor Guide 2008-1 ... recurring...

EFCOG Contractor Guide

for

Performance Analysis

Revision: 0

April 8, 2008

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Table of Contents

Introduction …………………………………………………………….. 1

Discussion ………………………………………………………………. 1

Definitions ……………………………………………………………… 2

Purpose ………………………………………………………………… 3

Performance Analysis Process …………………………………………... 4

References ……………………………………………………………… 10

Appendix A ..…………………………………………………………... 11

Appendix B …………………………………………………………….. 12

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 1

PERFORMANCE ANALYSIS

1.0 Introduction

This document provides guidance for conducting performance analysis of events, such as

occurrences, and of conditions, such those found during assessments.

Performance analysis is a systematic review of selected data of interest to identify

common causes and to determine which conditions have occurred with an unacceptably

high frequency such that corrective actions can be developed to prevent recurrence. The

goal is to prevent a more serious event or an unacceptable repeat of events or conditions

and to drive continuous improvement.

This methodology identifies recurring events that may be reportable and identifies

recurring conditions that may warrant additional analysis and corrective action. This

document provides a methodology (including key process elements) for successfully

completing and documenting the analysis and results. This guidance meets the

requirements in DOE Manual 231.1-2 Occurrence Reporting and Processing of

Operations Information and in DOE Order 210.2 DOE Operating Experience Program..

This methodology identifies repeat and programmatic nuclear safety (PAAA) and worker

safety and health (WSH) non-compliances. These PAAA and WSH non-compliances are

then evaluated for reporting to the DOE Office of Enforcement Noncompliance Tracking

System (NTS). NTS-reportable non-compliances include non-compliances that are

repetitive, and non-compliances that indicate a common breakdown in a program or

program area and are caused by systemic problems having a common underlying cause.

The overall process of performance analysis, from data gathering to reporting the results,

is diagramed in Appendix A.

2.0 Discussion

There are many business reasons for a formal performance analysis process. These

include ongoing protection of the workers, the public and the environment; improving the

cost effectiveness of operations; avoiding the costs resulting from recurring or repetitive

events; preventing violations and civil penalties associated with the failure to effectively

correct and prevent problems; increasing the margin of regulatory and customer

confidence; improving quality and reducing rework.

The DOE has several requirements for performance analysis:

• DOE Manual 231.1-2 Occurrence Reporting and Processing of Operations

Information requires that the contractor will perform ongoing quarterly

performance analysis of events that occurred during the previous four quarters to

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 2

look for trends. It is also requires that the performance analysis evaluate

occurrences of all significance categories plus non-reportable occurrences.

In the DOE Office of Enforcement, Enforcement Process Overview, requires

tracking and reporting of repetitive and programmatic noncompliances.

In the DOE Order 210.2 Operating Experience Program

3.0 Definitions

1. Apparent cause – The most probable cause of a condition or event based upon

readily available information; term is associated with occurrence reporting

2. Condition – Any as-found state, whether or not identified by an event or assessment,

that may have adverse safety, health, quality assurance, operational or environmental

implications. In this guidance, the term condition can be errors in calculation,

anomalies associated with design or performance, items indicating a weakness in the

management process, or identified noncompliances.

3. Control Charts – A control chart is a graphical tool for monitoring changes that

occur within a process, by distinguishing variation that is inherent in the process from

variation that indicates a change to the process..

4. Event - Something significant and real-time that happens (e.g., pipe break, valve

failure, loss of power, environmental spill, earthquake, tornado, flood). Events that

are occurrences are reported and tracked in the ORPS or in local non-ORPS

reportable databases.

5. Groupings – Selected elements on which data searches and sorts are performed.

Groupings can be made up of one or more elements.

6. Non-ORPS Reportable Occurrence - An event or condition identified by the site for

tracking that falls below any threshold for a reportable occurrence.

7. Normalize –Data that is shown as a rate to create an equal basis of comparison. For

examples, events per effort hours, events per work orders, or violations per

inspection.

8. NTS – DOE Office of Enforcement’s centralized noncompliance tracking system for

reporting and tracking PAAA and worker safety and health noncompliances.

9. Occurrence Reporting and Processing System (ORPS) - An unclassified,

centralized DOE database containing Occurrence Reports from the DOE community.

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 3

10. Occurrence Report - A documented evaluation of an occurrence event or condition

that is prepared in sufficient detail to enable the reader to assess its significance,

consequences, or implications, and to evaluate the actions being proposed or

employed to correct the condition or to avoid recurrence. Occurrences are tracked in

the DOE Occurrence Reporting and Processing System.

11. Occurrence – A single or recurring event or condition that adversely affect, or may

adversely affect, DOE (including NNSA) or contractor personnel, the public,

property, the environment, or the DOE mission. Events or conditions meeting the

criteria threshold identified in DOE M 231.1-2, or determined to be recurring through

performance analysis, are occurrences.

12. Performance Analysis - a systematic review of data of interest to identify common

causes and conditions and then to identify those with an unacceptably high frequency

such that corrective actions can be developed and implemented. The goal is to prevent

a more serious event or an unacceptable repeat of the events or conditions

13. PAAA – Price Anderson Amendments Act authorizes DOE to establish and enforce

the nuclear safety rules.

14. Pareto Charts – Displays the frequency of events, conditions or causes to identify

those items that are most frequent and require more in-depth analysis or action to

eliminate defects.

15. Recurring occurrence – A series of two or more events determined by performance

analysis to have an unacceptably high frequency and severity, for which previous

corrective actions failed to prevent repetition.

16. Root cause – The most basic cause that can reasonably be identified, that

management has the control to fix, and for which effective recommendations for

corrective action(s) for preventing recurrence can be generated

17. Trend Chart – Trend charts compare the number of events over time.

18. Watch list – Those conditions or events that are identified through performance

analysis that do not meet the criteria for further reporting but are considered by the

analyst as an indication that additional attention or action may be necessary to prevent

it from becoming a reportable event or condition in the future. The watch list is a tool

for managers.

4.0 Purpose

This document establishes an approach for conducting performance analysis.

Performance analysis is a systematic review of events reported in ORPS and internally-

(site-) reportable events to identify possible recurring occurrences. Repeated events are

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 4

not necessarily recurring occurrences and vice versa. Performance analysis is also a

systematic review of internally- (site-) reportable PAAA and WSH non-compliances to

identify repetitive, systemic or programmatic PAAA and WSH non-compliances for

more in-depth causal analysis. These conditions and events can be analyzed by several

methods to identify the underlying cause and possible latent organizational weaknesses.

Performance analysis is also used to analyze events and conditions for potential lessons

learned

Events may recur when the apparent cause analysis for an individual occurrence was not

effective in determining the cause, or when the corrective actions taken to resolve the

causes were not effective. In each case, repeated events should be reviewed to determine

if they should be classified as potential recurring occurrences. Apparently dissimilar

events (events with different consequences) may be recurring if they have a similar

apparent cause.. If a series of two or more events with common causes or conditions is

determined by performance analysis to have an unacceptably high frequency, a recurring

occurrence is reported. The common causes or conditions are then analyzed such that

appropriate corrective actions can be developed and implemented to prevent more serious

events.

DOE has a graded approach to analyzing occurrences and non-compliances. As part of

this process, DOE requires contractors to determine and report the apparent cause of all

occurrences above significance category 4. A root cause analysis, which is more in-depth,

is also required for occurrences at the higher significance categories of OE, 1 and R.

There are many causal analysis methods that can be used to evaluate an event or

condition. Apparent cause analysis is the method for analyzing occurrences that are

significance category 2 or 3, which are a major portion of all occurrences. For most

occurrences, knowing the apparent cause is sufficient to determine the appropriate

corrective action. A more in-depth analysis of a single event may not yield additional

insight and would not change the corrective action. In addition, reporting cause categories

from the cause analysis tree provides DOE with the capability to analyze events across

the complex.

Performance analysis is also part of the graded approach to implementation. The

performance analysis process identifies additional events or conditions that should be

analyzed using root cause analysis.

5.0 Performance Analysis Process

The performance analysis process includes five basic steps:

(1) Gathering the data

(2) Analyzing the data

(3) Identifying potential problems

(4) Documenting the results of the performance analysis

(5) Reviewing the results with senior management

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 5

5.1 Gathering Data

The parameters for the data should be established based on the analysis objectives and

used consistently to allow for comparisons between analysis results.

5.1.1 Collect and Compile Data

For analysis of occurrences, the DOE Manual 231.1-2 requires analysis of all events

during the previous four-quarter period. The time period should remain consistent for

each of the quarterly analysis. If conditions from assessments are included in the analysis,

then the variations caused by assessment frequencies, subjects or types need to be

considered and addressed in the analysis. In cases with variation, the data may need to be

normalized.

DOE requires contractors to include site-specific data for non-reportable events in the

analysis. Examples of non-reportable events may include illness and injury reports,

radiological doses and events, minor spills, and emergency response logs or conditions

identified by assessment results.

The analyst will need to determine the minimum set of information needed for successful

analysis. Contractors may choose to establish additional internal reporting requirements

in order to have the information they need to conduct the analysis. For example, the

significance category 4 occurrences, the internally- (site-) reportable occurrences and

conditions found by assessments may not have cause codes, unless required by the

contractor.

Once the data needs are identified and collected for the occurrences and the selected

conditions, the data needs to be entered into a one or more database as it becomes

available and formatted to facilitate analysis. The data will need to be reviewed and

sorted to ensure it is complete, accurate, and does not include duplicate entries.

5.1.2 Determine Elements and Groupings

The analyst will sort the data into appropriate groupings based on reporting objectives

and occurrence attributes that are identified by the contractor and the elements to be

analyzed. Selected groupings may include elements such as cause codes, deficiency

description types, affected items and equipment, DOE Headquarter keywords, reporting

criteria, organization, time, and significance category, locations, work groups, Reporting

Criterion/Criteria, Significance Category.

The selection of the groupings is dependent on the reporting objectives. Analysis results

will be more definitive and revealing if the groupings include comparable and directly

related attributes. For example, if the groupings include only Final reports, then the

attributes will be final causes and conclusions. The analyst may want to look at corrective

actions in the analysis to identify common elements. The contractor determines which

groupings should be used based on current reporting objectives and previous analyses.

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 6

5.2 Analyze Data

The techniques discussed below include both statistical and qualitative methods and

provide a framework to assist in conducting performance analysis. While the methods

described below can be used to analyze event data, other site-specific processes can also

be used to support this analysis.

The data may be analyzed by personnel with expertise in statistical analysis. The overall

performance analysis process; however, will likely include many different health, safety

and environmental protection specialists and managers. This, of course, will depend on

the contractor’s organizational structure and assigned roles and responsibilities.

The analyst examines trends and data considering “Who”, “What”, “Where”, “When,”

“Why,” and “How” the event occurred. The first step is to identify which events appear to

be different from the other events. The analyst then determines the relationship the

elements have together within the context of the subject or grouping. The count of items

in itself does not yield useful a result; the analyst will need to understand the occurrences

and their significance.

5.2.1 Pareto Charts

The Pareto principle states that 80% of the impact of the problem will show up in 20% of

the causes. A bar chart that displays by frequency or quantity, in descending order, will

identify the most important defects. This chart-type is used to identify if the Pareto

principle is evident in the data. If the Pareto principle is evident, about 20% of the

categories on the far left will have about 80% of the impact on the problem. A Pareto

chart is used to graphically summarize and display the relative importance of the

differences between groups of data.

In a Pareto chart, the analyst graphs the number of items (events, causes codes,

facilities/operations/organizations) within a chosen grouping. Pareto charts can be used to

visually display the major contributor to a grouping and help identify areas for further

analysis.

The data should be “normalized” so that comparisons are made that are useful to the

analyst. For example, when preparing a Pareto chart on electrical shocks, it would not

add value to compare the maintenance organization to administrative organizations,

because the maintenance organization has more frequent opportunity for electrical shock.

It may be valuable; however, to compare the programmatic organizations with similar

expected frequencies.

Example of Pareto chart is displayed in Appendix C.

Questions to ask when reviewing Pareto charts may include:

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 7

• What is the most frequent common underlying cause for a series or group of

events?

• Do related series or groups of events, having the same underlying cause, occur

most frequently within a single organization, facility or operation?

• Have multiple control failures taken place within the boundaries of a single

grouping indicating a common breakdown in a program or area of a program?

• Do the causal factors of a series or group of events indicate possible systemic

problems?

5.2.2 Trend Charts

Trend charts compare the number of events over time. They can be used to measure the

significance of performance in a single point of time compared to the past and to project

future performance. Using a trend chart, the analyst can determine the impact of actions

taken and whether corrective actions are effective. In evaluating trend charts, one must

consider what variable that may affect the number of events identified. If the definition of

the event is changed or an activity has been added to increase the likelihood of

identification and reporting, the trend results will be affected.

An example of trend chart is displayed in Appendix C.

Questions to ask when reviewing trend charts may include:

• Did the trending data for the series or groping of events indicate a positive

trend?

• Did the trending data for the series or group of events indicate a negative

trend?

• Can the trend be correlated to specific activities, such as assessments, changes

in processes, changes in requirements, etc?

5.2.3 Control Charts

Control charts are designed to look for specific variations outside of the normal range. A

control chart is a graphical tool for monitoring changes that occur within a process, by

distinguishing variation that is inherent in the process from variation that indicates a

change to the process. This change may be a single point or a series of points in time -

each is a signal that something is different from what was previously observed and

measured.

Control charts are used to identify changed levels of variability. By establishing control

charts, one can analyze process variability and determine when something unusual is

present. If the change is significant, the analyst then looks for the source(s) of the

variability. The use of control charts can show if the process is in “statistical” control so

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 8

that an over-reaction to events can be avoided. Process control charts can show gradual

trends toward the established limits so that those in direct control of the operation can

avoid an abnormal event. In the example of electrical shocks, the frequency of electrical

shocks at different energy levels can be graphed and compared to the control limits

expected in “normal” operations. When these limits are exceeded, we can expect to find

sources of the variation. These should be investigated. Control limits are established

based on the historical data and current expectations. Control limits based on standard

deviations set the likelihood of occurrences.

An example of a control chart is displayed in Appendix C.

Questions to ask when reviewing control charts may include:

• Is the frequency of an event or cause acceptable?

• Is the source of the variability of concern?

• Is there an undesirable change in the frequency of an event or cause?

5.2.4 Relationship Analysis

The analysis methods described in this section complement the statistical analysis

methods described above. The statistical analysis methods above lead the analyst to

specific areas with common elements for further relationship analysis. In addition to

looking at events, the apparent causes of dissimilar events may be related and should be

analyzed. This is the point at which the relationship analysis is applied to a smaller set of

events and conditions to provide insight into management, operations and programs. For

the most part, reporting is required based on the consequences of an event or the

seriousness of the noncompliance. This relationship analysis looks at underlying causes

of multiple events or conditions and of multiple noncompliances. Events in which the

consequences appear unrelated may in fact have common causes that would be identified

by this analysis. The analysis can determine the causes of the events or conditions using

the “Why-Because,” “5-Whys,” “Why-staircase,” method or another suitable method.

The results can be diagramed such that the common causes are connected. The goal of

relationship analysis is for the analyst to look for indications of an ineffective or weak

program, a common underlying cause, or indications of a breakdown in management

control. In cases where the cause of the problem involves human performance, it is

necessary to look beyond human behavior and to examine and determine the error

precursors and latent organizational weaknesses that may have contributed to the human

performance problem.

Questions to ask when looking for commonality across dissimilar events may include:

• Do the apparently isolated series or groups of events indicate a series of

common work process breakdowns or a series of common issues?

• Do apparently isolated series or groups of events collectively indicate a

program weakness?

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 9

5.3 Identifying Potential Problems

If a series of two or more events with common causes or conditions under the control of

the contractor is determined by performance analysis to have an unacceptably high

frequency, a recurring occurrence is identified.

If the data analysis results in the identification of a series of noncompliances that are

considered repetitive or if the analysis indicates a common breakdown in a program or

program area or systemic problems having a common underlying cause, the results

should be reviewed for reporting requirements.

If the analysis reveals operating experience that would be beneficial to share with others,

a lessons learned could be prepared.

5.4 Management Review

If the data analysis results in the identification of non-compliances that are considered

repetitive or if the analysis indicates a common breakdown in a program or program area

or systemic problems having a common underlying cause, the results are presented for

senior managers to review. The senior managers discuss the results of the performance

analysis and either validates the analysis or request additional investigation, as needed.

Often, the recurring issues identified through performance analysis will be at the

contractor level and can only be addressed by senior management commitment (a single

department cannot “solve the problem”). In addition, huge benefits of a robust

management review process are the enabling opportunities for awareness of emerging

trends, resulting in proactive management opportunities to possibly prevent “recurring”

issues.

The management review is an important element of the overall performance analysis

process. Having a thorough and dynamic management review not only ensures that there

are appropriate focuses and proper balances in the analysis process, but also confirms that

management owns the process and the results of the process.

5.5 Document the Performance Analysis

The performance analyst documents the results of the analysis including the identification

of events or conditions that are considered repetitive or if the analysis indicates a

common breakdown in a program or program area or systemic problems having a

common underlying cause recurring in a report.

DOE requires contractors to identify and report recurring occurrences in ORPS and to

identify and report repeated non-compliances in the nuclear safety and worker safety and

health Noncompliance Tracking System. The reporting requirements should be consulted.

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 10

6.0 References

DOE Manual 231.1-2, Occurrence Reporting and Processing of Operations Information

DOE Guide 231.1-1, Occurrence Reporting and Performance Analysis Guide

DOE Office of Enforcement, Enforcement Process Overview

DOE Order 210.2, Operating Experience Program

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 11

Appendix A

Performance Analysis Process

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 12

Appendix B

Statistical Charts and Matrixes

Pareto Charts

Occurrence Reporting Criteria

1 1 1 1 1 1 1 1 1

2 2 2

4

6

7

9

14

12

21

22

24

25

26

0

5

10

15

20

25

30

2A(5

)-Per

sonn

el e

xpos

ure

10(5

)-Pro

mpt

not

ifica

tion

repo

rting

optio

n

2B(1

)-Any

unp

lann

ed fi

re o

r exp

losi

on

3A(2

)-TSR

4A(1

)-Per

form

ance

deg

rada

tion

of a

ny

Safe

ty C

lass

4B(4

)-Any

faci

lity

evac

uatio

n

4B(5

)-A fa

cilit

y op

erat

iona

l eve

nt c

ause

by d

evia

tion

from

pro

cedu

re

5A(4

)-Rel

ease

of h

azar

dous

sub

stan

ce

6A(2

)-Los

s of

radi

oact

ive

mat

eria

l

6D(3

)-Any

ons

ite c

onta

min

atio

n

10(4

)-Inq

uirie

s to

Hea

dqua

rters

2C(2

)-Loc

kout

/Tag

out

4C(3

)-Dis

cove

ry o

f any

def

ectiv

e ite

m

10(3

)-A n

ear m

iss

1(1)

-An

Ope

ratio

nal E

mer

genc

y

9(2)

-Non

com

plia

nce

notif

icat

ions

3B(2

)-Pot

entia

l ina

quat

e sa

fety

anal

ysis

4A(2

)-Saf

ety

clas

s SS

C

2A(6

)-Occ

upat

iona

l Illn

esse

s/ In

jurie

s

4C(2

)-Sus

pect

/cou

nter

feit

3B(1

)-Pos

itive

unr

evie

wed

saf

ety

ques

tion

10(2

)- M

gmt C

once

rns

6B(4

)-Ide

ntifi

catio

n of

ons

ite le

gacy

Qua

ntity

0%

2%

4%

6%

8%

10%

12%

14%

16%

Perc

enta

ge

SAMPLE DATA

Cause Code Categories

1 1 1 1 12 2 2

3 3 3 34 4 4

5 56

78

9

11 11

25

0

5

10

15

20

25

30

Des

ign

Ver

ifica

tion

/ In

stal

latio

n V

erifi

catio

n LT

A

Ext

erna

l Phe

nom

ena

Ope

rabi

lity

of D

esig

n /

Env

ironm

ent

LTA

Sup

ervi

sory

Met

hods

LT

A

Ver

bal C

omm

unic

atio

n LT

A

Cal

ibra

tion

for

Inst

rum

ents

Les

s T

han

Ade

quat

e

Insp

ectio

n/ t

estin

g LT

A

Writ

ten

Com

mun

icat

ion

Met

hod

of P

rese

ntat

ion

LTA

Per

iodi

c/C

orre

ctiv

e M

aint

enan

ce L

TA

Res

ourc

e M

anag

emen

t LT

A

Wor

k P

ract

ices

LT

A

Ski

ll B

ased

Err

ors

Cha

nge

Man

agem

ent

LTA

Des

ign

/ do

cum

enta

tion

LTA

Rad

iolo

gica

l / H

azar

dous

Mat

eria

l Pro

blem

No

Tra

inin

g P

rovi

ded

Tra

inin

g M

etho

ds L

ess

Tha

n A

dequ

ate

Des

ign

outp

ut L

TA

Def

ectiv

e, F

aile

d or

Con

tam

inat

ed

Kno

wle

dge

Bas

ed E

rror

Mat

eria

l con

trol

LT

A

Wor

k O

rgan

izat

ion

& P

lann

ing

LTA

Writ

ten

Com

mun

icat

ion

Con

tent

LT

A

Man

agem

ent

Met

hods

Les

s T

han

Ade

quat

e

Qu

anti

ty

0%

5%

10%

15%

20%

25%

Per

cen

tag

e

SAMPLE DATA

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 13

Control Chart

Total Number ORPS Occurrences

Significance Categories (OE, SC1, R, SC2, SC3, SC4)

SAMPLE DATA

0

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

Jan-0

4

Feb-0

4

Mar-

04

Apr-

04

May-0

4

Jun-0

4

Jul-04

Aug-0

4

Sep-0

4

Oct-

04

Nov-0

4

Dec-0

4

Qu

an

tity

Total Occurrences Mean Upper Confidence Limit

SAMPLE DATA

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 14

Trend Chart

Cumulative Dollective Dose

0

5

10

15

20

25

30

35

40

1998 1999 2000 2001 2002 2003 2004 2005 2006

Pe

rso

n-r

em

Annual Doses

0

5

10

15

20

25

30

35

40

19

88

19

89

19

90

19

91

19

92

19

93

19

94

19

95

19

96

19

97

19

98

19

99

20

00

20

01

20

02

20

03

20

04

Pe

rso

n-re

m

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 15

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 16

ASSOCIATED HUMAN FACTOR COUPLETS

Associated Couplet Human Factor A3Bx Node

Cause Code B1-Skill B2-Rule B3-Knowledge B4-Work Pract. Total

A2-Equip/Mat'l B2-PM 1 1

B4-Mat'l Cont 2 2

A4-Mgmt B1-Mthds 1 1

B2-Res. Mgmt 6 3 9

A5-Comm B1-Method 1 1

B2-Content 1 2 3

A6-Training B2-Training LTA 8 4 1 2 15

A7-Other B1-External 1 1

None 2 3 2 7

Total 21 7 7 5 40

SAMPLE DATA

EFCOG Contractor Guide 2008-1

Integrated Safety Management & QA Working Group

Page 17

CROSS CUTS OF REPORTING CRITERIA AND APPARENT CAUSE CODES

OE

Mgm

t co

nce

rns

Ne

ar

Mis

s

Occu

pa

tio

na

l

Illn

esse

s/I

nju

rie

s Fire

Locko

ut/

Ta

gou

t

TS

R

US

Q/P

ISA

Po

ten

tia

l L

TA

sa

fety

ana

lysis

Pe

rfo

rman

ce

deg

rad

atio

n

Fa

cili

ty

eva

cu

atio

n

Loss R

ad

Mate

ria

l

On

site

Leg

acy

On

site

Co

nta

min

atio

n

CAUSE CODES 1(1) 10(2) 10(3) 2A(6) 2B(1) 2C(2) 3A(2) 3B(1) 3B(2) 4A(1) 4B(4) 6A(2) 6B(4) 6D(3) Total

A1-Design

B2-Output 1 1 1 1 4

B3-Doc LTA 1 3 4

B4-Install Verif. 1 1

A2-Equip/Mat'l

B1-Calib. 1 1 2

B2-Maint. 1 2 3

B3-Insp/Test 1 1 2

B4-Mat'l Control 1 2 1 4

B6-Defect 1 2 1 1 1 6

A3-Human

B1-Skill 7 9 7 9 9 6 13

B3-Knowledge 1 8 5 6 1 1 1 6

B4-Work 2 1 3

A4-Mgmt

B1-Methods 1 3 1 2 1 8

B2-Resource 2 1 3

B3-Planning 1 2 1 1 1 6

B5-Change Mgmt 1 1 2

A5-Comm.

B1-Methods 1 1

B2-Content 2 1 2 2 1 2 10

B4-Verbal 1 1

A6-Training

B1-No Training 1 2 1 4

B2-Methods 5 5

A7-Other

B1-External 1 1

B2-Rad/Hazardous 2 1 3

Total 11 16 6 15 3 2 10 11 8 1 2 1 3 1 92

SAMPLE DATA

SAMPLE DATA