Detecting Improper Laboratory Practices Presentation

Post on 02-Jan-2017

229 views 4 download

Transcript of Detecting Improper Laboratory Practices Presentation

Detecting ImproperLaboratory Practices

A Toolbox for Assessors

Module 1

Orientation

Module 1 - Topics

• Impact of Improper Practices• Purpose and Scope of Training• Assessment Process Flow Chart• Assessor Roles and Responsibilities• Assessor Skills and Qualifications• Responding to Improper Practices• Definitions and Examples

Impact of Improper Practices

• Sound decision-making requires reliable data of known and documented quality

• Laboratories are coming under increasing scrutiny for improper practices

• Improper practices result in millions of dollars in unreliable data

• Impacts Public Health Regulatory Programs Industry Laboratories Individuals

Purpose and Scope of Training

• This course presents tools for identifying improper laboratory practices during routine assessments

• Presumes knowledge of laboratory assessments such as: National Environmental Laboratory Accreditation

Program (NELAP) Good Laboratory Practices (GLP) Drinking Water Certification program Pre-qualification (e.g., project-specific assessments)

Purpose and Scope of Training

• This is supplemental training for experienced assessors

• Determining intentional wrongdoing is a job for the investigator

The Assessment ProcessFlow Chart

Assessor Roles and Responsibilities

• Verify adherence to specifications (quality system documents and SOPs)

• Be alert to deficiencies (non-conformance to a standard)

• Probe further when a deficiency is observed• Have contacts from whom to seek advice• Follow your assessment plan and observe

your assessment program’s policies

Assessor Roles and Responsibilities

• Be alert to patterns of behavior that could indicate a systematic weakness

• Document findings and gather sufficient objective evidence to support deficiencies

• Inform management of deficiencies• Recognize good practices

Assessor Skills & Qualifications

• Assessment training and experience• Verbal and written communication skills

Interviewing Documenting deficiencies

• Expertise in areas being assessed Laboratory instrumentation (including computer

skills) Data collection, reduction, analysis, and reporting Record-keeping

Assessor Skills & Qualifications

• Knowledge of applicable specifications Relevant federal and state regulations Laboratory assessment and quality systems

standards Methods Minimum QA and QC requirements

Flexibility is essential: Assessors must adapt as they go!

Responding to Improper Practices

• Act impartially and observe due process (innocent until proven guilty)

• DO NOT take actions that could compromise a future investigation

• DO NOT overstep your authority (e.g., interrogate laboratory analysts until they “confess”)

Responding to Improper Practices (cont’d)

Important Reminders• There may be a number of explanations for

an observation• When in doubt, gather the objective evidence

and records and seek advice• Be candid with laboratory management

during the exit briefing and encourage open dialog about your observations

Responding to Improper Practices (cont’d)

Circumstances that require follow-up:• You have been approached by a “whistle-

blower” or otherwise suspect deliberate wrong-doing

• You observe practices that could result in a potential threat to public health and safety

• You observe a pattern of improper practices that raises concerns about data integrity

Responding to Improper Practices (cont’d)

• Your organization should have specific procedures for managing assessment findings, including making referrals

• Before you go on-site, make sure you have identified points of contact for both technical assistance and referrals

Terms Defined

Red flagDeficiency

Improper practiceFraud

System vulnerabilityObjective evidenceData assessment

Definitions

Red Flag or Warning Sign:An observation that indicates the potential for, or leads the assessor to suspect, system vulnerabilities or improper practices

Examples: Weaknesses in internal assessments/corrective

action High staff turnover in key areas Unclear roles and responsibilities Too few QC records to support data output

Definitions

Deficiency:An unauthorized deviation from acceptable procedures or practices; a defect in an item; non-conformance with a specification

Examples: The laboratory has no arrangements for annual

internal audits (ISO 17025 4.3.1, NELAC 5.xx 2002)

Lack of written SOPs (GLP standards 40 CFR Part 160 Section 160.81)

Definitions

Improper Practice:A scientifically unsound or technically unjustified omission, manipulation, or alteration of procedures or data that bypasses the required QC parameters, making the results appear acceptable

Any alteration of data such that the data are unauthentic or untrue representations of the experiment or test performed

Definitions

Improper Practice - Examples: Reporting failed QC results as acceptable Reporting post-digested spikes as pre-digested Changing instrument clock setting or altering

date/time information Manipulating peak area without

approval/documentation Selectively dropping calibration points

Definitions

Fraud:A deliberate deception practiced so as to secure unfair or unlawful gain

Reminder: An assessor can not determine whether an improper practice is fraud. A determination of fraud is the conclusion of a legal process which must evaluate the elements of both “intent” and “unlawful gain”. Until that point, the practice in question is an allegation of misconduct.

(Webster’s II)

Definitions

System Vulnerability:Any aspect of a Quality System that allows improper practices to occur and to go undetected

Examples: Inadequate training Ineffective internal assessments Lack of independent QA reviews Lack of management controls

Definitions

Objective Evidence:Data supporting the existence or truth of something

Data Assessment:An in-depth review and reconstruction of data from raw/source data through final reporting

Any Questions?

Module 2

Pre-Assessment Activities

Topics

• Requesting Documents and Records• Pre-Assessment Review• Pre-Assessment Red Flags• Developing the Assessment Plan

Requesting Documents and Records

• Qualifications statement• Organizational chart• Quality Systems manual• Resumes for staff performing key functions• Previous assessment reports• Standard Operating Procedures (SOPs)

Requesting Documents and Records

• Proficiency Testing (PT) sample results• Data Packages

Request specific data packages Designate the type of data to be sent Include cross-section of procedures

Pre-Assessment Review Questions

During the pre-assessment review you should answer the following questions:

Pre-Assessment Review

1) Do initial signs point to a well-established internal assessment and corrective action program?

2) Does the pre-assessment submittal document proficiency in requested services?

Pre-Assessment Review

3) Does the technical depth appear to be adequate?

4) Have there been recent, significant organizational changes?

Pre-Assessment Review

5) Was the laboratory’s initial response prompt, complete, and organized?

6) Is management’s commitment to data integrity clearly spelled out?

Pre-Assessment Red Flags

• Weaknesses in internal assessments/corrective action

• Inadequate demonstration of proficiency• Questionable technical depth• Recent, significant organizational changes• Lack of responsiveness• Lack of management commitment to data

integrity

Developing the Assessment Plan

An assessment plan is a written document that identifies the scope and objectives of an assessment and includes: Laboratory information including primary contacts Names of assessment team members Areas to be assessed Documents reviewed during pre-assessment Documents to review on-site List of laboratory staff to interview Schedule of assessment team activities while on-site

Developing the Assessment Plan

1) If you find weaknesses in internal assessments/corrective action, then: Examine corrective action reports Interview QA staff Interview analysts

Developing the Assessment Plan

2) If you question the laboratory’s proficiency, then: Interview analysts Request process demonstrations Review training records Conduct data assessment

Developing the Assessment Plan

3) If you have concerns about technical depth, then: Review additional resumes Check training records Interview key staff Interview selected analysts

Developing the Assessment Plan

4) If you find recent organizational or ownership changes, then: Verify that the quality systems manual has been

updated Include a cross-section of staff for interviews Confirm that SOPs are current Confirm that training records are up to date Confirm that internal assessment files and

corrective action records are up to date

Developing the Assessment Plan

5) If the laboratory has been unresponsive, then: Reissue your request Report concerns during opening meeting Conduct on-site interviews of technical

management, QA staff, and customer service staff

Developing the Assessment Plan

6) If a strong management commitment to data integrity is not apparent, then: Interview staff at all levels Review ethics or data integrity policy Review internal assessment records and

corrective action reports Review training records Perform data assessment

Summary

• Note areas of concern during pre-assessment for follow-up

• Identify additional resources for technical questions

• Determine what information you will need to obtain on-site

Any Questions?

Module 3

Interviewing

Topics

• Effective Interviewing Skills• Setting the Stage• Using Core Questions• Useful Interview Phrases• Know How to Respond to Questions• Interviewing Do’s and Don’ts

Effective Interviewing Skills

• Organizational skills• Maintaining a non-threatening demeanor• Ability to listen• Ability to control situations and manage time• Ability to read body language

Effective Interviewing Skills

• Ability to redirect• Ability to follow a lead• Accurately documenting interview• Ability to recognize when to refer to other

experts• Know the right questions to ask

Setting the Stage

• Start slowly• Put interviewees at ease• Tell them

who you are why you are there what you want to talk about what are you going to do with the information

Using Core Questions

• Asking several employees the same set of core questions can provide a sense of how the laboratory functions. Begin with: What are your responsibilities? How do you like working here? How long have you been working here? How do you handle overtime?

Using Core Questions (cont’d)

• Move into job function/responsibilities What do you do in a typical day? Who is your backup? Explain the data review process? What is your repeat rate? Do you get a lot of rush

jobs? Tell me about recent training you have received Does the laboratory have a bonus program? How

does it work?

Using Core Questions (cont’d)

• Draw out more specific information How are you assigned work? How do you document which work you did? How do you interact with your quality assurance

staff? What do you do, and where do you go when you

have a problem?

Using Core Questions (cont’d)

• Draw out more specific information (cont’d) What would you do if you came across an

improper practice? Describe the laboratory process for corrective

action Has anyone (such as a project manager) ever

asked you to change data? Tell me about the laboratory’s data integrity

program

Useful Interview Phrases

• Let’s move on to another topic…• Why do you do that…?• Tell me about…• Where do you document…?• Can you show me…?

Useful Interview Phrases

• I’m confused; can you help me understand…?

• This is interesting…• I came across this in our data review…• [Someone] mentioned this to me; how do you

handle that in here?• You seem extremely busy...

Be Prepared to Answer...

• Am I going to get into trouble for talking with you?

• Should my supervisor be here?• Can I talk to you about something off the

record?• I saw something recently that concerns me.

What should I do?

Be Alert to...

• Managers translating or clarifying staff’s responses to questions

• Emphasis on production

Interview Do’s

• Be prepared• Be objective and impartial• Adapt you interview style to the individual• Ask who else to talk to• Consider having a witness (scribe) with you• Know when to move on or stop

Interview Don’ts

• Don’t overpower the conversation• Don’t respond for the interviewee; silence is

OK• Don’t take statement at face value -- check

your information• Don’t overtax your schedule; conduct fewer,

more thorough interviews

Summary

• Interviews are a critical part of the assessment process

• Plan for effective interviews• Practice effective interviewing skills

Any Questions?

Module 4

Overview of Improper Practices

Topics

• Types of Improper Practices• Causes/Factors• Quality System Vulnerabilities

Improper Practice

Improper Practice:A scientifically unsound or technically unjustified omission, manipulation, or alteration of procedures or data that bypasses the required QC parameters, making the results appear acceptable

Any alteration of data such that the data are unauthentic or untrue representations of the experiment or test performed

Quality System Vulnerabilities

• DefinitionAny aspect of a Quality System that allows improper practices to occur and to go undetected

• Weaknesses in the following areas: Management commitment to data integrity Resources Qualifications and training Supervision and oversight Preventative and corrective actions Document control

Types of Improper Practices

1) Fabrication of data or other information2) Misrepresentation of QC sample results3) Improper date/time setting or recording4) Improper peak integration5) Improper GC/MS tuning6) Improper calibration and verification

Types of Improper Practices (cont’d)

7) Data file substitution or modification 8) Unwarranted sample dilution 9) Deletion of non-compliant data10) Improper alteration of analytical conditions11) Unwarranted manipulation of computer

software12) Concealment of a known problem

1) Fabrication

• Creating information that is not true• Creating data for an analysis that was not performed• Creating information for a sample that was not collected• Claiming ownership for work performed by external

analysts, equipment, facilities• Cutting and pasting reports and support dataExamples:

Subcontracting PT samples Creating CoCs without sample possession Filling in logbooks for audits “Reappearing” QC results Recording autoclave conditions before starting the process

2) Misrepresentation of QC Results

Improperly processing or reporting QC samples

Examples: Representing QC samples as digested or

extracted when they were not Adding surrogates after sample extraction Adding more than the prescribed amount of spikes Reporting post-digested spikes as predigested Failing to prepare blanks, spikes, PT samples or

standards in the same manner as samples

3) Improper Date/Time Setting

Altering the recorded times that samples were collected, extracted, or analyzed

Examples: Resetting instrument clocks to make it appear that

a sample was analyzed within holding time Altering dates/times on printouts and/or screen

printing to make analyses appear to meet 12-hour windows

4) Improper Peak Integration

Altering the area of a chromatographic peak to avoid QC failures

Examples: Adding or subtracting peak area to make QC

results appear to meet criteria Artificially reducing the height of peak responses Failing to manually subtract an interfering peak

because doing so would result in a QC failure

5) Improper GC/MS Tuning

Manipulating ion abundance of MS tune verification to cause abundance to appear to meet criteria

Examples: Choosing non-representative scan(s) for evaluation Performing incorrect background subtraction Injecting incorrect amounts (BFB surrogate in CCV) Copying and renaming files and screen printing Adding spectra from two different files

6) Improper Calibration/Verification

Any technically unsound deviation from proper calibration techniquesExamples: Recording results for pH meter calibrations that are not

performed Performing multiple calibration runs or QC analyses Representing previous initial calibration data as current Inserting calibration levels or CCVs run well after initial

calibration Discarding analyte responses from the center of the

calibration curve without technical justification

7) Data File Substitution/Modification

Substituting previously generated analyses for non-compliant calibration, QC or sample runs

Examples: Reusing historical initial calibration data and

representing it as current Changing sample IDs in the data files

8) Unwarranted Sample Dilution

Diluting samples or blanks without explanation, often to the point of eliminating target analyte responses

Example: Diluting a sample to reduce laboratory

contamination below the method detection limit (MDL)

9) Deletion of Non-Compliant Data

• Deleting non-compliant analytical results for calibrations, QC samples or blanks

• Failing to record non-compliant data for these samples

Examples: Deleting common laboratory contaminant results

from method blanks Recording only those assays that work in the

laboratory notebook

10) Improper Alteration of Analytical Conditions

Changing analytical or instrument conditions for standards or QC samples from those specified in the method or SOP

Examples: Adjusting EM voltage Increasing gain Failing to run samples and standards under the

same conditions

11) Unwarranted Software Manipulation

• Removing operational codes to eliminate or hide manipulations

• Performing inappropriate background subtractions• Adjusting baselines

Example: Removing data qualifying flags (e.g., removing the “m”

flag to hide the fact that a manual integration was performed)

12) Concealment of a Known Problem

• Concealing a known sample problem or QC failure

• Concealing an unethical or improper practice

Examples: Failure to discuss surrogate or CCV failures in the

case narrative Failure to report and resolve equipment malfunction

issues

Improper PracticesCauses, Factors

• Production pressure• Conflicts of interest• Lack of awareness• Lack of communication• Misinterpretation of method requirements• Personality and attitude• Financial stability

Improper PracticesQuality System Vulnerabilities

• Inadequate training• Poor workload management• Inadequate documentation and document

control• Unclear roles/responsibilities• Inadequate procedures for addressing ethical

dilemmas• Inadequate oversight

Improper PracticesDetection Tools

• Records reviews• Process demonstrations• Interviews• Data Assessment

Any Questions?

Module 5

On-Site Assessment Part 1

Topics

• Opening Meeting and Lab Tour Red Flags• Assessing the Quality System• Quality System Red Flags• Follow-Up on Quality System Red Flags• Technical Area Assessment• Technical Area Red Flags• Case Studies

Flow Chart

O p en in g M ee tin g & Tou r Q u a lity S ys tem Tec h n ica l A reas D ata A ssessm en t

O n -S ite A ss essm en t

Opening Meeting and Laboratory Tour

• Make introductions• Establish ground rules• Set up separate space and times for team

conferences• Conduct brief laboratory tour

Opening Meeting and Laboratory Tour Red Flags

• Time wasters Lengthy health and safety overview Non-relevant briefings (history, sales)

• Selected staff not available for interviews• Housekeeping practices• Unexplained restricted access

Flow Chart

O p en in g M ee tin g & Tou r Q u a lity S ys tem Tec h n ica l A reas D ata A ssessm en t

O n -S ite A ss essm en t

Importance of the Quality System

• An effective Quality System is critical to a laboratory’s timely detection, correction, and deterrence of improper practices

• An ineffective Quality System can allow shortcuts and improper practices to occur and to go unnoticed

Assessing the Quality System

• Documents and Records Quality Manual Resumes Training records Assessment and corrective action reports

• Interviews General Manager, Technical Director(s), QA

Manager, QA staff, laboratory staff

Management Commitmentto Data Integrity

Red Flags

• Lack of a data integrity policy or statement• Lack of a no-fault mechanism• Emphasis on production over quality

ResourcesRed Flags

• Inadequate backup (personnel and equipment)

• Poor coordination for accepting work• Excessive overtime• “Bottleneck” departments

Qualifications and TrainingRed Flags

• Lack of technical depth• High staff turnover• Gaps in training records

Supervision and OversightRed Flags

• QA staff lacks direct access to senior management

• Unclear roles and responsibilities• QA staff performing competing

responsibilities• Inadequate internal assessments• Lack of data assessments

Preventative and Corrective Action Red Flags

• Incomplete assessment files• Repeat assessment findings• Inadequate procedures for handling

complaints• Corrective action fails to address root cause• Lack of documented follow-up to corrective

action

Document ControlRed Flags

• Lack of record verification in archiving procedures

• Inadequate archiving procedures to ensure data retrieval

Follow-Up on Quality SystemRed Flags

• As appropriate, verify: Adequate coverage for key roles Clear lines of communication and reporting Authority and accountability for data integrity QA manager independence Adequate procedures to avoid conflicts of interest

Flow Chart

O p en in g M ee tin g & Tou r Q u a lity S ys tem Tec h n ica l A reas D ata A ssessm en t

O n -S ite A ss essm en t

Technical Assessment Tools

• Document/Record Reviews• Process Demonstrations• Interviews• Data Assessment

Technical Assessment ToolsDocument/Record Reviews

• Records subject to on-site review: PT sample results Method Detection Limit (MDL) studies Control charts Sample receipt and handling records Standard materials preparation and use records Standard Operating Procedures (SOPs) Instrument run logs, maintenance logs Initial and continuing demonstrations of

performance

Document/Record Review Red Flags

• Records not readily accessible• Discrepancies between method and SOP• Referenced methods out of date• Presence of pencils or whiteout• Extremely clean, neat logbooks• Uncontrolled records• Logbook entries all aligned

Technical Assessment ToolsProcess Demonstration

• Select a process to observe and schedule with the laboratory prior to going on site

• Review the SOP• Ask the analyst to describe the steps in the

process as they are conducted

Process Demonstration Red Flags

• Discrepancies between SOP and practice• Expired standards• Disabled audit trails• Shared log-on and password access• Unlabeled containers or illegible labels• Sample volume discrepancies• Materials inventory doesn’t match throughput• Sample throughput exceeds time required to

process samples

Technical Assessment ToolsInterviews

• Set the stage• Use core questions• Be prepared

Interview Red Flags

• Managers interfering with analyst’s responses• Emphasis on production• Inadequate sample handling procedures for

evening/weekend deliveries• No mechanism for reporting problems• Analysts unable to describe data review

and/or oversight

Any Questions?

Case Study # 1

The Strange Case of the Missing Ocular

CS #1: The Strange Case of the Missing Ocular

• What are the “morals” of the story?

• Describe the objective evidence

More Case Studies

CS # 2: Easy Does It

CS # 3: Assessor Clousseau

CS # 2: Easy Does It

• What types of improper practices have occurred?

• What was the specific red flag?• What other red flags might have been

apparent?• What quality system elements are key to

early detection/correction?• What other steps could an assessor take?

CS # 3: Assessor Clousseau

• How did you uncover this debacle?• Where did you start?• Trace the trail

Module 6

On-Site Assessment Part 2Data Assessment

Flow Chart

O p en in g M eetin g & Tou r Q u a lity S ys tem Tech n ic a l A reas D ata A ssessm en t

O n -S ite A sses sm en t

Topics

• Application of Data Assessment• Data Assessment Process• Data Assessment Red Flags

Application of Data Assessment

• Data assessment may be performed During the pre-assessment process As a part of the on-site process Following the on-site process

• It is especially important to conduct data assessment when: High visibility projects are involved Data integrity red flags have surfaced during other

phases of laboratory assessment

Triggers for Data Assessment

• Incomplete or illegible records• Improper changes, corrections, or method

deviations• Inadequate technical proficiency• Data review and reduction deficiencies• Computer security deficiencies• Inadequate internal oversight or surveillance

Data Assessment Process

• Step 1 - Select the data packages Request initial data packages during pre-

assessment process Request additional packages based on your on-

site observations, for example:• Red flags related to technical proficiency• High visibility projects• High volume business, or “bottleneck” departments

Data Assessment Process (cont’d)

• Step 2 - Conduct a Paper Review Purpose

• Links analyst to data• Tracks samples to client report• Answers key questions regarding systems and

operations• Identifies red flags for further probing

Process• Review PT data packages• Look at data from complaints or visible projects• Identify red flags and inconsistencies

Data Assessment Red FlagsPaper Review

• Incomplete case narratives• Unexpected sample results• Too-perfect QC results• Reports missing review signatures or dates• Discrepancies between CoC and reports

Data Assessment Process (cont’d)

• Step 3 - Evaluate electronic/source data Purpose

• Allows comparisons between source data and final report

• Allows comparisons between analysts• Can detect system vulnerabilities and improper practices• Requires understanding of instrument operations and

analytical procedures Process

• Pay attention to self-written programs• Look at audit trails• Compare raw data to submitted data packages

Data Assessment Red FlagsSource Data

• Unexplained gaps/changes in records• Discrepancies in QC performance between

analysts• Calibration records missing review

signatures/dates• “Reappearing” QC results• Non-standard report formats

Data Assessment Process (cont’d)

• Step 4 - Reconstruct the data Purpose

• Verifies data traceability from source data to final report• Validates laboratory operations• Requires the most time, effort, and access to expertise

Process• Trace and verify dates, times, sample custody, analysts,

instruments• Look at logbooks, preparation logs, analytical sequences• Request supporting data or additional packages if

needed

Data Assessment Red FlagsData Reconstruction

• Routine dead time on auto-injection run logs• Samples or blanks diluted without apparent

justification• Too few QC records to support data output• Chronological chromatograms that do not

display prevalent background

Finding Data Assessment Red Flags

• Sample Receipt RecordsTranscription errors, date/time discrepancies, inappropriate containers, improper preservation or storage

• Wet Chemistry RecordsDiscrepancies between original data and reports, inappropriate sample sequencing, “reappearing” QC results

Finding Data Assessment Red Flags (cont’d)

• Sample Preparation/Extraction RecordsFailure to meet holding times, inappropriate sample volumes or dilutions, improper sample processing cycles

• Calibration RecordsImproper calibration procedures (e.g., dropping points, re-use of historical calibration data)

Finding Data Assessment Red Flags (cont’d)

• Logbooks, sequences, data sheets, run logsImproper changes or corrections, data fabrication

• ChromatogramsImproper electronic peak integration, improper manual integration, improper background subtraction

Important Things to Remember

• Be flexible• You don’t have to be an expert• Take more time and get assistance when you

need it

Any Questions?

Case Study # 4

Dip and Swirl

CS # 4 - Dip and Swirl

• What other red flags might be revealed through data assessment?

• What system vulnerabilities are indicated?• What should the assessor do next?• What should laboratory management do?

Module 7

Documentation and Follow-up

Topics

• Recording Findings• Assessment Team Meetings• Exit Briefing• Follow-up and Closeout• Adapting the Flowchart

Recording Assessment Findings

• Record all relevant information legibly in ink• Record and compile information as you go

Assessment checklists Assessor notebooks Copies of relevant laboratory records

Recording Assessment Findings (cont’d)

• When conducting interviews and process demonstrations, record the following Names and titles of all persons present Date, time, and place of interview/demonstration Type of process observed or reviewed Instrumentation, method and SOP used Topics discussed Observations

Recording Assessment Findings (cont’d)

• When reviewing documents and records, record the following information: Type of record reviewed Date, time, place and location of record Observations

• As appropriate, make a copy of the record supporting deficiencies/improper practices Sign, date, and number the copy

Assessment Team Meetings

• Conduct regular assessment team working sessions Compare notes for completeness

• Observations, deficiencies, improper practices Identify and collect any additional objective

evidence needed• Laboratory records, interviews

Exit Briefing

• Follow your assessment program’s protocols! Determine who should be present Summarize assessment findings objectively Provide laboratory with a schedule for completing

follow-up activities

Exit Briefing (cont’d)

Exception ! If you suspect or have discovered improper

practices:• Describe the finding in general terms (as an observation

or deficiency)• Complete evaluation off-site if necessary

If you suspect misconduct• Gather your records• Refer to your assessment program’s procedures

Follow-up and Closeout

• Prepare assessment report; include required corrective action

• Review laboratory response with Corrective Action Plan

• Prepare response to Corrective Action Plan• Conduct follow-up assessment (if needed)• Issue or deny accreditation or approval• Ensure assessment file is complete

Contents of the Assessment File

• Purpose of the assessment• Laboratory application or nomination• List of pre-assessment records reviewed• Letter announcing on-site assessment,

including agenda• Opening meeting notes• Checklists• Interview records

Contents of the Assessment File (cont’d)

• Exit briefing notes• Assessment report• Laboratory response with Corrective Action

Plan• Assessor response to Corrective Action Plan• Final decision

Any Questions?

Case Study # 5

The Oil Slick

CS # 5 - The Oil Slick

• What are possible technical area red flags?• What are possible data assessment red

flags?• What are possible quality system red flags?• What should the QA Manager do first?• What should be done to determine impacts?• Is retraining an effective correction action?

Module 8

Wrap-up

Case Study # 6 - Name that Tune

• What should you do next?• If the audit trail confirms the tune file has

been overwritten, what should you do?• What would you expect to see in the

laboratory’s Corrective Action Plan?

Case Study # 7 - Time Warp

• What system vulnerabilities are indicated?• What red flags might have been apparent

during a project compliance assessment?• What assessment tools would have been

most effective in uncovering this problem?

Any final questions?

Thank you!