Technology Integration and Improved Technology Maturity Assessments

26
999999-1 XYZ 4/24/2009 MIT Lincoln Laboratory Technology Integration and Improved Technology Maturity Assessments Kyle Y. Yang, MIT Lincoln Laboratory James Bilbro, JBCI Inc. AIAA Southern California Aerospace Systems and Technology Conference 2 May 2009

Transcript of Technology Integration and Improved Technology Maturity Assessments

Page 1: Technology Integration and Improved Technology Maturity Assessments

999999-1XYZ 4/24/2009

MIT Lincoln Laboratory

Technology Integration and Improved Technology Maturity Assessments

Kyle Y. Yang, MIT Lincoln LaboratoryJames Bilbro, JBCI Inc.

AIAA Southern California Aerospace Systems and Technology Conference

2 May 2009

Presenter
Presentation Notes
James Bilbro is the former Chief Technologist at NASA Marshall and former president of the SPIE.
Page 2: Technology Integration and Improved Technology Maturity Assessments

MIT Lincoln Laboratory999999-2

XYZ 4/24/2009

The Context of the Study

• Continuing DoD acquisition budget overruns and schedule slips

• Bright spot: Programs with “mature” technologies and knowledge- based practices fare better

– Programs with “immature” technologies undergo 44% more cost growth

• In 2006, Air Force launched process reengineering activities in order to find money to allow continued fleet modernization

2000 Portfolio 2007 PortfolioPortfolio sizeNumber of programs 75 95Total planned commitments $790 Billion $1,600 Billion

Portfolio performanceChange to total RDT&E costs from first estimate 27 % 40 %

Change in total acquisition cost from first estimate 6 % 26 %

Estimated total acquisition cost growth $42 Billion $295 Billion

Average schedule delay to IOC 16 months 21 months

Source data: GAO-08-467SP Assessments of Selected Weapon Systems

Presenter
Presentation Notes
It is well known that DoD programs tend to suffer overruns and schedule slips. In the most recent GAO report on the matter, programs had increased in cost by 40% in RDT&E dollars and had slipped by 21 months. The bright spot is that programs with mature technologies fare much better – the ones with immature technologies have 44% more cost growth. It was, in part, as a result of this type of data that the Air Force launched the AFSO21 initiative, part of which was to improve the success of acquisition programs. The work discussed today was done for the Develop and Sustain Warfighting Systems process of the AFSO21 initiative.
Page 3: Technology Integration and Improved Technology Maturity Assessments

MIT Lincoln Laboratory999999-3

XYZ 4/24/2009

Study Description

• Goal is to reduce schedule slip and cost growth due to immature technology by

– Reducing the likelihood that immature technology is accepted into acquisition programs

Or– Better revealing upfront the risks associated with accepting

immature technology

Presenter
Presentation Notes
Within Develop and Sustain Warfighting Systems, the technology development subprocess started this initiative. I led the initiative for the majority of its time.
Page 4: Technology Integration and Improved Technology Maturity Assessments

MIT Lincoln Laboratory999999-4

XYZ 4/24/2009

Outline

• Background

• Existing Methodology (TRL Scale)

• Methods to Augment the TRL Scale

– Manufacturing

– Integration & “Ilities”

• Summary

Page 5: Technology Integration and Improved Technology Maturity Assessments

5Dec 2008 ver. 5.1

IOC

Technology Development

Engineering and Manufacturing Development Production &

DeploymentOperations &

SupportFRP DecisionReview

FOC

Post-CDR A

Materiel Solution AnalysisMateriel Development Decision

BA CProgram Initiation

ITR ASR

TRA

SRR SFR PDR CDR

TRA

TRR SVR (FCA)/PRR

Systems Engineering Technical Reviews

PCA ISR

TRA(Ships)

• Initial Technical Review (ITR)• Alternative Systems Review (ASR)• Systems Requirements Review (SRR)• System Functional Review (SFR)• Preliminary Design Review (PDR)• Critical Design Review (CDR)• Post-PDR Assessment (Post-PDRA)

• Post-CDR Assessment (PCDRA)• Test Readiness Review (TRR)• System Verification Review (SVR)• Functional Configuration Audit (FCA)• Production Readiness Review (PDR)• Operational Test Readiness Review (OTRR)• Physical Configuration Audit (PCA)

• Technology Readiness Assessment (TRA)

• In-Service Review (ISR)

PostPDR A

PDR

or

Presenter
Presentation Notes
Effective Technical Reviews are a critical part of the Technical Assessment process. For programs that start at Milestone B, and other programs that may experience significant design changes during the Technology Development phase, another series of technical reviews (SRR, SVR and PDR) may be required after Milestone B. In that case, the MDA will conduct a Post-PDR Assessment review and issue an ADM indicating the program is on track to meet EMD exit criteria and APB thresholds. IBRs, OTRR and AOTR: Not shown here, the Integrated Baseline Review (IBR) (essentially a business review), the Operational Test Readiness Review (OTRR) (a review conducted for the SAE to ensure readiness to proceed to operational testing), and the Assessment of Operational Test Readiness (AOTR) (a review conducted for designated ACAT ID and special interest program by OUSD(AT&L)/Systems and Software Engineering). These reviews also consider technical issues. The IBRs and OTRR are highlighted in draft Chapter 4 of the DAG. The AOTR requirement is in 5000.02.
Page 6: Technology Integration and Improved Technology Maturity Assessments

MIT Lincoln Laboratory999999-6

XYZ 4/24/2009

Technology Readiness Assessment (TRA)

• Prompted by continuing acquisition failures, the law now requires that “the technology in the program has been demonstrated in a relevant environment” at milestone B

– Previously, many waivers were granted No longer being granted except for true emergencies (e.g. MRAP)

• DoD conducts technology readiness assessment (TRA) at milestones B, C

– Regulations have been formed around the Technology Readiness Level (TRL) scale

Minor generalizations from NASA scale• TRA process & “deskbook*” formalized to address

concerns about TRL assessment– Repeatability– Definition of a “relevant environment”– Objectivity forced by independent team for Major programs

*See: https://acc.dau.mil/CommunityBrowser.aspx?id=18545

Presenter
Presentation Notes
New version of TRA deskbook with better “relevant enviro” defn’s should be forthcoming within weeks. Let’s give them the benefit of the doubt and say that the deskbook cleans up the issues listed here. Are there still remaining issues? Yes, indeed, see then next slides.
Page 7: Technology Integration and Improved Technology Maturity Assessments

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 7

Measuring Technology Readiness (DoD TRA Deskbook, May 2005)

9. Actual system proven through successful mission operations (sw mission-proven operational capabilities)

8. Actual system completed and qualified (sw mission qualified) through test and demonstration (sw in an operational environment)

7. System prototype demonstration in an operational (sw high-fidelity) environment

6. System/subsystem model or prototype demonstration in a relevant environment (sw module and/or subsystem validation in a relevant end-to-end environment)

5. Component and/or breadboard (sw module and/or subsystem) validation in relevant environment

4. Component and/or breadboard validation in laboratory environment

3. Analytical and experimental critical function and/or characteristic proof-of-concept

2. Technology concept and/or application formulate1. Basic principles observed and reported

System Test, Launch & Operations

System/Subsystem Development

Technology Demonstration

Technology Development

Research to Prove Feasibility

Basic Technology Research

TRL 9

TRL 8

TRL 7

TRL 6TRL 6

TRL 5TRL 5

TRL 4

TRL 3

TRL 2

TRL 1

Technology Readiness Levels (TRLs)

Presenter
Presentation Notes
Key notes: - By law, must achieve TRL 6 by milestone B - on ACATs, we conduct TRL assessments only on critical technology elements, or CTEs. Process, called the TRA process, is governed by OSD DDR&E. - how TRLS assessments are conducted on ACAT-II, III are not well standardized yet. Technology Readiness Levels 1. Basic principles observed and reported. Lowest level of technology readiness. Scientific research begins to be translated into applied research and development. Examples might include paper studies of a technology’s basic properties. 2. Technology concept and/or application formulated. Invention begins. Once basic principles are observed, practical applications can be invented. Applications are speculative and there may be no proof or detailed analysis to support the assumptions. Examples are limited to analytic studies. 3. Analytical and experimental critical function and/or characteristic proof of concept. Active research and development is initiated. This includes analytical studies and laboratory studies to physically validate analytical predictions of separate elements of the technology. Examples include components that are not yet integrated or representative. 4. Component and/or breadboard validation in laboratory environment. Basic technological components are integrated to establish that they will work together. This is relatively “low fidelity” compared to the eventual system. Examples include integration of “ad hoc” hardware in the laboratory. 5. Component and/or breadboard validation in relevant environment. Fidelity of breadboard technology increases significantly. The basic technological components are integrated with reasonably realistic supporting elements so it can be tested in a simulated environment. Examples include “high fidelity” laboratory integration of components. 6. System/subsystem model or prototype demonstration in a relevant environment. Representative model or prototype system, which is well beyond that of TRL 5, is tested in a relevant environment. Represents a major step up in a technology’s demonstrated readiness. Examples include testing a prototype in a high-fidelity laboratory environment or in simulated operational environment. 7. System prototype demonstration in an operational environment. Prototype near, or at, planned operational system. Represents a major step up from TRL 6, requiring demonstration of an actual system prototype in an operational environment such as an aircraft, vehicle, or space. Examples include testing the prototype in a test bed aircraft. 8. Actual system completed and qualified through test and demonstration. Technology has been proven to work in its final form and under expected conditions. In almost all cases, this TRL represents the end of true system development. Examples include developmental test and evaluation of the system in its intended weapon system to determine if it meets design specifications. 9. Actual system proven through successful mission operations. Actual application of the technology in its final form and under mission conditions, such as those encountered in operational test and evaluation. Examples include using the system under operational mission conditions.
Page 8: Technology Integration and Improved Technology Maturity Assessments

MIT Lincoln Laboratory999999-8

XYZ 4/24/2009

Remaining TRL Issues

• The TRL scale has utility for decision makers at higher levels

– “Rear window mirror” – where have we been?– Helps to plan and explain development– TRL is being used by federal agencies (DoD, DoE, NASA,

FAA, etc) and allies (NATO, GBR, Canada,…)

• It also has significant shortcomings– DoD TRA process only looks at Critical Technology Elements

(CTEs)TRL 6 definition is used for subsystems, not systems

– NonlinearityHuge leap from 6 to 7Almost unused after TRL 6

– Does not account for integration, nor manufacturing– Does not indicate the difficulty (risk) of moving forward up

the scale

Presenter
Presentation Notes
Self explanatory slide.
Page 9: Technology Integration and Improved Technology Maturity Assessments

MIT Lincoln Laboratory999999-9

XYZ 4/24/2009

Outline

• Background

• Existing Methodology (TRL Scale)

• Methods to Augment the TRL Scale

– Manufacturing

– Integration & “Ilities”

• Summary

Presenter
Presentation Notes
Self explanatory slide.
Page 10: Technology Integration and Improved Technology Maturity Assessments

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 10

Manufacturing Readiness Levels (MRLs)

Tri-service working group (led by AFRL) has developed MRLsMRL scale

Early steps: planning for future production (e.g. supplier base)Later steps: full process control with lean-manufacturing

Manufacturing Readiness Assessments (MRAs) fill the vital role of predicting whether or not we will be able to produce the product in the timeframe and at the rate desired with the desired quality

Identifies risks for a program office to work on

Policy in development currently

MRL 1

Mfg feasibility assessed

MRL 2

Mfg concepts defined

MRL 3

Mfg concepts

developed

MRL 4

Capability to produce the technology

in a laboratory

environment

MRL 5

Capability to produce

prototype components

in a production relevant

environment

MRL 6

Capability to produce a prototype system or subsystem

in a production relevant

environment

MRL 7

Capability to produce systems,

subsystems or components in a production

representative environment

MRL 8

Pilot line capability

demonstrated. Ready to

begin low rate production

MRL 9

Low rate production

demonstrated. Capability in

place to begin full rate

production

MRL 10

Full rate production

demonstrated and lean

production practices in

place

A B C

Presenter
Presentation Notes
Why necessary? Eglin steel example. MRL scale is defined via a Joint Working Group. AF position is to mandate usage only in support of Milestone C. MRLs codified in MRA deskbook* * See https://acc.dau.mil/CommunityBrowser.aspx?id=18231 Also, GAO report 02-701, “BEST PRACTICES: Capturing Design and Manufacturing Knowledge Early Improves Acquisition Outcomes,” July, 2002. Given a TRA and an MRA, what issues remain?
Page 11: Technology Integration and Improved Technology Maturity Assessments

MIT Lincoln Laboratory999999-11

XYZ 4/24/2009

Outline

• Background

• Existing Methodology (TRL Scale)

• Methods to Augment the TRL Scale

– Manufacturing

– Integration & “Ilities”

• Summary

Page 12: Technology Integration and Improved Technology Maturity Assessments

MIT Lincoln Laboratory999999-12

XYZ 4/24/2009

Study Focus

• Initial study team had already concluded that integration must be accounted for

• What other issues should be included?• How should these issues be handled?

– Should additional scales be developed for each?

• Gathered team from Air Force product centers, logistics centers, test (AFOTEC), AFRL, cost analysis, NASA, Aerospace Corp, Carnegie Mellon SEI

“Our primary technical problem with the C-17 was integration. We grabbed too much off the shelf

and tried to put it together”(Panton, 1994).

Presenter
Presentation Notes
Self explanatory slide.
Page 13: Technology Integration and Improved Technology Maturity Assessments

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 13

Surveyed Globe for Good Ideas

Efforts surveyed across DoD, other agencies, internationally, universities, corporate worldNASA-originated AD2 methodologyIndependent Program Assessment processBritish Ministry of Defence (MoD) has iterated 3 times on TRL-like process

British System Readiness Levels (SRLs) are used in conjunction with TRLs

Also in conjunction with a full-blown risk analysis assessment

Presenter
Presentation Notes
Jim Bilbro, former NASA Marshall Chief Technologist, and originator of AD2, became part of the study team. AD2 stands for Advancement Degree of Difficulty. British displays viewed with favor throughout community, but their methodology is more complex than we prefer. Our scale will look similar, but has a different meaning. We do not have a new SRL scale. Brits used to have an IRL scale, but abandoned it because it was too much effort for too little gain.
Page 14: Technology Integration and Improved Technology Maturity Assessments

MIT Lincoln Laboratory999999-14

XYZ 4/24/2009

Examined Case Studies and Formed Opinions

• Conducted case histories on 5 current and historical programs at Air Force product centers and 1 at NASA

– Mix of air and space projects (no cyber-only)– Program literature (eg quarterly DAES reports)– Live interviews

• Combined case histories with team members’ knowledge to form lessons learned and identify best practices

• Final judgment: The issues that are lacking with TRL assessments are not where you are but what are the issues lying ahead

– No new scales required (no Integration Readiness Level, etc.)– Identification of risks is the key (as is done in MRAs)– Utilize existing risk processes

• Decided to develop new methodology: Risk Identification: Integration and Ilities (RI3)

Presenter
Presentation Notes
Self explanatory slide.
Page 15: Technology Integration and Improved Technology Maturity Assessments

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 15

RI3 Use By an XR or PMO For Risk Management

RI3 used to support existing Risk Identification processQuestions in nine ‘ilities areas

Design Maturity and StabilityScalability & ComplexityIntegrabilityTestabilitySoftwareReliability MaintainabilityHuman factorsPeople, organization, & skills

Questions contained in a guidebook and interim toolQuestions are based on repeated problems in pastHelps ensure completeness of technical risksDeconflicted from TRA, MRA, SEAM, LHA

Risk Mgmt Guide for DoD Acquisition, August 2006, V 1.0.

Similar process in D&SWS LCRM.

Presenter
Presentation Notes
The questions come in 9 areas. Areas chosen from INCOSE systems engineering handbook as well as the expertise of the panel members. What areas have caused the most problems? TRA= Technology readiness assessment MRA = Manufacturing readiness assessment SEAM = System engineering assessment model LHA = Logistics Health Assessment
Page 16: Technology Integration and Improved Technology Maturity Assessments

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 16

Some Sample Questions:

IntegrabilityAre there interactions / integration issues that could be affected by proprietary or trust issues between/ among suppliers?Have key sub-systems, at whatever level of readiness (breadboard, brassboard, prototype), been tested together in an integrated test environment and have they met test objectives?

SoftwareAre personnel with development-level knowledge of the existing, reused software part of the new software development team?

MaintainabilityIs modeling and simulation used to simulate and validate maintenance procedures for the unit under test and higher levels of integration?

Explanatory discussion with potential best practices on each question are included in RI3 guidebook and Excel-like worksheet/toolQuestions are technical and shy away from programmaticApproximately 90 questions under development (~10 per ’ility)

Approximately 2 hours required to answer questions

Presenter
Presentation Notes
This is just a sample set. The full question set became available to the Air Force for review in January 09.
Page 17: Technology Integration and Improved Technology Maturity Assessments

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 17

What to Evaluate with the RI3 Methodology

To assess integration and ‘ilities, evaluate Critical Technology Elements (CTEs) + units that interface with CTEs, even if they are not CTEs themselvesRules of thumb:

If a unit is important enough to have an engineer or a billet assigned to it, then it’s important enough to assess risks for itIn early phases of a program, may only be able to assess RI3 at a top or system levelFor practical reasons, typically easier to ask RI3 questions for lower level units before doing higher levels of integrationIf starting at the top level, run RI3 separately from the unit engineer’s own evaluations, or this leads unit engineers to merely parrot back risks apparent at the top level

Project XYZ

System A System B System C

Subsystem cSubsystem bSubsystem a

Component α Component β Component γ

CTE

Non-CTE

Presenter
Presentation Notes
In the case when not all components are CTEs, one still needs to evaluate the non-CTEs that interact with it for integration issues.
Page 18: Technology Integration and Improved Technology Maturity Assessments

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 18

Assess Likelihood and Consequence for Each Risk

Utilize “standard” DoD/AF definitions for “Likelihood” and “Consequence”

L∈[1,5]C∈[1,5]2-Dimensional plot has defined R,Y,G colors

For each question, can plot results of the risks that are spawned

Each ‘ility area has a different spread on its own scatter plotProduces 9 scatter plots for a UUE

UtilityWithin a thread, concentrates program manager on area (question) that needs workL,C outputs should be used as inputs to a risk assessment process

5

4

3

2

1 1g

1 2 3 4 5

Consequence

Like

lihoo

d 1a

1b

1c

1e

1f

Example Results: Integrability for UUE

Presenter
Presentation Notes
UUE=unit under evaluation This plot shows how each ility area results in a scatter plot on a risk matrix. The number of risks may be greater or fewer than the number of questions. For purposes of example, we show 1 risk spawning from each question 1a through 1g.
Page 19: Technology Integration and Improved Technology Maturity Assessments

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 19

Why Summarize Each ‘ility Area?

Manager of the Unit Under Evaluation (UUE) is left with 9 separate risk scatter plots

Summarization of the details would improveUnderstanding of overall statusReporting upwards

1c

1a 1f

1b

1g 1e

Integrability Testability Reliability Etc.

2d 2e

2a 2b 2c

2f

3c 3d

3a 3b 3e

3f

Presenter
Presentation Notes
Self Explanatory Slide.
Page 20: Technology Integration and Improved Technology Maturity Assessments

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 20

0 1 2 3 4 5

People, Org., Skills

Des. Maturity & Stab.

Scalability & Complexity

Reliability

MaintainabilitySoftware Development

Human FactorsIntegrability

Testability

Summary Display for Unit Under Evaluation

Summary display for decision makersUses unique 2D-> 1D mapping of (L,C) to ratingsFor each ‘ility, display the worst case rating of any risk

Highlights most pressing issuesComplements underlying risk-methodology dataInvites reader to investigate further

RI3 Ratings Most PressingLeast Pressing

Presenter
Presentation Notes
Self Explanatory Slide.
Page 21: Technology Integration and Improved Technology Maturity Assessments

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 21

Usage of RI3 to Feed AF Risk Management Processes

0 1 2 3 4 5

Integrability

Testability

Reliability

Maintainability

Human Factors

Scalability & Complexity

People, Organization, Skills

Design Maturity & Stability

5

4

3

2

1

1 2 3 4 5

RI3Guidebook

Consequence

Questions:• Integration• ilities

Risks

Additional Summary Displays

Risk Management

Step 2.Risk

Identification

Active Risk Manager

(ARM) compatible

file

Similar output for cost estimation

being investigated

Tool

Tool

PoPS

Tool

Presenter
Presentation Notes
Plan to build a tool to provide automated output to Risk programs, such as ARM. PoPS is “probability of program success.” A standard tool in the AF.
Page 22: Technology Integration and Improved Technology Maturity Assessments

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 22

Description of RI3 Historical “Test”

Historical Program

FullHistorical

Document- ation

Omniscient Team

Document- ation up to PDR Only

Stayback Team

RI3Guidebook,

Tool

Predict Risks

Review Materials

Review Full History

Full Team• Interview Program Office,• Compare Predictions to Truth Results

• Test Metrics

• RI3 Revisions

Extract Partial

Info

Presenter
Presentation Notes
Late in 2008, we ran a “test” of the RI3 methodology using actual historical program data. The team was divided into two groups. One group knew the whole history of the program, and the other group (Stayback team) only knew the data up to and including PDR. At PDR there were only 6 risks to the program at the system level. After the Stayback team ran the RI3 methodology on the data they had, then met with the rest of the team and also with the former program office personnel to compare the predictions to what eventually transpired in the effort.
Page 23: Technology Integration and Improved Technology Maturity Assessments

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 23

Results: RI3 Historical “Test” Completed Nov 21, 2008

Number

Correctly Predicted by

Team

Could be Predicted by

Program Office

Escaped Prediction

(Type 1 Error)Realized Risks / Issues 22 13 6 3

RI3 v1.0 Tool could have predicted 86% of Issues

Modified tool to be more perceptive

Correctly Predicted by TeamTeam members predicted a risk, which in fact became an issue

Could be Predicted by Program OfficeIf the program personnel had the RI3 tool available, issue that arose would likely have been predicted as a risk by RI3

Team did not predict risk in exercise due to lack of informationEscaped Prediction

Questions did not yet capture an issue that arose

Presenter
Presentation Notes
Although at PDR the program had identified 6 risks, 22 issues eventually arose that, together, resulted in killing the program. The RI3 methodology performed extremely well. Based on PDR data, it correctly identified 86% of the issues that arose. 3 issues escaped prediction and the tool was modified to accommodate them.
Page 24: Technology Integration and Improved Technology Maturity Assessments

MIT Lincoln Laboratory999999-24

XYZ 4/24/2009

RI3 Deliverables

• RI3 Guidebook issued– Methodology description– 101 questions in 9 “ilities”

• RI3 Interim Tool– Currently in a spreadsheet– AFRL to make a “web” version

• RI3 training– In development at AFIT

• Status– RI3 Guidebook to be included as an appendix to a new AF

acquisition book– AF Systems Engr policy to be changed to reference– RI3 under test right now in 3 of 4 AF product centers plus top

50% of AFRL portfolioEarly feedback from one product center indicates that RI3 has identified 10% more risk items than previously identified

Available for public release

as of April 2009

Presenter
Presentation Notes
Mostly self explanatory slide. If all continues to go well, then expect RI3 to be pushed to the contractor community.
Page 25: Technology Integration and Improved Technology Maturity Assessments

MIT Lincoln Laboratory999999-25

XYZ 4/24/2009

Summary

• TRL tells you where you are, but is not an indicator of future success

– Data shows that programs reaching MS B with TRL 5 or 6 fare no better (7 does fare better)

• MRLs add analysis to determine manufacturability and identify related risks

• RI3 provides a complementary methodology– To avoid common pitfalls in integration and the “ilities”– To report upwards – That is in test right now in the Air Force and should be

publically available soon

Presenter
Presentation Notes
Self explanatory slide.
Page 26: Technology Integration and Improved Technology Maturity Assessments

MIT Lincoln Laboratory999999-26

XYZ 4/24/2009

Author Contact Information

• Kyle YangMIT Lincoln Laboratory2401 E El Segundo BlvdEl Segundo, CA 90245Tel: 310-536-0798 x [email protected]

• James W. BilbroJB Consulting International4017 Panorama Drive SEHuntsville, AL 35801Tel: 256-655-6273 Fax: [email protected]

Presenter
Presentation Notes
Self explanatory slide