NASA Prognostics[1]

47
DIAGNOSTICS & PROGNOSTICS 12/09/2009 Prognostics Prognostics Center of Excellence NASA Ames Research Center, Moffett Field CA 94035 Kai Goebel April 27, 2010 http://prognostics.nasa.gov TEAM MEMBERS Kai Goebel Ph.D. (Center Lead) Vadim Smelianskyi, Ph.D. (Center Co-Director) Scott Poll (ADAPT) Edward Balaban Jose Celaya, Ph.D. Matt Daigle, Ph.D. Bhaskar Saha, Ph.D. Sankalita Saha, Ph.D. Abhinav Saxena, Ph.D. Phil Wysocki

Transcript of NASA Prognostics[1]

Page 1: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 112/09/2009

Prognostics

Prognostics Center of Excellence

NASA Ames Research Center, Moffett Field CA 94035

Kai Goebel

April 27, 2010

http://prognostics.nasa.gov

TEAM MEMBERS

Kai Goebel Ph.D. (Center Lead)

Vadim Smelianskyi, Ph.D.

(Center Co-Director)

Scott Poll (ADAPT)

Edward Balaban

Jose Celaya, Ph.D.

Matt Daigle, Ph.D.

Bhaskar Saha, Ph.D.

Sankalita Saha, Ph.D.

Abhinav Saxena, Ph.D.

Phil Wysocki

Page 2: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 2

Prognostics CoE

“The Prognostics Center of Excellence

(PCoE) at Ames Research Center provides

an umbrella for prognostic technology

development, specifically addressing

technology gaps within the application

areas of aeronautics and space

exploration.”

• En route to becoming a national asset

• Expertise in prediction technology and

uncertainty management for systems

health monitoring

Page 3: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 3

Prognostics

0 500 1000 1500 2000 2500 3000 3500

2.6

2.8

3

3.2

3.4

3.6

3.8

4

4.2

time (secs)

voltage (

V)

EEOD

E (measured)

E (from PF)

Prediction points

tEOD

EOD pdfs

Definition: Predict damage progression of a fault based on current

and future operational and environmental conditions to estimate the

time at which a component no longer fulfils its intended function within

desired bounds (“Remaining Useful Life”)

health

Page 4: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 4

Key Ingredients for Prognostics• Run-to-failure data

– Measurement data

– Ground truth data

– Operational conditions

• Load profiles

• Environmental conditions

– Failure threshold

• Physics of Failure models

– For each fault in the fault catalogue

• Uncertainty information

– Sources of uncertainty

– Uncertainty characterization

Page 5: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 5

Prognostics Algorithms• Data Driven Algorithms

– Gaussian Process Regression

– Relevance Vector Machine

– Neural Networks

– Polynomial Regression

• Model Based Algorithms

• Hybrid Algorithms

– Particle Filters

• Classical PF

• Rao-Blackwellized PF

• Risk Sensitive PF

– Kalman Filters

• Classical KF

• Extended KF

Page 6: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 6

30 35 40 45 50 55 60 650

5

10

15

20

25

30

35

40

- Metric ( = 0.2, = 0.5)

Time (weeks)

RU

L

End of Life (EOL)

80% accuracy zone

actual RUL

RVM-PF RUL

GRP RUL

Baseline Statistics RUL

Metrics Example• Establishing a common ground to compare different prognostic efforts

• Metrics must not only measure accuracy and precision but also the convergence of both properties

• Data/time requirement of algorithms (prognostic horizon) before they produce consistent predictions is also important

0 10 20 30 40 50 60 700.6

0.65

0.7

0.75

0.8

0.85

0.9

0.95

1Particle Filter Prediction

Time (weeks)

C/1

ca

pa

city (

mA

h);

R

UL

pd

f (b

iase

d b

y 0

.7)

Real data

RUL threshold

RUL pdfs

PredictionSpread (weeks)

PredictionHorizon (weeks)

——l—— 4.23 32

——l—— 3.25 16

——l—— 5.34 32

Page 7: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 7

Sources of

Uncertainty

Uncertainty

Management

• Model

• System complexity

• Insufficient knowledge

• Usage

• Load profile

• Temperature

• Noise

• Internal, external

• Electrical, mechanical, thermal

• Sensors

• Training data based extrapolation

• Probabilistic state space model

• Online model adaptation

• Noise modeling

• Probabilistic regression

• Hyperparameters to prevent overfitting

Uncertainty Management

Page 8: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 8

Ames Research Center

Application Example: ValvesModeling, Algorithms, Metrics

Page 9: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 9

Valve Prognostics• Apply model-based prognostics to

pneumatic valves

• Develop high-fidelity simulation model– Progressive damage models include seal wear

(internal and external leaks), spring degradation,

and increase in friction.

• Investigate performance under different

circumstances using prognostic

performance metrics for comparison– Impact of using different filters

– Effects of increased sensor noise

– Effects of increased process noise and model

uncertainty

– Feasibility of different sensor sets (e.g. continuous

position sensor vs. discrete open/closed indicators)

Computational Architecture

Source: M. Daigle and K. Goebel, "Model-based Prognostics with Fixed-lag Particle Filters“ Accepted for publication

at PHM09

Page 10: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 10

Problem Formulation

• Prognostics goal– Compute EOL = time point at which component no longer meets specified

performance criteria– Compute RUL = time remaining until EOL

• System model

• Define condition that determines if EOL has been reached

• EOL and RUL defined as

5/

26

Compute and/or

State Input Process Noise

Output Sensor Noise

Parameters

Page 11: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 11

Fault Detection

Isolation &

Identification

Damage

EstimationPrediction

uk yk F p(EOLk|y0:k)System

yk

p(xk,θk|y0:k)

p(RULk|y0:k)

Prognostics Architecture

5/

26

System receives inputs, produces

outputs

Identify active damage mechanisms

Estimate current state and parameter

values

Predict EOL and RUL as probability

distributions

1

2

3

4

Page 12: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 12

Return Spring

Piston

Plug

Top

Pneumatic Port

Bottom

Pneumatic Port

Fluid Flow

Case Study• Apply framework to pneumatic valve

– Complex mechanical devices used in many domains including aerospace

– Failures of critical valves can cause significant effects on system function

• Pneumatic valve operation– Valve opened by opening bottom

port to supply pressure and top port to atmosphere

– Valve closed by opening bottom port to atmosphere and top port to supply pressure

– Return spring ensures valve will close upon loss of supply pressure

5/

26

/2

01

Page 13: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 13

Case Study• Faults

– External leaks at ports & internal leaks across piston– Friction buildup due to lubrication breakdown, sliding wear, buildup of

particulate matter– Spring degradation

• Defining EOL– Limits defined for open and close

times of valves• E.g., main fill valve opens in 20 seconds

(26 req.), closes in 15 (20 req.)

– Limits placed on valve leakage rates (pneumatic gas)

– Valve must be able to fully close upon fail-safe

– Valve is at EOL when any of above conditions violated (defines CEOL)• Function of amount of damage, parameterized

in model

5/

26

Return Spring

Piston

Plug

Top

Pneumatic Port

Bottom

Pneumatic Port

Fluid Flow

Page 14: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 14

Physics-based Modeling

• Valve state defined by

• State derivatives given by

• Inputs given by

5/

26

Valve positionValve velocityGas mass above pistonGas mass below piston

VelocityAccelerationGas flow above pistonGas flow below piston

Fluid pressure (left)Fluid pressure (right)Input pressure at top portInput pressure at bottom port

Page 15: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 15

Physics-based Modeling: Forces

• Piston movement governed by sum of forces, including– Pneumatic gas:

– Process fluid:

– Weight:

– Spring:

– Friction:

– Contact forces:

5/

26

Valve Stroke Length

Page 16: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 16

Physics-based Modeling: Flows

• Gas flows determined by choked/non-choked orifice flow equations:

• Fluid flow determined by orifice flow equation:

5/

26

Page 17: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 17

Pneumatic Valve Modeling

• Possible sensors include

where,

5/

26

Valve positionGas pressure (top)Gas pressure (bottom)Fluid flowOpen indicatorClosed Indicator

Page 18: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 18

Modeling Damage

Increase in friction

• Based on sliding wear equation

• Describes how friction coefficient changes as function of friction force, piston velocity, and wear coefficient

Degradation of spring

• Assume form similar to sliding wear equation

• Describes how spring constant changes as function of spring force, piston velocity, and wear coefficient

Growth of internal leak

• Based on sliding wear equation

• Describes how leak size changes as function of friction force, piston velocity, and wear coefficient

Growth of external leak

• Based on environmental factors such as corrosion

• Assume a linear change in absence of known model

5/

26

Page 19: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 19

Damage Progression

5/

26

0 20 40 60 80 1000

0.5

1

1.5

2

2.5

3

3.5

4x 10

6

Time (cycles)

Friction C

oeff

icie

nt

(Ns/m

)

Damage Progression of Friction Coefficient

40 411.05

1.1

x 106

0 20 40 60 80 1004

5

6

7

8

9

10x 10

4

Time (cycles)

Spring C

onsta

nt

(N/m

)

Damage Progression of Spring Constant

40 41

6.846.866.886.9

6.926.94

x 104

0 20 40 60 80 1000

0.5

1

1.5

2x 10

-6

Time (cycles)

Inte

rnal Leak A

rea (

m2)

Damage Progression of Internal Leak

40 41

6.8

6.9

7

x 10-7

0 20 40 60 80 1000

0.5

1

1.5

2

2.5

3x 10

-5

Time (cycles)

Top E

xte

rnal Leak A

rea (

m2)

Damage Progression of Top External Leak

Page 20: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 20

Damage Estimation

• Wear parameters are unknown, and must be estimated along with system state

5/

26

PositionVelocityGas mass above pistonGas mass below pistonFriction coefficientSpring rateInternal leak areaExternal leak area (top)External leak area (bottom)

Friction wearSpring wearInternal leak wearExternal leak wear (top)External leak wear (bottom)

Augment system state with unknown parameters and use

state observer

Page 21: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 21

Particle Filters

• Employ particle filters for joint state-parameter estimation– Represent probability

distributions using set of weighted samples

– Help manage uncertainty (e.g., sensor noise, process noise, etc.)

– Similar approaches have been applied successfully to actuators, batteries, and other prognostics applications

5/

26

w

x

t

Distribution evolves in time

State represented with discrete probability

distribution

Page 22: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 22

Damage Estimation with PF

• Particle filters (PFs) are state observers that can be applied to general nonlinear processes with non-Gaussian noise– Approximate state distribution by set of discrete weighted samples:

– Suboptimal, but approach optimality as N∞

• Parameter evolution described by random walk:

– Selection of variance of random walk noise is important– Variance must be large enough to ensure convergence, but small enough

to ensure precise tracking

• PF approximates posterior as

5/

26

Page 23: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 23

Sampling Importance Resampling

• Begin with initial particle population• Predict evolution of particles one step ahead• Compute particle weights based on likelihood of given observations• Resample to avoid degeneracy issues

– Degeneracy is when small number of particles have high weight and the rest have very low weight– Avoid wasting computation on particles that do not contribute to the approximation

5/

26

w

x

Initial Particle Population

w

x

Predict Evolution

w

x

Compute Weights

w

x

Resample

Page 24: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 24

• Particle filter computes

• Prediction n steps ahead approximated as

• Similarly, EOL prediction approximated as

• General idea– Propagate each particle forward until EOL reached (condition on

EOL evaluates to true)

– Use particle weights for EOL weights

Prediction

5/

26

Page 25: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 25

Prediction

5/

26

25

Friction progression EOL prediction

Hypothesized inputs

Page 26: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 26

Validation of Methodology

5/

26

26

EOL predictions all contain true EOL, andget more accurate and precise as EOL isapproached.

Estimate of wear parameter convergesafter a few cycles, after this, leak area canbe tracked well.

Internal Leak EOL

Predictions

Page 27: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 27

α-λ Performance

• Plot summarizes performance of internal leak prognosis

• Over 50% of probability mass concentrated within the bounds at all prediction points except at 20 and 30 cycles– Mean RULs are

within the bounds at these points

• For α=0.122, metric is satisfied at all points

5/

26

27

α=0.1, β=0.5

Page 28: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 28

Prediction Performance

5/

26

28

α-λ metric for spring damage prediction, where α =0.1, β=0.4, M={x,f,pt,pb}

α-λ metric for spring damage prediction, where α =0.1, β=0.4, M={open,closed}

Both cases have similar accuracy, but the case with continuous measurementshas much better precision, as the metric evaluates to true for all but one λ point.

Page 29: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 29

Discussion• Different sensor sets have comparable estimation and prediction accuracy,

but some differences are observed

• Wide differences observed in precision of estimation and prediction

• Results reveal that some sensors are more useful for certain faults than others

– Flow measurement can be dropped with little effect

– For friction and spring faults, sensor sets with position measurement perform best

– For leak faults, sensor sets with pressure measurements perform best

– Helps decide importance of sensors based on which faults are most important

• Overall performance still reasonable with higher levels of noise

– Sensor sets with continuous measurements impacted most

5/

26

29

Page 30: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 30

Ames Research Center

Application Example: Electronics

Page 31: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 31

Prognostics for Electronics

Component:

Power Transistor

Line Replaceable Unit:

Power Controller

Page 32: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 32

Motivation• Electronic components have increasingly critical role in on-

board, autonomous functions for – Vehicle controls, Communications, Navigation, Radar systems

• Future aircraft systems will rely more on electric & electronic components– More electric aircraft

– Next Generation Air Traffic System (NGATS)

• Move toward lead-free electronics and microelectromechanical devices (MEMS)

• Assumption of new functionality increases number of electronics faults with perhaps unanticipated fault modes

• Needed– Understanding of behavior of deteriorated components to

– develop capability to anticipate failures/predict remaining RUL

Page 33: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 33

Current Research Efforts

• Thermal overstress aging of MOSFETs and IGBTs• Electrical overstress aging testbed (isothermal)• Modeling of MOSFETs• Identification of precursors of failure for different IGBT

technologies*• Prognostics for output capacitor in power supplies+

• Effects of lightning events of MOSFETS• Effects of ESD events of MOSFETS and IGBTs• Effects of radiation on MOSFETS and IGBTs

• In collaboration with* University of Maryland+ Vanderbilt University

Page 34: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 34

Accelerated aging system

• A platform for aging, characterization, and scenario simulation of gate controlled power transistors.

• The platform supports:– Thermal cycling

– Simulation of operation conditions

– Isothermal aging

• In situ state monitoring is supported at varying gate and drain voltage levels.

Page 35: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 35

4.15 4.2 4.25 4.3 4.35

x 10-5

0

5

10

time (s)Colle

cto

r-E

mm

iter

and G

ate

(V

)

4.22 4.222 4.224 4.226 4.228 4.23 4.232

x 10-5

7.4

7.6

7.8

8

8.2

time (s)

Colle

cto

r-E

mm

iter

Voltage (

V)

180 min

150 min

110 min

65 min

30 min

T = ~ 330C

Vd = 4v

Vg = 10vVg

Vce

Experiment on IGBT

• Collector-emitter voltage turn-OFF transient

Potential degradation Indicator

326 326.5 327 327.5 328 328.5 329 329.5 330 330.5 3317

7.5

8

8.5

9

9.5

10

10.5

11

11.5

temperature (C)S

witchin

g T

ransie

nt

peak(V

)

1000

2000

3000

4000

5000

6000

7000

8000

9000

10000Degradation time (s)

• Turn-OFF collector-emitter voltage transient decreased significantly with both increases in temperature and thermal overstress aging time

Page 36: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 36

Electronics Aging

5.07 5.08 5.09 5.1 5.11 5.12 5.13 5.14 5.15 5.16

0.25

0.3

0.35

0.4

0.45

0.5

0.55

0.6

Offstate ICE

Time (min)

measurements

curve fit

85.44 85.45 85.46 85.47 85.48 85.49

0.1

0.2

0.3

0.4

0.5

0.6

0.7

Offstate ICE

Time (min)

measurements

curve fit

86.88 86.89 86.9 86.91 86.92 86.93

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Offstate ICE

Time (min)

measurements

curve fit

86.94 86.96 86.98 87 87.02 87.04 87.06 87.08 87.1 87.120

1

2

3

4

5

6

7

8

Time (mins)

IGBT Waveforms

ICE

VGE

33.82 33.84 33.86 33.88 33.9 33.92 33.94 33.96 33.98 34 34.02

0

1

2

3

4

5

6

7

8

Time (mins)

IGBT Waveforms

ICE

VGE

0 10 20 30 40 50 60 70 80 90-3.5

-3

-2.5

-2

-1.5

-1

-0.5

0x 10

5 Coeff of Polyfit Term P1t3

Time (mins)

0 10 20 30 40 50 60 70 80 90-2

0

2

4

6

8x 10

7 Coeff of Polyfit Term P2t2

Time (mins)

0 10 20 30 40 50 60 70 80 90-6

-5

-4

-3

-2

-1

0

1x 10

9 Coeff of Polyfit Term P3t

Time (mins)

0 10 20 30 40 50 60 70 80 90-5

0

5

10

15

20x 10

10 Coeff of Polyfit Term P4

Time (mins)

0 10 20 30 40 50 60 70 80-2.5

-2

-1.5

-1

-0.5

0

0.5

1Particle Filter Parameter Estimation

Time (mins)

P

1

Pre

dic

tion @

51.8

75 m

ins

0 10 20 30 40 50 60 70 80-2.5

-2

-1.5

-1

-0.5

0

0.5

1Particle Filter Parameter Estimation

Time (mins)

P

1

Pre

dic

tion @

51.8

75 m

ins

Page 37: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 37

Ames Research Center

Application Example: EMA

Page 38: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 38

Electro-Mechanical Actuators

Mission Statement0 500 1000 1500 2000 2500 3000 3500 4000

299

301.5

304

306.5

309

Time (s)

Tem

pera

ture

(K

)

Temperature Plot: Baseline Run 2

0 500 1000 1500 2000 2500 3000 3500 4000-1

-0.5

0

0.5

1

Err

or

(K)

Motor TCTsurf (Model)

EMA

Data collection in laboratory…

…and flight conditions

Model Verification

Physical Modeling

5 metric ton load capacity

Page 39: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 39

Flyable Electro-mechanical Actuator (FLEA) Testbed

• One load actuator and two test actuators (nominal and faulty), switcheable in flight

• Sensor suite includes accelerometers, current sensors, position sensors, temperature sensors and a load cell

• Real-time flight surface loads

simulation and data recording

• Data collection flights performed on

C-17 (DFRC) and planned on UH-60

(ARC)

Page 40: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 40

Ames Research Center

Application Example: Energy Storage Devices

Demonstration

Page 41: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 41

Prognostics HIL Testbed

• Demonstrate prognostic algorithm performance– Fast

– Inexpensive

– Control of several run-to-failure parameters

– Interesting dynamics

• Evaluate different prediction algorithms and uncertainty management schemes

Page 42: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 42

Modeling Batteries

Concentration

polarization

IR drop

Activation polarization

Eo

Increasing Current I

Incre

asin

g V

olta

ge

E

Polarization Curve

1/

C

re re+rct

-> oo

m=1/(rctCnf)

re+2S2Cnf

Re(Z)

Im(Z

)

Increasing Ohmic Resistance

0

0

Increasing Ohmic Resistance

Electrolyte Weakening

Re(Z)

Im(Z

)

Increasing Charge Transfer Resistance

0

0

Increasing Charge Transfer Resistance

Plate Sulphation

Freq.

Domain

Time

Domain

Page 43: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 43

• Objective: Predict when Li-ion battery voltage will dip below 2.7V indicating end-of-discharge (EOD)

• Approach

– Model non-linear electro-chemical phenomena that explain the discharge process

– Learn model parameters from training data

– Let the PF framework fine tune the model during the tracking phase

– Use the tuned model to predict EOD

time

voltage

Eo

Eo-E

sd

Eo-E

rd

Eo-E

mt

E=Eo-E

sd-E

rd-E

mt

mt: mass transfersd: self dischargerd: reactant depletion

Modeling SOC

Page 44: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 44

• Objective: Predict when Li-ion battery capacity will fade by 30% indicating life (EOL)

• Approach

– Model self-recharge and Coulombic efficiency that explain the aging process

– Learn model parameters from training data

– Let the PF framework fine tune the model during a few initial cycles

– Use the tuned model to predict EOL

time

voltage

discharge self-recharge

from measurements from modelR. Huggins, Advanced Batteries: Materials Science

Aspects, 1st ed., Springer, 2008.

Modeling SOL

Page 45: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 45

Battery Testbed• Cells are cycled through charge and

discharge under different load and

environmental conditions set by the

electronic load and environmental chamber

respectively

• Periodically EIS measurements are taken to

monitor the internal condition of the

battery

• DAQ system collects externally observable

parameters from the sensors

• Switching circuitry enables cells to be in the

charge, discharge or EIS health monitoring

state as dictated by the aging regimeBHM

EIS: Electro-chemical Impedance Spectroscopy

Page 46: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 46

Prognostics in Action

Page 47: NASA Prognostics[1]

12/09/2009 D I A G N O S T I C S & P R O G N O S T I C S 47

References• A. Saxena, J. Celaya, B. Saha, S. Saha, K. Goebel, “Metrics for Offline Evaluation of Prognostic

Performance”, International Journal of Prognostics and Health Management, 001, 2010

• M. Daigle and K. Goebel, “Model-based Prognostics under Limited Sensing”, Proceeding of IEEE Aerospace conference 2010, 2010

• A. Saxena, K. Goebel, “Requirements Specification for Prognostics Performance - An Overview”, Proceedings of AIAA@Infotech 2010, 2010

• B. Saha, K. Goebel, and J. Christophersen, “Comparison of Prognostic Algorithms for Estimating Remaining Useful Life of Batteries”, Transactions of the Royal UK Institute on Measurement & Control, special issue on Intelligent Fault Diagnosis & Prognosis for Engineering Systems, pp. 293-308, 2009

• B. Saha, K. Goebel, S. Poll and J. Christophersen, “Prognostics Methods for Battery Health Monitoring Using a Bayesian Framework”, IEEE Transactions on Instrumentation and Measurement, Vol. 58, No. 2, pp. 291-296, 2009

• J. Celaya, N. Patil, S. Saha, P. Wysocki, and K. Goebel, “Towards Accelerated Aging Methodologies and Health Management of Power MOSFETs”, Proceedings of Annual Conference of the PHM Society 2009, 2009

• B. Saha, K. Goebel, “Modeling Li-ion Battery Capacity Depletion in a Particle Filtering Framework”, Proceedings of Annual Conference of the PHM Society 2009, 2009

• K. Goebel, B. Saha, A. Saxena, J. Celaya, and J. Christophersen, “Prognostics in Battery Health Management”, I&M Magazine, Vol. 11, No. 4, pp. 33-40, 2008

• K. Goebel, B. Saha, and A. Saxena, “A Comparison of Three Data-driven Techniques for Prognostics”, Proceedings of MFPT 2008 2008