Bayesian Component Reliability Estimation: an F-35 Case Study
Bayesian networks and their application in circuit reliability estimation
description
Transcript of Bayesian networks and their application in circuit reliability estimation
Bayesian networks and their application in
circuit reliability estimation
Erin Taylor
What is a Bayesian Network?
• An example
We want to describe the causal relationship between the following events:
1) The season2) Whether it is raining outside3) The sprinkler is on4) The sidewalk is wet5) The sidewalk is slippery
• We can construct a graph to represent the causal link between these 5 events.
What is a Bayesian Network?
Season
Wet
Sprinkler Rain
Slippery
• Each node represents a random variable, in this case the probability of a particular event.
Assumptions:• “Sprinkler on” and “Rain”
are determined by “Season”
• “Sidewalk wet” is determined by “Sprinkler on” and “Rain”
• “Sidewalk slippery” is determined by “Sidewalk wet”
Bayesian network
Properties of Bayesian Networks
• A Bayesian network is – A Directed Acyclic Graph (DAG)– A model of probabilistic events
• In a Bayesian network– Nodes represent random variables of interest– Links represent causal dependencies among variables
• Bayesian networks are direct representations of the world. Arrows indicate real causal connections, not the flow of information, as in neural networks.
Properties of Bayesian Networks
• Links are not absolute– If the sprinkler is on, this does not always mean
that the sidewalk is wet– For example, the sprinkler may be aimed away
from the sidewalk
Season
Wet
Sprinkler Rain
Slippery
Properties of Bayesian Networks
• Given that the sidewalk is wet, we can calculate the probability that the sprinkler is on: P(sprinkler on | sidewalk wet)
Season
Wet
Sprinkler Rain
Slippery
• Bayesian networks allow us to calculate such values from a small set of probabilities in a process called reasoning or Bayesian Inference
Reasoning in Bayesian Networks
• Reasoning in Bayesian networks operates by propagating information in any direction
1) If the sprinkler is on, the sidewalk is probably wet (prediction)
2) If the sidewalk is wet, it is more likely that the sprinkler is on or it is raining (abduction)
3) If the sidewalk is wet and the sprinkler is on, the likelihood that it is raining is reduced (explaining away)
Season
Wet
Sprinkler Rain
Slippery
• Explaining away is a special type of reasoning that is especially difficult to model in other network models
Specifying a Bayesian Network
• A new example:
Family out Dog dirty
Light on Dog out
Hear bark
1) When a family leaves their house, they often turn the front light on and let the dog out
2) If the dog is dirty, the family often puts him outside
3) If the dog is out, you can sometimes hear him bark
Specifying a Bayesian Network
• To specify the probability distribution of a Bayesian network we need– The prior probability of all root nodes– The conditional probabilities of all nonroot nodes
given all possible combinations of their direct predecessors
Family out Dog dirty
Light on Dog out
Hear bark
P(fo) = 0.15 P(dd) = 0.01
P(lo | fo) = 0.6P(lo | ~fo) = 0.05
P(do | fo dd) = 0.99P(do | fo ~dd) = 0.90P(do | ~fo dd) = 0.97
P(do | ~fo ~dd) = 0.3
P(hb | do) = 0.7P(hb | ~do) =
0.01
Total specified values: 10
Bayesian Networks and Probability Theory
• In traditional probability theory, specifying the previous example would require the joint distribution of all 5 variables:
P(fo,dd,lo,do,hb)
Family out Dog dirty
Light on Dog out
Hear bark
• The joint distribution of 5 variables requires 25-1 or 31 values
Bayesian Networks and Probability Theory
• To see where 25-1 comes from, consider the set of Boolean variables (a,b)
• To specify the joint probability distribution we need the following values:
• In the general case, this yields a total of 2n values for a system of n variables
• Since the sum of all possible outcomes must be 1, we can reduce the number of values to 2n -1
P(a b) P(~a b) P(a ~b) P(~a ~b)
Bayesian Networks and Joint Probabilities
• Using Bayesian networks for this example, we can reduce the number of values that need to be specified from 31 to 10.
Family out Dog dirty
Light on Dog out
Hear bark
Total specified values: 10
P(fo,dd,lo,do,hb)
Total specified values: 31
How is this possible?
Simplifying Joint Distributions
• Bayesian networks reduce the complexity of joint distributions by introducing several independence assumptions
• Conditional Independence:– If we know whether the dog is out, then the probability of
hearing him bark is completely independent from all other events
– Other events only serve to indicate the probability that the dog is out
P(hb | fo dd lo do) = P(hb | do)
– Also P(dd | fo lo do hb) = P(dd)
Family out Dog dirty
Light on Dog out
Hear barkPrior probability
Simplifying Joint Distributions
• From probability theory:P(x1,…,xn) = P(x1)P(x2|x1)…P(xn|x1…xn-1)
• In our example:P(fo,dd,lo,do,hb) = P(fo) P(dd|fo) P(lo|fo dd) P(do|fo dd lo) P(hb|fo dd lo do)
• Simplify:P(dd|fo) = P(dd)P(lo|fo dd) = P(lo|fo)P(do|fo dd lo) = P(do|fo dd)P(hb|fo dd lo do) = P(hb|do)
Family out Dog dirty
Light on Dog out
Hear bark
P(fo) P(dd) P(lo|fo) P(do|fo dd) P(hb|do)
Simplifying Joint Distributions
• Only the five terms on the right side are needed to specify the joint distribution of our example:P(fo,dd,lo,do,hb) = P(fo) P(dd) P(lo|fo) P(do|fo dd) P(hb|do)
• The number of values that need to be specified in a Bayesian network grows linearly with the number of variables whereas the joint distribution grows exponentially
Prior probabilities of root nodes Conditional
probabilities of nonroot nodes given all combinations of
predecessors
Evaluating Probabilities Using BN’s
• Basic computation on Bayesian networks is the computation of every node’s belief (conditional probability) given the evidence observed
• For example– Evidence: The dog is heard barking– Compute: The probability that the family is out– Compute: The probability that
the light is on
Family out Dog dirty
Light on Dog out
Hear bark
Evaluating Probabilities Using BN’s
• Solving Bayesian networks involves Bayesian inference
• Exact Solution– Involves enumerating all possible probability
combinations– Generally NP-Hard
• Simple Query: P(fo = true | hb = true)( , )( | )
( )
P fo t hb tP fo t hb t
P hb t
, ,
, , ,
( , , , , )
( , , , , )dd lo do
fo dd lo do
P fo t dd lo do hb t
P fo dd lo do hb t
Evaluating Probabilities Using BN’s
• Approximate Solutions– Logic sampling– Markov chain Monte Carlo algorithm– Likelihood weighting method
• General approach to approximate solutions– Select values for a subset of nodes – Use this ‘evidence’ to pick values for remaining
nodes– Keep statistics on all the nodes’ values
Logic Sampling
• Logic Sampling Algorithm1) Guess values for all roots nodes
according to prior probabilityP(fo=0.15) -> 15% of time fo=true
2) Work down network guessing values for next lower node based on parent valuesPrevious values: fo=true and dd=falseP(do|fo ~dd)=0.90 -> 90% of time do=true
3) Repeat many times for entire network and keep track of how often each node is assigned each value
Family out Dog dirty
Light on Dog out
Hear bark
P(fo) = 0.15
P(do | fo ~dd) = 0.90
4) To determine a conditional probability, P(fo=true | hb=true), consider cases when hb=true and count the number of times fo=true.
• Bayesian networks were popularized by the Artificial Intelligence community who used them as a learning algorithm
• Used to model trust in a P2P network“Bayesian network-based trust model in P2P networks” byWang and Vassileva
• Used to evaluate circuit reliability“Scalable probabilistic computing models using Bayesian networks” by Rejimon and Bhanja
Applications
“If I see a red object, what is the probability that I should stop?”
OR
BN’s for Circuit Reliability
• Circuit Example
Inputs: Z1, Z2, Z3
Internal Signals: X1, X2
Outputs: Y1, Y2
Z1
Z2
Z3
X1
X2
Y1
Y2
BN’s for Circuit Reliability
• Goal: Analyze circuit reliability in the face of dynamic errors
• Procedure:– Construct an error prone version of the circuit where each gate has a
probability of failure = p
– Analyze this circuit in relation to the fault-free circuitZ1
Z2
Z3
Xe1
Xe2
Ye1
Ye2
p
p
BN’s for Circuit Reliability
• Error at the ith output can be represented mathematically:
Ei = Yei Yi
P(Ei = 1) = P(Yei Yi =1)
• Equivalent circuit representation
Xe1
Xe2
Ye1
Ye2
p
p
Z1
Z2
Z3
X1
X2
Y1
Y2
E1
E2
BN’s for Circuit Reliability
• In a circuit, each gate output has a causal relationship with its input
circuits can be represented as Bayesian networks
• In the Bayesian network representation of a circuit– Inputs are root nodes– Outputs are leaf nodes– Internal signals are internal nodes– Each node’s conditional probability is determined by the gate type and
probability of error, p
BN’s for Circuit Reliability
Xe1
Xe2
Ye1
Ye2
p
p
X1
X2
Y1
Y2
E1
E2 Z1
Z1
Z2
Z3
Z2 Z3
Xe2
Xe1
X1X2
Ye1 Ye
2 Y1 Y2
E1 E2
Circuit
Bayesian network
BN’s for Circuit Reliability
• Specifying the Bayesian network – Prior probabilities for all root nodes
Circuit inputs
– Conditional probabilities for all nonroot nodes given all possible combinations of parents
Xe1
Xe2
Ye1
Ye2
p
p
X1
X2
Y1
Y2
E1
E2
Z1
Z2
Z3
Ye1 is the output of NAND
gatewith inputs Z1 and Xe
1
P(Ye1=1| Z1=0, Xe
1=0) = (1-p) Probability of no
gate error
BN’s for Circuit Reliability
Z1
Z2 Z3
Xe2
Xe1 X1
X2
Ye1 Ye
2 Y1 Y2
E1 E2
• Solving for the error probabilities, E1 and E2
– Error probability is fixed (p = 0.005, 0.05, 0.1)– Logic sampling algorithm is used– Determines the probability of output error given each input combination
• Results– Circuits with 2,000-3,000 gates
took on average 18 s to simulate– Average accuracy: 0.7%– Worst-case accuracy: 3.34%
– Compare this to an accurate method which takes ~1,000 s to simulate simplecircuits with only 10’s of gates
Advanced Subjects in BN’s
• Dynamic Bayesian networks– Models variables whose values change over time– Captures this process by representing each state variable as
multiple copies, one for each time step
• Learning in Bayesian networks– The conditional probabilities of each node can be updated
continuously– Similar to the way weights are adjusted in neural networks
Conclusions
• Bayesian networks are a powerful tool for modeling any probabilistic system
• Applications are diverse– Medical field– Image processing– Speech recognition– Computer networking
• Used for efficient evaluation of nanoscale circuit reliability