You are on page 1of 15

Agents almost never have access to the whole truth about their environment Constraint for logical agent

t Some sentences ascertained from agents percepts, others can be inferred from current and previous percepts together with knowledge about properties

of environment Agents have to act under uncertainty Incompleteness and incorrectness in agents understanding of the environment In real life situation we cannot be sure of an evidence

Uncertainty minutes before flight Let action A = leave for airport


t t

Will At get me there on time? Problems: partial observability (road state, other drivers' plans, etc.) 1. uncertainty in action outcomes (flat tire, etc.) 2. immense complexity of modeling and predicting traffic Hence a purely logical approach either 1. risks falsehood: A25 will get me there on time, or 2. leads to conclusions that are too weak for decision making: A90 will get me there on time if there's no accident on the bridge and it doesn't rain and my tires remain intact etc etc. A120 will increase agents belief to get me on time but will increase the likelihood of a long wait (A1440 might reasonably be said to get me there on time but I'd have to stay overnight in the airport )

Handling Uncertain KnowledgeExample 1


The Doorbell rang at midnight Was someone at the door? Did Mohun wake up? Proposition 1: AtDoor(x) Doorbell Proposition 2: Doorbell Wake(Mohun)

Doorbell may be due to shortcircuit, animal, wind

Modify proposition 1 AtDoor(x)ShortCircuit(x)Wind Doorbell

Handling uncertain knowledge Example 2 p Symptom(p,Toothache)Disease(p,Cavity)


Not all patients with toothaches have cavities. Some may have gum diseases, wisdom teeth, or other reasons. p Symptom(p,Toothache)Disease(p,Cavity) Disease(p,GumDisease)Disease(p,wisdomteeth)... Above is true by adding an unlimited list of causes p Disease(p,Cavity) Symptom(p,Toothache) This is also not true because not all cavities cause toothache

Methods for handling uncertainty


Default or nonmonotonic logic: Assume my car does not have a flat tire

Issues: What assumptions are reasonable? How to handle contradiction?


Rules with fudge factors: A25 |0.3 get there on time

Assume A25 works unless contradicted by evidence

Issues: Problems with combination, e.g., Sprinkler causes Rain??

Sprinkler | 0.99 WetGrass WetGrass | 0.7 Rain

Probability Model agent's degree of belief


Given the available evidence, A25 will get me there on time with probability 0.04

Probability
First order logic fails due to:
laziness: failure to enumerate exceptions, qualifications,

etc. Theoretical Ignorance: No complete theory for the domain Practical ignorance: lack of relevant facts, initial conditions, etc.

Agents knowledge can at best provide only a degree of

belief about a sentence Probability theory is one main tool to deal with degrees of belief E.g 80% of toothache patients seen so far have had cavities Probability = 0.8

Making decisions under uncertainty


Suppose I believe the following: P(A25 gets me there on time) = 0.04 P(A90 gets me there on time) = 0.70 P(A120 gets me there on time ) = 0.95 P(A1440 gets me there on time) = 0.9999
Which action to choose? Depends on my preferences for missing flight vs. time

spent waiting, etc. Utility theory is used to represent and infer preferences. The quality of being useful Decision theory = probability theory + utility theory

Prior Probability probability P(A) for unconditional or prior


P(Cavity) = 0.1 means in the absence of any other

information, the agent will assign a probability of 0.1 (10% chance) to the event. Proposition can also include equalities involving so-called random variables. P(Weather=Sunny) = 0.7 P(Weather=Rainy) = 0.2 P(Weather=Cloudy) = 0.08 Logical connectives may also be used, e.g P(Cavity Insured)=0.06 P(Cavity) is viewed as P(Cavity=True) and P(Cavity) is viewed as P(Cavity=False) Complex propositions formed from elementary propositions and standard logical connectives e.g., Weather = sunny Cavity = false

Conditional probability or Posterior Probability


P(A|B) = P(AB)/P(B)

Probability of A given that all we know is B e.g., P(cavity | toothache) = 0.8


i.e., given that toothache is all I know

P(AB) = P(A|B) P(B)

Rule

Product

Axioms of probability
For any propositions A, B 0 P(A) 1 P(true) = 1 and P(false) = 0 P(A B) = P(A) + P(B) - P(A B)

Joint probability distribution


Joint probability distribution for a set of random variables

gives the probability of every atomic event on those random variables P(Weather,Cavity) = a 4 2 matrix of values:

Weather = Cavity = true Cavity = false

sunny rainy cloudy snow 0.144 0.02 0.016 0.02 0.576 0.08 0.064 0.08

Bayes' Rule

Product rule P(ab) = P(a | b) P(b) = P(b | a) P(a)

Bayes' rule: P(a | b) = P(b | a) P(a) / P(b)


Useful for assessing diagnostic probability from causal

probability: P(Cause|Effect) = P(Effect|Cause) P(Cause) / P(Effect)


E.g., let M be meningitis, S be stiff neck:

P(m|s) = P(s|m) P(m) / P(s) = 0.8 0.0001 / 0.1 = 0.0008


Note: posterior probability of meningitis still very small!

Normalisation
P(M|S)= P(S|M)P(M)

P(S|M)P(M)+P(S|M)P(M)

You might also like