Skip to content

Reasoning and Uncertainty in AI

Propositional Logic is the simplest form of logic where all knowledge is represented as propositions.

  • A proposition is a declarative statement that is either True (1) or False (0).
  • PL is also called Boolean logic since it works on binary values (0 and 1).
  • Propositions are denoted by symbols like P, Q, R.
  1. “It is Sunday” → P (True if today is Sunday).
  2. “The Sun rises from the West” → Q (False).
  3. “3 + 3 = 7” → False.
  4. “5 is a prime number” → True.

NOTE: Here, V(disjunction is written as OR ) and Conjunction is denoted by AND


ConnectiveSymbolFormulaExampleTruth Condition
Negation¬PNot P”Not raining”True if P = 0
ConjunctionP AND QP and Q”Rohan is intelligent and hardworking”True only if both P and Q are True
DisjunctionP OR QP or Q”Ritika is a doctor or engineer”True if at least one is True
ImplicationP → QIf P then Q”If it rains, the street is wet”False if P = 1, Q = 0
BiconditionalP ↔ QP iff Q”If I breathe, I am alive”True if P and Q have same value

PQP → Q
111
100
011
001

  • De Morgan’s Laws ¬(P AND Q) = ¬P OR ¬Q ¬(P OR Q) = ¬P AND ¬Q

  • Double Negation ¬(¬P) = P

  • Distributive Law P AND (Q OR R) = (P AND Q) OR (P AND R)


FOL extends PL by including objects, predicates, quantifiers, and relations.

  • Constants: specific entities (e.g., Socrates).

  • Variables: placeholders (x, y).

  • Predicates: properties/relations (Human(x), Loves(x,y)).

  • Functions: mappings (Father(John)).

  • Quantifiers:

    • Universal (∀x): “for all x”
    • Existential (∃x): “there exists an x”
  • “All humans are mortal” → ∀x (Human(x) → Mortal(x))
  • “There exists a student who studies AI” → ∃x (Student(x) AND Studies(x, AI))

Applications: NLP, planning, expert systems, robotics.


A rule-based system works on IF–THEN rules.

  1. Rules: IF (condition) → THEN (action).

  2. Knowledge Base: stores rules and facts.

  3. Inference Engine:

  • Forward Chaining (data-driven).
  • Backward Chaining (goal-driven).
  1. Facts: known information.
  2. Explanation Facility: explains reasoning.
  3. Knowledge Acquisition: adding/updating rules.

Example Rule IF fever AND cough THEN flu

Applications: medical diagnosis, credit risk, troubleshooting.


Semantic networks represent knowledge using graphs.

  • Nodes = concepts.
  • Edges = relations.

Example Network

  • Tom → isa → Dog
  • Dog → isa → Mammal
  • Mammal → isa → Animal
  • Dog → likes → Bone
  • Definitional (is-a hierarchy)
  • Assertional (facts)
  • Implicational (rules)
  • Executable (dynamic)
  • Learning (expanding with examples)
  • Hybrid (combinations)

Conceptual Graphs extend semantic networks with direct mapping to FOL.

  • Rectangles = concepts.
  • Ellipses = relations.

Example Sentence: “A cat is on a mat”

Graph: [Cat] – (On) – [Mat]

FOL Representation: ∃x ∃y (Cat(x) AND Mat(y) AND On(x,y))


Inference = process of deriving conclusions from known facts.

RuleFormulaExample
Modus PonensP, P → Q ⇒ QIf sleepy → bed; sleepy ⇒ bed
Modus TollensP → Q, ¬Q ⇒ ¬PIf sleepy → bed; not bed ⇒ not sleepy
Hypothetical SyllogismP → Q, Q → R ⇒ P → Rkey→unlock, unlock→money ⇒ key→money
Disjunctive SyllogismP OR Q, ¬P ⇒ QSun OR Mon; not Sun ⇒ Mon
AdditionP ⇒ P OR QI have vanilla ⇒ Vanilla OR Chocolate
SimplificationP AND Q ⇒ PStudy AND Work ⇒ Study
Resolution(P OR Q), (¬P OR R) ⇒ (Q OR R)Resolves contradictions

7. Resolution Refutation & Answer Extraction

Section titled “7. Resolution Refutation & Answer Extraction”

Resolution = proof by contradiction.

  1. Convert all statements to CNF (Conjunctive Normal Form).
  2. Negate the statement to prove.
  3. Apply resolution repeatedly.
  4. If contradiction occurs → proof complete.

Example

  1. Pleasant → StrawberryPicking
  2. StrawberryPicking → Happy
  3. Prove: Pleasant → Happy
  4. Negation: Pleasant AND ¬Happy
  5. Apply resolution → contradiction → proof done.

In real-world AI, data is often incomplete, noisy, or unreliable.

  • Sensor failures
  • Environmental variations
  • Incomplete knowledge
  • Human error

  • Probability Axioms 0 ≤ P(A) ≤ 1 P(True) = 1, P(False) = 0

  • Conditional Probability P(A|B) = P(A AND B) / P(B)

  • Bayes’ Theorem P(A|B) = [P(B|A) * P(A)] / P(B)

Example 70% students like English, 40% like both English & Math. P(Math | English) = 0.4 / 0.7 = 0.57


A Bayesian Belief Network is a Directed Acyclic Graph (DAG) showing dependencies among variables.

  • Nodes = random variables
  • Edges = causal dependencies
  • CPT (Conditional Probability Table) defines probabilities

  • Cloudy → Sprinkler
  • Cloudy → Rain
  • Sprinkler + Rain → WetGrass

Conditional Probability Table (CPT)

VariableParentsCPT Example
CloudyP(C=T) = 0.5
SprinklerCloudyP(S=TC=T)=0.1, P(S=TC=F)=0.5
RainCloudyP(R=TC=T)=0.8, P(R=TC=F)=0.2
WetGrassSprinkler, RainP(W=TS=T,R=T)=0.99, P(W=TS=F,R=F)=0.0

Applications: medical diagnosis, anomaly detection, decision-making.


MethodDescriptionExample Use
Propositional LogicBinary logic with truth valuesKnowledge bases
First-Order LogicExtends PL with quantifiers/objectsNLP, planning
Rule-Based SystemsIF–THEN reasoningMedical diagnosis
Semantic NetsGraph-based relationsKnowledge graphs
Conceptual GraphsGraph + FOL mappingOntologies
Inference RulesDeductive reasoning techniquesAutomated proofs
Resolution RefutationProof by contradictionTheorem proving
Probabilistic ReasoningProbability-based reasoningWeather prediction
Bayesian NetworksDAG with CPTs for uncertaintyDiagnosis, forecasting