Reasoning and Uncertainty in AI
1. Propositional Logic (PL)
Section titled “1. Propositional Logic (PL)”Propositional Logic is the simplest form of logic where all knowledge is represented as propositions.
- A proposition is a declarative statement that is either True (1) or False (0).
- PL is also called Boolean logic since it works on binary values (0 and 1).
- Propositions are denoted by symbols like P, Q, R.
Examples of Propositions
Section titled “Examples of Propositions”- “It is Sunday” → P (True if today is Sunday).
- “The Sun rises from the West” → Q (False).
- “3 + 3 = 7” → False.
- “5 is a prime number” → True.
NOTE: Here, V(disjunction is written as OR ) and Conjunction is denoted by AND
Logical Connectives
Section titled “Logical Connectives”| Connective | Symbol | Formula | Example | Truth Condition |
|---|---|---|---|---|
| Negation | ¬P | Not P | ”Not raining” | True if P = 0 |
| Conjunction | P AND Q | P and Q | ”Rohan is intelligent and hardworking” | True only if both P and Q are True |
| Disjunction | P OR Q | P or Q | ”Ritika is a doctor or engineer” | True if at least one is True |
| Implication | P → Q | If P then Q | ”If it rains, the street is wet” | False if P = 1, Q = 0 |
| Biconditional | P ↔ Q | P iff Q | ”If I breathe, I am alive” | True if P and Q have same value |
Truth Table (for Implication P → Q)
Section titled “Truth Table (for Implication P → Q)”| P | Q | P → Q |
|---|---|---|
| 1 | 1 | 1 |
| 1 | 0 | 0 |
| 0 | 1 | 1 |
| 0 | 0 | 1 |
Laws in Propositional Logic
Section titled “Laws in Propositional Logic”-
De Morgan’s Laws ¬(P AND Q) = ¬P OR ¬Q ¬(P OR Q) = ¬P AND ¬Q
-
Double Negation ¬(¬P) = P
-
Distributive Law P AND (Q OR R) = (P AND Q) OR (P AND R)
2. First-Order Logic (FOL)
Section titled “2. First-Order Logic (FOL)”FOL extends PL by including objects, predicates, quantifiers, and relations.
Components
Section titled “Components”-
Constants: specific entities (e.g., Socrates).
-
Variables: placeholders (x, y).
-
Predicates: properties/relations (Human(x), Loves(x,y)).
-
Functions: mappings (Father(John)).
-
Quantifiers:
- Universal (∀x): “for all x”
- Existential (∃x): “there exists an x”
Examples
Section titled “Examples”- “All humans are mortal” → ∀x (Human(x) → Mortal(x))
- “There exists a student who studies AI” → ∃x (Student(x) AND Studies(x, AI))
Applications: NLP, planning, expert systems, robotics.
3. Rule-Based Systems
Section titled “3. Rule-Based Systems”A rule-based system works on IF–THEN rules.
Components
Section titled “Components”-
Rules: IF (condition) → THEN (action).
-
Knowledge Base: stores rules and facts.
-
Inference Engine:
- Forward Chaining (data-driven).
- Backward Chaining (goal-driven).
- Facts: known information.
- Explanation Facility: explains reasoning.
- Knowledge Acquisition: adding/updating rules.
Example Rule IF fever AND cough THEN flu
Applications: medical diagnosis, credit risk, troubleshooting.
4. Semantic Networks
Section titled “4. Semantic Networks”Semantic networks represent knowledge using graphs.
- Nodes = concepts.
- Edges = relations.
Example Network
- Tom → isa → Dog
- Dog → isa → Mammal
- Mammal → isa → Animal
- Dog → likes → Bone
- Definitional (is-a hierarchy)
- Assertional (facts)
- Implicational (rules)
- Executable (dynamic)
- Learning (expanding with examples)
- Hybrid (combinations)
5. Conceptual Graphs
Section titled “5. Conceptual Graphs”Conceptual Graphs extend semantic networks with direct mapping to FOL.
- Rectangles = concepts.
- Ellipses = relations.
Example Sentence: “A cat is on a mat”
Graph: [Cat] – (On) – [Mat]
FOL Representation: ∃x ∃y (Cat(x) AND Mat(y) AND On(x,y))
6. Inference and Deduction
Section titled “6. Inference and Deduction”Inference = process of deriving conclusions from known facts.
Inference Rules
Section titled “Inference Rules”| Rule | Formula | Example |
|---|---|---|
| Modus Ponens | P, P → Q ⇒ Q | If sleepy → bed; sleepy ⇒ bed |
| Modus Tollens | P → Q, ¬Q ⇒ ¬P | If sleepy → bed; not bed ⇒ not sleepy |
| Hypothetical Syllogism | P → Q, Q → R ⇒ P → R | key→unlock, unlock→money ⇒ key→money |
| Disjunctive Syllogism | P OR Q, ¬P ⇒ Q | Sun OR Mon; not Sun ⇒ Mon |
| Addition | P ⇒ P OR Q | I have vanilla ⇒ Vanilla OR Chocolate |
| Simplification | P AND Q ⇒ P | Study AND Work ⇒ Study |
| Resolution | (P OR Q), (¬P OR R) ⇒ (Q OR R) | Resolves contradictions |
7. Resolution Refutation & Answer Extraction
Section titled “7. Resolution Refutation & Answer Extraction”Resolution = proof by contradiction.
- Convert all statements to CNF (Conjunctive Normal Form).
- Negate the statement to prove.
- Apply resolution repeatedly.
- If contradiction occurs → proof complete.
Example
- Pleasant → StrawberryPicking
- StrawberryPicking → Happy
- Prove: Pleasant → Happy
- Negation: Pleasant AND ¬Happy
- Apply resolution → contradiction → proof done.
8. Reasoning Under Uncertainty
Section titled “8. Reasoning Under Uncertainty”In real-world AI, data is often incomplete, noisy, or unreliable.
Causes of Uncertainty
Section titled “Causes of Uncertainty”- Sensor failures
- Environmental variations
- Incomplete knowledge
- Human error
Probabilistic Reasoning
Section titled “Probabilistic Reasoning”-
Probability Axioms 0 ≤ P(A) ≤ 1 P(True) = 1, P(False) = 0
-
Conditional Probability P(A|B) = P(A AND B) / P(B)
-
Bayes’ Theorem P(A|B) = [P(B|A) * P(A)] / P(B)
Example 70% students like English, 40% like both English & Math. P(Math | English) = 0.4 / 0.7 = 0.57
9. Bayesian Belief Networks (BBNs)
Section titled “9. Bayesian Belief Networks (BBNs)”A Bayesian Belief Network is a Directed Acyclic Graph (DAG) showing dependencies among variables.
- Nodes = random variables
- Edges = causal dependencies
- CPT (Conditional Probability Table) defines probabilities
Example
Section titled “Example”- Cloudy → Sprinkler
- Cloudy → Rain
- Sprinkler + Rain → WetGrass
Conditional Probability Table (CPT)
| Variable | Parents | CPT Example | ||
|---|---|---|---|---|
| Cloudy | — | P(C=T) = 0.5 | ||
| Sprinkler | Cloudy | P(S=T | C=T)=0.1, P(S=T | C=F)=0.5 |
| Rain | Cloudy | P(R=T | C=T)=0.8, P(R=T | C=F)=0.2 |
| WetGrass | Sprinkler, Rain | P(W=T | S=T,R=T)=0.99, P(W=T | S=F,R=F)=0.0 |
Applications: medical diagnosis, anomaly detection, decision-making.
Summary Table – Reasoning Methods
Section titled “Summary Table – Reasoning Methods”| Method | Description | Example Use |
|---|---|---|
| Propositional Logic | Binary logic with truth values | Knowledge bases |
| First-Order Logic | Extends PL with quantifiers/objects | NLP, planning |
| Rule-Based Systems | IF–THEN reasoning | Medical diagnosis |
| Semantic Nets | Graph-based relations | Knowledge graphs |
| Conceptual Graphs | Graph + FOL mapping | Ontologies |
| Inference Rules | Deductive reasoning techniques | Automated proofs |
| Resolution Refutation | Proof by contradiction | Theorem proving |
| Probabilistic Reasoning | Probability-based reasoning | Weather prediction |
| Bayesian Networks | DAG with CPTs for uncertainty | Diagnosis, forecasting |