SlideShare a Scribd company logo
1 of 79
CHAPTER 4
USING PREDICATE LOGIC
Logic
• A scientific study of the process of reasoning
• System of rules and procedures that help in reasoning process
• Logic can be used to represent simple facts
• The facts are claims about the world that are True or False
• Logic defines ways of putting symbols together so that user can define legal sentences in
the language that represent True facts.
Types of Logic
1. Propositional Logic
2. Predicate Logic
Propositional Logic
• Simplest form
• Sentences- either TRUE or FALSE but not both are called prepositions
Predicate logic
Quantifiers Universal quantification
• (∀x)P(x) means that P holds for all values of x in the domain associated with that
variable
• E.g., (∀x) dolphin(x) → mammal(x) Existential quantification
• (∃ x)P(x) means that P holds for some value of x in the domain associated with that
variable
• E.g., (∃ x) mammal(x) ∧ lays-eggs(x)
Consider the following example that shows the use of predicate logic as a way of
representing knowledge.
1. Marcus was a man.
2. Marcus was a Pompeian.
3. All Pompeians were Romans.
4. Caesar was a ruler.
5. Also, All Pompeians were either loyal to Caesar or hated him.
6. Everyone is loyal to someone.
7. People only try to assassinate rulers they are not loyal to.
8. Marcus tried to assassinate Caesar
The facts described by these sentences can be represented as a set of well-formed formulas (wffs) as follows:
1. Marcus was a man
man(Marcus)
2. Marcus was a Pompeian.
Pompeian(Marcus)
3. All Pompeians were Romans.
∀x: Pompeian(x) → Roman(x)
4. Caesar was a ruler.
ruler(Caesar)
5. All Romans were either loyal to Caesar or hated him.
• Inclusive-or
∀x: Roman(x) → loyalto(x, Caesar) ∨ hate(x, Caesar)
• exclusive-or
∀x: Roman(x) → [(loyalto(x, Caesar) V hate(x, Caesar)) ∧ (¬loyalto(x, Caesar) ∧ hate(x, Caesar))
6. Everyone is loyal to someone.
∀x: ∃y: loyalto(x, y)
7. People only try to assassinate rulers they are not loyal to.
∀x: ∀y: person(x) ∧ ruler(y) ∧ tryassassinate(x, y)→¬loyalto(x, y)
8. Marcus tried to assassinate Caesar.
tryassassinate(Marcus, Caesar)
• Now suppose if we want to use these statements to answer the question: Was Marcus loyal to Caesar?
• Also, Now let’s try to produce a formal proof, reasoning backward from the desired goal:
¬ Ioyalto(Marcus, Caesar)
• In order to prove the goal, we need to use the rules of inference to transform it into another goal (or
possibly a set of goals) that can, in turn, transformed, and so on, until there are no unsatisfied goals
remaining.
Figure: An attempt to prove ¬loyalto(Marcus, Caesar)
• The problem is that, although we know that Marcus was a man, we do not have any way to conclude from that that
Marcus was a person. Also, We need to add the representation of another fact to our system,
namely: ∀ man(x) → person(x)
• Now we can satisfy the last goal and produce a proof that Marcus was not loyal to Caesar.
• Moreover, From this simple example, we see that three important issues must be addressed in the process of converting
English sentences into logical statements and then using those statements to deduce new ones:
1. Many English sentences are ambiguous. Choosing the correct interpretation may be difficult.
2. Also, There is often a choice of how to represent the knowledge. Simple representations are desirable, but they may
exclude certain kinds of reasoning.
3. Similalry, Even in very simple situations, a set of sentences is unlikely to contain all the information necessary to reason
about the topic at hand. In order to be able to use a set of statements effectively. Moreover, It is usually necessary to have
access to another set of statements that represent facts that people consider too obvious to mention.
Representing Instance and ISA Relationships
• Specific attributes instance and isa play an important role particularly in a useful form of reasoning
called property inheritance.
• The predicates instance and isa explicitly captured the relationships they used to express, namely class
membership and class inclusion.
• Figure shows the first five sentences of the last section represented in logic in three different ways.
• The first part of the figure contains the representations we have already discussed. In these
representations, class membership represented with unary predicates (such as Roman), each of which
corresponds to a class.
• Asserting that P(x) is true is equivalent to asserting that x is an instance (or element) of P.
• The second part of the figure contains representations that use the instance predicate explicitly.
Man(Marcus), class man is represented using
unary predicate
Instance(Marcus, man) predicate is binary such that first
argument is an instance(element) of the second argument
which is usually a class.
Isa(Pompeian, Roman)
The subclass(Pompeian) and superclass(Romans) are used
Isa relationships simplifies representation of wffs and combines statements to express knowledge in a
shorter version.
• The predicate instance is a binary one, whose first argument is an object and whose second argument is a
class to which the object belongs.
• But these representations do not use an explicit isa predicate.
• Instead, subclass relationships, such as that between Pompeians and Romans, described as shown in
sentence 3.
• The implication rule states that if an object is an instance of the subclass Pompeian then it is an instance of
the superclass Roman.
• Note that this rule is equivalent to the standard set-theoretic definition of the subclass superclass
relationship.
• The third part contains representations that use both the instance and isa predicates explicitly.
• The use of the isa predicate simplifies the representation of sentence 3, but it requires that one additional
axiom (shown here as number 6) be provided.
Computable Functions and Predicates
• To express simple facts, such as the following greater-than and less-than relationships:
• gt(1,O) It(0,1)
• gt(2,1) It(1,2)
• gt(3,2) It( 2,3)
• It is often also useful to have computable functions as well as computable predicates. Thus we might want to
be able to evaluate the truth of gt(2 + 3,1)
• To do so requires that we first compute the value of the plus function given the arguments 2 and 3, and then
send the arguments 5 and 1 to gt.
Consider the following set of facts, again involving Marcus:
1) Marcus was a man.
man(Marcus)
2) Marcus was a Pompeian.
Pompeian(Marcus)
3) Marcus was born in 40 A.D.
born(Marcus, 40)
4) All men are mortal.
x: man(x) → mortal(x)
5) All Pompeians died when the volcano erupted in 79 A.D.
erupted(volcano, 79) ∧ ∀ x : [Pompeian(x) → died(x, 79)]
6) No mortal lives longer than 150 years.
x: t1: At2: mortal(x) born(x, t1) gt(t2 – t1,150) → died(x, t2)
7) It is now 1991. now = 1991
• The example shows how these ideas of computable functions and predicates can be useful. It also makes
use of the notion of equality and allows equal objects to be substituted for each other whenever it appears
helpful to do so during a proof.
• So, Now suppose we want to answer the question “Is Marcus alive?”
• The statements suggested here, there may be two ways of deducing an answer.
• Either we can show that Marcus is dead because he was killed by the volcano or we can show that he must
be dead because he would otherwise be more than 150 years old, which we know is not possible.
• Also, As soon as we attempt to follow either of those paths rigorously, however, we discover, just as we did
in the last example, that we need some additional knowledge. For example, our statements talk about
dying, but they say nothing that relates to being alive, which is what the question is asking.
So we add the following facts:
8) Alive means not dead.
Ɐ x: Ɐ t: [alive(x, t) → ¬ dead(x, t)] [¬ dead(x, t) → alive(x, t)]
9) If someone dies, then he is dead at all later times.
Ɐ x: Ɐ t1: At2: died(x, t1) gt(t2, t1) → dead(x, t2)
So, Now let’s attempt to answer the question “Is Marcus alive?” by proving: ¬ alive(Marcus, now)
• The term nil at the end of each proof indicates that the list of conditions remaining to be
proved is empty and so the proof has succeeded.
• Notice in those proofs that whenever a statement of the form:
a∧b → c
was used, a and b were set up as independent sub goals.
• In one sense they are, but in another sense they are not if they share the same bound
variables since, in that case, consistent substitutions must be made in each of them.
born(Marcus, t1)
using statement 3 by binding t1 to 40, but then we must also bind t1 to 40 in
gt(now - t1, 150)
From looking at the proofs we have just shown, two things should be clear:
• Even very simple conclusions can require many steps to prove.
• A variety of processes, such as matching, substitution, and application of modus
ponens are involved in the production of a proof.
RESOLUTION
• A proof procedure that carried out in a single operation the variety of processes involved in reasoning
with statements in predicate logic.
• Resolution is such a procedure, which gains its efficiency from the fact that it operates on statements
that have been converted to a very convenient standard form Resolution produces proofs by refutation.
• In other words, to prove a statement (i.e., show that it is valid), resolution attempts to show that the
negation of the statement produces a contradiction with the known statements (i.e. , that it is
unsatisfiable).
• Produces proof by refutation(the action of proving a statement or theory to be false).
Resolution uses following two steps
a) Conversion of axioms to canonical or clause form
b) Using refutation which means to show that the negation of the statement
• An atomic formula and its negation are called Literals
• The disjunction (OR) of the literals is called a clause
Conversion to Clause form:
This can be achieved by using the Conjunctive normal form
“Suppose we know that all Romans who know Marcus either hate Caesar or think that anyone who hates anyone is crazy.”
Wff form:-
∀x: [Roman(x) ∧ know(x, Marcus)] → [hate(x, Caesar) V (∀y: ∃z: hate (y, z) → thinkcrazy(x, y))]
• To use this formula in a proof requires a complex matching process. Then, having matched one piece of it, such as
thinkcrazy(x, y), it is necessary to do the right thing with the rest of the formula including the pieces in which the
matched part is embedded and those in which it is not.
• If the formula were in a simpler form, this process would be much easier. The formula would be easier to work with if
-It was flatter, i.e., there was less embedding of components.
- The quantifiers were separated from the rest of the formula so that they did not need to be considered.
• The formula given above for the feelings of Romans who know Marcus would be represented in conjunctive normal form
as
¬ Roman(x) ∧¬ know(x, Marcus) V
hate(x, Caesar) V ¬ hate(y, z) V thinkcrazy(x, z)
• Need to reduce a set of wff's to a set of clauses, where a clause is defined to be a wff in conjunctive normal form but with
no instances of the connector A.
• by first converting each wff into conjunctive normal form and then breaking apart each such expression into clauses, one for
each conjunct.
• All the conjuncts will be considered to be conjoined together as the proof procedure operates.
Algorithm: Convert to Clause Form
1. Eliminate →, using the fact that a → b is equivalent to ¬ a V b. Performing this transformation on the wff given above
yields.
∀x: ¬ [Roman(x) ∧know(x, Marcus)] V[hate(x, Caesar) V (∀y :¬(∃z : hate(y, z)) V thinkcrazy(x, y))]
2. Reduce the scope of each ¬ to a single term, using the fact that ¬ (¬ p) = p, deMorgan's laws [which say that ¬ (a ∧b) = ¬
a V ¬ b and ¬ (a V b) = ¬ a ∧ ¬ b ], and the standard correspondences between quantifiers [¬ ∀x: P(x) = ∃x: ¬ P(x) and ¬∃x:
P(x) = ∀x: ¬P(x)]. Performing this transformation on the wff from step 1 yields
∀x: [¬ Roman(x) V ¬ know(x, Marcus)] V
[hate(x, Caesar) V (∀y: ∀z: ¬ hate(y, z) Vthinkcrazy(x, y))]
3. Standardize variables so that each quantifier binds a unique variable. Since variables are just dummy names, this
process cannot affect the truth value of the wff.
∀x: P(x) V∀x: Q(x)
Would be converted to
∀x:P(x) V ∀y: Q(y)
4. Move all quantifiers to the left of the formula without changing their relative order. This is possible since there is no
conflict among variable names.
∀x: ∀y: ∀z: [¬ Roman(x) V¬ know(x, Marcus)] V
[hate(x, Caesar) V (¬ hale(y, z) V thinkcrazy(x, y))], The formula at this point is known as prenex normal form
5. Eliminate existential quantifiers. (Skolemization) A formula that contains an existentially quantified variable asserts that
there is a value that can be substituted for the variable that makes the formula true. Eliminate the quantifier by substituting
for the variable a reference to a function that produces the desired value.
∃y: President(y)
Can be transformed into the formula
President (S1)
Where S1 is a function with no arguments that somehow produces a value that satisfies President (S1)-S1 called as skolem
constant.
6. Drop the prefix. At this point, all remaining variables are universally quantified, so the prefix can just be dropped and any
proof procedure we use can simply assume that any variable it sees is universally quantified.
[¬ Roman(x) V¬ know(x, Marcus)] V
[hate(x, Caesar) V (¬ hate(y, z) V thinkcrazy(x, y))]
7. Convert the matrix into a conjunction of disjuncts. In the case or our example, since there are no and’s, it is only necessary
to exploit the associative property of or [ i.e., (a ∧ b) V c = (a V c) ∧ (b ∧ c)] and simply remove the parentheses, giving
¬ Roman(x) V ¬ know(x, Marcus) V
hate(x, Caesar) V¬ hate(y, z) V thinkcrazy(x, y)
8. Create a separate clause corresponding to each conjunct. In order for a wff to be true, all the clauses that are generated
from it must be true. If we are going to be working with several wff’s, all the clauses generated by each of them can now be
combined to represent the same set of facts as were represented by the original wff’s.
9. Standardize apart the variables in the set of clauses generated in step 8. By this we mean rename the variables so that no
two clauses make reference to the same variable. In making this transformation, we rely on the fact that
(∀x: P(x) ∧ Q(x)) = ∀x: P(x) ∧∀x: Q(x)
After applying the entire procedure to a set of wff’s have a set of clause, each of which is disjunction of literals.
Thus since each clause is a separate conjunct and since all the variables are universally quantified, there need be no
relationship between the variables of two clauses, even if they were generated from the same wff.
The Basis of Resolution
• The resolution procedure is a simple iterative process: at each step, two clauses, called the parent clauses, are compared
(resolved), yielding a new clause that has been inferred from them.
• The new clause represents ways that the two parent clauses interact with each other.
• Suppose that there are two clauses in the system:
winterV summer
¬ winterV cold
Observe that precisely one of winter and ¬ winter will be true at any point. If winter is true, then cold must be true to
guarantee the truth of the second clause. If ¬ winter is true, then summer must be true to guarantee the truth of the first
clause. From these two clauses can deduce-
Summer V cold
• This is the deduction that the resolution procedure will make.
• Resolution operates by taking two clauses that each contains the same literal, in this example, winter.
• The literal must occur in positive form in one clause and in negative form in the other.
• The resolventis obtained by combining all of the literals of the two parent clauses except the ones that cancel.
• If the clause that is produced is the empty clause, then a contradiction has been found.
• The two clauses
winter
¬ winter
will produce the empty clause. If a contradiction exists, then eventually it will be found. Of course, if no contradiction exists,
it is possible that the procedure will never terminate, although as we will see, there are often ways of detecting that no
contradiction exists.
• In predicate logic, all possible ways of substituting values for the variables are considered.
• The theoretical basis of the resolution procedure in predicate logic is Herbrand’s theorem
-To show that a set of clauses S is unsatisfiable, it is necessary to consider only interpretations over a particular set, called the
Herbrand universe of S.
-A set of clauses S is unsatisfiable if and only if a finite subset of ground instances(in which all bound variables have had a
value substituted for them) of S is unsatisfiable.
Resolution in Propositional Logic
• In propositional logic, the procedure for producing a proof by resolution of proposition P with respect to a set of
axioms F is the following.
Algorithm: Propositional Resolution
1. Convert all the propositions of F to clause form.
2. Negate P and convert the result to clause form. Add it to the set of clauses obtained in step 1.
3. Repeat until either a contradiction is found or no progress can be made:
a) Select two clauses. Call these the parent clauses.
b) Resolve them together. The resulting clause, called the resolvent, will be the disjunction of all of the literals of both
of the parent clauses with the following exception: If there are any pairs of literals L and ¬ L such that one of the parent
clauses contains L and the other contains ¬ L, then select one such pair and eliminate both L and ¬ L from the resolvent.
c) If the resolvent is the empty clause, then a contradiction has been found. If it is not, then add it to the set of clauses
available to the procedure.
Simple example:
Q. want to prove R.
A. Then we negate R, producing ¬ R, which is already in clause form.
One way of viewing the resolution process is that it takes a set of
clauses that are all assumed to be true and, based on information
provided by the others, it generates new clauses that represent
restrictions on the way each of those original clauses can be made
true. A contradiction occurs when a clause becomes so restricted that
there is no way it can be true. This is indicated by the generation of
the empty clause
• In order for proposition 2 to be true, one of three things must be true: ¬ P, ¬ Q. or R. But we are assuming that ¬ R
is true.
• Given that, the only way for proposition 2 to be true is for one of two things to be true: ¬ P or ¬ Q.
• That is what the first resolvent clause says. But proposition 1 says that P is true, which means that ¬ P cannot be
true, which leaves only one way for proposition 2 to be true, namely for ¬ Q to be true (as shown in the second
resolvent clause). Proposition 4 can be true if either ¬ T orQ is true. But since we now know that ¬ Q must be true,
the only way for proposition 4 to be true is for ¬ T to be true (the third resolvent).
• But proposition 5 says that T is true. Thus there is no way for all of these clauses to be true in a single
interpretation. This is indicated by the empty clause.
The Unification Algorithm In propositional logic
• It is easy to determine that two literals cannot both be true at the same time. Simply look for L and ¬ L in predicate logic, this
matching process is more complicated since the arguments of the predicates must be considered.
• Example: man(John) and ¬man(John) is a contradiction, while man(john)and ¬man(spot) is not
• In order to determine contradictions, we need a matching procedure that compares two literals and discovers whether there exists a
set of substitutions that makes them identical.
• The substitution of variables to make resolution possible is called unification and proof in predicate logic can be based on this joint
method of unification and Resolution.
• Unification is a process of making two different logical atomic expressions identical by finding a substitution.
• The basic idea of unification is very simple. To attempt to unify two literals, first check if their initial predicate symbols are the same.
If so, can proceed. Otherwise, there is no way they can be unified, regardless of their arguments.
tryassassinate (Marcus, Caesar)
hate(Marcus, Caesar)
cannot be unified.
• If the predicate symbols match, then must check the arguments, one pair at a time. If the first matches, then can
continue with the second, and so on. To test each argument pair, we can simply call the unification procedure
recursively. The matching rules are simple. Different constants or predicates cannot match; identical ones can. A
variable can match another variable, any constant, or a predicate expression, with the restriction that the predicate
expression must not contain any instances of the variable being matched.
Example 1: To unify the expressions
P(x, x)
P(y, z)
The two instances of P match fine. Next we compare x and y, and decide that if we substitute y for x, they could
match. We will write that substitution as
y/x, z/x - not a consistent substitution
After finding first substitution make it throughout the literals
P(y, y)
P(y, z)
Now we can succeed with a substitution of z/y.
Example 2: hate(x, y)
hate(Marcus, z)
Could be unified with any of the following substitutions:
(Marcus/x, z/y)
(Marcus/x, y/z)
(Marcus/x, Caesarly/y, Caesar/z)
Algorithm: Unify (L1, L2)
1. If L1 or L2 are both variables or constants, then:
(a) If L1 and L2 are identical, then return NIL.
(b) Else if L1 is a variable, then if L1 occurs in L2 then return {FAIL}, else return (L2/L1).
(c) Else if L2 is a variable, then if L2 occurs in L1 then return {FAIL}, else return (L1/L2).
(d) Else return {FAIL}.
2. If the initial predicate symbols in L1 and L2 are not identical, then return {FAIL}.
3. If LI and L2 have a different number of arguments, then return {FAIL}.
4. Set SUBST to NIL. (At the end of this procedure, SUBST will contain all the substitutions used to unify L1 and L2.)
5. For i ← 1 to number of arguments in L1 :
(a) Call Unify with the ith argument of L1 and the ith argument of L2, putting result in S.
(b) If S contains FAIL then return {FAIL}.
(c) If S is not equal to NIL then:
(i) Apply S to the remainder of both L1 and L2.
(ii) SUBST: = APPEND(S, SUBST).
6. Return SUBST.
L2/L1: Substitute L2 for L1
Resolution in Predicate Logic
• Determining that two literals are contradictory- they are if one of them can be unified with the negation of the other.
So, for example,
man(x) and ¬ man(Spot) are contradictory, since man(x) and man(Spot) can be unified.
• To use resolution for expressions in the predicate logic, use the unification algorithm to locate pairs of literals that
cancel out. Need to use the unifier produced by the unification algorithm to generate the resolvent clause. For
example. suppose to resolve two clauses:
1. man(Marcus)
2. ¬ man(x1) V mortal(x1)
• The literal man(Marcus) can be unified with the literal man(x1) with the substitution Marcus/x1, telling us that for
x1 = Marcus, ¬ man(Marcus) is false.
But cannot simply cancel out the two man literals as we did in propositional logic and generate the
Resolvent mortal(x1). Clause 2 says that for a given x1, either ¬ man(x1) or mortal(x1). So for it to be true, can now
conclude only that mortal(Marcus) must be true.
Algorithm: Resolution
1. Convert all the statements of F to clause form.
2. Negate P and convert the result to clause form. Add it to the set of clauses obtained in 1.
3. Repeat until either a contradiction is found, no progress can be made, or a predetermined amount of effort has been
expended.
a) Select two clauses. Call these the parent clauses.
b) Resolve them together. The resolvent will be the disjunction of all the literals of both parent clauses with appropriate
substitutions performed and with the following exception: If there is one pair of literals T1 and ¬ T2 such that one of the
parent clauses contains T2 and the other contains T1 and if T1 and T2 are unifiable, then neither T1 nor T2 should appear
in the resolvent. Call T1 and T2 Complementary literals. Use the substitution produced by the unification to create the
resolvent. If there is more than one pair of complementary literals, only one pair should be omitted from the resolvent.
c) If the resolvent is the empty clause, then a contradiction has been found. If it is not, then add it to the set of clauses
available to the procedure
If the choice of clauses to resolve together at each step is made in certain systematic ways, then the resolution
procedure will find a contradiction if one exists.
There exist strategies for making the choice that can speed up the process considerably:
• Only resolve pairs of clauses that contain complementary literals, since only such resolutions produce new clauses that
are harder to satisfy than their parents.
• Eliminate certain clauses as soon as they are generated so that they cannot participate in later resolutions. Two kinds of
clauses should be eliminated: tautologies (which can never be unsatisfied) and clauses that are subsumed by other
clauses.
• Whenever possible, resolve either with one of the clauses that is part of the statement, are trying to refute or with a
clause generated by a resolution with such a clause.
• Whenever possible, resolve with clauses that have a single literal. Such resolutions generate new clauses with fewer
literals than the larger of their parent clauses and thus are probably closer to the goal of a resolvent with zero terms.
This method is called the unit-preference strategy.
Suppose our knowledge base contained the two additional statements
9) persecute(x, y)->hate(y, x)
10) hate(x, y)->persecute(y, x)
Converting to clause form, we get
9) ¬persecute(x5, y2) V hate(y2, x5)
10) ¬hate(x6, y3)V persecute (y3, x6)
Variable y occurs in both clause 1 and clause 2
The substitution at the second resolution step
produces a clause that is too restricted and so does
not lead to contradiction that is present in the
database.
If the clause
¬Father(Chris, Y) had been produced, the
contradiction with clause 4 would have emerged.
This would have happen if clause 2 had been
rewritten as
¬ Mother (a, b) V woman(a)
The need to try several substitutions
Resolution provides a good way of finding a refutation
proof without actually trying all the substitutions that
Herbrand’s theorem suggests.
But it does not always eliminate the need of trying
more than one substitutions.
Eg: To prove that Marcus hates some ruler, can try
each substitutions and finding any contradictions.
Question Answering
Question: When did Marcus die?
Answering:
Finding a known statement that
matches the terms given in the
question
What happened in 79 AD?
ⱻx: event(x, 79)
Erupted(volcano)
can be represented as
Event(erupted(volcano), 79)
Natural Deductions
• It is a way of doing machine theorem proving that corresponds more closely to
processes used in human theorem proving.
• Resolution as an easily implementable proof procedure that relies for its simplicity on
a uniform representation of the statements it uses.
• In converting everything to clause form, we often lose valuable heuristic information
that is contained in the original representation of the facts.
• Heuristic information contained in the original statements can be lost in the
transformation
For example, suppose we believe that all judges who are not crooked are well-educated, which can be represented as
Ɐx: judge(x) ꓥ ¬ crooked(x)->educated(x)
If converted to clause form,
¬ judge(x)V crooked(x) V educated(x)
It appears also to be a way of deducing that someone is not a judge by showing that he is not crooked and not
educated. The heuristic information contained in the original statement has been lost in the transformation.
6. Representing Knowledge using Rules
Procedural versus Declaration Knowledge
The declarative representation is one in which the knowledge is specified but how to use to which that knowledge is to
be put is not given.
Declarative knowledge answers the question 'What do you know?'
It is your understanding of things, ideas, or concepts.
In other words, declarative knowledge can be thought of as the who, what, when, and where of information.
Declarative knowledge is normally discussed using nouns, like the names of people, places, or things or dates that
events occurred.
The procedural representation is one in which the control information i.e., necessary to use the knowledge is considered
to be embedded in the knowledge itself.
Procedural knowledge answers the question 'What can you do?'
While declarative knowledge is demonstrated using nouns,
Procedural knowledge relies on action words, or verbs.
It is a person's ability to carry out actions to complete a task.
The real difference between declarative and procedural views of knowledge lies in which the control information presides.
Example:
The statements 1, 2 and 3 are procedural knowledge and 4 is a declarative knowledge.
Forward & Backward Reasoning
The object of a search procedure is to discover a path through a problem space from an initial configuration to a goal state.
There are actually two directions in which such a search could proceed:
Forward Reasoning,
from the start states
LHS rule must match with initial state
Eg: A → B, B→C => A→C
Backward Reasoning,
from the goal states
RHS rules must match with goal state
Eg: 8-Puzzle Problem
In both the cases, the control strategy is it must cause motion and systematic. The production system model of the search
process provides an easy way of viewing forward and backward reasoning as symmetric processes.
Consider the problem of solving a particular instance of the 8-puzzle problem. The rules to be used for solving the puzzle
can be written as:
Reasoning Forward from Initial State:
Begin building a tree of move sequences that might be solved with initial configuration at root of the tree.
Generate the next level of the tree by finding all the rules whose left sides match the root node and using their right
sides to create the new configurations.
Generate the next level by taking each node generated at the previous level and applying to it all of the rules whose left
sides match it.
Continue until a configuration that matches the goal state is generated.
Reasoning Backward from Goal State:
Begin building a tree of move sequences that might be solved with goal configuration at root of the tree.
Generate the next level of the tree by finding all the rules whose right sides match the root node. These are all the rules that,
if only we could apply them, would generate the state we want. Use the left sides of the rules to generate the nodes at this
second level of the tree.
Generate the next level of the tree by taking each node at the previous level and finding all the rules whose right sides match
it. Then use the corresponding left sides to generate the new nodes.
Continue until a node that matches the initial state is generated.
This method of reasoning backward from the desired final state is often called goal- directed reasoning.
To reason forward, the left sides (preconditions) are matched against the current state and the right sides (results) are used to
generate new nodes until the goal is reached. To reason backward, the right sides are matched against the current node and
the left sides are used to generate new nodes representing new goal states to be achieved.
The following 4 factors influence whether it is better to reason Forward or Backward:
1. Are there more possible start states or goal states? We would like to move from the smaller set of states to the larger (and
thus easier to find) set of states.
2. In which direction branching factor (i.e, average number of nodes that can be reached directly from a single node) is
greater? We would like to proceed in the direction with lower branching factor.
3. Will the program be used to justify its reasoning process to a user? If so, it is important to proceed in the direction that
corresponds more closely with the way the user will think.
4. What kind of event is going to trigger a problem-solving episode? If it is arrival of a new fact, forward reasoning makes
sense. If it is a query to which a response is desired, backward reasoning is more natural.
Backward-Chaining Rule Systems
Backward-chaining rule systems are good for goal-directed problem solving.
For example, a query system would probably use backward chaining to reason about and answer user questions.
Unification tries to find a set of bindings for variables to equate a (sub) goal with the head of some rule.
Medical expert system, diagnostic problems
Forward-Chaining Rule Systems
Instead of being directed by goals, we sometimes want to be directed by incoming data.
For example, suppose you sense searing heat near your hand. You are likely to jerk your hand away.
Rules that match dump their right-hand side assertions into the state and the process repeats.
Matching is typically more complex for forward-chaining systems than backward ones.
Synthesis systems – Design/Configuration
Example of Typical Forward Chaining
Rules
1) If hot and smoky then ADD fire
2) If alarm_beeps then ADD smoky
3) If fire then ADD switchon_sprinkles
Facts
1) alarm_beeps (given)
2) hot (given)
.........
(3) smoky (from F1 by R2)
(4) fire (from F2, F4 by R1)
(5) switch_on_sprinklers (from F2 by R3)
Example of Typical Backward Chaining
Goal: Should I switch on sprinklers?
Combining Forward and Backward Reasoning
• Sometimes certain aspects of a problem are best handled via forward chaining and other aspects by backward chaining.
Consider a forward-chaining medical diagnosis program. It might accept twenty or so facts about a patient’s condition then
forward chain on those concepts to try to deduce the nature and/or cause of the disease.
• Now suppose that at some point, the left side of a rule was nearly satisfied – nine out of ten of its preconditions were met.
It might be efficient to apply backward reasoning to satisfy the tenth precondition in a directed manner, rather than wait for
forward chaining to supply the fact by accident.
• Whether it is possible to use the same rules for both forward and backward reasoning also depends on the form of the
rules themselves. If both left sides and right sides contain pure assertions, then forward chaining can match assertions on the
left side of a rule and add to the state description the assertions on the right side. But if arbitrary procedures are allowed as the
right sides of rules then the rules will not be reversible.
Logic Programming
Logic Programming is a programming language paradigm in which logical assertions are viewed as programs.
There are several logic programming systems in use today, the most popular of which is PROLOG.
A PROLOG program is described as a series of logical assertions, each of which is a Horn clause.
A Horn clause is a clause that has at most one positive literal. Thus p, -p v q, p -> q are all Horn clauses.
Programs written in pure PROLOG are composed only of Horn Clauses.
Syntactic Difference between the logic and the PROLOG representations, including:
In logic, variables are explicitly quantified. In PROLOG, quantification is provided implicitly by the way the variables are
interpreted.
o The distinction between variables and constants is made in PROLOG by having all variables begin with uppercase letters
and all constants begin with lowercase letters.
In logic, there are explicit symbols for and (ꓥ) and or (V). In PROLOG, there is an explicit symbol for and (,), but there is
none for or.
In logic, implications of the form “p implies q” as written as p->q. In PROLOG, the same implication is written
“backward” as q: -p.
The first two of these differences arise naturally from the fact that PROLOG
programs are actually sets of Horn Clauses that have been transformed as
follows:
1. If the Horn Clause contains no negative literals (i.e., it contains a single
literal which is positive), then leave it as it is.
2. Otherwise, return the Horn clause as an implication, combining all of the
negative literals into the antecedent of the implication and leaving the
single positive literal (if there is one) as the consequent.
This procedure causes a clause, which originally consisted of a disjunction
of literals (all but one of which were negative), to be transformed to single
implication whose antecedent is a conjunction of (what are now positive)
literals.
Matching
• We described the process of using search to solve problems as the application of appropriate rules to individual problem
states to generate new states to which the rules can then be applied and so forth until a solution is found.
• How we extract from the entire collection of rules those that can be applied at a given point? To do so requires some kind
of matching between the current state and the preconditions of the rules. How should this be done? The answer to this
question can be critical to the success of a rule based system.
• A more complex matching is required when the preconditions of rule specify required properties that are not stated
explicitly in the description of the current state. In this case, a separate set of rules must be used to describe how some
properties can be inferred from others.
• An even more complex matching process is required if rules should be applied and if their pre condition approximately
match the current situation. This is often the case in situations involving physical descriptions of the world.
Indexing
One way to select applicable rules is to do a simple search though all the rules comparing each one’s precondition to
the current state and extracting all the one’s that match. There are two problems with this simple solution:
i. The large number of rules will be necessary and scanning through all of them at every step would be inefficient.
ii. It’s not always obvious whether a rule’s preconditions are satisfied by a particular state.
Solution: Instead of searching through rules use the current state as an index into the rules and select the matching
one’s immediately.
Matching with Variables
• The problem of selecting applicable rules is made more difficult when preconditions are not stated as exact descriptions
of particular situations but rather describe properties that the situations must have. It often turns out that discovering
whether there is a match between a particular situation and the preconditions of a given rule must itself involve a
significant search process.
• Backward-chaining systems usually use depth-first backtracking to select individual rules, but forward-chaining systems
generally employ sophisticated conflict resolution strategies to choose among the applicable rules.
• While it is possible to apply unification repeatedly over the cross product of preconditions and state description
elements, it is more efficient to consider the many-many match problem, in which many rules are matched against many
elements in the state description simultaneously.
One efficient many-many match algorithm is RETE.
RETE Matching Algorithm
The matching consists of 3 parts
1. Rules & Productions
2. Working Memory
3. Inference Engine
The inference Engine is a cycle of production system which is match, select, execute.
The above cycle is repeated until no rules are put in the conflict set or until stopping condition is reached. In order to verify
several conditions, it is a time consuming process. To eliminate the need to perform thousands of matches of cycles on
effective matching algorithm is called RETE.
The Algorithm consists of two Steps.
1. Working memory changes need to be examined.
2. Grouping rules which share the same condition & linking them to their common terms. RETE Algorithm is many-match algorithm (In which many rules are
matched against many elements). RETE uses forward chaining systems which generally employee sophisticated conflict resolution strategies to choose among
applicable rules. RETE gains efficiency from 3 major sources.
1. RETE maintains a network of rule condition and it uses changes in the state description to determine which new rules might apply. Full matching is
only pursued for candidates that could be affected by incoming/outgoing data.
2. Structural Similarity in rules: RETE stores the rules so that they share structures in memory, set of conditions that appear in several rules are
matched once for cycle.
3. Persistence of variable binding consistency. While all the individual preconditions of the rule might be met, there may be variable binding conflicts
that prevent the rule from firing.
can be minimized. RETE remembers its previous calculations and is able to merge new binding information efficiently.
Approximate Matching:
Rules should be applied if their preconditions approximately match to the current situation
Eg: Speech understanding program
Rules: A description of a physical waveform to phones
Physical Signal: difference in the way individuals speak, result of background noise.
Conflict Resolution:
When several rules matched at once such a situation is called conflict resolution. There are 3 approaches to the problem of
conflict resolution in production system.
1. Preference based on rule match:
a. Physical order of rules in which they are presented to the system
b. Priority is given to rules in the order in which they appear
2. Preference based on the objects match:
a. Considers importance of objects that are matched
b. Considers the position of the match able objects in terms of Long Term Memory (LTM) & Short Term Memory(STM)
LTM: Stores a set of rules
STM (Working Memory): Serves as storage area for the facts deduced by rules in
long term memory
3. Preference based on the Action:
a. One way to do is find all the rules temporarily and examine the results of each.
Using a Heuristic Function that can evaluate each of the resulting states compare the merits of the result and then select the
preferred one.
Search Control Knowledge:
It is knowledge about which paths are most likely to lead quickly to a goal state
Search Control Knowledge requires Meta Knowledge.
It can take many forms. Knowledge about
o which states are more preferable to others.
o which rule to apply in a given situation
o the Order in which to pursue sub goals
o useful Sequences of rules to apply.

More Related Content

What's hot

ג'אווה - תכנות מונחה עצמים - מתודות - דריסה/חפיפה, העמסה ומתודות בונות
ג'אווה - תכנות מונחה עצמים - מתודות - דריסה/חפיפה, העמסה ומתודות בונותג'אווה - תכנות מונחה עצמים - מתודות - דריסה/חפיפה, העמסה ומתודות בונות
ג'אווה - תכנות מונחה עצמים - מתודות - דריסה/חפיפה, העמסה ומתודות בונותמורן אלקובי
 
Static and dynamic scoping
Static and dynamic scopingStatic and dynamic scoping
Static and dynamic scopingNusratShaikh16
 
Inheritance in c++
Inheritance in c++Inheritance in c++
Inheritance in c++Vineeta Garg
 
3.looping(iteration statements)
3.looping(iteration statements)3.looping(iteration statements)
3.looping(iteration statements)Hardik gupta
 
friend function(c++)
friend function(c++)friend function(c++)
friend function(c++)Ritika Sharma
 
jstl ( jsp standard tag library )
jstl ( jsp standard tag library )jstl ( jsp standard tag library )
jstl ( jsp standard tag library )Adarsh Patel
 
Multiple choice questions for Java io,files and inheritance
Multiple choice questions for Java io,files and inheritanceMultiple choice questions for Java io,files and inheritance
Multiple choice questions for Java io,files and inheritanceAbishek Purushothaman
 
oops concept in java | object oriented programming in java
oops concept in java | object oriented programming in javaoops concept in java | object oriented programming in java
oops concept in java | object oriented programming in javaCPD INDIA
 
Singleton Pattern (Sole Object with Global Access)
Singleton Pattern (Sole Object with Global Access)Singleton Pattern (Sole Object with Global Access)
Singleton Pattern (Sole Object with Global Access)Sameer Rathoud
 
Polymorphism in c++(ppt)
Polymorphism in c++(ppt)Polymorphism in c++(ppt)
Polymorphism in c++(ppt)Sanjit Shaw
 
ג'אווה - תכנות מונחה עצמים - משתנים מטיפוס מחלקה והמרת טיפוסים למעלה/למטה
ג'אווה - תכנות מונחה עצמים - משתנים מטיפוס מחלקה והמרת טיפוסים למעלה/למטהג'אווה - תכנות מונחה עצמים - משתנים מטיפוס מחלקה והמרת טיפוסים למעלה/למטה
ג'אווה - תכנות מונחה עצמים - משתנים מטיפוס מחלקה והמרת טיפוסים למעלה/למטהמורן אלקובי
 
Operator Overloading
Operator OverloadingOperator Overloading
Operator OverloadingNilesh Dalvi
 
Action Jackson! Effective JSON processing in Spring Boot Applications
Action Jackson! Effective JSON processing in Spring Boot ApplicationsAction Jackson! Effective JSON processing in Spring Boot Applications
Action Jackson! Effective JSON processing in Spring Boot ApplicationsJoris Kuipers
 

What's hot (20)

Fixed point scaling
Fixed point scalingFixed point scaling
Fixed point scaling
 
ג'אווה - תכנות מונחה עצמים - מתודות - דריסה/חפיפה, העמסה ומתודות בונות
ג'אווה - תכנות מונחה עצמים - מתודות - דריסה/חפיפה, העמסה ומתודות בונותג'אווה - תכנות מונחה עצמים - מתודות - דריסה/חפיפה, העמסה ומתודות בונות
ג'אווה - תכנות מונחה עצמים - מתודות - דריסה/חפיפה, העמסה ומתודות בונות
 
Static and dynamic scoping
Static and dynamic scopingStatic and dynamic scoping
Static and dynamic scoping
 
Inheritance in c++
Inheritance in c++Inheritance in c++
Inheritance in c++
 
3.looping(iteration statements)
3.looping(iteration statements)3.looping(iteration statements)
3.looping(iteration statements)
 
friend function(c++)
friend function(c++)friend function(c++)
friend function(c++)
 
4.2 PHP Function
4.2 PHP Function4.2 PHP Function
4.2 PHP Function
 
jstl ( jsp standard tag library )
jstl ( jsp standard tag library )jstl ( jsp standard tag library )
jstl ( jsp standard tag library )
 
Actor Model Akka Framework
Actor Model Akka FrameworkActor Model Akka Framework
Actor Model Akka Framework
 
Multiple choice questions for Java io,files and inheritance
Multiple choice questions for Java io,files and inheritanceMultiple choice questions for Java io,files and inheritance
Multiple choice questions for Java io,files and inheritance
 
oops concept in java | object oriented programming in java
oops concept in java | object oriented programming in javaoops concept in java | object oriented programming in java
oops concept in java | object oriented programming in java
 
Singleton Pattern (Sole Object with Global Access)
Singleton Pattern (Sole Object with Global Access)Singleton Pattern (Sole Object with Global Access)
Singleton Pattern (Sole Object with Global Access)
 
GUI components in Java
GUI components in JavaGUI components in Java
GUI components in Java
 
File handling in c
File handling in cFile handling in c
File handling in c
 
Polymorphism in c++(ppt)
Polymorphism in c++(ppt)Polymorphism in c++(ppt)
Polymorphism in c++(ppt)
 
ג'אווה - תכנות מונחה עצמים - משתנים מטיפוס מחלקה והמרת טיפוסים למעלה/למטה
ג'אווה - תכנות מונחה עצמים - משתנים מטיפוס מחלקה והמרת טיפוסים למעלה/למטהג'אווה - תכנות מונחה עצמים - משתנים מטיפוס מחלקה והמרת טיפוסים למעלה/למטה
ג'אווה - תכנות מונחה עצמים - משתנים מטיפוס מחלקה והמרת טיפוסים למעלה/למטה
 
Operator Overloading
Operator OverloadingOperator Overloading
Operator Overloading
 
[OOP - Lec 07] Access Specifiers
[OOP - Lec 07] Access Specifiers[OOP - Lec 07] Access Specifiers
[OOP - Lec 07] Access Specifiers
 
Polymorphism
PolymorphismPolymorphism
Polymorphism
 
Action Jackson! Effective JSON processing in Spring Boot Applications
Action Jackson! Effective JSON processing in Spring Boot ApplicationsAction Jackson! Effective JSON processing in Spring Boot Applications
Action Jackson! Effective JSON processing in Spring Boot Applications
 

Similar to AIML 7th semester VTU

Knowledge representation using predicate logic
Knowledge representation using predicate logicKnowledge representation using predicate logic
Knowledge representation using predicate logicHarshitaSharma285596
 
Predlogic
PredlogicPredlogic
Predlogicsaru40
 
AI NOTES ppt 4.pdf
AI NOTES ppt 4.pdfAI NOTES ppt 4.pdf
AI NOTES ppt 4.pdfARMANVERMA7
 
Chapter5 slideshare
Chapter5 slideshareChapter5 slideshare
Chapter5 slidesharemanirajan12
 
Predicate logic_2(Artificial Intelligence)
Predicate logic_2(Artificial Intelligence)Predicate logic_2(Artificial Intelligence)
Predicate logic_2(Artificial Intelligence)SHUBHAM KUMAR GUPTA
 
Logic in Predicate and Propositional Logic
Logic in Predicate and Propositional LogicLogic in Predicate and Propositional Logic
Logic in Predicate and Propositional LogicArchanaT32
 
Logic programming (1)
Logic programming (1)Logic programming (1)
Logic programming (1)Nitesh Singh
 
knowledge representation.pptx
knowledge representation.pptxknowledge representation.pptx
knowledge representation.pptxSwatiHans10
 
10 logic+programming+with+prolog
10 logic+programming+with+prolog10 logic+programming+with+prolog
10 logic+programming+with+prologbaran19901990
 
Foundations of Knowledge Representation in Artificial Intelligence.pptx
Foundations of Knowledge Representation in Artificial Intelligence.pptxFoundations of Knowledge Representation in Artificial Intelligence.pptx
Foundations of Knowledge Representation in Artificial Intelligence.pptxkitsenthilkumarcse
 
Discrete Structure Lecture #5 & 6.pdf
Discrete Structure Lecture #5 & 6.pdfDiscrete Structure Lecture #5 & 6.pdf
Discrete Structure Lecture #5 & 6.pdfMuhammadUmerIhtisham
 
Dynamical Systems Methods in Early-Universe Cosmologies
Dynamical Systems Methods in Early-Universe CosmologiesDynamical Systems Methods in Early-Universe Cosmologies
Dynamical Systems Methods in Early-Universe CosmologiesIkjyot Singh Kohli
 
Inference in First-Order Logic
Inference in First-Order Logic Inference in First-Order Logic
Inference in First-Order Logic Junya Tanaka
 
Chapter 01 - p2.pdf
Chapter 01 - p2.pdfChapter 01 - p2.pdf
Chapter 01 - p2.pdfsmarwaneid
 

Similar to AIML 7th semester VTU (20)

Predicate logic
 Predicate logic Predicate logic
Predicate logic
 
Knowledge representation using predicate logic
Knowledge representation using predicate logicKnowledge representation using predicate logic
Knowledge representation using predicate logic
 
Chapter 4 (final)
Chapter 4 (final)Chapter 4 (final)
Chapter 4 (final)
 
Predlogic
PredlogicPredlogic
Predlogic
 
artficial intelligence
artficial intelligenceartficial intelligence
artficial intelligence
 
AI NOTES ppt 4.pdf
AI NOTES ppt 4.pdfAI NOTES ppt 4.pdf
AI NOTES ppt 4.pdf
 
Logic.ppt
Logic.pptLogic.ppt
Logic.ppt
 
Chapter5 slideshare
Chapter5 slideshareChapter5 slideshare
Chapter5 slideshare
 
01bkb04p.ppt
01bkb04p.ppt01bkb04p.ppt
01bkb04p.ppt
 
Predicate logic_2(Artificial Intelligence)
Predicate logic_2(Artificial Intelligence)Predicate logic_2(Artificial Intelligence)
Predicate logic_2(Artificial Intelligence)
 
Logic in Predicate and Propositional Logic
Logic in Predicate and Propositional LogicLogic in Predicate and Propositional Logic
Logic in Predicate and Propositional Logic
 
Logic programming (1)
Logic programming (1)Logic programming (1)
Logic programming (1)
 
knowledge representation.pptx
knowledge representation.pptxknowledge representation.pptx
knowledge representation.pptx
 
10 logic+programming+with+prolog
10 logic+programming+with+prolog10 logic+programming+with+prolog
10 logic+programming+with+prolog
 
Foundations of Knowledge Representation in Artificial Intelligence.pptx
Foundations of Knowledge Representation in Artificial Intelligence.pptxFoundations of Knowledge Representation in Artificial Intelligence.pptx
Foundations of Knowledge Representation in Artificial Intelligence.pptx
 
AI Lesson 11
AI Lesson 11AI Lesson 11
AI Lesson 11
 
Discrete Structure Lecture #5 & 6.pdf
Discrete Structure Lecture #5 & 6.pdfDiscrete Structure Lecture #5 & 6.pdf
Discrete Structure Lecture #5 & 6.pdf
 
Dynamical Systems Methods in Early-Universe Cosmologies
Dynamical Systems Methods in Early-Universe CosmologiesDynamical Systems Methods in Early-Universe Cosmologies
Dynamical Systems Methods in Early-Universe Cosmologies
 
Inference in First-Order Logic
Inference in First-Order Logic Inference in First-Order Logic
Inference in First-Order Logic
 
Chapter 01 - p2.pdf
Chapter 01 - p2.pdfChapter 01 - p2.pdf
Chapter 01 - p2.pdf
 

Recently uploaded

Dynamo Scripts for Task IDs and Space Naming.pptx
Dynamo Scripts for Task IDs and Space Naming.pptxDynamo Scripts for Task IDs and Space Naming.pptx
Dynamo Scripts for Task IDs and Space Naming.pptxMustafa Ahmed
 
21scheme vtu syllabus of visveraya technological university
21scheme vtu syllabus of visveraya technological university21scheme vtu syllabus of visveraya technological university
21scheme vtu syllabus of visveraya technological universityMohd Saifudeen
 
15-Minute City: A Completely New Horizon
15-Minute City: A Completely New Horizon15-Minute City: A Completely New Horizon
15-Minute City: A Completely New HorizonMorshed Ahmed Rahath
 
21P35A0312 Internship eccccccReport.docx
21P35A0312 Internship eccccccReport.docx21P35A0312 Internship eccccccReport.docx
21P35A0312 Internship eccccccReport.docxrahulmanepalli02
 
Augmented Reality (AR) with Augin Software.pptx
Augmented Reality (AR) with Augin Software.pptxAugmented Reality (AR) with Augin Software.pptx
Augmented Reality (AR) with Augin Software.pptxMustafa Ahmed
 
NEWLETTER FRANCE HELICES/ SDS SURFACE DRIVES - MAY 2024
NEWLETTER FRANCE HELICES/ SDS SURFACE DRIVES - MAY 2024NEWLETTER FRANCE HELICES/ SDS SURFACE DRIVES - MAY 2024
NEWLETTER FRANCE HELICES/ SDS SURFACE DRIVES - MAY 2024EMMANUELLEFRANCEHELI
 
handbook on reinforce concrete and detailing
handbook on reinforce concrete and detailinghandbook on reinforce concrete and detailing
handbook on reinforce concrete and detailingAshishSingh1301
 
Artificial intelligence presentation2-171219131633.pdf
Artificial intelligence presentation2-171219131633.pdfArtificial intelligence presentation2-171219131633.pdf
Artificial intelligence presentation2-171219131633.pdfKira Dess
 
Involute of a circle,Square, pentagon,HexagonInvolute_Engineering Drawing.pdf
Involute of a circle,Square, pentagon,HexagonInvolute_Engineering Drawing.pdfInvolute of a circle,Square, pentagon,HexagonInvolute_Engineering Drawing.pdf
Involute of a circle,Square, pentagon,HexagonInvolute_Engineering Drawing.pdfJNTUA
 
engineering chemistry power point presentation
engineering chemistry  power point presentationengineering chemistry  power point presentation
engineering chemistry power point presentationsj9399037128
 
analog-vs-digital-communication (concept of analog and digital).pptx
analog-vs-digital-communication (concept of analog and digital).pptxanalog-vs-digital-communication (concept of analog and digital).pptx
analog-vs-digital-communication (concept of analog and digital).pptxKarpagam Institute of Teechnology
 
Circuit Breakers for Engineering Students
Circuit Breakers for Engineering StudentsCircuit Breakers for Engineering Students
Circuit Breakers for Engineering Studentskannan348865
 
Adsorption (mass transfer operations 2) ppt
Adsorption (mass transfer operations 2) pptAdsorption (mass transfer operations 2) ppt
Adsorption (mass transfer operations 2) pptjigup7320
 
UNIT 4 PTRP final Convergence in probability.pptx
UNIT 4 PTRP final Convergence in probability.pptxUNIT 4 PTRP final Convergence in probability.pptx
UNIT 4 PTRP final Convergence in probability.pptxkalpana413121
 
Independent Solar-Powered Electric Vehicle Charging Station
Independent Solar-Powered Electric Vehicle Charging StationIndependent Solar-Powered Electric Vehicle Charging Station
Independent Solar-Powered Electric Vehicle Charging Stationsiddharthteach18
 
Tembisa Central Terminating Pills +27838792658 PHOMOLONG Top Abortion Pills F...
Tembisa Central Terminating Pills +27838792658 PHOMOLONG Top Abortion Pills F...Tembisa Central Terminating Pills +27838792658 PHOMOLONG Top Abortion Pills F...
Tembisa Central Terminating Pills +27838792658 PHOMOLONG Top Abortion Pills F...drjose256
 
Filters for Electromagnetic Compatibility Applications
Filters for Electromagnetic Compatibility ApplicationsFilters for Electromagnetic Compatibility Applications
Filters for Electromagnetic Compatibility ApplicationsMathias Magdowski
 
NO1 Best Powerful Vashikaran Specialist Baba Vashikaran Specialist For Love V...
NO1 Best Powerful Vashikaran Specialist Baba Vashikaran Specialist For Love V...NO1 Best Powerful Vashikaran Specialist Baba Vashikaran Specialist For Love V...
NO1 Best Powerful Vashikaran Specialist Baba Vashikaran Specialist For Love V...Amil baba
 
electrical installation and maintenance.
electrical installation and maintenance.electrical installation and maintenance.
electrical installation and maintenance.benjamincojr
 
Raashid final report on Embedded Systems
Raashid final report on Embedded SystemsRaashid final report on Embedded Systems
Raashid final report on Embedded SystemsRaashidFaiyazSheikh
 

Recently uploaded (20)

Dynamo Scripts for Task IDs and Space Naming.pptx
Dynamo Scripts for Task IDs and Space Naming.pptxDynamo Scripts for Task IDs and Space Naming.pptx
Dynamo Scripts for Task IDs and Space Naming.pptx
 
21scheme vtu syllabus of visveraya technological university
21scheme vtu syllabus of visveraya technological university21scheme vtu syllabus of visveraya technological university
21scheme vtu syllabus of visveraya technological university
 
15-Minute City: A Completely New Horizon
15-Minute City: A Completely New Horizon15-Minute City: A Completely New Horizon
15-Minute City: A Completely New Horizon
 
21P35A0312 Internship eccccccReport.docx
21P35A0312 Internship eccccccReport.docx21P35A0312 Internship eccccccReport.docx
21P35A0312 Internship eccccccReport.docx
 
Augmented Reality (AR) with Augin Software.pptx
Augmented Reality (AR) with Augin Software.pptxAugmented Reality (AR) with Augin Software.pptx
Augmented Reality (AR) with Augin Software.pptx
 
NEWLETTER FRANCE HELICES/ SDS SURFACE DRIVES - MAY 2024
NEWLETTER FRANCE HELICES/ SDS SURFACE DRIVES - MAY 2024NEWLETTER FRANCE HELICES/ SDS SURFACE DRIVES - MAY 2024
NEWLETTER FRANCE HELICES/ SDS SURFACE DRIVES - MAY 2024
 
handbook on reinforce concrete and detailing
handbook on reinforce concrete and detailinghandbook on reinforce concrete and detailing
handbook on reinforce concrete and detailing
 
Artificial intelligence presentation2-171219131633.pdf
Artificial intelligence presentation2-171219131633.pdfArtificial intelligence presentation2-171219131633.pdf
Artificial intelligence presentation2-171219131633.pdf
 
Involute of a circle,Square, pentagon,HexagonInvolute_Engineering Drawing.pdf
Involute of a circle,Square, pentagon,HexagonInvolute_Engineering Drawing.pdfInvolute of a circle,Square, pentagon,HexagonInvolute_Engineering Drawing.pdf
Involute of a circle,Square, pentagon,HexagonInvolute_Engineering Drawing.pdf
 
engineering chemistry power point presentation
engineering chemistry  power point presentationengineering chemistry  power point presentation
engineering chemistry power point presentation
 
analog-vs-digital-communication (concept of analog and digital).pptx
analog-vs-digital-communication (concept of analog and digital).pptxanalog-vs-digital-communication (concept of analog and digital).pptx
analog-vs-digital-communication (concept of analog and digital).pptx
 
Circuit Breakers for Engineering Students
Circuit Breakers for Engineering StudentsCircuit Breakers for Engineering Students
Circuit Breakers for Engineering Students
 
Adsorption (mass transfer operations 2) ppt
Adsorption (mass transfer operations 2) pptAdsorption (mass transfer operations 2) ppt
Adsorption (mass transfer operations 2) ppt
 
UNIT 4 PTRP final Convergence in probability.pptx
UNIT 4 PTRP final Convergence in probability.pptxUNIT 4 PTRP final Convergence in probability.pptx
UNIT 4 PTRP final Convergence in probability.pptx
 
Independent Solar-Powered Electric Vehicle Charging Station
Independent Solar-Powered Electric Vehicle Charging StationIndependent Solar-Powered Electric Vehicle Charging Station
Independent Solar-Powered Electric Vehicle Charging Station
 
Tembisa Central Terminating Pills +27838792658 PHOMOLONG Top Abortion Pills F...
Tembisa Central Terminating Pills +27838792658 PHOMOLONG Top Abortion Pills F...Tembisa Central Terminating Pills +27838792658 PHOMOLONG Top Abortion Pills F...
Tembisa Central Terminating Pills +27838792658 PHOMOLONG Top Abortion Pills F...
 
Filters for Electromagnetic Compatibility Applications
Filters for Electromagnetic Compatibility ApplicationsFilters for Electromagnetic Compatibility Applications
Filters for Electromagnetic Compatibility Applications
 
NO1 Best Powerful Vashikaran Specialist Baba Vashikaran Specialist For Love V...
NO1 Best Powerful Vashikaran Specialist Baba Vashikaran Specialist For Love V...NO1 Best Powerful Vashikaran Specialist Baba Vashikaran Specialist For Love V...
NO1 Best Powerful Vashikaran Specialist Baba Vashikaran Specialist For Love V...
 
electrical installation and maintenance.
electrical installation and maintenance.electrical installation and maintenance.
electrical installation and maintenance.
 
Raashid final report on Embedded Systems
Raashid final report on Embedded SystemsRaashid final report on Embedded Systems
Raashid final report on Embedded Systems
 

AIML 7th semester VTU

  • 2. Logic • A scientific study of the process of reasoning • System of rules and procedures that help in reasoning process • Logic can be used to represent simple facts • The facts are claims about the world that are True or False • Logic defines ways of putting symbols together so that user can define legal sentences in the language that represent True facts. Types of Logic 1. Propositional Logic 2. Predicate Logic
  • 3. Propositional Logic • Simplest form • Sentences- either TRUE or FALSE but not both are called prepositions
  • 4.
  • 5.
  • 6.
  • 8.
  • 9.
  • 10.
  • 11. Quantifiers Universal quantification • (∀x)P(x) means that P holds for all values of x in the domain associated with that variable • E.g., (∀x) dolphin(x) → mammal(x) Existential quantification • (∃ x)P(x) means that P holds for some value of x in the domain associated with that variable • E.g., (∃ x) mammal(x) ∧ lays-eggs(x)
  • 12. Consider the following example that shows the use of predicate logic as a way of representing knowledge. 1. Marcus was a man. 2. Marcus was a Pompeian. 3. All Pompeians were Romans. 4. Caesar was a ruler. 5. Also, All Pompeians were either loyal to Caesar or hated him. 6. Everyone is loyal to someone. 7. People only try to assassinate rulers they are not loyal to. 8. Marcus tried to assassinate Caesar
  • 13. The facts described by these sentences can be represented as a set of well-formed formulas (wffs) as follows: 1. Marcus was a man man(Marcus) 2. Marcus was a Pompeian. Pompeian(Marcus) 3. All Pompeians were Romans. ∀x: Pompeian(x) → Roman(x) 4. Caesar was a ruler. ruler(Caesar) 5. All Romans were either loyal to Caesar or hated him. • Inclusive-or ∀x: Roman(x) → loyalto(x, Caesar) ∨ hate(x, Caesar) • exclusive-or ∀x: Roman(x) → [(loyalto(x, Caesar) V hate(x, Caesar)) ∧ (¬loyalto(x, Caesar) ∧ hate(x, Caesar))
  • 14. 6. Everyone is loyal to someone. ∀x: ∃y: loyalto(x, y) 7. People only try to assassinate rulers they are not loyal to. ∀x: ∀y: person(x) ∧ ruler(y) ∧ tryassassinate(x, y)→¬loyalto(x, y) 8. Marcus tried to assassinate Caesar. tryassassinate(Marcus, Caesar)
  • 15. • Now suppose if we want to use these statements to answer the question: Was Marcus loyal to Caesar? • Also, Now let’s try to produce a formal proof, reasoning backward from the desired goal: ¬ Ioyalto(Marcus, Caesar) • In order to prove the goal, we need to use the rules of inference to transform it into another goal (or possibly a set of goals) that can, in turn, transformed, and so on, until there are no unsatisfied goals remaining. Figure: An attempt to prove ¬loyalto(Marcus, Caesar)
  • 16. • The problem is that, although we know that Marcus was a man, we do not have any way to conclude from that that Marcus was a person. Also, We need to add the representation of another fact to our system, namely: ∀ man(x) → person(x) • Now we can satisfy the last goal and produce a proof that Marcus was not loyal to Caesar. • Moreover, From this simple example, we see that three important issues must be addressed in the process of converting English sentences into logical statements and then using those statements to deduce new ones: 1. Many English sentences are ambiguous. Choosing the correct interpretation may be difficult. 2. Also, There is often a choice of how to represent the knowledge. Simple representations are desirable, but they may exclude certain kinds of reasoning. 3. Similalry, Even in very simple situations, a set of sentences is unlikely to contain all the information necessary to reason about the topic at hand. In order to be able to use a set of statements effectively. Moreover, It is usually necessary to have access to another set of statements that represent facts that people consider too obvious to mention.
  • 17. Representing Instance and ISA Relationships • Specific attributes instance and isa play an important role particularly in a useful form of reasoning called property inheritance. • The predicates instance and isa explicitly captured the relationships they used to express, namely class membership and class inclusion. • Figure shows the first five sentences of the last section represented in logic in three different ways. • The first part of the figure contains the representations we have already discussed. In these representations, class membership represented with unary predicates (such as Roman), each of which corresponds to a class. • Asserting that P(x) is true is equivalent to asserting that x is an instance (or element) of P. • The second part of the figure contains representations that use the instance predicate explicitly.
  • 18. Man(Marcus), class man is represented using unary predicate Instance(Marcus, man) predicate is binary such that first argument is an instance(element) of the second argument which is usually a class. Isa(Pompeian, Roman) The subclass(Pompeian) and superclass(Romans) are used Isa relationships simplifies representation of wffs and combines statements to express knowledge in a shorter version.
  • 19. • The predicate instance is a binary one, whose first argument is an object and whose second argument is a class to which the object belongs. • But these representations do not use an explicit isa predicate. • Instead, subclass relationships, such as that between Pompeians and Romans, described as shown in sentence 3. • The implication rule states that if an object is an instance of the subclass Pompeian then it is an instance of the superclass Roman. • Note that this rule is equivalent to the standard set-theoretic definition of the subclass superclass relationship. • The third part contains representations that use both the instance and isa predicates explicitly. • The use of the isa predicate simplifies the representation of sentence 3, but it requires that one additional axiom (shown here as number 6) be provided.
  • 20. Computable Functions and Predicates • To express simple facts, such as the following greater-than and less-than relationships: • gt(1,O) It(0,1) • gt(2,1) It(1,2) • gt(3,2) It( 2,3) • It is often also useful to have computable functions as well as computable predicates. Thus we might want to be able to evaluate the truth of gt(2 + 3,1) • To do so requires that we first compute the value of the plus function given the arguments 2 and 3, and then send the arguments 5 and 1 to gt.
  • 21. Consider the following set of facts, again involving Marcus: 1) Marcus was a man. man(Marcus) 2) Marcus was a Pompeian. Pompeian(Marcus) 3) Marcus was born in 40 A.D. born(Marcus, 40) 4) All men are mortal. x: man(x) → mortal(x) 5) All Pompeians died when the volcano erupted in 79 A.D. erupted(volcano, 79) ∧ ∀ x : [Pompeian(x) → died(x, 79)] 6) No mortal lives longer than 150 years. x: t1: At2: mortal(x) born(x, t1) gt(t2 – t1,150) → died(x, t2) 7) It is now 1991. now = 1991
  • 22. • The example shows how these ideas of computable functions and predicates can be useful. It also makes use of the notion of equality and allows equal objects to be substituted for each other whenever it appears helpful to do so during a proof. • So, Now suppose we want to answer the question “Is Marcus alive?” • The statements suggested here, there may be two ways of deducing an answer. • Either we can show that Marcus is dead because he was killed by the volcano or we can show that he must be dead because he would otherwise be more than 150 years old, which we know is not possible. • Also, As soon as we attempt to follow either of those paths rigorously, however, we discover, just as we did in the last example, that we need some additional knowledge. For example, our statements talk about dying, but they say nothing that relates to being alive, which is what the question is asking.
  • 23. So we add the following facts: 8) Alive means not dead. Ɐ x: Ɐ t: [alive(x, t) → ¬ dead(x, t)] [¬ dead(x, t) → alive(x, t)] 9) If someone dies, then he is dead at all later times. Ɐ x: Ɐ t1: At2: died(x, t1) gt(t2, t1) → dead(x, t2)
  • 24. So, Now let’s attempt to answer the question “Is Marcus alive?” by proving: ¬ alive(Marcus, now)
  • 25. • The term nil at the end of each proof indicates that the list of conditions remaining to be proved is empty and so the proof has succeeded. • Notice in those proofs that whenever a statement of the form: a∧b → c was used, a and b were set up as independent sub goals. • In one sense they are, but in another sense they are not if they share the same bound variables since, in that case, consistent substitutions must be made in each of them. born(Marcus, t1) using statement 3 by binding t1 to 40, but then we must also bind t1 to 40 in gt(now - t1, 150)
  • 26. From looking at the proofs we have just shown, two things should be clear: • Even very simple conclusions can require many steps to prove. • A variety of processes, such as matching, substitution, and application of modus ponens are involved in the production of a proof.
  • 27. RESOLUTION • A proof procedure that carried out in a single operation the variety of processes involved in reasoning with statements in predicate logic. • Resolution is such a procedure, which gains its efficiency from the fact that it operates on statements that have been converted to a very convenient standard form Resolution produces proofs by refutation. • In other words, to prove a statement (i.e., show that it is valid), resolution attempts to show that the negation of the statement produces a contradiction with the known statements (i.e. , that it is unsatisfiable). • Produces proof by refutation(the action of proving a statement or theory to be false).
  • 28. Resolution uses following two steps a) Conversion of axioms to canonical or clause form b) Using refutation which means to show that the negation of the statement • An atomic formula and its negation are called Literals • The disjunction (OR) of the literals is called a clause
  • 29.
  • 30. Conversion to Clause form: This can be achieved by using the Conjunctive normal form “Suppose we know that all Romans who know Marcus either hate Caesar or think that anyone who hates anyone is crazy.” Wff form:- ∀x: [Roman(x) ∧ know(x, Marcus)] → [hate(x, Caesar) V (∀y: ∃z: hate (y, z) → thinkcrazy(x, y))] • To use this formula in a proof requires a complex matching process. Then, having matched one piece of it, such as thinkcrazy(x, y), it is necessary to do the right thing with the rest of the formula including the pieces in which the matched part is embedded and those in which it is not. • If the formula were in a simpler form, this process would be much easier. The formula would be easier to work with if -It was flatter, i.e., there was less embedding of components. - The quantifiers were separated from the rest of the formula so that they did not need to be considered. • The formula given above for the feelings of Romans who know Marcus would be represented in conjunctive normal form as ¬ Roman(x) ∧¬ know(x, Marcus) V hate(x, Caesar) V ¬ hate(y, z) V thinkcrazy(x, z)
  • 31. • Need to reduce a set of wff's to a set of clauses, where a clause is defined to be a wff in conjunctive normal form but with no instances of the connector A. • by first converting each wff into conjunctive normal form and then breaking apart each such expression into clauses, one for each conjunct. • All the conjuncts will be considered to be conjoined together as the proof procedure operates.
  • 32. Algorithm: Convert to Clause Form 1. Eliminate →, using the fact that a → b is equivalent to ¬ a V b. Performing this transformation on the wff given above yields. ∀x: ¬ [Roman(x) ∧know(x, Marcus)] V[hate(x, Caesar) V (∀y :¬(∃z : hate(y, z)) V thinkcrazy(x, y))] 2. Reduce the scope of each ¬ to a single term, using the fact that ¬ (¬ p) = p, deMorgan's laws [which say that ¬ (a ∧b) = ¬ a V ¬ b and ¬ (a V b) = ¬ a ∧ ¬ b ], and the standard correspondences between quantifiers [¬ ∀x: P(x) = ∃x: ¬ P(x) and ¬∃x: P(x) = ∀x: ¬P(x)]. Performing this transformation on the wff from step 1 yields ∀x: [¬ Roman(x) V ¬ know(x, Marcus)] V [hate(x, Caesar) V (∀y: ∀z: ¬ hate(y, z) Vthinkcrazy(x, y))] 3. Standardize variables so that each quantifier binds a unique variable. Since variables are just dummy names, this process cannot affect the truth value of the wff. ∀x: P(x) V∀x: Q(x) Would be converted to ∀x:P(x) V ∀y: Q(y)
  • 33. 4. Move all quantifiers to the left of the formula without changing their relative order. This is possible since there is no conflict among variable names. ∀x: ∀y: ∀z: [¬ Roman(x) V¬ know(x, Marcus)] V [hate(x, Caesar) V (¬ hale(y, z) V thinkcrazy(x, y))], The formula at this point is known as prenex normal form 5. Eliminate existential quantifiers. (Skolemization) A formula that contains an existentially quantified variable asserts that there is a value that can be substituted for the variable that makes the formula true. Eliminate the quantifier by substituting for the variable a reference to a function that produces the desired value. ∃y: President(y) Can be transformed into the formula President (S1) Where S1 is a function with no arguments that somehow produces a value that satisfies President (S1)-S1 called as skolem constant. 6. Drop the prefix. At this point, all remaining variables are universally quantified, so the prefix can just be dropped and any proof procedure we use can simply assume that any variable it sees is universally quantified. [¬ Roman(x) V¬ know(x, Marcus)] V [hate(x, Caesar) V (¬ hate(y, z) V thinkcrazy(x, y))]
  • 34. 7. Convert the matrix into a conjunction of disjuncts. In the case or our example, since there are no and’s, it is only necessary to exploit the associative property of or [ i.e., (a ∧ b) V c = (a V c) ∧ (b ∧ c)] and simply remove the parentheses, giving ¬ Roman(x) V ¬ know(x, Marcus) V hate(x, Caesar) V¬ hate(y, z) V thinkcrazy(x, y) 8. Create a separate clause corresponding to each conjunct. In order for a wff to be true, all the clauses that are generated from it must be true. If we are going to be working with several wff’s, all the clauses generated by each of them can now be combined to represent the same set of facts as were represented by the original wff’s. 9. Standardize apart the variables in the set of clauses generated in step 8. By this we mean rename the variables so that no two clauses make reference to the same variable. In making this transformation, we rely on the fact that (∀x: P(x) ∧ Q(x)) = ∀x: P(x) ∧∀x: Q(x) After applying the entire procedure to a set of wff’s have a set of clause, each of which is disjunction of literals. Thus since each clause is a separate conjunct and since all the variables are universally quantified, there need be no relationship between the variables of two clauses, even if they were generated from the same wff.
  • 35. The Basis of Resolution • The resolution procedure is a simple iterative process: at each step, two clauses, called the parent clauses, are compared (resolved), yielding a new clause that has been inferred from them. • The new clause represents ways that the two parent clauses interact with each other. • Suppose that there are two clauses in the system: winterV summer ¬ winterV cold Observe that precisely one of winter and ¬ winter will be true at any point. If winter is true, then cold must be true to guarantee the truth of the second clause. If ¬ winter is true, then summer must be true to guarantee the truth of the first clause. From these two clauses can deduce- Summer V cold • This is the deduction that the resolution procedure will make. • Resolution operates by taking two clauses that each contains the same literal, in this example, winter. • The literal must occur in positive form in one clause and in negative form in the other. • The resolventis obtained by combining all of the literals of the two parent clauses except the ones that cancel. • If the clause that is produced is the empty clause, then a contradiction has been found.
  • 36. • The two clauses winter ¬ winter will produce the empty clause. If a contradiction exists, then eventually it will be found. Of course, if no contradiction exists, it is possible that the procedure will never terminate, although as we will see, there are often ways of detecting that no contradiction exists. • In predicate logic, all possible ways of substituting values for the variables are considered. • The theoretical basis of the resolution procedure in predicate logic is Herbrand’s theorem -To show that a set of clauses S is unsatisfiable, it is necessary to consider only interpretations over a particular set, called the Herbrand universe of S. -A set of clauses S is unsatisfiable if and only if a finite subset of ground instances(in which all bound variables have had a value substituted for them) of S is unsatisfiable.
  • 37. Resolution in Propositional Logic • In propositional logic, the procedure for producing a proof by resolution of proposition P with respect to a set of axioms F is the following. Algorithm: Propositional Resolution 1. Convert all the propositions of F to clause form. 2. Negate P and convert the result to clause form. Add it to the set of clauses obtained in step 1. 3. Repeat until either a contradiction is found or no progress can be made: a) Select two clauses. Call these the parent clauses. b) Resolve them together. The resulting clause, called the resolvent, will be the disjunction of all of the literals of both of the parent clauses with the following exception: If there are any pairs of literals L and ¬ L such that one of the parent clauses contains L and the other contains ¬ L, then select one such pair and eliminate both L and ¬ L from the resolvent. c) If the resolvent is the empty clause, then a contradiction has been found. If it is not, then add it to the set of clauses available to the procedure.
  • 38. Simple example: Q. want to prove R. A. Then we negate R, producing ¬ R, which is already in clause form. One way of viewing the resolution process is that it takes a set of clauses that are all assumed to be true and, based on information provided by the others, it generates new clauses that represent restrictions on the way each of those original clauses can be made true. A contradiction occurs when a clause becomes so restricted that there is no way it can be true. This is indicated by the generation of the empty clause
  • 39.
  • 40. • In order for proposition 2 to be true, one of three things must be true: ¬ P, ¬ Q. or R. But we are assuming that ¬ R is true. • Given that, the only way for proposition 2 to be true is for one of two things to be true: ¬ P or ¬ Q. • That is what the first resolvent clause says. But proposition 1 says that P is true, which means that ¬ P cannot be true, which leaves only one way for proposition 2 to be true, namely for ¬ Q to be true (as shown in the second resolvent clause). Proposition 4 can be true if either ¬ T orQ is true. But since we now know that ¬ Q must be true, the only way for proposition 4 to be true is for ¬ T to be true (the third resolvent). • But proposition 5 says that T is true. Thus there is no way for all of these clauses to be true in a single interpretation. This is indicated by the empty clause.
  • 41. The Unification Algorithm In propositional logic • It is easy to determine that two literals cannot both be true at the same time. Simply look for L and ¬ L in predicate logic, this matching process is more complicated since the arguments of the predicates must be considered. • Example: man(John) and ¬man(John) is a contradiction, while man(john)and ¬man(spot) is not • In order to determine contradictions, we need a matching procedure that compares two literals and discovers whether there exists a set of substitutions that makes them identical. • The substitution of variables to make resolution possible is called unification and proof in predicate logic can be based on this joint method of unification and Resolution. • Unification is a process of making two different logical atomic expressions identical by finding a substitution. • The basic idea of unification is very simple. To attempt to unify two literals, first check if their initial predicate symbols are the same. If so, can proceed. Otherwise, there is no way they can be unified, regardless of their arguments. tryassassinate (Marcus, Caesar) hate(Marcus, Caesar) cannot be unified.
  • 42. • If the predicate symbols match, then must check the arguments, one pair at a time. If the first matches, then can continue with the second, and so on. To test each argument pair, we can simply call the unification procedure recursively. The matching rules are simple. Different constants or predicates cannot match; identical ones can. A variable can match another variable, any constant, or a predicate expression, with the restriction that the predicate expression must not contain any instances of the variable being matched.
  • 43. Example 1: To unify the expressions P(x, x) P(y, z) The two instances of P match fine. Next we compare x and y, and decide that if we substitute y for x, they could match. We will write that substitution as y/x, z/x - not a consistent substitution After finding first substitution make it throughout the literals P(y, y) P(y, z) Now we can succeed with a substitution of z/y. Example 2: hate(x, y) hate(Marcus, z) Could be unified with any of the following substitutions: (Marcus/x, z/y) (Marcus/x, y/z) (Marcus/x, Caesarly/y, Caesar/z)
  • 44. Algorithm: Unify (L1, L2) 1. If L1 or L2 are both variables or constants, then: (a) If L1 and L2 are identical, then return NIL. (b) Else if L1 is a variable, then if L1 occurs in L2 then return {FAIL}, else return (L2/L1). (c) Else if L2 is a variable, then if L2 occurs in L1 then return {FAIL}, else return (L1/L2). (d) Else return {FAIL}. 2. If the initial predicate symbols in L1 and L2 are not identical, then return {FAIL}. 3. If LI and L2 have a different number of arguments, then return {FAIL}. 4. Set SUBST to NIL. (At the end of this procedure, SUBST will contain all the substitutions used to unify L1 and L2.) 5. For i ← 1 to number of arguments in L1 : (a) Call Unify with the ith argument of L1 and the ith argument of L2, putting result in S. (b) If S contains FAIL then return {FAIL}. (c) If S is not equal to NIL then: (i) Apply S to the remainder of both L1 and L2. (ii) SUBST: = APPEND(S, SUBST). 6. Return SUBST. L2/L1: Substitute L2 for L1
  • 45. Resolution in Predicate Logic • Determining that two literals are contradictory- they are if one of them can be unified with the negation of the other. So, for example, man(x) and ¬ man(Spot) are contradictory, since man(x) and man(Spot) can be unified. • To use resolution for expressions in the predicate logic, use the unification algorithm to locate pairs of literals that cancel out. Need to use the unifier produced by the unification algorithm to generate the resolvent clause. For example. suppose to resolve two clauses: 1. man(Marcus) 2. ¬ man(x1) V mortal(x1) • The literal man(Marcus) can be unified with the literal man(x1) with the substitution Marcus/x1, telling us that for x1 = Marcus, ¬ man(Marcus) is false. But cannot simply cancel out the two man literals as we did in propositional logic and generate the Resolvent mortal(x1). Clause 2 says that for a given x1, either ¬ man(x1) or mortal(x1). So for it to be true, can now conclude only that mortal(Marcus) must be true.
  • 46. Algorithm: Resolution 1. Convert all the statements of F to clause form. 2. Negate P and convert the result to clause form. Add it to the set of clauses obtained in 1. 3. Repeat until either a contradiction is found, no progress can be made, or a predetermined amount of effort has been expended. a) Select two clauses. Call these the parent clauses. b) Resolve them together. The resolvent will be the disjunction of all the literals of both parent clauses with appropriate substitutions performed and with the following exception: If there is one pair of literals T1 and ¬ T2 such that one of the parent clauses contains T2 and the other contains T1 and if T1 and T2 are unifiable, then neither T1 nor T2 should appear in the resolvent. Call T1 and T2 Complementary literals. Use the substitution produced by the unification to create the resolvent. If there is more than one pair of complementary literals, only one pair should be omitted from the resolvent. c) If the resolvent is the empty clause, then a contradiction has been found. If it is not, then add it to the set of clauses available to the procedure If the choice of clauses to resolve together at each step is made in certain systematic ways, then the resolution procedure will find a contradiction if one exists.
  • 47. There exist strategies for making the choice that can speed up the process considerably: • Only resolve pairs of clauses that contain complementary literals, since only such resolutions produce new clauses that are harder to satisfy than their parents. • Eliminate certain clauses as soon as they are generated so that they cannot participate in later resolutions. Two kinds of clauses should be eliminated: tautologies (which can never be unsatisfied) and clauses that are subsumed by other clauses. • Whenever possible, resolve either with one of the clauses that is part of the statement, are trying to refute or with a clause generated by a resolution with such a clause. • Whenever possible, resolve with clauses that have a single literal. Such resolutions generate new clauses with fewer literals than the larger of their parent clauses and thus are probably closer to the goal of a resolvent with zero terms. This method is called the unit-preference strategy.
  • 48.
  • 49.
  • 50. Suppose our knowledge base contained the two additional statements 9) persecute(x, y)->hate(y, x) 10) hate(x, y)->persecute(y, x) Converting to clause form, we get 9) ¬persecute(x5, y2) V hate(y2, x5) 10) ¬hate(x6, y3)V persecute (y3, x6)
  • 51. Variable y occurs in both clause 1 and clause 2 The substitution at the second resolution step produces a clause that is too restricted and so does not lead to contradiction that is present in the database. If the clause ¬Father(Chris, Y) had been produced, the contradiction with clause 4 would have emerged. This would have happen if clause 2 had been rewritten as ¬ Mother (a, b) V woman(a)
  • 52. The need to try several substitutions Resolution provides a good way of finding a refutation proof without actually trying all the substitutions that Herbrand’s theorem suggests. But it does not always eliminate the need of trying more than one substitutions. Eg: To prove that Marcus hates some ruler, can try each substitutions and finding any contradictions.
  • 53. Question Answering Question: When did Marcus die? Answering: Finding a known statement that matches the terms given in the question
  • 54.
  • 55. What happened in 79 AD? ⱻx: event(x, 79) Erupted(volcano) can be represented as Event(erupted(volcano), 79)
  • 56. Natural Deductions • It is a way of doing machine theorem proving that corresponds more closely to processes used in human theorem proving. • Resolution as an easily implementable proof procedure that relies for its simplicity on a uniform representation of the statements it uses. • In converting everything to clause form, we often lose valuable heuristic information that is contained in the original representation of the facts. • Heuristic information contained in the original statements can be lost in the transformation
  • 57. For example, suppose we believe that all judges who are not crooked are well-educated, which can be represented as Ɐx: judge(x) ꓥ ¬ crooked(x)->educated(x) If converted to clause form, ¬ judge(x)V crooked(x) V educated(x) It appears also to be a way of deducing that someone is not a judge by showing that he is not crooked and not educated. The heuristic information contained in the original statement has been lost in the transformation.
  • 58. 6. Representing Knowledge using Rules Procedural versus Declaration Knowledge
  • 59. The declarative representation is one in which the knowledge is specified but how to use to which that knowledge is to be put is not given. Declarative knowledge answers the question 'What do you know?' It is your understanding of things, ideas, or concepts. In other words, declarative knowledge can be thought of as the who, what, when, and where of information. Declarative knowledge is normally discussed using nouns, like the names of people, places, or things or dates that events occurred. The procedural representation is one in which the control information i.e., necessary to use the knowledge is considered to be embedded in the knowledge itself. Procedural knowledge answers the question 'What can you do?' While declarative knowledge is demonstrated using nouns, Procedural knowledge relies on action words, or verbs. It is a person's ability to carry out actions to complete a task.
  • 60. The real difference between declarative and procedural views of knowledge lies in which the control information presides. Example: The statements 1, 2 and 3 are procedural knowledge and 4 is a declarative knowledge.
  • 61. Forward & Backward Reasoning The object of a search procedure is to discover a path through a problem space from an initial configuration to a goal state. There are actually two directions in which such a search could proceed: Forward Reasoning, from the start states LHS rule must match with initial state Eg: A → B, B→C => A→C Backward Reasoning, from the goal states RHS rules must match with goal state Eg: 8-Puzzle Problem In both the cases, the control strategy is it must cause motion and systematic. The production system model of the search process provides an easy way of viewing forward and backward reasoning as symmetric processes.
  • 62. Consider the problem of solving a particular instance of the 8-puzzle problem. The rules to be used for solving the puzzle can be written as:
  • 63. Reasoning Forward from Initial State: Begin building a tree of move sequences that might be solved with initial configuration at root of the tree. Generate the next level of the tree by finding all the rules whose left sides match the root node and using their right sides to create the new configurations. Generate the next level by taking each node generated at the previous level and applying to it all of the rules whose left sides match it. Continue until a configuration that matches the goal state is generated.
  • 64. Reasoning Backward from Goal State: Begin building a tree of move sequences that might be solved with goal configuration at root of the tree. Generate the next level of the tree by finding all the rules whose right sides match the root node. These are all the rules that, if only we could apply them, would generate the state we want. Use the left sides of the rules to generate the nodes at this second level of the tree. Generate the next level of the tree by taking each node at the previous level and finding all the rules whose right sides match it. Then use the corresponding left sides to generate the new nodes. Continue until a node that matches the initial state is generated. This method of reasoning backward from the desired final state is often called goal- directed reasoning.
  • 65. To reason forward, the left sides (preconditions) are matched against the current state and the right sides (results) are used to generate new nodes until the goal is reached. To reason backward, the right sides are matched against the current node and the left sides are used to generate new nodes representing new goal states to be achieved. The following 4 factors influence whether it is better to reason Forward or Backward: 1. Are there more possible start states or goal states? We would like to move from the smaller set of states to the larger (and thus easier to find) set of states. 2. In which direction branching factor (i.e, average number of nodes that can be reached directly from a single node) is greater? We would like to proceed in the direction with lower branching factor. 3. Will the program be used to justify its reasoning process to a user? If so, it is important to proceed in the direction that corresponds more closely with the way the user will think. 4. What kind of event is going to trigger a problem-solving episode? If it is arrival of a new fact, forward reasoning makes sense. If it is a query to which a response is desired, backward reasoning is more natural.
  • 66. Backward-Chaining Rule Systems Backward-chaining rule systems are good for goal-directed problem solving. For example, a query system would probably use backward chaining to reason about and answer user questions. Unification tries to find a set of bindings for variables to equate a (sub) goal with the head of some rule. Medical expert system, diagnostic problems Forward-Chaining Rule Systems Instead of being directed by goals, we sometimes want to be directed by incoming data. For example, suppose you sense searing heat near your hand. You are likely to jerk your hand away. Rules that match dump their right-hand side assertions into the state and the process repeats. Matching is typically more complex for forward-chaining systems than backward ones. Synthesis systems – Design/Configuration
  • 67. Example of Typical Forward Chaining Rules 1) If hot and smoky then ADD fire 2) If alarm_beeps then ADD smoky 3) If fire then ADD switchon_sprinkles Facts 1) alarm_beeps (given) 2) hot (given) ......... (3) smoky (from F1 by R2) (4) fire (from F2, F4 by R1) (5) switch_on_sprinklers (from F2 by R3) Example of Typical Backward Chaining Goal: Should I switch on sprinklers?
  • 68. Combining Forward and Backward Reasoning • Sometimes certain aspects of a problem are best handled via forward chaining and other aspects by backward chaining. Consider a forward-chaining medical diagnosis program. It might accept twenty or so facts about a patient’s condition then forward chain on those concepts to try to deduce the nature and/or cause of the disease. • Now suppose that at some point, the left side of a rule was nearly satisfied – nine out of ten of its preconditions were met. It might be efficient to apply backward reasoning to satisfy the tenth precondition in a directed manner, rather than wait for forward chaining to supply the fact by accident. • Whether it is possible to use the same rules for both forward and backward reasoning also depends on the form of the rules themselves. If both left sides and right sides contain pure assertions, then forward chaining can match assertions on the left side of a rule and add to the state description the assertions on the right side. But if arbitrary procedures are allowed as the right sides of rules then the rules will not be reversible.
  • 69. Logic Programming Logic Programming is a programming language paradigm in which logical assertions are viewed as programs. There are several logic programming systems in use today, the most popular of which is PROLOG. A PROLOG program is described as a series of logical assertions, each of which is a Horn clause. A Horn clause is a clause that has at most one positive literal. Thus p, -p v q, p -> q are all Horn clauses. Programs written in pure PROLOG are composed only of Horn Clauses.
  • 70. Syntactic Difference between the logic and the PROLOG representations, including: In logic, variables are explicitly quantified. In PROLOG, quantification is provided implicitly by the way the variables are interpreted. o The distinction between variables and constants is made in PROLOG by having all variables begin with uppercase letters and all constants begin with lowercase letters. In logic, there are explicit symbols for and (ꓥ) and or (V). In PROLOG, there is an explicit symbol for and (,), but there is none for or. In logic, implications of the form “p implies q” as written as p->q. In PROLOG, the same implication is written “backward” as q: -p.
  • 71. The first two of these differences arise naturally from the fact that PROLOG programs are actually sets of Horn Clauses that have been transformed as follows: 1. If the Horn Clause contains no negative literals (i.e., it contains a single literal which is positive), then leave it as it is. 2. Otherwise, return the Horn clause as an implication, combining all of the negative literals into the antecedent of the implication and leaving the single positive literal (if there is one) as the consequent. This procedure causes a clause, which originally consisted of a disjunction of literals (all but one of which were negative), to be transformed to single implication whose antecedent is a conjunction of (what are now positive) literals.
  • 72.
  • 73. Matching • We described the process of using search to solve problems as the application of appropriate rules to individual problem states to generate new states to which the rules can then be applied and so forth until a solution is found. • How we extract from the entire collection of rules those that can be applied at a given point? To do so requires some kind of matching between the current state and the preconditions of the rules. How should this be done? The answer to this question can be critical to the success of a rule based system. • A more complex matching is required when the preconditions of rule specify required properties that are not stated explicitly in the description of the current state. In this case, a separate set of rules must be used to describe how some properties can be inferred from others. • An even more complex matching process is required if rules should be applied and if their pre condition approximately match the current situation. This is often the case in situations involving physical descriptions of the world.
  • 74. Indexing One way to select applicable rules is to do a simple search though all the rules comparing each one’s precondition to the current state and extracting all the one’s that match. There are two problems with this simple solution: i. The large number of rules will be necessary and scanning through all of them at every step would be inefficient. ii. It’s not always obvious whether a rule’s preconditions are satisfied by a particular state. Solution: Instead of searching through rules use the current state as an index into the rules and select the matching one’s immediately.
  • 75. Matching with Variables • The problem of selecting applicable rules is made more difficult when preconditions are not stated as exact descriptions of particular situations but rather describe properties that the situations must have. It often turns out that discovering whether there is a match between a particular situation and the preconditions of a given rule must itself involve a significant search process. • Backward-chaining systems usually use depth-first backtracking to select individual rules, but forward-chaining systems generally employ sophisticated conflict resolution strategies to choose among the applicable rules. • While it is possible to apply unification repeatedly over the cross product of preconditions and state description elements, it is more efficient to consider the many-many match problem, in which many rules are matched against many elements in the state description simultaneously. One efficient many-many match algorithm is RETE.
  • 76. RETE Matching Algorithm The matching consists of 3 parts 1. Rules & Productions 2. Working Memory 3. Inference Engine The inference Engine is a cycle of production system which is match, select, execute.
  • 77. The above cycle is repeated until no rules are put in the conflict set or until stopping condition is reached. In order to verify several conditions, it is a time consuming process. To eliminate the need to perform thousands of matches of cycles on effective matching algorithm is called RETE. The Algorithm consists of two Steps. 1. Working memory changes need to be examined. 2. Grouping rules which share the same condition & linking them to their common terms. RETE Algorithm is many-match algorithm (In which many rules are matched against many elements). RETE uses forward chaining systems which generally employee sophisticated conflict resolution strategies to choose among applicable rules. RETE gains efficiency from 3 major sources. 1. RETE maintains a network of rule condition and it uses changes in the state description to determine which new rules might apply. Full matching is only pursued for candidates that could be affected by incoming/outgoing data. 2. Structural Similarity in rules: RETE stores the rules so that they share structures in memory, set of conditions that appear in several rules are matched once for cycle. 3. Persistence of variable binding consistency. While all the individual preconditions of the rule might be met, there may be variable binding conflicts that prevent the rule from firing. can be minimized. RETE remembers its previous calculations and is able to merge new binding information efficiently.
  • 78. Approximate Matching: Rules should be applied if their preconditions approximately match to the current situation Eg: Speech understanding program Rules: A description of a physical waveform to phones Physical Signal: difference in the way individuals speak, result of background noise. Conflict Resolution: When several rules matched at once such a situation is called conflict resolution. There are 3 approaches to the problem of conflict resolution in production system. 1. Preference based on rule match: a. Physical order of rules in which they are presented to the system b. Priority is given to rules in the order in which they appear 2. Preference based on the objects match: a. Considers importance of objects that are matched b. Considers the position of the match able objects in terms of Long Term Memory (LTM) & Short Term Memory(STM) LTM: Stores a set of rules STM (Working Memory): Serves as storage area for the facts deduced by rules in long term memory 3. Preference based on the Action: a. One way to do is find all the rules temporarily and examine the results of each. Using a Heuristic Function that can evaluate each of the resulting states compare the merits of the result and then select the preferred one.
  • 79. Search Control Knowledge: It is knowledge about which paths are most likely to lead quickly to a goal state Search Control Knowledge requires Meta Knowledge. It can take many forms. Knowledge about o which states are more preferable to others. o which rule to apply in a given situation o the Order in which to pursue sub goals o useful Sequences of rules to apply.