SlideShare une entreprise Scribd logo
1  sur  64
Télécharger pour lire hors ligne
Abstract KR: Using WordNet
and VerbNet for Question
Answering
          Valeria de Paiva
  speaking for the PARC Aquaint team


 Dick Crouch, Tracy King, Danny Bobrow,
      Cleo Condoravdi, Ron Kaplan,
     Annie Zaenen, Lauri Karttunen
Outline

   Motivation: why use deep linguistic analysis
    for question answering
   Basic architecture and example structures
   Abstract Knowledge Representation
   The experiment: WordNet/VerbNet
   Discussion
Motivation
   Knowledge-based question answering
    –   Deep representations allow high precision and recall, but
    –   Typically on restricted domains
    –   Hard for users to pose KR questions and interpret KR answers
    –   very hard for system to build up knowledge

   Shallow, open-domain question answering
    – Broad-coverage
    – Lower precision and recall
    – Easy to pose questions but sensitive to question form

   Question answering at PARC
    – Layered mapping from language to deeper semantic representations
    – Broad-coverage: Matching for answers as light reasoning
    – Expresses KRR answers in real English -- eventually
2-way bridge between language & KR
           XLE/LFG Pa                                           Target
Text                 r    sing    KR Mapping
                                                                 KRR
                 F-structure
                                  Semantics
                                                  Abstract KR
Sources
                                                                Assertions

                          Match         M            M          Inference

Question                                                        Query




                                    Composed             Answers
 Text         XLE/LFG
Text         Generation
                                    F-Structure          Explanations
to user                             Templates            Subqueries
2-way bridge between language & KR
           XLE/LFG Pa                                             Target
Text                 r     sing   KR Mapping
                                                                   KRR
                 F-structure
                                  Semantics
                                                    Abstract KR
Sources
                                                                  Assertions

                       Match             M             M          Inference

Question                                                          Query




                                  Infrastructure
                                      Composed             Resources
 Theories      XLE/LFG            XLE
Text Functional Grammar               F-Structure          English Grammar
 Lexical      Generation          Term rewriting
                                      Templates            KR mapping rules
 Ambiguity management             MaxEnt models
                                                           Unified Lexicon
Key Process: Canonicalize
representations
   F-structures are semantic representations to
    some – e.g. logic forms (Rus/Moldovan’01)
   Transformed into (flat and contexted)
    semantic structures, inspired by Glue, born
    out of the need to ‘pack’ semantics ( talk to Dick
    about it..)
   Transformed into (flat and contexted) AKR
    structures
   Want to discuss semantics to AKR
    but, what does canonicalization buy you?
Canonicalization helps matching

   Argument structure:
    – Mary bought an apple./An apple was bought by Mary.
   Synonyms and hypernyms:
    – Mary bought/purchased/acquired an apple.
   Factivity and contexts:
    –   Mary bought an apple./Mary did not buy an apple.
    –   Mary managed/failed to buy an apple.
    –   Ed prevented Mary from buying an apple.
    –   We know/said/believe that Mary bought an apple.
    –   Mary didn’t wait to buy an apple. (talk to Lauri..)
Canonicalization helps matching

   basic argument structure
      Did Mary buy an apple?
       “Mary bought an apple.”          Yes
       “An apple was bought by Mary.”   Yes
    Semantics gives us:

      in_context(t, cardinality('Mary':9,sg)),
      in_context(t, cardinality(apple:1,sg)),
      in_context(t, specifier(apple:1,a)),
      in_context(t, tense(buy:3,past)),
      in_context(t, buy(buy:3,'Mary':9,apple:1)),
      in_context(t, proper_name('Mary':9,name,'Mary'))
Canonicalization helps matching

   basic synonyms/hypernyms
      Did Mary buy an apple?
        “Mary bought an apple.”       Yes
       “Mary purchased an apple.” Yes
    Using Entailment detection:
         “Mary bought an apple.” IMPLIES
         “Mary purchased an apple.” ?

          System Answer is YES
          it aligns skolems in the appropriate way:
          [purchase##1-buy##1,Mary##0-Mary##0,apple##3-apple##3,t-
     t]
Canonicalization helps matching

   basic context structure
    Negation dealt with by context.
    Does
       “Mary did not buy an apple.”       imply
       “Mary bought an apple” ? No

       System Answer is NO:
       Skolems are aligned, respecting contexts
       [buy##1-buy##4,Ed##0-Ed##0,apple##3-apple##6,t-t]
       we get a conflict:
      conflict(uninstantiable(passage,buy##4,t),
                                   instantiable(question,buy##4,t)))
Canonicalization helps matching

   For negation you may think contexts are an overkill, but
    for implicative context structure:
     Does
       “Mary failed to buy an apple.”           imply
       “Mary bought an apple” ?

     System Answer is NO:
     under the right skolem alignment
      [buy##1-buy##7,Mary##0-Mary##0,apple##3-apple##9,t-t]

     That is entailment detection knows about contexts
Overcoming language/KR misalignments:
A principled approach
   Language
    – Generalizations come from the structure of the language
    – Representations compositionally derived from sentence structure
   Knowledge representation and reasoning
    – Generalizations come from the structure of the world
    – Representations to support reasoning
   Layered architecture helps with different constraints
   But boundaries are not fixed (like beads in a string?)
This talk: Semantics To AKR only…
           XLE/LFG Pa                                            Target
Text                 r    sing   KR Mapping
                                                                  KRR
                F-structure
                                 Semantics
                                                   Abstract KR
Sources
                                                                 Assertions
                      Match             M             M          Inference

Question                                                         Query




                                 Infrastructure
                                     Composed             Resources
Theories      XLE/LFG            XLE
Text Functional Grammar              F-Structure          English Grammar
Lexical      Generation          Term rewriting
                                     Templates            KR mapping rules
Ambiguity management             MaxEnt models
                                                          Unified Lexicon
Ambiguity-enabled Text Analysis Pipeline
                                                    The official version
                                Text


                                              LFG Grammar
                             C-structure       Morphology
              XLE/LFG                           Lexicon
               parsing
 Selection




                             F-structure


                              Linguistic                              VerbNet
                                              Semantic
                              semantics        lexicon
 Stochastic




                                                            Unified
                                                                      WordNet
                                                            Lexicon
                 Term
              Rewriting                                                 Cyc
              (transfer)       Abstract       Mapping                 Mappings
                                 KR            Rules




                               Target KR
                           (e.g. CycL, KM…)
Ambiguity-enabled Text Analysis pipeline
                               Text                     Early 2005
Stochastic
 Selection


                                             LFG Grammar
                            C-structure       Morphology
             XLE/LFG                           Lexicon
              parsing
                            F-structure


                             Linguistic                                VerbNet
                                             Semantic
                             semantics        lexicon
                                                           Unified
                                                                       WordNet
                                                           Lexicon
                Term
             Rewriting
             (transfer)       Abstract       Mapping                   Cyc
                                KR            Rules
                                                                     Mappings

                              Target KR
                          (e.g. CycL, KM…)
Example structures
String: Mary did not laugh.
Syntax: Functional/dependency structure
Example structures cont.
Semantics
 in_context(t,not(ctx(laugh:2))),
 in_context(ctx(laugh:2),cardinality('Mary':0,sg)),
 in_context(ctx(laugh:2),laugh(laugh:2,'Mary':0)),
 in_context(ctx(laugh:2),tense(laugh:2,past)),
 in_context(ctx(laugh:2),proper_name('Mary':0,name,'Mary'))

Abstract Knowledge Representation
 context(t),
 context('cx_laugh##4'),
 instantiable('Mary##0','cx_laugh##4'),
 instantiable('Mary##0',t),
 instantiable('laugh_ev##4','cx_laugh##4'),
 uninstantiable('laugh_ev##4',t)
 role(cardinality_restriction,'Mary##0',sg),
 role(bodilyDoer,'laugh_ev##4','Mary##0'),
 subconcept('Mary##0','Person'),
 subconcept('laugh_ev##4','Laughing'),
 temporalRel(startsAfterEndingOf,'Now','laugh_ev##4'),
Resources in a Unified Lexicon
   Mapping to abstract KR requires
    – Coverage of most words and constructions
    – Detailed lexical entailments
   Results of merging XLE, VerbNet, Cyc, WordNet
    – 9,835 different verb stems
    – 46,000 verb entries (indexed by subcat/sense)
    – 24,000 with VerbNet information
    – 2,800 with Cyc information
   Triangulation to extend mappings to Cyc concepts
    – 21,000 Verbs with Cyc data
    – 50,000 noun entries, including N-N compounds
    – 2,000 adjectives & adverbs
       (Quality of data not evaluated)
PRED fire<Ed, boy>
                                               TENSE past

Abstract KR: “Ed fired the boy.”               SUBJ [ PRED Ed ]

                                                       PRED boy
                                               OBJ
                                                       DEF +



(subconcept Ed3 Person)
(subconcept boy2 MaleChild)
(subconcept fire_ev1 DischargeWithPrejudice)         Conceptual
(role fire_ev1 performedBy Ed3)
(role fire_ev1 objectActedOn boy2)

(context t)
(instantiable Ed3 t)
                                                     Contextual
(instantiable boy2 t)
(instantiable fire_ev1 t)

(temporalRel startsAfterEndingOf Now                 Temporal
  fire_ev1)
PRED fire<Ed, hun>
                                       TENSE past

Abstract KR: “Ed fired the gun.”       SUBJ [ PRED Ed ]

                                               PRED gun
                                       OBJ
                                               DEF +



(subconcept Ed0 Person)
(subconcept gun3 Gun)
(subconcept fire_ev1 ShootingAGun)           Conceptual
(role fire_ev1 performedBy Ed0)
(role fire_ev1 deviceUsed gun3)

(context t)
(instantiable Ed0 t)
                                             Contextual
(instantiable gun3 t)
(instantiable fire_ev1 t)

(temporalRel startsAfterEndingOf Now         Temporal
  fire_ev1)
Abstract Knowledge Representation
   Encode different aspects of meaning
     – Asserted content
         » concepts and arguments, relations among objects
     – Contexts
         » author commitment, belief, report, denial, prevent, …
     – Temporal relations
         » qualitative relations among time intervals, events
   Translate to various target KR's
        e.g. CycL,Knowledge Machine, Description Logic, AnsProlog
   Capture meaning ambiguity
     – Mapping to KR can introduce and reduce ambiguity
     – Need to handle ambiguity efficiently
   A Basic Logic for Textual Inference (Bobrow et al, July 05)
Cyc oriented version of AKR

 Cyc concepts: Person, Laughing,
  DischargeWithPrejudice, etc..
 Cyc Roles: bodilyDoer, performedBy,
  infoTransferred, etc
 WordNet/VerbNet as fallback mechanisms
 Sortal restrictions from Cyc disambiguate
  e.g, Ed fired the boy/Ed fired the gun.
 Limitation: raggedness of Cyc


How do we know? Can we do better ?
Raggedness?

   It’s clear that Cyc misses many senses of
    words, particularly more abstract ones.
    “Ed admitted the mistake” gives only:
      (A1, subconcept('admit_ev##1','AdmitToMembership')),
      (A2, subconcept('admit_ev##1','PermittingEntrance'))
   Nothing about “Ed admitted that [Mary had arrived]”.
   Also nothing about “hesitate”, “surprise”, “please”.
   Nothing about “forget” (compare: Ed forgot that Mary
    left -> Mary left but Ed forgot to leave -> Ed did not
    leave)
Cyc KR: ambiguating too much?
                                                               The school bought the
 choice([A1, A2], 1),
 choice([B1, B2], A1),                                           store
 choice([C1, C2], A2),
 choice([D1], or(B2,C2)),
 choice([E1, E2], B1),                    3 schools, 2 stores, 3 buy in UL
 choice([F1, F2], C1),
 choice([G1], or(E1,F2)),
 choice([H1], or(E2,F1)),
 choice([I1], or(G1,B2,C2))
….
cf(or(B2,or(C2,E1,F2)), role(buyer,'buy_ev##2','school##1')),
cf(or(B2,or(C2,E1,F2)), role(objectPaidFor,'buy_ev##2','store##4')),
…..
 cf(or(B2,or(C2,E1,F2)), subconcept('buy_ev##2','Buying')),
 cf(or(E2,F1), subconcept('buy_ev##2',buy)),
 cf(D1, subconcept('school##1','SchoolInstitution')),
 cf(G1, subconcept('school##1','SchoolInstitution-KThrough12')),
 cf(H1, subconcept('school##1','GroupFn'('Fish'))),
 cf(A2, subconcept('store##4','RetailStore')),
 cf(A1, subconcept('store##4','RetailStoreSpace'))
The Experiment:
                  Text


                              LFG Grammar
                C-structure    Morphology
    XLE/LFG                     Lexicon
     parsing
                F-structure                           Resources

     Transfer    Linguistic                               VerbNet
                              Semantic
    semantics    semantics     lexicon
                                            Unified
                                                          WordNet
                                            Lexicon



                  Abstract    Mapping
     Term           KR         Rules                        c s
                                                          Cy ing
                                                              p
    Rewriting                                            M ap


                 Target KR
                (e.g. CycL)
Experiment: UL oriented version of
AKR
 Cyc only one of the lexical resources
 VerbNet/WordNet “concepts” are synsets
 Roles: very coarse ones from VerbNet
 Goal roles: less refined than Cyc (800+)

        more refined than VerbNet (<20)
 Have infrastructure to experiment:

Can we do knowledge representation with
  VerbNet/WordNet?
Example “Mary didn’t laugh” AGAIN
Semantics
 in_context(t,not(ctx(laugh:2))),
 in_context(ctx(laugh:2),cardinality('Mary':0,sg)),
 in_context(ctx(laugh:2),laugh(laugh:2,'Mary':0)),
 in_context(ctx(laugh:2),tense(laugh:2,past)),
 in_context(ctx(laugh:2),proper_name('Mary':0,name,'Mary'))

Abstract Knowledge Representation
 context(t),
 context('cx_laugh##4'),
 instantiable('Mary##0','cx_laugh##4'),
 instantiable('Mary##0',t),
 instantiable('laugh_ev##4','cx_laugh##4'),
 uninstantiable('laugh_ev##4',t)
 role(cardinality_restriction,'Mary##0',sg),
 role('Agent','laugh##2','Mary##0')
 subconcept('Mary##0',[[7626,4576,4359,3122,7127,1930,1740]]),
 subconcept('laugh##2',[[31311]])
 temporalRel(startsAfterEndingOf,'Now','laugh_ev##4'),
Example: The school bought the store

KR:
 choice([A1, A2], 1)

Packed Clauses:
 cf(1, context(t)),
 cf(1, instantiable('buy##2',t)),
 cf(1, instantiable('school##1',t)),
 cf(1, instantiable('store##4',t)),
 cf(A1, role('Agent','buy##2','school##1')),
 cf(A2, role('Asset','buy##2','school##1')),
 cf(1, role('Theme','buy##2','store##4')),
cf(1,subconcept('buy##2',[[2186766]])),
cf(1,subconcept('school##1',
    [[8162936,8162558,7943952,7899136,7842951,29714,2236,2119,1740],
    [4099321,2884748,4290445,20846,3991,3122,1930,1740],
    [5686799,5682845,5681825,5631]])),
cf(1,subconcept('store##4',
    [[4153847,3706735,3908195,3262760,4290445,20846,3991,3122,1930,1740],
    [13194813,13194436,13087849,13084632,13084472,13084292,131597020]])),
 cf(1, temporalRel(startsAfterEndingOf,'Now','buy##2'))
First thoughts

   Surprisingly easy to change the base
    ontology
   Modular architecture does help
   Do we have as good results as before?
   Well, no…
Improvements can be made…

   Some WordNet senses of buying should only
    go with A1, some with A2
   Sortal restrictions/preferences should come in
   But ambiguity is managed without losing
    meanings
Discussion

   Experiment ongoing
   Not “serious” report, yet. Gut feelings
   UL marking of factives/implicatives/etc very
    useful, (much) more needs to be done
   Matching can do more inference than first
    thought.
   Ambiguity enabling technology on concepts
    means no need to cluster WN senses ?
   Sortal restrictions/preferences a must ?
Thanks!
Mary fired the boy.
 cf(1, context(t)),
 cf(1, instantiable('Mary##0',t)),
 cf(1, instantiable('boy##3',t)),
 cf(1, instantiable('fire##1',t)),
 cf(1, role('Agent','fire##1','Mary##0')),
 cf(1, role('Theme','fire##1','boy##3')),
cf(1,subconcept('Mary##0',[[7626,4576,4359,3122,,1930,1740]])),
cf(1,subconcept('boy##3',
    [[10131706,9487097,7626,4576,4359,3122,7127,1930,1740],
    [9725282,10133569,9487097,7626,4576,4359,31]])),
  cf(A1, subconcept('fire##1',[[1124984],[1123061],[1123474]])),
  cf(A2, subconcept('fire##1',[[2379472]])),
  cf(1, temporalRel(startsAfterEndingOf,'Now','fire##1'))
Layered mapping
                Text
   text

            C-structure
XLE/LFG
 parsing
            F-structure
                           in_context(t,arrive(arrive:2,boy:1)),
                            in_context(t,cardinality(boy:1,sg)),
                            in_context(t,specifier(boy:1,a)),
              Linguistic
              semantics     in_context(t,tense(arrive:2,past))


                                Introduces contexts, cardinality restrictions, etc
 Term          Abstract
Rewriting        KR


              Target KRR
Layered mapping
              Text        Ed knows that Mary arrived.
    text
                             cf(1, context('cx_arrive##6')),
            C-structure      cf(1, context(t)),
XLE/LFG                      cf(1, context_head('cx_arrive##6','arrive##6')),
 parsing                     cf(1, context_head(t,'know##1')),
                             cf(1, context_lifting_relation(veridical,t,'cx_arrive##6')),
            F-structure
                             cf(1, instantiable('Ed##0',t)),
                             cf(1, instantiable('Mary##4','cx_arrive##6')),
                             cf(1, instantiable('Mary##4',t)),
                             cf(1, instantiable('arrive##6','cx_arrive##6')),
                             cf(1, instantiable('arrive##6',t)),
            Linguistic       cf(1, instantiable('know##1',t)),
            semantics        cf(1, role('Agent','know##1','Ed##0')),
 Term                        cf(1, role('Theme','arrive##6','Mary##4')),
Rewriting                    cf(1, role('Theme','know##1','cx_arrive##6')),
                             cf(1, subconcept('Ed##0',[[7626,4576,4359,3122,7127,1930,1740]])),
             Abstract        cf(1, subconcept('Mary##4',[[7626,4576,4359,3122,7127,1930,1740]])
               KR            cf(1, subconcept('arrive##6',[[1987643]])),
                             cf(1, subconcept('know##1',[[585692],[587146],[587430],[588050]])),
                             cf(1, temporalRel(startsAfterEndingOf,'Now','arrive##6')),
            Target KRR       cf(1, temporalRel(temporallyContains,'know##1','Now'))
Ambiguity is rampant in language
               Alternatives multiply within and across layers…




                                        Linguistic Sem




                                                         Abstract KR
 C-structure




                          F-structure




                                                                       KRR
What not to do
         Use heuristics to prune as soon as possible
         Oops: Strong constraints may reject the so-far-best (= only) option

                       Statistics


                                                            X                 X
                                          Linguistic sem
                                                           X




                                                                Abstract KR
    C-structure




                        F-structure




                                      X




                                                                                  KRR
                   X




                  Fast computation, wrong result
Manage ambiguity instead
                                                   How many sheep?
               The sheep liked the fish.
                                                   How many fish?
                    Options multiplied out
                The sheep-sg liked the fish-sg.
                The sheep-pl liked the fish-sg.
                The sheep-sg liked the fish-pl.
                The sheep-pl liked the fish-pl.

                        Options packed
                            sg                sg
               The sheep       liked the fish
                            pl                pl

  Packed representation:
      – Encodes all dependencies without loss of information
      – Common items represented, computed once
      – Key to practical efficiency with broad-coverage grammars
Embedded contexts:
             “The man said that Ed fired a boy.”
(subconcept say_ev19 Inform-CommunicationAct)
(role say_ev19 senderOfInfo man24)
(role say_ev19 infoTransferred comp21)

(subconcept fire_ev20 DischargeWithPrejudice)
(role fire_ev20 objectActedOn boy22)
(role fire_ev20 performedBy Ed23)

(context t)
(context comp21)
(context_lifting_rules averidical t comp21)

(instantiable man24 t)
(instantiable say_ev19 t)
(instantiable Ed23 t)

(instantiable boy22 comp21)
(instantiable fire_ev20 comp21)
Contexts for negation
     “No congressman has gone to Iraq since the war.”

   Contextual relations:
      (context t)
      (context not58)              context triggered by negation
      (context_lifting_rules antiveridical t not58)
                                      interpretation of negation
   Instantiability claims
      (instantiable go_ev57 not58)
      (uninstantiable go_ev57 t) entailment of negation
Local Textual Inference
   Broadening and Narrowing
        Ed went to Boston by bus → Ed went to Boston
        Ed didn’t go to Boston   → Ed didn’t go to Boston by bus
   Positive implicative
         Ed managed to go       → Ed went
         Ed didn’t manage to go → Ed didn’t go
   Negative implicative
         Ed forgot to go        → Ed didn’t go
         Ed didn’t forget to go → Ed went
   Factive
         Ed forgot that Bill went           → Bill went
         Ed didn’t forget that Bill went
Local Textual Inference
         Matching of Abstract KR: Inference by Inclusion

Ed went to Boston by bus. → Ed went to Boston.
(context t)                             (context t)
(instantiable Boston8 t)                (instantiable Boston8 t)
(instantiable Ed9 t)                    (instantiable Ed9 t)
(instantiable bus7 t)
(instantiable go_ev6 t)                 (instantiable go_ev6 t)
(role go_ev6 objectMoving Ed9)          (role go_ev6 objectMoving Ed9)
(role go_ev6 toLocation Boston8)        (role go_ev6 toLocation Boston8)
(role go_ev6 vehicle bus7)
(subconcept Boston8 CityOfBostonMass)   (subconcept Boston8 CityOfBostonMass)
(subconcept Ed9 Person)                 (subconcept Ed9 Person)
(subconcept bus7 Bus-RoadVehicle)
(subconcept go_ev6                      (subconcept go_ev6
     Movement-TranslationEvent)             Movement-TranslationEvent)
(temporalRel startsAfterEndingOf        (temporalRel startsAfterEndingOf
     Now go_ev6)                            Now go_ev6)
NYT 2000: top of the chart
 be (un)able to         > 12600
 fail to                > 5000
 know that              > 4800
 acknowledge that       > 2800
 be too ADJ to          > 2400
 happen to              > 2300
 manage to              > 2300
 admit/concede that     > 2000
 be ADJ enough to       > 1500
 have a/the chance to   > 1500
 go on/proceed to       > 1000
 have time/money to     > 1000
Verbs differ as to speaker commitments

“Bush realized that the US Army had to be transformed to meet new
   threats”


“The US Army had to be transformed to meet new threats”




“Bush said that Khan sold centrifuges to North Korea”


“Khan sold centrifuges to North Korea”
Implicative verbs: carry commitments

    Commitment depends on verb context
        Positive (+) or negative (-)
    Speaker commits to truth-value of complement
        True (+) or false (-)
++/- - Implicative

  positive context: positive commitment
  negative context: negative commitment

“Ed managed to leave”
                              “Ed left”

“Ed didn’t manage to leave”
                              “Ed didn’t leave”
+-/-+ Implicative

  positive context: negative commitment
  negative context: positive commitment

“Ed forgot to leave”
                              “Ed didn’t leave”

“Ed didn’t forget to leave”
                              “Ed left”
++ Implicative

positive context only: positive commitment

“Ann forced Ed to leave”
                           “Ed left”

“Ann didn’t force Ed to leave”
                          “Ed left / didn’t leave”
+- Implicative

positive context only: negative commitment


“Ed refused to leave”
                           “Ed didn’t left”

“Ed didn’t refuse to leave”
                          “Ed left / didn’t leave”
-+ Implicative

negative context only: positive commitment

“Ed hesitated to leave”
                           “Ed left / didn’t leave”

“Ed didn’t hesitate to leave”
                          “Ed left”
-- Implicative

negative context only: negative commitment


“Ed attempted to leave”
                           “Ed left / didn’t leave”

“Ed didn’t attempt to leave”
                          “Ed didn’t leave”
+Factive
positive commitment no matter what context

      “Ann realized that Ed left”
      “Ann didn’t realize that Ed left”

                     “Ed left”


Like ++/-+ implicative but additional commitments
in questions and conditionals

        Did Ann realize that Ed left?
            cf. Did Ed manage to leave?
- Factive
negative commitment no matter what context



“Ann pretended that Ed left”
“Ann didn’t pretend that Ed left”

                                    “Ed didn’t leave”




Like +-/-- implicative but with additional commitments
Coding of implicative verbs
   Implicative properties of complement-taking verbs
    • Not available in any external lexical resource
    • No obvious machine-learning strategy
   1250 embedded-clause verbs in PARC lexicon
   400 examined on first pass
    • Considered in BNC frequency order
    • Google search when intuitions unclear
   1/3 are implicative, classified in Unified Lexicon
Matching for implicative entailments
  Term rewriting promotes commitments in Abstract KR

                 context(cx_leave7)
                 context(t)
                 instantiable(leave_ev7 t)
                 context_lifting_rules(veridical t cx_leave7)
“Ed managed      instantiable(Ed0 t)
  to leave.”     instantiable(leave_ev7 cx_leave7)
                 instantiable(manage_ev3 t)
                 role(objectMoving leave_ev7 Ed0)
                 role(performedBy manage_ev3 Ed0)
                 role(whatsSuccessful manage_ev3 cx_leave7)

                 context(t)
  “Ed left.”     instantiable(Ed0 t)
                 instantiable(leave_ev1 t)
                 role(objectMoving leave_ev1 Ed0)
Embedded examples in real text

From Google:

Song, Seoul's point man, did not forget to
persuade the North Koreans to make a “strategic
choice” of returning to the bargaining table...




     Song persuaded the North Koreans…
Promotion for a (simple) embedding


 “Ed did not forget to force Dave to leave”




                Dave left.
Conclusions
   Broad coverage, efficient language to KR
    system
   Need extensive resources (ontologies,
    argument mappings)
    – Much work to incorporate
   Use layered architecture
    – Matching as well as reasoning
   KR mapping both increases and decreases
    ambiguity
Term-Rewriting for KR Mapping
   Rule form
      <Input terms> ==> <Output terms>           (obligatory)
      <Input terms> ?=> <Output terms>           (optional)
              (optional rules introduces new choices)

   Input patterns allow
     –   Consume term if matched:                  Term
     –   Test on term without consumption:        +Term
     –   Test that term is missing:               -Term
     –   Procedural attachment:            {ProcedureCall}

   Ordered rule application
     – Rule1 applied in all possible ways to Input to produce Output1
     – Rule2 applied in all possible ways to Output1 to produce Output2
Example rules: depassivisation
 +VTYPE(%V, %%), +PASSIVE(%V,+),
 SUBJ(%V, %LogicalOBJ)
 ==>
 OBJ(%V, %LogicalOBJ).

 +VTYPE(%V, %%), +PASSIVE(%V,+),
 OBL-AG(%V,%LogicalSUBJ), PFORM(%LogicalSUBJ, %%)
 ==>
 SUBJ(%V, %LogicalSUBJ).

 +VTYPE(%V, %%), +PASSIVE(%V,+),
 -SUBJ(%V,%%)
 ==>
 SUBJ(%V, %AgtPro),
 PRED(%AgtPro, agent_pro), PRON-TYPE(%AgtPro, null_agent).
Term-rewriting can introduce ambiguity
   Alternative lexical mappings
        /- cyc_concept_map(bank, FinancialInstitution).         Permanent
                                                                Background
         /- cyc_concept_map(bank, SideOfRiver).
                                                                Facts

        ist(%%, %P( %Arg)),
                                                                Mapping from
        cyc_concept_map(%P, %Concept)                           Predicate to
         ==> sub_concept(%Arg, %Concept).                       Cyc concept
   Input term
         ist(ctx0, bank(bank##1))
   Alternative rule applications produce different outputs
     Rewrite system represents this ambiguity by a new choice
   Output
     – C1: sub_concept(bank##1, FinancialInstitution)
       C2: sub_concept(bank##1, SideOfRiver)
     – (C1 xor C2) ↔ 1
Rules can prune ill-formed mappings
   The bank hired Ed
        hire(%E), subj(%E, %Subj), obj(%E, %Obj),
        +sub_concept(%Subj, %CSubj), +sub_concept( %Obj, %CObj),       Rule for
        {genls( %CSubj, Organization), genls( %CObj, Person)}          mapping
         ==> sub_concept(%E, EmployingEvent),                          hire
             performedBy(%E, %Subj), personEmployed(%E, %Obj).

   From Cyc: genls(FinancialInstitution, Organization)                true
              genls(SideOfRiver, Organization)                     false

   If bank is mapped to SideOfRiver, the rule will not fire.
    This leads to a failure to consume the subject.
           subj(%%,%%) ==> stop.
    prunes this analysis from the choice space.
   In general, later rewrites prune analyses that don’t consume grammatical
    roles.
The NLP-KR Gap

   There have been parallel efforts on text-based
    and knowledge-based question answering
     – Benefits of knowledge-based approaches inaccessible without some bridge
       from text to KR.

   Prime Goal
     – Robust, broad coverage mapping from texts and questions to KR
     – Allow KRR systems with lots of world knowledge to answer questions

   Second-thoughts Goal
     – Robust, broad coverage mapping from texts and questions to KR
     – Allow system with basic knowledge of English to answer questions
Canonical conceptual semantics

 “Cool the device.”      “Lower the temperature of the device.”

                                           lower
        cool
                               addressee           temperature
addressee      device
                                                    of device




       decreaseCausally(lower-event, device, temperature)

Contenu connexe

Tendances

Introduction to Distributional Semantics
Introduction to Distributional SemanticsIntroduction to Distributional Semantics
Introduction to Distributional Semantics
Andre Freitas
 
A Constructive Mathematics approach for NL formal grammars
A Constructive Mathematics approach for NL formal grammarsA Constructive Mathematics approach for NL formal grammars
A Constructive Mathematics approach for NL formal grammars
Federico Gobbo
 
Utilising wordsmith and atlas to explore, analyse and report qualitative data
Utilising wordsmith and atlas to explore, analyse and report qualitative dataUtilising wordsmith and atlas to explore, analyse and report qualitative data
Utilising wordsmith and atlas to explore, analyse and report qualitative data
Merlien Institute
 
Solutions advanced sb
Solutions advanced sbSolutions advanced sb
Solutions advanced sb
maicanhtinh
 
Like Alice in Wonderland: Unraveling Reasoning and Cognition Using Analogies ...
Like Alice in Wonderland: Unraveling Reasoning and Cognition Using Analogies ...Like Alice in Wonderland: Unraveling Reasoning and Cognition Using Analogies ...
Like Alice in Wonderland: Unraveling Reasoning and Cognition Using Analogies ...
Facultad de Informática UCM
 
Text smilarity02 corpus_based
Text smilarity02 corpus_basedText smilarity02 corpus_based
Text smilarity02 corpus_based
cyan1d3
 
Bondec - A Sentence Boundary Detector
Bondec - A Sentence Boundary DetectorBondec - A Sentence Boundary Detector
Bondec - A Sentence Boundary Detector
butest
 
What is the linguistics POR VANNESA ROGEL
What is the linguistics POR VANNESA ROGELWhat is the linguistics POR VANNESA ROGEL
What is the linguistics POR VANNESA ROGEL
vanne2020
 
Towards a theory of semantic communication
Towards a theory of semantic communicationTowards a theory of semantic communication
Towards a theory of semantic communication
Jie Bao
 

Tendances (20)

Introduction to Distributional Semantics
Introduction to Distributional SemanticsIntroduction to Distributional Semantics
Introduction to Distributional Semantics
 
Lecture 2: From Semantics To Semantic-Oriented Applications
Lecture 2: From Semantics To Semantic-Oriented ApplicationsLecture 2: From Semantics To Semantic-Oriented Applications
Lecture 2: From Semantics To Semantic-Oriented Applications
 
Exempler approach
Exempler approachExempler approach
Exempler approach
 
A Constructive Mathematics approach for NL formal grammars
A Constructive Mathematics approach for NL formal grammarsA Constructive Mathematics approach for NL formal grammars
A Constructive Mathematics approach for NL formal grammars
 
A Distributional Semantics Approach for Selective Reasoning on Commonsense Gr...
A Distributional Semantics Approach for Selective Reasoning on Commonsense Gr...A Distributional Semantics Approach for Selective Reasoning on Commonsense Gr...
A Distributional Semantics Approach for Selective Reasoning on Commonsense Gr...
 
Distributional semantics
Distributional semanticsDistributional semantics
Distributional semantics
 
Utilising wordsmith and atlas to explore, analyse and report qualitative data
Utilising wordsmith and atlas to explore, analyse and report qualitative dataUtilising wordsmith and atlas to explore, analyse and report qualitative data
Utilising wordsmith and atlas to explore, analyse and report qualitative data
 
Ontology Dev
Ontology DevOntology Dev
Ontology Dev
 
RuleML2015 The Herbrand Manifesto - Thinking Inside the Box
RuleML2015 The Herbrand Manifesto - Thinking Inside the Box RuleML2015 The Herbrand Manifesto - Thinking Inside the Box
RuleML2015 The Herbrand Manifesto - Thinking Inside the Box
 
MORPHOLOGICAL SEGMENTATION WITH LSTM NEURAL NETWORKS FOR TIGRINYA
MORPHOLOGICAL SEGMENTATION WITH LSTM NEURAL NETWORKS FOR TIGRINYAMORPHOLOGICAL SEGMENTATION WITH LSTM NEURAL NETWORKS FOR TIGRINYA
MORPHOLOGICAL SEGMENTATION WITH LSTM NEURAL NETWORKS FOR TIGRINYA
 
Solutions advanced sb
Solutions advanced sbSolutions advanced sb
Solutions advanced sb
 
Like Alice in Wonderland: Unraveling Reasoning and Cognition Using Analogies ...
Like Alice in Wonderland: Unraveling Reasoning and Cognition Using Analogies ...Like Alice in Wonderland: Unraveling Reasoning and Cognition Using Analogies ...
Like Alice in Wonderland: Unraveling Reasoning and Cognition Using Analogies ...
 
3rd grade checklist #2
3rd grade checklist #23rd grade checklist #2
3rd grade checklist #2
 
Text smilarity02 corpus_based
Text smilarity02 corpus_basedText smilarity02 corpus_based
Text smilarity02 corpus_based
 
Bondec - A Sentence Boundary Detector
Bondec - A Sentence Boundary DetectorBondec - A Sentence Boundary Detector
Bondec - A Sentence Boundary Detector
 
What is the linguistics POR VANNESA ROGEL
What is the linguistics POR VANNESA ROGELWhat is the linguistics POR VANNESA ROGEL
What is the linguistics POR VANNESA ROGEL
 
Substitutability
SubstitutabilitySubstitutability
Substitutability
 
Which Rationality For Pragmatics6
Which Rationality For Pragmatics6Which Rationality For Pragmatics6
Which Rationality For Pragmatics6
 
Towards a theory of semantic communication
Towards a theory of semantic communicationTowards a theory of semantic communication
Towards a theory of semantic communication
 
Fourth Grade Checklist
Fourth Grade ChecklistFourth Grade Checklist
Fourth Grade Checklist
 

Similaire à AbstractKR on Pargram 2006

5 Lessons Learned from Designing Neural Models for Information Retrieval
5 Lessons Learned from Designing Neural Models for Information Retrieval5 Lessons Learned from Designing Neural Models for Information Retrieval
5 Lessons Learned from Designing Neural Models for Information Retrieval
Bhaskar Mitra
 

Similaire à AbstractKR on Pargram 2006 (20)

Knowledge Extraction
Knowledge ExtractionKnowledge Extraction
Knowledge Extraction
 
5a use of annotated corpus
5a use of annotated corpus5a use of annotated corpus
5a use of annotated corpus
 
Incrementality
IncrementalityIncrementality
Incrementality
 
Lfg and gpsg
Lfg and gpsgLfg and gpsg
Lfg and gpsg
 
LFG and GPSG.pptx
LFG and GPSG.pptxLFG and GPSG.pptx
LFG and GPSG.pptx
 
Collin F. Baker - 2017 - Graph Methods for Multilingual FrameNets
Collin F. Baker - 2017 - Graph Methods for Multilingual FrameNetsCollin F. Baker - 2017 - Graph Methods for Multilingual FrameNets
Collin F. Baker - 2017 - Graph Methods for Multilingual FrameNets
 
Using topic modelling frameworks for NLP and semantic search
Using topic modelling frameworks for NLP and semantic searchUsing topic modelling frameworks for NLP and semantic search
Using topic modelling frameworks for NLP and semantic search
 
Navejar english 09_curriculum_map_semester_1
Navejar english 09_curriculum_map_semester_1Navejar english 09_curriculum_map_semester_1
Navejar english 09_curriculum_map_semester_1
 
Wide Coverage Semantic Representations from a CCG Parser
Wide Coverage Semantic Representations from a CCG ParserWide Coverage Semantic Representations from a CCG Parser
Wide Coverage Semantic Representations from a CCG Parser
 
The Semantic Web #7 - RDF Semantics
The Semantic Web #7 - RDF SemanticsThe Semantic Web #7 - RDF Semantics
The Semantic Web #7 - RDF Semantics
 
5 Lessons Learned from Designing Neural Models for Information Retrieval
5 Lessons Learned from Designing Neural Models for Information Retrieval5 Lessons Learned from Designing Neural Models for Information Retrieval
5 Lessons Learned from Designing Neural Models for Information Retrieval
 
nlp (1).pptx
nlp (1).pptxnlp (1).pptx
nlp (1).pptx
 
P-6
P-6P-6
P-6
 
P-6
P-6P-6
P-6
 
Performance Grammar
Performance GrammarPerformance Grammar
Performance Grammar
 
Challenges in transfer learning in nlp
Challenges in transfer learning in nlpChallenges in transfer learning in nlp
Challenges in transfer learning in nlp
 
OpenWN-PT: a Brazilian Wordnet for all
OpenWN-PT: a Brazilian Wordnet for allOpenWN-PT: a Brazilian Wordnet for all
OpenWN-PT: a Brazilian Wordnet for all
 
Lean Logic for Lean Times: Entailment and Contradiction Revisited
Lean Logic for Lean Times: Entailment and Contradiction RevisitedLean Logic for Lean Times: Entailment and Contradiction Revisited
Lean Logic for Lean Times: Entailment and Contradiction Revisited
 
I5 geetha4 suraiya
I5 geetha4 suraiyaI5 geetha4 suraiya
I5 geetha4 suraiya
 
Lean Logic for Lean Times: Varieties of Natural Logic
Lean Logic for Lean Times: Varieties of Natural LogicLean Logic for Lean Times: Varieties of Natural Logic
Lean Logic for Lean Times: Varieties of Natural Logic
 

Plus de Valeria de Paiva

Plus de Valeria de Paiva (20)

Dialectica Comonoids
Dialectica ComonoidsDialectica Comonoids
Dialectica Comonoids
 
Dialectica Categorical Constructions
Dialectica Categorical ConstructionsDialectica Categorical Constructions
Dialectica Categorical Constructions
 
Logic & Representation 2021
Logic & Representation 2021Logic & Representation 2021
Logic & Representation 2021
 
Constructive Modal and Linear Logics
Constructive Modal and Linear LogicsConstructive Modal and Linear Logics
Constructive Modal and Linear Logics
 
Dialectica Categories Revisited
Dialectica Categories RevisitedDialectica Categories Revisited
Dialectica Categories Revisited
 
PLN para Tod@s
PLN para Tod@sPLN para Tod@s
PLN para Tod@s
 
Networked Mathematics: NLP tools for Better Science
Networked Mathematics: NLP tools for Better ScienceNetworked Mathematics: NLP tools for Better Science
Networked Mathematics: NLP tools for Better Science
 
Going Without: a modality and its role
Going Without: a modality and its roleGoing Without: a modality and its role
Going Without: a modality and its role
 
Problemas de Kolmogorov-Veloso
Problemas de Kolmogorov-VelosoProblemas de Kolmogorov-Veloso
Problemas de Kolmogorov-Veloso
 
Natural Language Inference: for Humans and Machines
Natural Language Inference: for Humans and MachinesNatural Language Inference: for Humans and Machines
Natural Language Inference: for Humans and Machines
 
Dialectica Petri Nets
Dialectica Petri NetsDialectica Petri Nets
Dialectica Petri Nets
 
The importance of Being Erneast: Open datasets in Portuguese
The importance of Being Erneast: Open datasets in PortugueseThe importance of Being Erneast: Open datasets in Portuguese
The importance of Being Erneast: Open datasets in Portuguese
 
Negation in the Ecumenical System
Negation in the Ecumenical SystemNegation in the Ecumenical System
Negation in the Ecumenical System
 
Constructive Modal and Linear Logics
Constructive Modal and Linear LogicsConstructive Modal and Linear Logics
Constructive Modal and Linear Logics
 
Semantics and Reasoning for NLP, AI and ACT
Semantics and Reasoning for NLP, AI and ACTSemantics and Reasoning for NLP, AI and ACT
Semantics and Reasoning for NLP, AI and ACT
 
NLCS 2013 opening slides
NLCS 2013 opening slidesNLCS 2013 opening slides
NLCS 2013 opening slides
 
Dialectica Comonads
Dialectica ComonadsDialectica Comonads
Dialectica Comonads
 
Categorical Explicit Substitutions
Categorical Explicit SubstitutionsCategorical Explicit Substitutions
Categorical Explicit Substitutions
 
Logic and Probabilistic Methods for Dialog
Logic and Probabilistic Methods for DialogLogic and Probabilistic Methods for Dialog
Logic and Probabilistic Methods for Dialog
 
Intuitive Semantics for Full Intuitionistic Linear Logic (2014)
Intuitive Semantics for Full Intuitionistic Linear Logic (2014)Intuitive Semantics for Full Intuitionistic Linear Logic (2014)
Intuitive Semantics for Full Intuitionistic Linear Logic (2014)
 

Dernier

IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
Enterprise Knowledge
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
giselly40
 

Dernier (20)

TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 

AbstractKR on Pargram 2006

  • 1. Abstract KR: Using WordNet and VerbNet for Question Answering Valeria de Paiva speaking for the PARC Aquaint team Dick Crouch, Tracy King, Danny Bobrow, Cleo Condoravdi, Ron Kaplan, Annie Zaenen, Lauri Karttunen
  • 2. Outline  Motivation: why use deep linguistic analysis for question answering  Basic architecture and example structures  Abstract Knowledge Representation  The experiment: WordNet/VerbNet  Discussion
  • 3. Motivation  Knowledge-based question answering – Deep representations allow high precision and recall, but – Typically on restricted domains – Hard for users to pose KR questions and interpret KR answers – very hard for system to build up knowledge  Shallow, open-domain question answering – Broad-coverage – Lower precision and recall – Easy to pose questions but sensitive to question form  Question answering at PARC – Layered mapping from language to deeper semantic representations – Broad-coverage: Matching for answers as light reasoning – Expresses KRR answers in real English -- eventually
  • 4. 2-way bridge between language & KR XLE/LFG Pa Target Text r sing KR Mapping KRR F-structure Semantics Abstract KR Sources Assertions Match M M Inference Question Query Composed Answers Text XLE/LFG Text Generation F-Structure Explanations to user Templates Subqueries
  • 5. 2-way bridge between language & KR XLE/LFG Pa Target Text r sing KR Mapping KRR F-structure Semantics Abstract KR Sources Assertions Match M M Inference Question Query Infrastructure Composed Resources Theories XLE/LFG XLE Text Functional Grammar F-Structure English Grammar Lexical Generation Term rewriting Templates KR mapping rules Ambiguity management MaxEnt models Unified Lexicon
  • 6. Key Process: Canonicalize representations  F-structures are semantic representations to some – e.g. logic forms (Rus/Moldovan’01)  Transformed into (flat and contexted) semantic structures, inspired by Glue, born out of the need to ‘pack’ semantics ( talk to Dick about it..)  Transformed into (flat and contexted) AKR structures  Want to discuss semantics to AKR  but, what does canonicalization buy you?
  • 7. Canonicalization helps matching  Argument structure: – Mary bought an apple./An apple was bought by Mary.  Synonyms and hypernyms: – Mary bought/purchased/acquired an apple.  Factivity and contexts: – Mary bought an apple./Mary did not buy an apple. – Mary managed/failed to buy an apple. – Ed prevented Mary from buying an apple. – We know/said/believe that Mary bought an apple. – Mary didn’t wait to buy an apple. (talk to Lauri..)
  • 8. Canonicalization helps matching  basic argument structure Did Mary buy an apple? “Mary bought an apple.” Yes “An apple was bought by Mary.” Yes Semantics gives us: in_context(t, cardinality('Mary':9,sg)), in_context(t, cardinality(apple:1,sg)), in_context(t, specifier(apple:1,a)), in_context(t, tense(buy:3,past)), in_context(t, buy(buy:3,'Mary':9,apple:1)), in_context(t, proper_name('Mary':9,name,'Mary'))
  • 9. Canonicalization helps matching  basic synonyms/hypernyms Did Mary buy an apple? “Mary bought an apple.” Yes “Mary purchased an apple.” Yes Using Entailment detection: “Mary bought an apple.” IMPLIES “Mary purchased an apple.” ? System Answer is YES it aligns skolems in the appropriate way: [purchase##1-buy##1,Mary##0-Mary##0,apple##3-apple##3,t- t]
  • 10. Canonicalization helps matching  basic context structure Negation dealt with by context. Does “Mary did not buy an apple.” imply “Mary bought an apple” ? No System Answer is NO: Skolems are aligned, respecting contexts [buy##1-buy##4,Ed##0-Ed##0,apple##3-apple##6,t-t] we get a conflict: conflict(uninstantiable(passage,buy##4,t), instantiable(question,buy##4,t)))
  • 11. Canonicalization helps matching  For negation you may think contexts are an overkill, but for implicative context structure: Does “Mary failed to buy an apple.” imply “Mary bought an apple” ? System Answer is NO: under the right skolem alignment [buy##1-buy##7,Mary##0-Mary##0,apple##3-apple##9,t-t] That is entailment detection knows about contexts
  • 12. Overcoming language/KR misalignments: A principled approach  Language – Generalizations come from the structure of the language – Representations compositionally derived from sentence structure  Knowledge representation and reasoning – Generalizations come from the structure of the world – Representations to support reasoning  Layered architecture helps with different constraints  But boundaries are not fixed (like beads in a string?)
  • 13. This talk: Semantics To AKR only… XLE/LFG Pa Target Text r sing KR Mapping KRR F-structure Semantics Abstract KR Sources Assertions Match M M Inference Question Query Infrastructure Composed Resources Theories XLE/LFG XLE Text Functional Grammar F-Structure English Grammar Lexical Generation Term rewriting Templates KR mapping rules Ambiguity management MaxEnt models Unified Lexicon
  • 14. Ambiguity-enabled Text Analysis Pipeline The official version Text LFG Grammar C-structure Morphology XLE/LFG Lexicon parsing Selection F-structure Linguistic VerbNet Semantic semantics lexicon Stochastic Unified WordNet Lexicon Term Rewriting Cyc (transfer) Abstract Mapping Mappings KR Rules Target KR (e.g. CycL, KM…)
  • 15. Ambiguity-enabled Text Analysis pipeline Text Early 2005 Stochastic Selection LFG Grammar C-structure Morphology XLE/LFG Lexicon parsing F-structure Linguistic VerbNet Semantic semantics lexicon Unified WordNet Lexicon Term Rewriting (transfer) Abstract Mapping Cyc KR Rules Mappings Target KR (e.g. CycL, KM…)
  • 16. Example structures String: Mary did not laugh. Syntax: Functional/dependency structure
  • 17. Example structures cont. Semantics in_context(t,not(ctx(laugh:2))), in_context(ctx(laugh:2),cardinality('Mary':0,sg)), in_context(ctx(laugh:2),laugh(laugh:2,'Mary':0)), in_context(ctx(laugh:2),tense(laugh:2,past)), in_context(ctx(laugh:2),proper_name('Mary':0,name,'Mary')) Abstract Knowledge Representation context(t), context('cx_laugh##4'), instantiable('Mary##0','cx_laugh##4'), instantiable('Mary##0',t), instantiable('laugh_ev##4','cx_laugh##4'), uninstantiable('laugh_ev##4',t) role(cardinality_restriction,'Mary##0',sg), role(bodilyDoer,'laugh_ev##4','Mary##0'), subconcept('Mary##0','Person'), subconcept('laugh_ev##4','Laughing'), temporalRel(startsAfterEndingOf,'Now','laugh_ev##4'),
  • 18. Resources in a Unified Lexicon  Mapping to abstract KR requires – Coverage of most words and constructions – Detailed lexical entailments  Results of merging XLE, VerbNet, Cyc, WordNet – 9,835 different verb stems – 46,000 verb entries (indexed by subcat/sense) – 24,000 with VerbNet information – 2,800 with Cyc information  Triangulation to extend mappings to Cyc concepts – 21,000 Verbs with Cyc data – 50,000 noun entries, including N-N compounds – 2,000 adjectives & adverbs (Quality of data not evaluated)
  • 19. PRED fire<Ed, boy> TENSE past Abstract KR: “Ed fired the boy.” SUBJ [ PRED Ed ] PRED boy OBJ DEF + (subconcept Ed3 Person) (subconcept boy2 MaleChild) (subconcept fire_ev1 DischargeWithPrejudice) Conceptual (role fire_ev1 performedBy Ed3) (role fire_ev1 objectActedOn boy2) (context t) (instantiable Ed3 t) Contextual (instantiable boy2 t) (instantiable fire_ev1 t) (temporalRel startsAfterEndingOf Now Temporal fire_ev1)
  • 20. PRED fire<Ed, hun> TENSE past Abstract KR: “Ed fired the gun.” SUBJ [ PRED Ed ] PRED gun OBJ DEF + (subconcept Ed0 Person) (subconcept gun3 Gun) (subconcept fire_ev1 ShootingAGun) Conceptual (role fire_ev1 performedBy Ed0) (role fire_ev1 deviceUsed gun3) (context t) (instantiable Ed0 t) Contextual (instantiable gun3 t) (instantiable fire_ev1 t) (temporalRel startsAfterEndingOf Now Temporal fire_ev1)
  • 21. Abstract Knowledge Representation  Encode different aspects of meaning – Asserted content » concepts and arguments, relations among objects – Contexts » author commitment, belief, report, denial, prevent, … – Temporal relations » qualitative relations among time intervals, events  Translate to various target KR's e.g. CycL,Knowledge Machine, Description Logic, AnsProlog  Capture meaning ambiguity – Mapping to KR can introduce and reduce ambiguity – Need to handle ambiguity efficiently  A Basic Logic for Textual Inference (Bobrow et al, July 05)
  • 22. Cyc oriented version of AKR  Cyc concepts: Person, Laughing, DischargeWithPrejudice, etc..  Cyc Roles: bodilyDoer, performedBy, infoTransferred, etc  WordNet/VerbNet as fallback mechanisms  Sortal restrictions from Cyc disambiguate e.g, Ed fired the boy/Ed fired the gun.  Limitation: raggedness of Cyc How do we know? Can we do better ?
  • 23. Raggedness?  It’s clear that Cyc misses many senses of words, particularly more abstract ones.  “Ed admitted the mistake” gives only: (A1, subconcept('admit_ev##1','AdmitToMembership')), (A2, subconcept('admit_ev##1','PermittingEntrance'))  Nothing about “Ed admitted that [Mary had arrived]”.  Also nothing about “hesitate”, “surprise”, “please”.  Nothing about “forget” (compare: Ed forgot that Mary left -> Mary left but Ed forgot to leave -> Ed did not leave)
  • 24. Cyc KR: ambiguating too much? The school bought the choice([A1, A2], 1), choice([B1, B2], A1), store choice([C1, C2], A2), choice([D1], or(B2,C2)), choice([E1, E2], B1), 3 schools, 2 stores, 3 buy in UL choice([F1, F2], C1), choice([G1], or(E1,F2)), choice([H1], or(E2,F1)), choice([I1], or(G1,B2,C2)) …. cf(or(B2,or(C2,E1,F2)), role(buyer,'buy_ev##2','school##1')), cf(or(B2,or(C2,E1,F2)), role(objectPaidFor,'buy_ev##2','store##4')), ….. cf(or(B2,or(C2,E1,F2)), subconcept('buy_ev##2','Buying')), cf(or(E2,F1), subconcept('buy_ev##2',buy)), cf(D1, subconcept('school##1','SchoolInstitution')), cf(G1, subconcept('school##1','SchoolInstitution-KThrough12')), cf(H1, subconcept('school##1','GroupFn'('Fish'))), cf(A2, subconcept('store##4','RetailStore')), cf(A1, subconcept('store##4','RetailStoreSpace'))
  • 25. The Experiment: Text LFG Grammar C-structure Morphology XLE/LFG Lexicon parsing F-structure Resources Transfer Linguistic VerbNet Semantic semantics semantics lexicon Unified WordNet Lexicon Abstract Mapping Term KR Rules c s Cy ing p Rewriting M ap Target KR (e.g. CycL)
  • 26. Experiment: UL oriented version of AKR  Cyc only one of the lexical resources  VerbNet/WordNet “concepts” are synsets  Roles: very coarse ones from VerbNet  Goal roles: less refined than Cyc (800+) more refined than VerbNet (<20)  Have infrastructure to experiment: Can we do knowledge representation with VerbNet/WordNet?
  • 27. Example “Mary didn’t laugh” AGAIN Semantics in_context(t,not(ctx(laugh:2))), in_context(ctx(laugh:2),cardinality('Mary':0,sg)), in_context(ctx(laugh:2),laugh(laugh:2,'Mary':0)), in_context(ctx(laugh:2),tense(laugh:2,past)), in_context(ctx(laugh:2),proper_name('Mary':0,name,'Mary')) Abstract Knowledge Representation context(t), context('cx_laugh##4'), instantiable('Mary##0','cx_laugh##4'), instantiable('Mary##0',t), instantiable('laugh_ev##4','cx_laugh##4'), uninstantiable('laugh_ev##4',t) role(cardinality_restriction,'Mary##0',sg), role('Agent','laugh##2','Mary##0') subconcept('Mary##0',[[7626,4576,4359,3122,7127,1930,1740]]), subconcept('laugh##2',[[31311]]) temporalRel(startsAfterEndingOf,'Now','laugh_ev##4'),
  • 28. Example: The school bought the store KR: choice([A1, A2], 1) Packed Clauses: cf(1, context(t)), cf(1, instantiable('buy##2',t)), cf(1, instantiable('school##1',t)), cf(1, instantiable('store##4',t)), cf(A1, role('Agent','buy##2','school##1')), cf(A2, role('Asset','buy##2','school##1')), cf(1, role('Theme','buy##2','store##4')), cf(1,subconcept('buy##2',[[2186766]])), cf(1,subconcept('school##1', [[8162936,8162558,7943952,7899136,7842951,29714,2236,2119,1740], [4099321,2884748,4290445,20846,3991,3122,1930,1740], [5686799,5682845,5681825,5631]])), cf(1,subconcept('store##4', [[4153847,3706735,3908195,3262760,4290445,20846,3991,3122,1930,1740], [13194813,13194436,13087849,13084632,13084472,13084292,131597020]])), cf(1, temporalRel(startsAfterEndingOf,'Now','buy##2'))
  • 29. First thoughts  Surprisingly easy to change the base ontology  Modular architecture does help  Do we have as good results as before?  Well, no…
  • 30. Improvements can be made…  Some WordNet senses of buying should only go with A1, some with A2  Sortal restrictions/preferences should come in  But ambiguity is managed without losing meanings
  • 31. Discussion  Experiment ongoing  Not “serious” report, yet. Gut feelings  UL marking of factives/implicatives/etc very useful, (much) more needs to be done  Matching can do more inference than first thought.  Ambiguity enabling technology on concepts means no need to cluster WN senses ?  Sortal restrictions/preferences a must ?
  • 33. Mary fired the boy. cf(1, context(t)), cf(1, instantiable('Mary##0',t)), cf(1, instantiable('boy##3',t)), cf(1, instantiable('fire##1',t)), cf(1, role('Agent','fire##1','Mary##0')), cf(1, role('Theme','fire##1','boy##3')), cf(1,subconcept('Mary##0',[[7626,4576,4359,3122,,1930,1740]])), cf(1,subconcept('boy##3', [[10131706,9487097,7626,4576,4359,3122,7127,1930,1740], [9725282,10133569,9487097,7626,4576,4359,31]])), cf(A1, subconcept('fire##1',[[1124984],[1123061],[1123474]])), cf(A2, subconcept('fire##1',[[2379472]])), cf(1, temporalRel(startsAfterEndingOf,'Now','fire##1'))
  • 34. Layered mapping Text text C-structure XLE/LFG parsing F-structure in_context(t,arrive(arrive:2,boy:1)), in_context(t,cardinality(boy:1,sg)), in_context(t,specifier(boy:1,a)), Linguistic semantics in_context(t,tense(arrive:2,past)) Introduces contexts, cardinality restrictions, etc Term Abstract Rewriting KR Target KRR
  • 35. Layered mapping Text Ed knows that Mary arrived. text cf(1, context('cx_arrive##6')), C-structure cf(1, context(t)), XLE/LFG cf(1, context_head('cx_arrive##6','arrive##6')), parsing cf(1, context_head(t,'know##1')), cf(1, context_lifting_relation(veridical,t,'cx_arrive##6')), F-structure cf(1, instantiable('Ed##0',t)), cf(1, instantiable('Mary##4','cx_arrive##6')), cf(1, instantiable('Mary##4',t)), cf(1, instantiable('arrive##6','cx_arrive##6')), cf(1, instantiable('arrive##6',t)), Linguistic cf(1, instantiable('know##1',t)), semantics cf(1, role('Agent','know##1','Ed##0')), Term cf(1, role('Theme','arrive##6','Mary##4')), Rewriting cf(1, role('Theme','know##1','cx_arrive##6')), cf(1, subconcept('Ed##0',[[7626,4576,4359,3122,7127,1930,1740]])), Abstract cf(1, subconcept('Mary##4',[[7626,4576,4359,3122,7127,1930,1740]]) KR cf(1, subconcept('arrive##6',[[1987643]])), cf(1, subconcept('know##1',[[585692],[587146],[587430],[588050]])), cf(1, temporalRel(startsAfterEndingOf,'Now','arrive##6')), Target KRR cf(1, temporalRel(temporallyContains,'know##1','Now'))
  • 36. Ambiguity is rampant in language Alternatives multiply within and across layers… Linguistic Sem Abstract KR C-structure F-structure KRR
  • 37. What not to do  Use heuristics to prune as soon as possible  Oops: Strong constraints may reject the so-far-best (= only) option Statistics X X Linguistic sem X Abstract KR C-structure F-structure X KRR X Fast computation, wrong result
  • 38. Manage ambiguity instead How many sheep? The sheep liked the fish. How many fish? Options multiplied out The sheep-sg liked the fish-sg. The sheep-pl liked the fish-sg. The sheep-sg liked the fish-pl. The sheep-pl liked the fish-pl. Options packed sg sg The sheep liked the fish pl pl Packed representation: – Encodes all dependencies without loss of information – Common items represented, computed once – Key to practical efficiency with broad-coverage grammars
  • 39. Embedded contexts: “The man said that Ed fired a boy.” (subconcept say_ev19 Inform-CommunicationAct) (role say_ev19 senderOfInfo man24) (role say_ev19 infoTransferred comp21) (subconcept fire_ev20 DischargeWithPrejudice) (role fire_ev20 objectActedOn boy22) (role fire_ev20 performedBy Ed23) (context t) (context comp21) (context_lifting_rules averidical t comp21) (instantiable man24 t) (instantiable say_ev19 t) (instantiable Ed23 t) (instantiable boy22 comp21) (instantiable fire_ev20 comp21)
  • 40. Contexts for negation “No congressman has gone to Iraq since the war.”  Contextual relations: (context t) (context not58) context triggered by negation (context_lifting_rules antiveridical t not58) interpretation of negation  Instantiability claims (instantiable go_ev57 not58) (uninstantiable go_ev57 t) entailment of negation
  • 41. Local Textual Inference  Broadening and Narrowing Ed went to Boston by bus → Ed went to Boston Ed didn’t go to Boston → Ed didn’t go to Boston by bus  Positive implicative Ed managed to go → Ed went Ed didn’t manage to go → Ed didn’t go  Negative implicative Ed forgot to go → Ed didn’t go Ed didn’t forget to go → Ed went  Factive Ed forgot that Bill went → Bill went Ed didn’t forget that Bill went
  • 42. Local Textual Inference Matching of Abstract KR: Inference by Inclusion Ed went to Boston by bus. → Ed went to Boston. (context t) (context t) (instantiable Boston8 t) (instantiable Boston8 t) (instantiable Ed9 t) (instantiable Ed9 t) (instantiable bus7 t) (instantiable go_ev6 t) (instantiable go_ev6 t) (role go_ev6 objectMoving Ed9) (role go_ev6 objectMoving Ed9) (role go_ev6 toLocation Boston8) (role go_ev6 toLocation Boston8) (role go_ev6 vehicle bus7) (subconcept Boston8 CityOfBostonMass) (subconcept Boston8 CityOfBostonMass) (subconcept Ed9 Person) (subconcept Ed9 Person) (subconcept bus7 Bus-RoadVehicle) (subconcept go_ev6 (subconcept go_ev6 Movement-TranslationEvent) Movement-TranslationEvent) (temporalRel startsAfterEndingOf (temporalRel startsAfterEndingOf Now go_ev6) Now go_ev6)
  • 43. NYT 2000: top of the chart be (un)able to > 12600 fail to > 5000 know that > 4800 acknowledge that > 2800 be too ADJ to > 2400 happen to > 2300 manage to > 2300 admit/concede that > 2000 be ADJ enough to > 1500 have a/the chance to > 1500 go on/proceed to > 1000 have time/money to > 1000
  • 44. Verbs differ as to speaker commitments “Bush realized that the US Army had to be transformed to meet new threats” “The US Army had to be transformed to meet new threats” “Bush said that Khan sold centrifuges to North Korea” “Khan sold centrifuges to North Korea”
  • 45. Implicative verbs: carry commitments  Commitment depends on verb context Positive (+) or negative (-)  Speaker commits to truth-value of complement True (+) or false (-)
  • 46. ++/- - Implicative positive context: positive commitment negative context: negative commitment “Ed managed to leave” “Ed left” “Ed didn’t manage to leave” “Ed didn’t leave”
  • 47. +-/-+ Implicative positive context: negative commitment negative context: positive commitment “Ed forgot to leave” “Ed didn’t leave” “Ed didn’t forget to leave” “Ed left”
  • 48. ++ Implicative positive context only: positive commitment “Ann forced Ed to leave” “Ed left” “Ann didn’t force Ed to leave” “Ed left / didn’t leave”
  • 49. +- Implicative positive context only: negative commitment “Ed refused to leave” “Ed didn’t left” “Ed didn’t refuse to leave” “Ed left / didn’t leave”
  • 50. -+ Implicative negative context only: positive commitment “Ed hesitated to leave” “Ed left / didn’t leave” “Ed didn’t hesitate to leave” “Ed left”
  • 51. -- Implicative negative context only: negative commitment “Ed attempted to leave” “Ed left / didn’t leave” “Ed didn’t attempt to leave” “Ed didn’t leave”
  • 52. +Factive positive commitment no matter what context “Ann realized that Ed left” “Ann didn’t realize that Ed left” “Ed left” Like ++/-+ implicative but additional commitments in questions and conditionals Did Ann realize that Ed left? cf. Did Ed manage to leave?
  • 53. - Factive negative commitment no matter what context “Ann pretended that Ed left” “Ann didn’t pretend that Ed left” “Ed didn’t leave” Like +-/-- implicative but with additional commitments
  • 54. Coding of implicative verbs  Implicative properties of complement-taking verbs • Not available in any external lexical resource • No obvious machine-learning strategy  1250 embedded-clause verbs in PARC lexicon  400 examined on first pass • Considered in BNC frequency order • Google search when intuitions unclear  1/3 are implicative, classified in Unified Lexicon
  • 55. Matching for implicative entailments Term rewriting promotes commitments in Abstract KR context(cx_leave7) context(t) instantiable(leave_ev7 t) context_lifting_rules(veridical t cx_leave7) “Ed managed instantiable(Ed0 t) to leave.” instantiable(leave_ev7 cx_leave7) instantiable(manage_ev3 t) role(objectMoving leave_ev7 Ed0) role(performedBy manage_ev3 Ed0) role(whatsSuccessful manage_ev3 cx_leave7) context(t) “Ed left.” instantiable(Ed0 t) instantiable(leave_ev1 t) role(objectMoving leave_ev1 Ed0)
  • 56. Embedded examples in real text From Google: Song, Seoul's point man, did not forget to persuade the North Koreans to make a “strategic choice” of returning to the bargaining table... Song persuaded the North Koreans…
  • 57. Promotion for a (simple) embedding “Ed did not forget to force Dave to leave” Dave left.
  • 58. Conclusions  Broad coverage, efficient language to KR system  Need extensive resources (ontologies, argument mappings) – Much work to incorporate  Use layered architecture – Matching as well as reasoning  KR mapping both increases and decreases ambiguity
  • 59. Term-Rewriting for KR Mapping  Rule form <Input terms> ==> <Output terms> (obligatory) <Input terms> ?=> <Output terms> (optional) (optional rules introduces new choices)  Input patterns allow – Consume term if matched: Term – Test on term without consumption: +Term – Test that term is missing: -Term – Procedural attachment: {ProcedureCall}  Ordered rule application – Rule1 applied in all possible ways to Input to produce Output1 – Rule2 applied in all possible ways to Output1 to produce Output2
  • 60. Example rules: depassivisation +VTYPE(%V, %%), +PASSIVE(%V,+), SUBJ(%V, %LogicalOBJ) ==> OBJ(%V, %LogicalOBJ). +VTYPE(%V, %%), +PASSIVE(%V,+), OBL-AG(%V,%LogicalSUBJ), PFORM(%LogicalSUBJ, %%) ==> SUBJ(%V, %LogicalSUBJ). +VTYPE(%V, %%), +PASSIVE(%V,+), -SUBJ(%V,%%) ==> SUBJ(%V, %AgtPro), PRED(%AgtPro, agent_pro), PRON-TYPE(%AgtPro, null_agent).
  • 61. Term-rewriting can introduce ambiguity  Alternative lexical mappings /- cyc_concept_map(bank, FinancialInstitution). Permanent Background /- cyc_concept_map(bank, SideOfRiver). Facts ist(%%, %P( %Arg)), Mapping from cyc_concept_map(%P, %Concept) Predicate to ==> sub_concept(%Arg, %Concept). Cyc concept  Input term ist(ctx0, bank(bank##1))  Alternative rule applications produce different outputs Rewrite system represents this ambiguity by a new choice  Output – C1: sub_concept(bank##1, FinancialInstitution) C2: sub_concept(bank##1, SideOfRiver) – (C1 xor C2) ↔ 1
  • 62. Rules can prune ill-formed mappings  The bank hired Ed hire(%E), subj(%E, %Subj), obj(%E, %Obj), +sub_concept(%Subj, %CSubj), +sub_concept( %Obj, %CObj), Rule for {genls( %CSubj, Organization), genls( %CObj, Person)} mapping ==> sub_concept(%E, EmployingEvent), hire performedBy(%E, %Subj), personEmployed(%E, %Obj).  From Cyc: genls(FinancialInstitution, Organization) true genls(SideOfRiver, Organization) false  If bank is mapped to SideOfRiver, the rule will not fire. This leads to a failure to consume the subject. subj(%%,%%) ==> stop. prunes this analysis from the choice space.  In general, later rewrites prune analyses that don’t consume grammatical roles.
  • 63. The NLP-KR Gap  There have been parallel efforts on text-based and knowledge-based question answering – Benefits of knowledge-based approaches inaccessible without some bridge from text to KR.  Prime Goal – Robust, broad coverage mapping from texts and questions to KR – Allow KRR systems with lots of world knowledge to answer questions  Second-thoughts Goal – Robust, broad coverage mapping from texts and questions to KR – Allow system with basic knowledge of English to answer questions
  • 64. Canonical conceptual semantics “Cool the device.” “Lower the temperature of the device.” lower cool addressee temperature addressee device of device decreaseCausally(lower-event, device, temperature)