SlideShare une entreprise Scribd logo
1  sur  28
Télécharger pour lire hors ligne
Introduction
         Methodology
              Results




Wide-Coverage CCG Parsing
  with Quantifier Scope

         Dimitrios Kartsaklis

  MSc Thesis, University of Edinburgh

  Supervisor: Professor Mark Steedman


              July 18, 2011




   Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   1/ 28
Introduction
                           Methodology
                                Results


Introduction


      A Natural Language Processing project
      Dealing with semantics, and specifically with quantifier scope
      ambiguities
      Purpose: The creation of a wide-coverage semantic parser
      capable of handling quantifier scope ambiguities using
      Generalized Skolem Terms
      Grammar formalism: Combinatory Categorial Grammar
      (CCG)
      Logical form: First-order logic using λ-calculus as “glue”
      language



                     Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   2/ 28
Introduction
                            Methodology
                                 Results


Quantification



      All known human languages make use of quantification. In
      English:
        • Universal quantifiers (∀): every, each, all, ...
        • Existential quantifiers (∃): a, some, ...
        • Generalized quantifiers: most, at least, few, ...
      Traditional representations using first-order logic and
      λ-calculus:
        • Universal: λp.λq.∀x[p(x) → q(x)]
        • Existential: λp.λq.∃x[p(x) ∧ q(x)]




                      Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   3/ 28
Introduction
                                           Methodology
                                                Results


Compositionality


         Frege’s principle:
                            “The meaning of the whole is a function
                                 of the meaning of its parts”
         Example: “Every boy likes some girl”
            Every                    boy                  likes                        some                     girl
            NP/N                       N             (SNP)/NP                     NP/N                        N
   : λp.λq.∀y [p(y ) → q(y )]    : λy .boy (y )   : λx.λy .likes(y , x)    : λp.λq.∃x[p(x) ∧ q(x)]        : λx.girl(x)
                                            >                                                                          >
                       NP                                                                      NP
            : λq.∀y [boy (y ) → q(y )]                                                : λq.∃x[girl(x) ∧ q(x)]
                                                                                                                       >
                                                                  SNP : λy .∃x[girl(x) ∧ likes(y , x)]
                                                                                                                       <
                                     S : ∀y [boy (y ) → ∃x[girl(x) ∧ likes(y , x)]]




                                   Dimitrios Kartsaklis           Wide-Coverage CCG Parsing with Quantifier Scope           4/ 28
Introduction
                            Methodology
                                 Results


Quantifier scope ambiguities


      Example: “Every boy likes some girl”
        • ∀x[boy (x) → ∃y [girl(y ) ∧ likes(x, y )]]
          (every boy likes a possibly different girl)
      However: Not the only meaning:
        • ∃y [girl(y ) ∧ ∀x[boy (x) → likes(x, y )]]
          (there is a specific girl who is liked by every boy )
      But our semantics is surface-compositional, so only the first
      reading is allowed by syntax
      We need a quantification method to deliver both readings in a
      single syntactic derivation



                      Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   5/ 28
Introduction
                            Methodology
                                 Results


Underspecification


      A solution to the problem: provide underspecified
      representations of the quantified expressions without explicitly
      specify their scope:
        •   loves(x1 , x2 ),
              (λq.∀x[boy (x) → q(x)], 1),
              (λq.∃y [girl(y ) ∧ q(y )], 2)
      Specification is performed in a separate step, after the end of
      the syntactic derivation, by combining the available quantified
      expressions in every possible way
      The most known underspecification technique is Cooper
      storage



                      Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   6/ 28
Introduction
                             Methodology
                                  Results


Underspecification problems

      Decoupling the semantic derivation from the syntactic
      combinatorics can lead to problems:
        • Possibly equivalent logical forms:“Some boy likes some girl”
              a. ∃x[boy (x) ∧ ∃y [girl(y ) ∧ likes(x, y )]]
              b. ∃y [girl(y ) ∧ ∃x[boy (x) ∧ likes(x, y )]]
        •   Scope asymmetries: “Every boy likes, and every girl
            detests, some saxophonist”:
              ∀x[boy (x) → ∃y [sax(y ) ∧ likes(x, y )]]∧
                 ∃v [sax(v ) ∧ ∀z[girl(z) → detests(z, v )]]
        •   Intermediate readings: “Some teacher showed every pupil
            every movie”:
              ∀x[movie(x) → ∃y [teacher (y )∧
                 ∀z[pupil(x) → showed(x, y , z)]]]


                       Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   7/ 28
Introduction
                            Methodology
                                 Results


Skolemization

      If existentials cause such problems, why not remove them
      altogether?
      Skolemization: The process of replacing an existential
      quantifier with a function of all universally quantified variables
      in whose scope the existential falls.
      Example 1: ∀x∃y ∀z.P(x, y , x) =⇒ ∀x∀z.P(x, sk(x), z)
        • The existential of y is replaced by sk(x), since x was the only
          preceding universal.
      Example 2: ∃y ∀x∀z.P(x, y , x) =⇒ ∀x∀z.P(x, sk(), z)
        • Now sk() is a function without arguments – a constant.
      Example 3: “Every boy likes some girl”:
        • Normal form: ∀x[boy (x) → ∃y [girl(y ) ∧ likes(x, y )]]
                                                     {x}
        • Skolemized form: ∀x[boy (x) → likes(x, skgirl )]

                      Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   8/ 28
Introduction
                                 Methodology
                                      Results


One step further: Generalized Skolem Terms

        The only true quantifiers in English are the universals every,
        each, and their relatives
        Every other non-universal quantifier should be associated with
        a (yet unspecified) Skolem term
        Skolem terms can be specified according to their environment
        at any step of the derivation process into a generalized form
                                                    E
                                                  skn:p;c
        where E is the environment (preceding universals) and
            • n the number of the originating noun phrase
            • p a nominal property (e.g. “girl”)
            • c a cardinality condition (e.g. λs.|s| > 1)

   (Mark Steedman, Natural Semantics of Scope, currently in publication by MIT Press)

                           Dimitrios Kartsaklis      Wide-Coverage CCG Parsing with Quantifier Scope   9/ 28
Introduction
                                           Methodology
                                                Results


Two available readings
         Specification takes places in the beginning of the derivation
             Every                   boy                    likes                       some                 girl
             NP/N                      N               (SNP)/NP                        NP/N                  N
    : λp.λq.∀y [p(y ) → q(y )]   : λy .boy (y )     : λx.λy .likes(y , x)        : λp.λq.q(skolem(p))    : λx.girl(x)
                                              >                                                                     >
           NP : λq.∀y [boy (y ) → q(y )]                                           NP : λq.q(skolem(λx.girl(x)))
                                                                                 ...................................
                                                                                                       {}
                                                                                          NP : λq.q(skgirl )
                                                                                                                    >
                                                                                                {}
                                                                        SNP : λy .likes(y , skgirl )
                                                                                                                    <
                                                                            {}
                                           S : ∀y [boy (y ) → likes(y , skgirl )]


         Specification takes places at the end of the derivation
             Every                   boy                    likes                       some                 girl
             NP/N                      N               (SNP)/NP                        NP/N                  N
    : λp.λq.∀y [p(y ) → q(y )]   : λy .boy (y )     : λx.λy .likes(y , x)        : λp.λq.q(skolem(p))    : λx.girl(x)
                                              >                                                                     >
           NP : λq.∀y [boy (y ) → q(y )]                                            NP : λq.q(skolem(λx.girl(x)))
                                                                                                                    >
                                                                SNP : λy .likes(y , skolem(λx.girl(x)))
                                                                                                                    <
                                 S : ∀y [boy (y ) → likes(y , skolem(λx.girl(x)))]
      ....................................................................................................
                                                                      {y }
                                       S : ∀y [boy (y ) → likes(y , skgirl )]

                                  Dimitrios Kartsaklis           Wide-Coverage CCG Parsing with Quantifier Scope         10/ 28
Introduction
                           Methodology
                                Results


Advantages


      Provides a global solution to the most important
      quantification problems
      Skolem terms are part of the semantic theory, not ad-hoc
      mechanisms
      Easy integration with CCG parsers – no significant increase in
      computational complexity
      Semantic derivation is performed “on-line”, based on the
      combinatory rules of CCG
        • This limits the degree of freedom in which the available
          readings are derived, so non-attested or redundant readings are
          excluded



                     Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   11/ 28
Introduction
                              Methodology
                                   Results


A proof of concept


      Purpose of the project: The creation of a wide-coverage
      semantic parser that applies the previously described
      quantification approach.
      Main tasks:
        1.   Create a wide-coverage probabilistic syntactic parser
        2.   Create a λ-calculus framework for the logical forms
        3.   Integrate the semantic combinatorics to the parser
        4.   Provide appropriate logical forms for the CCG lexicon
      Eventually: Provide a proof of concept for the theory by
      testing the results in specific quantification cases



                        Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   12/ 28
Introduction
                              Methodology
                                   Results


Syntactic parsing


       Parser is based on the OpenCCG framework
         • Well-tested API for parsing, supports every aspect of CCG
       Two additions:
         • A supertagger for assigning initial categories to the words
           (Clark & Curran)
         • A probabilistic model incorporating head-word dependencies
           (Hockenmaier)
       Standard interpolation techniques for dealing with sparse data
       problems
       Beam search for pruning the search space



                        Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   13/ 28
Introduction
                              Methodology
                                   Results


Probabilistic model

       A generative model on local trees level (trees of depth 1)
       (Hockenmaier)
       Baseline version: The probability of a local tree with root N
       and children H and S is the product of:
         •   An expansion probability P(expansion|N)
         •   A head probability P(H|N, expansion)
         •   A non-head probability P(S|N, expansion, H)
         •   A lexical probability P(w |N, expansion = leaf )
       The overall probability of a derivation is the product of the
       probabilities of all local trees
       Head-word dependencies version: Also take in account the
       relationships between the heads of local trees


                        Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   14/ 28
Introduction
                               Methodology
                                    Results


Semantic forms

      An object-oriented approach
      Formulas are represented as a set of nested objects
        • Example: ‘‘Every man walks”
             Infix notation: ∀x[man(x) → walks(x)]
             Prefix notation: all(x,imp(man(x),walks(x)))

         all(x, expr)
                        imp(expr1, expr2)

            xx            man(x)                      walk(x)
                                          xx                         xx




                         Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   15/ 28
Introduction
                                       Methodology
                                            Results


Object hierarchy

                                                 Expression




    LambdaAbstraction    FunctionalApplication        FirstOrderExpression            Variable          SkolemTerm


               GenericPredicate

                                       Quantification     Conjunction        ...                       GeneralizedST


                            Constant


                                                                        BindingTerm             PlainVariable




                                                                  Lambda           Quantified




                               Dimitrios Kartsaklis        Wide-Coverage CCG Parsing with Quantifier Scope              16/ 28
Introduction
                                Methodology
                                     Results


Skolem Term representations

      A single linked packed structure:
                                                                       Object references
                                                                       (pointers)
                                SkolemTerm                             to specifications




        GeneralizedSkolemTerm        GeneralizedSkolemTerm        GeneralizedSkolemTerm
           (Specification1)             (Specification2)             (Specification3)


      Example: “A boy ate a pizza”
                  skolem                         skolem
        ate(                     boy ,                       pizza )
                    sk                             sk


                          Dimitrios Kartsaklis      Wide-Coverage CCG Parsing with Quantifier Scope   17/ 28
Introduction
                             Methodology
                                  Results


β-conversion


      β-conversion: The process of substituting a bound variable in
      the body of a λ-abstraction by the argument passed to the
      function
      A stack-based method (Blackburn & Bos)
        1. When the expression is an application, push its argument to
           the stack and discard the outermost application object.
        2. If the expression is a λ-abstraction, throw away the λ-term,
           pop the item at the top of the stack, and substitute it for
           every occurrence of the correlated variable.
        3. If the expression is neither an application nor a λ-abstraction,
           β-convert its sub-expressions.



                       Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   18/ 28
Introduction
                            Methodology
                                 Results


β-conversion

      Example: “John loves Mary”

            John                        loves                         Mary
          NP : john   (SNP)/NP : λx.λy .loves(y , x)              NP : mary
                                                                              >
                                  SNP : λy .loves(y , mary )
                                                                              <
                              S : loves(john, mary )


          Expression                                                    Stack
      1   app(app(lam(x,lam(y,loves(y,x))),mary),john)                  []
      2   app(lam(x,lam(y,loves(y,x))),mary)                            [john]
      3   lam(x,lam(y,loves(y,x)))                                      [mary,john]
      4   lam(y,loves(y,mary))                                          [john]
      5   loves(john,mary)                                              []



                      Dimitrios Kartsaklis      Wide-Coverage CCG Parsing with Quantifier Scope   19/ 28
Introduction
                               Methodology
                                    Results


OpenCCG and CKY

     OpenCCG uses the CKY algorithm:
                          NP                                    S

                        S/(SNP)                S/NP                   S


                         Mary


                                              (SNP)/NP      SNP

                                               married


                                                                NP    S/(SNP)

                                                                     John

                                                          Mary              married    John
    Mary    married       John
                                                           NP         (SNP)/NP         NP
    NP     (SNP)/NP       NP                                   >T
                               >                         S/(SNP)
                SNP                                                              >B
                               <                                     S/NP
               S                                                                           >
                                                                             S

                       Dimitrios Kartsaklis      Wide-Coverage CCG Parsing with Quantifier Scope   20/ 28
Introduction
                           Methodology
                                Results


CKY modifications


      An additional step is introduced in the inner loop of the CKY
      algorithm, called skolem term specification:
       1. For each skolem term ST in the logical form Λ, collect the
          new environment (preceding universals) of ST .
       2. If the new environment is different than the old environment,
          specify a new Generalized Skolem Term and add it to the
          specifications list of ST .
      where ΛA is the logical form of a result category A that has
      been produced by the application of some CCG rule
      Environment is always readily available thanks to the nested
      internal structure



                     Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   21/ 28
Introduction
                           Methodology
                                Results


Syntax-to-semantics mapping


      The syntactic rules are mapped to semantic transformations
      based on the following table:
              Rule                     λ-abstraction
              fapp(ΛB , ΛC )           ΛA = app(ΛB , ΛC )
              bapp(ΛB , ΛC )           ΛA = app(ΛC , ΛB )
              fcomp(ΛB , ΛC )          ΛA = λ¯.app(ΛB , app(¯, ΛC ))
                                              x             x
              bcomp(ΛB , ΛC )          ΛA = λ¯.app(ΛC , app(ΛB , x ))
                                              x                  ¯

      λ¯ is a vector containing the outer λ-terms of the predicate
       x
      that remain to be filled after the composition




                     Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   22/ 28
Introduction
                           Methodology
                                Results


The semantic lexicon


      A simple form that allows various degrees of grouping between
      categories and words
      Each entry is comprised by a descriptive title, a list of CCG
      categories, a list of surface forms, and a logical expression in
      prefix notation
      Example: The entry for universal quantifiers
         [universal]
         categories: (S/(SNP))/N|NP/N
         words: every|each|all
         LF: lam(p,lam(q,all(x,impl(app(p,x),app(q,x)))))



                     Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   23/ 28
Introduction
                            Methodology
                                 Results


Syntactic parsing results



       Probabilistic model trained on Sections 02-21 of CCGbank
       Evaluation has been performed on Section 23 of CCGbank

           Parser                      Cov.    LexCat       P, H, S
           Clark et al. (2002)         95.0     90.3         81.8        90.0
           Hockenmaier (2003)          99.8     92.2         85.1        91.4
           Clark & Curran (2004)       99.6     93.6         86.4        92.3
                                       96.6     92.4         71.8        78.8




                      Dimitrios Kartsaklis    Wide-Coverage CCG Parsing with Quantifier Scope   24/ 28
Introduction
                             Methodology
                                  Results


Evaluation of semantic component

      The semantic aspect of the parser was tested on 50 sentences
      presenting a wide range of linguistic challenges
      More specifically, the following cases were tested:
        •   Scope inversion
        •   “Donkey” sentences
        •   Scope asymmetries
        •   Intermediate scope
        •   Spurius readings
        •   Coordination cases
        •   Generalized quantifiers
      In almost every case the results conformed to the predictions
      of the theory


                       Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   25/ 28
Introduction
                                Methodology
                                     Results


Form of the derivations

        Sample derivation: “Some logician proved every theorem”
                                                 {}{x}
          • ∀x[theorem(x) → proved(sklogician , x)]

   (lex)        Some :- NP/N : lam:p.lam:q.q(skolem(p))
   (lex)        logician :- N : lam:x.logician(x)
   (>)          Some logician :- NP : lam:q.q(sk{lam:x.logician(x)}_{})
   (lex)        proved :- (SNP)/NP : lam:x.lam:y.proved(y,x)
   (lex)        every :- NP/N : lam:p.lam:q.all:x[p(x)->q(x)]
   (lex)        theorem :- N : lam:x.theorem(x)
   (>)          every theorem :- NP : lam:q.all:x[theorem(x)->q(x)]
   (>)          proved every theorem :- SNP : lam:y.all:x[theorem(x)->proved(y,x)]
   (gram)       type-changing3: SNP => NPNP
   (tchange3)   proved every theorem :- NPNP : lam:y.all:x[theorem(x)->proved(y,x)]
   (<)          Some logician proved every theorem :- NP :
                    all:x[theorem(x)->proved(sk{lam:x.logician(x)}_{}_{x},x)]




                          Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   26/ 28
Introduction
                           Methodology
                                Results


However...

      Probabilistic model too weak to properly guide the semantic
      derivation in many cases
      Wide-coverage parsers stretch the flexibility of the grammar in
      order to provide some sort of analysis, even a wrong one
      Example: “Every man walks and talks”
                    Every man               walks       and     talks
                         NP           (SNP)/NP         conj     ∗N
                                                                    >
                                                             N
                                                           N ⇒ NP
                                                                    >
                                                    SNP
                                                                    <
                                              S

      In such cases, proper semantic derivation is blocked –
      semantics simply cannot follow syntax

                     Dimitrios Kartsaklis     Wide-Coverage CCG Parsing with Quantifier Scope   27/ 28
Introduction
                           Methodology
                                Results


Future work



      Fine-tuning of the probabilistic model
      Extending the semantic lexicon for really wide-coverage
      semantic parsing
      Adding semantic aspects such as negation and polarities
      Improve coverage of generalized quantifiers
      Presenting the results in a less cryptic form, by properly
      unpacking and enumerate all the available readings




                     Dimitrios Kartsaklis   Wide-Coverage CCG Parsing with Quantifier Scope   28/ 28

Contenu connexe

Tendances

ABC: How Bayesian can it be?
ABC: How Bayesian can it be?ABC: How Bayesian can it be?
ABC: How Bayesian can it be?Christian Robert
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayesPhong Vo
 
Intro probability 4
Intro probability 4Intro probability 4
Intro probability 4Phong Vo
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking componentsChristian Robert
 
ABC-Xian
ABC-XianABC-Xian
ABC-XianDeb Roy
 
Micro to macro passage in traffic models including multi-anticipation effect
Micro to macro passage in traffic models including multi-anticipation effectMicro to macro passage in traffic models including multi-anticipation effect
Micro to macro passage in traffic models including multi-anticipation effectGuillaume Costeseque
 
On generalized dislocated quasi metrics
On generalized dislocated quasi metricsOn generalized dislocated quasi metrics
On generalized dislocated quasi metricsAlexander Decker
 
11.on generalized dislocated quasi metrics
11.on generalized dislocated quasi metrics11.on generalized dislocated quasi metrics
11.on generalized dislocated quasi metricsAlexander Decker
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Valentin De Bortoli
 
Approximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-LikelihoodsApproximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-LikelihoodsStefano Cabras
 
IVR - Chapter 1 - Introduction
IVR - Chapter 1 - IntroductionIVR - Chapter 1 - Introduction
IVR - Chapter 1 - IntroductionCharles Deledalle
 
Bayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear modelsBayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear modelsCaleb (Shiqiang) Jin
 
Monte-Carlo method for Two-Stage SLP
Monte-Carlo method for Two-Stage SLPMonte-Carlo method for Two-Stage SLP
Monte-Carlo method for Two-Stage SLPSSA KPI
 

Tendances (20)

ABC: How Bayesian can it be?
ABC: How Bayesian can it be?ABC: How Bayesian can it be?
ABC: How Bayesian can it be?
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayes
 
Intro probability 4
Intro probability 4Intro probability 4
Intro probability 4
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
 
Opthlt
OpthltOpthlt
Opthlt
 
ABC-Xian
ABC-XianABC-Xian
ABC-Xian
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Nominal Schema DL 2011
Nominal Schema DL 2011Nominal Schema DL 2011
Nominal Schema DL 2011
 
Micro to macro passage in traffic models including multi-anticipation effect
Micro to macro passage in traffic models including multi-anticipation effectMicro to macro passage in traffic models including multi-anticipation effect
Micro to macro passage in traffic models including multi-anticipation effect
 
On generalized dislocated quasi metrics
On generalized dislocated quasi metricsOn generalized dislocated quasi metrics
On generalized dislocated quasi metrics
 
11.on generalized dislocated quasi metrics
11.on generalized dislocated quasi metrics11.on generalized dislocated quasi metrics
11.on generalized dislocated quasi metrics
 
ABC in Roma
ABC in RomaABC in Roma
ABC in Roma
 
ABC in Venezia
ABC in VeneziaABC in Venezia
ABC in Venezia
 
SSA slides
SSA slidesSSA slides
SSA slides
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
 
Approximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-LikelihoodsApproximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-Likelihoods
 
IVR - Chapter 1 - Introduction
IVR - Chapter 1 - IntroductionIVR - Chapter 1 - Introduction
IVR - Chapter 1 - Introduction
 
Bayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear modelsBayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear models
 
Athens workshop on MCMC
Athens workshop on MCMCAthens workshop on MCMC
Athens workshop on MCMC
 
Monte-Carlo method for Two-Stage SLP
Monte-Carlo method for Two-Stage SLPMonte-Carlo method for Two-Stage SLP
Monte-Carlo method for Two-Stage SLP
 

En vedette

Wide Coverage Semantic Representations from a CCG Parser
Wide Coverage Semantic Representations from a CCG ParserWide Coverage Semantic Representations from a CCG Parser
Wide Coverage Semantic Representations from a CCG ParserMark Chang
 
Integrating Machine Translation with Translation Memory: A Practical Approach
Integrating Machine Translation with Translation Memory: A Practical ApproachIntegrating Machine Translation with Translation Memory: A Practical Approach
Integrating Machine Translation with Translation Memory: A Practical Approachdimkart
 
Designing Teams for Emerging Challenges
Designing Teams for Emerging ChallengesDesigning Teams for Emerging Challenges
Designing Teams for Emerging ChallengesAaron Irizarry
 
UX, ethnography and possibilities: for Libraries, Museums and Archives
UX, ethnography and possibilities: for Libraries, Museums and ArchivesUX, ethnography and possibilities: for Libraries, Museums and Archives
UX, ethnography and possibilities: for Libraries, Museums and ArchivesNed Potter
 
Visual Design with Data
Visual Design with DataVisual Design with Data
Visual Design with DataSeth Familian
 
3 Things Every Sales Team Needs to Be Thinking About in 2017
3 Things Every Sales Team Needs to Be Thinking About in 20173 Things Every Sales Team Needs to Be Thinking About in 2017
3 Things Every Sales Team Needs to Be Thinking About in 2017Drift
 
How to Become a Thought Leader in Your Niche
How to Become a Thought Leader in Your NicheHow to Become a Thought Leader in Your Niche
How to Become a Thought Leader in Your NicheLeslie Samuel
 

En vedette (7)

Wide Coverage Semantic Representations from a CCG Parser
Wide Coverage Semantic Representations from a CCG ParserWide Coverage Semantic Representations from a CCG Parser
Wide Coverage Semantic Representations from a CCG Parser
 
Integrating Machine Translation with Translation Memory: A Practical Approach
Integrating Machine Translation with Translation Memory: A Practical ApproachIntegrating Machine Translation with Translation Memory: A Practical Approach
Integrating Machine Translation with Translation Memory: A Practical Approach
 
Designing Teams for Emerging Challenges
Designing Teams for Emerging ChallengesDesigning Teams for Emerging Challenges
Designing Teams for Emerging Challenges
 
UX, ethnography and possibilities: for Libraries, Museums and Archives
UX, ethnography and possibilities: for Libraries, Museums and ArchivesUX, ethnography and possibilities: for Libraries, Museums and Archives
UX, ethnography and possibilities: for Libraries, Museums and Archives
 
Visual Design with Data
Visual Design with DataVisual Design with Data
Visual Design with Data
 
3 Things Every Sales Team Needs to Be Thinking About in 2017
3 Things Every Sales Team Needs to Be Thinking About in 20173 Things Every Sales Team Needs to Be Thinking About in 2017
3 Things Every Sales Team Needs to Be Thinking About in 2017
 
How to Become a Thought Leader in Your Niche
How to Become a Thought Leader in Your NicheHow to Become a Thought Leader in Your Niche
How to Become a Thought Leader in Your Niche
 

Similaire à Wide-Coverage CCG Parsing with Quantifier Scope

ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 2
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 2ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 2
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 2tingyuansenastro
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clusteringFrank Nielsen
 
Deep Learning for Cyber Security
Deep Learning for Cyber SecurityDeep Learning for Cyber Security
Deep Learning for Cyber SecurityAltoros
 
NICE Implementations of Variational Inference
NICE Implementations of Variational Inference NICE Implementations of Variational Inference
NICE Implementations of Variational Inference Natan Katz
 
NICE Research -Variational inference project
NICE Research -Variational inference projectNICE Research -Variational inference project
NICE Research -Variational inference projectNatan Katz
 
block-mdp-masters-defense.pdf
block-mdp-masters-defense.pdfblock-mdp-masters-defense.pdf
block-mdp-masters-defense.pdfJunghyun Lee
 
Divide_and_Contrast__Source_free_Domain_Adaptation_via_Adaptive_Contrastive_L...
Divide_and_Contrast__Source_free_Domain_Adaptation_via_Adaptive_Contrastive_L...Divide_and_Contrast__Source_free_Domain_Adaptation_via_Adaptive_Contrastive_L...
Divide_and_Contrast__Source_free_Domain_Adaptation_via_Adaptive_Contrastive_L...Huang Po Chun
 
Supervised Prediction of Graph Summaries
Supervised Prediction of Graph SummariesSupervised Prediction of Graph Summaries
Supervised Prediction of Graph SummariesDaniil Mirylenka
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?Christian Robert
 
Lecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceLecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceasimnawaz54
 
Coordinate sampler: A non-reversible Gibbs-like sampler
Coordinate sampler: A non-reversible Gibbs-like samplerCoordinate sampler: A non-reversible Gibbs-like sampler
Coordinate sampler: A non-reversible Gibbs-like samplerChristian Robert
 
Options Portfolio Selection
Options Portfolio SelectionOptions Portfolio Selection
Options Portfolio Selectionguasoni
 
Dirichlet processes and Applications
Dirichlet processes and ApplicationsDirichlet processes and Applications
Dirichlet processes and ApplicationsSaurav Jha
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerChristian Robert
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsChristian Robert
 
Variational inference
Variational inference  Variational inference
Variational inference Natan Katz
 
On learning statistical mixtures maximizing the complete likelihood
On learning statistical mixtures maximizing the complete likelihoodOn learning statistical mixtures maximizing the complete likelihood
On learning statistical mixtures maximizing the complete likelihoodFrank Nielsen
 

Similaire à Wide-Coverage CCG Parsing with Quantifier Scope (20)

ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 2
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 2ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 2
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 2
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clustering
 
Deep Learning for Cyber Security
Deep Learning for Cyber SecurityDeep Learning for Cyber Security
Deep Learning for Cyber Security
 
NICE Implementations of Variational Inference
NICE Implementations of Variational Inference NICE Implementations of Variational Inference
NICE Implementations of Variational Inference
 
NICE Research -Variational inference project
NICE Research -Variational inference projectNICE Research -Variational inference project
NICE Research -Variational inference project
 
block-mdp-masters-defense.pdf
block-mdp-masters-defense.pdfblock-mdp-masters-defense.pdf
block-mdp-masters-defense.pdf
 
Divide_and_Contrast__Source_free_Domain_Adaptation_via_Adaptive_Contrastive_L...
Divide_and_Contrast__Source_free_Domain_Adaptation_via_Adaptive_Contrastive_L...Divide_and_Contrast__Source_free_Domain_Adaptation_via_Adaptive_Contrastive_L...
Divide_and_Contrast__Source_free_Domain_Adaptation_via_Adaptive_Contrastive_L...
 
Supervised Prediction of Graph Summaries
Supervised Prediction of Graph SummariesSupervised Prediction of Graph Summaries
Supervised Prediction of Graph Summaries
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?
 
ML unit3.pptx
ML unit3.pptxML unit3.pptx
ML unit3.pptx
 
Lecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceLecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inference
 
Coordinate sampler: A non-reversible Gibbs-like sampler
Coordinate sampler: A non-reversible Gibbs-like samplerCoordinate sampler: A non-reversible Gibbs-like sampler
Coordinate sampler: A non-reversible Gibbs-like sampler
 
Options Portfolio Selection
Options Portfolio SelectionOptions Portfolio Selection
Options Portfolio Selection
 
Dirichlet processes and Applications
Dirichlet processes and ApplicationsDirichlet processes and Applications
Dirichlet processes and Applications
 
Per3 logika
Per3 logikaPer3 logika
Per3 logika
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like sampler
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
Variational inference
Variational inference  Variational inference
Variational inference
 
Quantification
QuantificationQuantification
Quantification
 
On learning statistical mixtures maximizing the complete likelihood
On learning statistical mixtures maximizing the complete likelihoodOn learning statistical mixtures maximizing the complete likelihood
On learning statistical mixtures maximizing the complete likelihood
 

Dernier

Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESmohitsingh558521
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningLars Bell
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsNathaniel Shimoni
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfMounikaPolabathina
 

Dernier (20)

Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine Tuning
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directions
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdf
 

Wide-Coverage CCG Parsing with Quantifier Scope

  • 1. Introduction Methodology Results Wide-Coverage CCG Parsing with Quantifier Scope Dimitrios Kartsaklis MSc Thesis, University of Edinburgh Supervisor: Professor Mark Steedman July 18, 2011 Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 1/ 28
  • 2. Introduction Methodology Results Introduction A Natural Language Processing project Dealing with semantics, and specifically with quantifier scope ambiguities Purpose: The creation of a wide-coverage semantic parser capable of handling quantifier scope ambiguities using Generalized Skolem Terms Grammar formalism: Combinatory Categorial Grammar (CCG) Logical form: First-order logic using λ-calculus as “glue” language Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 2/ 28
  • 3. Introduction Methodology Results Quantification All known human languages make use of quantification. In English: • Universal quantifiers (∀): every, each, all, ... • Existential quantifiers (∃): a, some, ... • Generalized quantifiers: most, at least, few, ... Traditional representations using first-order logic and λ-calculus: • Universal: λp.λq.∀x[p(x) → q(x)] • Existential: λp.λq.∃x[p(x) ∧ q(x)] Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 3/ 28
  • 4. Introduction Methodology Results Compositionality Frege’s principle: “The meaning of the whole is a function of the meaning of its parts” Example: “Every boy likes some girl” Every boy likes some girl NP/N N (SNP)/NP NP/N N : λp.λq.∀y [p(y ) → q(y )] : λy .boy (y ) : λx.λy .likes(y , x) : λp.λq.∃x[p(x) ∧ q(x)] : λx.girl(x) > > NP NP : λq.∀y [boy (y ) → q(y )] : λq.∃x[girl(x) ∧ q(x)] > SNP : λy .∃x[girl(x) ∧ likes(y , x)] < S : ∀y [boy (y ) → ∃x[girl(x) ∧ likes(y , x)]] Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 4/ 28
  • 5. Introduction Methodology Results Quantifier scope ambiguities Example: “Every boy likes some girl” • ∀x[boy (x) → ∃y [girl(y ) ∧ likes(x, y )]] (every boy likes a possibly different girl) However: Not the only meaning: • ∃y [girl(y ) ∧ ∀x[boy (x) → likes(x, y )]] (there is a specific girl who is liked by every boy ) But our semantics is surface-compositional, so only the first reading is allowed by syntax We need a quantification method to deliver both readings in a single syntactic derivation Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 5/ 28
  • 6. Introduction Methodology Results Underspecification A solution to the problem: provide underspecified representations of the quantified expressions without explicitly specify their scope: • loves(x1 , x2 ), (λq.∀x[boy (x) → q(x)], 1), (λq.∃y [girl(y ) ∧ q(y )], 2) Specification is performed in a separate step, after the end of the syntactic derivation, by combining the available quantified expressions in every possible way The most known underspecification technique is Cooper storage Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 6/ 28
  • 7. Introduction Methodology Results Underspecification problems Decoupling the semantic derivation from the syntactic combinatorics can lead to problems: • Possibly equivalent logical forms:“Some boy likes some girl” a. ∃x[boy (x) ∧ ∃y [girl(y ) ∧ likes(x, y )]] b. ∃y [girl(y ) ∧ ∃x[boy (x) ∧ likes(x, y )]] • Scope asymmetries: “Every boy likes, and every girl detests, some saxophonist”: ∀x[boy (x) → ∃y [sax(y ) ∧ likes(x, y )]]∧ ∃v [sax(v ) ∧ ∀z[girl(z) → detests(z, v )]] • Intermediate readings: “Some teacher showed every pupil every movie”: ∀x[movie(x) → ∃y [teacher (y )∧ ∀z[pupil(x) → showed(x, y , z)]]] Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 7/ 28
  • 8. Introduction Methodology Results Skolemization If existentials cause such problems, why not remove them altogether? Skolemization: The process of replacing an existential quantifier with a function of all universally quantified variables in whose scope the existential falls. Example 1: ∀x∃y ∀z.P(x, y , x) =⇒ ∀x∀z.P(x, sk(x), z) • The existential of y is replaced by sk(x), since x was the only preceding universal. Example 2: ∃y ∀x∀z.P(x, y , x) =⇒ ∀x∀z.P(x, sk(), z) • Now sk() is a function without arguments – a constant. Example 3: “Every boy likes some girl”: • Normal form: ∀x[boy (x) → ∃y [girl(y ) ∧ likes(x, y )]] {x} • Skolemized form: ∀x[boy (x) → likes(x, skgirl )] Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 8/ 28
  • 9. Introduction Methodology Results One step further: Generalized Skolem Terms The only true quantifiers in English are the universals every, each, and their relatives Every other non-universal quantifier should be associated with a (yet unspecified) Skolem term Skolem terms can be specified according to their environment at any step of the derivation process into a generalized form E skn:p;c where E is the environment (preceding universals) and • n the number of the originating noun phrase • p a nominal property (e.g. “girl”) • c a cardinality condition (e.g. λs.|s| > 1) (Mark Steedman, Natural Semantics of Scope, currently in publication by MIT Press) Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 9/ 28
  • 10. Introduction Methodology Results Two available readings Specification takes places in the beginning of the derivation Every boy likes some girl NP/N N (SNP)/NP NP/N N : λp.λq.∀y [p(y ) → q(y )] : λy .boy (y ) : λx.λy .likes(y , x) : λp.λq.q(skolem(p)) : λx.girl(x) > > NP : λq.∀y [boy (y ) → q(y )] NP : λq.q(skolem(λx.girl(x))) ................................... {} NP : λq.q(skgirl ) > {} SNP : λy .likes(y , skgirl ) < {} S : ∀y [boy (y ) → likes(y , skgirl )] Specification takes places at the end of the derivation Every boy likes some girl NP/N N (SNP)/NP NP/N N : λp.λq.∀y [p(y ) → q(y )] : λy .boy (y ) : λx.λy .likes(y , x) : λp.λq.q(skolem(p)) : λx.girl(x) > > NP : λq.∀y [boy (y ) → q(y )] NP : λq.q(skolem(λx.girl(x))) > SNP : λy .likes(y , skolem(λx.girl(x))) < S : ∀y [boy (y ) → likes(y , skolem(λx.girl(x)))] .................................................................................................... {y } S : ∀y [boy (y ) → likes(y , skgirl )] Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 10/ 28
  • 11. Introduction Methodology Results Advantages Provides a global solution to the most important quantification problems Skolem terms are part of the semantic theory, not ad-hoc mechanisms Easy integration with CCG parsers – no significant increase in computational complexity Semantic derivation is performed “on-line”, based on the combinatory rules of CCG • This limits the degree of freedom in which the available readings are derived, so non-attested or redundant readings are excluded Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 11/ 28
  • 12. Introduction Methodology Results A proof of concept Purpose of the project: The creation of a wide-coverage semantic parser that applies the previously described quantification approach. Main tasks: 1. Create a wide-coverage probabilistic syntactic parser 2. Create a λ-calculus framework for the logical forms 3. Integrate the semantic combinatorics to the parser 4. Provide appropriate logical forms for the CCG lexicon Eventually: Provide a proof of concept for the theory by testing the results in specific quantification cases Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 12/ 28
  • 13. Introduction Methodology Results Syntactic parsing Parser is based on the OpenCCG framework • Well-tested API for parsing, supports every aspect of CCG Two additions: • A supertagger for assigning initial categories to the words (Clark & Curran) • A probabilistic model incorporating head-word dependencies (Hockenmaier) Standard interpolation techniques for dealing with sparse data problems Beam search for pruning the search space Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 13/ 28
  • 14. Introduction Methodology Results Probabilistic model A generative model on local trees level (trees of depth 1) (Hockenmaier) Baseline version: The probability of a local tree with root N and children H and S is the product of: • An expansion probability P(expansion|N) • A head probability P(H|N, expansion) • A non-head probability P(S|N, expansion, H) • A lexical probability P(w |N, expansion = leaf ) The overall probability of a derivation is the product of the probabilities of all local trees Head-word dependencies version: Also take in account the relationships between the heads of local trees Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 14/ 28
  • 15. Introduction Methodology Results Semantic forms An object-oriented approach Formulas are represented as a set of nested objects • Example: ‘‘Every man walks” Infix notation: ∀x[man(x) → walks(x)] Prefix notation: all(x,imp(man(x),walks(x))) all(x, expr) imp(expr1, expr2) xx man(x) walk(x) xx xx Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 15/ 28
  • 16. Introduction Methodology Results Object hierarchy Expression LambdaAbstraction FunctionalApplication FirstOrderExpression Variable SkolemTerm GenericPredicate Quantification Conjunction ... GeneralizedST Constant BindingTerm PlainVariable Lambda Quantified Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 16/ 28
  • 17. Introduction Methodology Results Skolem Term representations A single linked packed structure: Object references (pointers) SkolemTerm to specifications GeneralizedSkolemTerm GeneralizedSkolemTerm GeneralizedSkolemTerm (Specification1) (Specification2) (Specification3) Example: “A boy ate a pizza” skolem skolem ate( boy , pizza ) sk sk Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 17/ 28
  • 18. Introduction Methodology Results β-conversion β-conversion: The process of substituting a bound variable in the body of a λ-abstraction by the argument passed to the function A stack-based method (Blackburn & Bos) 1. When the expression is an application, push its argument to the stack and discard the outermost application object. 2. If the expression is a λ-abstraction, throw away the λ-term, pop the item at the top of the stack, and substitute it for every occurrence of the correlated variable. 3. If the expression is neither an application nor a λ-abstraction, β-convert its sub-expressions. Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 18/ 28
  • 19. Introduction Methodology Results β-conversion Example: “John loves Mary” John loves Mary NP : john (SNP)/NP : λx.λy .loves(y , x) NP : mary > SNP : λy .loves(y , mary ) < S : loves(john, mary ) Expression Stack 1 app(app(lam(x,lam(y,loves(y,x))),mary),john) [] 2 app(lam(x,lam(y,loves(y,x))),mary) [john] 3 lam(x,lam(y,loves(y,x))) [mary,john] 4 lam(y,loves(y,mary)) [john] 5 loves(john,mary) [] Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 19/ 28
  • 20. Introduction Methodology Results OpenCCG and CKY OpenCCG uses the CKY algorithm: NP S S/(SNP) S/NP S Mary (SNP)/NP SNP married NP S/(SNP) John Mary married John Mary married John NP (SNP)/NP NP NP (SNP)/NP NP >T > S/(SNP) SNP >B < S/NP S > S Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 20/ 28
  • 21. Introduction Methodology Results CKY modifications An additional step is introduced in the inner loop of the CKY algorithm, called skolem term specification: 1. For each skolem term ST in the logical form Λ, collect the new environment (preceding universals) of ST . 2. If the new environment is different than the old environment, specify a new Generalized Skolem Term and add it to the specifications list of ST . where ΛA is the logical form of a result category A that has been produced by the application of some CCG rule Environment is always readily available thanks to the nested internal structure Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 21/ 28
  • 22. Introduction Methodology Results Syntax-to-semantics mapping The syntactic rules are mapped to semantic transformations based on the following table: Rule λ-abstraction fapp(ΛB , ΛC ) ΛA = app(ΛB , ΛC ) bapp(ΛB , ΛC ) ΛA = app(ΛC , ΛB ) fcomp(ΛB , ΛC ) ΛA = λ¯.app(ΛB , app(¯, ΛC )) x x bcomp(ΛB , ΛC ) ΛA = λ¯.app(ΛC , app(ΛB , x )) x ¯ λ¯ is a vector containing the outer λ-terms of the predicate x that remain to be filled after the composition Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 22/ 28
  • 23. Introduction Methodology Results The semantic lexicon A simple form that allows various degrees of grouping between categories and words Each entry is comprised by a descriptive title, a list of CCG categories, a list of surface forms, and a logical expression in prefix notation Example: The entry for universal quantifiers [universal] categories: (S/(SNP))/N|NP/N words: every|each|all LF: lam(p,lam(q,all(x,impl(app(p,x),app(q,x))))) Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 23/ 28
  • 24. Introduction Methodology Results Syntactic parsing results Probabilistic model trained on Sections 02-21 of CCGbank Evaluation has been performed on Section 23 of CCGbank Parser Cov. LexCat P, H, S Clark et al. (2002) 95.0 90.3 81.8 90.0 Hockenmaier (2003) 99.8 92.2 85.1 91.4 Clark & Curran (2004) 99.6 93.6 86.4 92.3 96.6 92.4 71.8 78.8 Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 24/ 28
  • 25. Introduction Methodology Results Evaluation of semantic component The semantic aspect of the parser was tested on 50 sentences presenting a wide range of linguistic challenges More specifically, the following cases were tested: • Scope inversion • “Donkey” sentences • Scope asymmetries • Intermediate scope • Spurius readings • Coordination cases • Generalized quantifiers In almost every case the results conformed to the predictions of the theory Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 25/ 28
  • 26. Introduction Methodology Results Form of the derivations Sample derivation: “Some logician proved every theorem” {}{x} • ∀x[theorem(x) → proved(sklogician , x)] (lex) Some :- NP/N : lam:p.lam:q.q(skolem(p)) (lex) logician :- N : lam:x.logician(x) (>) Some logician :- NP : lam:q.q(sk{lam:x.logician(x)}_{}) (lex) proved :- (SNP)/NP : lam:x.lam:y.proved(y,x) (lex) every :- NP/N : lam:p.lam:q.all:x[p(x)->q(x)] (lex) theorem :- N : lam:x.theorem(x) (>) every theorem :- NP : lam:q.all:x[theorem(x)->q(x)] (>) proved every theorem :- SNP : lam:y.all:x[theorem(x)->proved(y,x)] (gram) type-changing3: SNP => NPNP (tchange3) proved every theorem :- NPNP : lam:y.all:x[theorem(x)->proved(y,x)] (<) Some logician proved every theorem :- NP : all:x[theorem(x)->proved(sk{lam:x.logician(x)}_{}_{x},x)] Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 26/ 28
  • 27. Introduction Methodology Results However... Probabilistic model too weak to properly guide the semantic derivation in many cases Wide-coverage parsers stretch the flexibility of the grammar in order to provide some sort of analysis, even a wrong one Example: “Every man walks and talks” Every man walks and talks NP (SNP)/NP conj ∗N > N N ⇒ NP > SNP < S In such cases, proper semantic derivation is blocked – semantics simply cannot follow syntax Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 27/ 28
  • 28. Introduction Methodology Results Future work Fine-tuning of the probabilistic model Extending the semantic lexicon for really wide-coverage semantic parsing Adding semantic aspects such as negation and polarities Improve coverage of generalized quantifiers Presenting the results in a less cryptic form, by properly unpacking and enumerate all the available readings Dimitrios Kartsaklis Wide-Coverage CCG Parsing with Quantifier Scope 28/ 28