This document describes a dependency parsing-based question answering system that uses RDF and SPARQL. It parses natural language facts and questions into typed dependencies, translates them into RDF and SPARQL, queries populated RDF data to answer questions, and incorporates WordNet and DBpedia background knowledge. The system handles negation, tenses, passive voice and provides examples of its question answering capabilities.
6. QA in General
• Finding an answer to natural language
questions based on documents/facts
• Instead of documents, give answers
• Factoid questions:
– Who can dance Tango?
– What did I eat this morning?
– When Mahatma Gandhi was born?
6
9. WordNet
• Large lexical database of English words
• Words are grouped into synsets
• Relations among synsets: Synonymy,
antonymy, hyponymy, meronymy, troponymy
9
12. NL Text Dependency
(Facts) Parser
RDFizer
RDF SPARQL
OWL SPARQLizer
Ontology
Dependency NL Text
Parser (Questions)
13. Facts Population
1. We parse the natural language facts using the
Stanford dependency parser. The result will be
typed dependencies.
2. The typed dependencies are then translated into
RDF format using RDFizer. The RDFizer is built
using Java with Apache Jena as the Semantic
Web library.
3. The resulting RDF will be consulted with an OWL
ontology that contains some WordNet and
DBpedia axioms to infer some new facts.
13
14. Query Execution
1. We parse the natural language questions
using the Stanford dependency parser. The
result will be typed dependencies.
2. We then translate the typed dependencies
into SPARQL query format.
3. The SPARQL query is then executed over
populated RDF data from the result of Facts
Population.
14
24. A Detailed Example (4)
KB has:
We have the following triple in the knowledge base
already:
:bought owl:sameAs :purchased .
and also the following rules:
PREFIX : <http://example.org/sentence/>
CONSTRUCT {:vehicle ?y ?z} WHERE {:car ?y ?z}
PREFIX : <http://example.org/sentence/>
CONSTRUCT {?x ?y :vehicle} WHERE {?x ?y :car}
24
25. A Detailed Example (5)
Inferred facts:
<http://example.org/sentence/purchased>
<http://example.org/sentence/dobj>
<http://example.org/sentence/vehicle> ;
<http://example.org/sentence/nsubj>
<http://example.org/sentence/aliana> .
25
26. A Detailed Example (6)
Typed dependencies of question:
[nsubj(purchased-2, Who-1), root(ROOT-0,
purchased-2), det(vehicle-4, a-3), dobj(purchased-
2, vehicle-4)]
26
27. A Detailed Example (7)
SPARQL form of question:
SELECT ?x WHERE {
:vehicle :det :a .
:purchased :nsubj ?x .
:purchased :dobj :vehicle .
:root :root :purchased }
Answer: “aliana”
27
28. DBpedia Integration
• By adding some background knowledge from
DBpedia, one can ask more questions.
• Example of Italy data:
:italy owl:sameAs dbpedia:Italy .
dbpedia:Italy dbpprop:capital "Rome" .
dbpedia:Enzo_Ferrari dbpedia-
owl:nationality dbpedia:Italy ;
dbpprop:deathPlace dbpedia:Maranello .
dbpedia:Enzo_Ferrari dbpedia-owl:child
dbpedia:Piero_Ferrari ,
dbpedia:Alfredo_Ferrari .
28
29. Example Case
• Fact = “Fariz loves Italy.”
• Question = “Does Fariz love a country, whose
capital is Rome, which was the nationality of a
person who passed away in Maranello and
whose sons are Piero Ferrari and Alfredo
Ferrari?”
• Thus the answer will be: YES, eventhough we
only have a fact, Fariz loves Italy.
29
30. Example Case (cont.)
• Note that, the previous example, the fact is
translated automatically by the system but the
question is translated manually to be the
following SPARQL query:
ASK WHERE { :love :nsubj :fariz . :root
:root :love .
:love :dobj ?x .
?x dbpprop:capital "Rome" .
?y dbpedia-owl:nationality ?x ;
dbpprop:deathPlacedbpedia:Maranello .
?y dbpedia-owl:childdbpedia:Piero_Ferrari
, dbpedia:Alfredo_Ferrari }
30
31. How to handle negation? (1)
• Fact: I did not buy it.
• RDF:
<http://example.org/sentence/root>
<http://example.org/sentence/root>
<http://example.org/sentence/bought> .
<http://example.org/sentence/bought>
<http://example.org/sentence/dobj>
<http://example.org/sentence/it> ;
<http://example.org/sentence/neg>
<http://example.org/sentence/not> ;
<http://example.org/sentence/nsubj>
<http://example.org/sentence/i> .
31
32. How to handle negation? (2)
• Question: Who bought it?
• SPARQL:
SELECT ?x WHERE {:bought :nsubj ?x . :bought
:dobj :it . :root :root :bought . FILTER NOT
EXISTS { [] :neg ?z . } }
32
33. How to handle negation? (3)
• Who did not buy it? I.
QUERY: SELECT ?x WHERE {:bought :dobj :it .
:bought :neg :not . :bought :nsubj ?x . :root
:root :bought }
33
34. How to handle tenses? (1)
• Fact (I will buy it):
<http://example.org/sentence/buy>
<http://example.org/sentence/aux>
<http://example.org/sentence/will> ;
<http://example.org/sentence/dobj>
<http://example.org/sentence/it> ;
<http://example.org/sentence/nsubj>
<http://example.org/sentence/i> .
34
35. How to handle tenses? (2)
• Who buys it?
• SELECT ?x WHERE {:root :root :buys . :buys
:nsubj ?x . :buys :dobj :it . FILTER NOT EXISTS {
[] :aux :will . } }
35
36. How to handle passive sentences?
Fact: Juliet was killed by Romeo.
<http://example.org/sentence/root>
<http://example.org/sentence/root>
<http://example.org/sentence/killed> .
<http://example.org/sentence/killed>
<http://example.org/sentence/agent>
<http://example.org/sentence/romeo> ;
<http://example.org/sentence/nsubjpass>
<http://example.org/sentence/juliet> .
36
37. How to handle passive sentences?
• Ontology:
:nsubjpass owl:equivalentProperty :dobj .
:agent owl:equivalentProperty :nsubj .
37
38. How to handle passive sentences?
• Who killed Juliet?
SELECT ?x WHERE {:killed :nsubj ?x . :killed :dobj
:juliet . :root :root :killed . FILTER NOT EXISTS {
[] :neg ?z . }}
38
39. DEMO - A Story about Antonio
Antonio is a famous and cool doctor. Antonio
has been working for 10 years. Antonio is in
Italy. Antonio can dance Salsa well. Antonio
loves Maria and Karina. Antonio is also loved
by Karina. Antonio never cooks. But Maria
always cooks. Antonio just bought a car.
Antonio must fly to Indonesia tomorrow.
39
40. Conclusions
• Dependency parsing-based QA system with
RDF and SPARQL
• The system is also aware of negations, tenses
and passive sentences
• Improvements: More advanced parsing
method, more efficient inference system,
richer background knowledge
40
50. Working Examples
• "John, that is the founder of Microsoft and the
initiator of Greenpeace movement, is genius,
honest and cool."
• "Who is honest?"
50
51. Working Examples
• "Farid wants to go to Rome.“
• "Who wants to go to Rome?"
• "Who wants to go?"
• "Who wants?"
51
52. Working Examples
• "Jorge ate 10 delicious apples."
• "Who ate 10 delicious apples?"
• "Who ate 10 apples?"
• "Who ate apples?"
52
Siri image: apple.comWatson: gizmodo.comExample questions (Querying Contacts):What's Michael's address?What is Susan Park's phone number?When is my wife's birthday?Show Jennifer's home email address
Siri image: apple.comWatson: gizmodo.comAnswer - Question systemfor example: 4 JulyWhen is the celebration of the independence day of the USA?
The Stanford dependencies provide a representation of grammatical relations between words in a sentence. They have been designed to be easily understood and effectively used by people who want to extract textual relations. Stanford dependencies (SD) are triplets: name of the relation, governor and dependent.The dependencies are produced using hand-written tregex patterns over phrase-structure trees as described in:Marie-Catherine de Marneffe, Bill MacCartney and Christopher D. Manning. 2006. Generating Typed Dependency Parses from Phrase Structure Parses. In LREC 2006.
SW: an extension of the Web with machine-interpretable content
Synsets: sets of cognitive synonyms, each expressing a distinct conceptAn example of synonymy: buy = purchaseantonymy: bad = goodhyponymy: hyponymy = relationmeronymy: tires = cartroponymy: run = move