Shakti Sinha and Daniel Tunkelang discuss how LinkedIn's search functionality works. They explain that LinkedIn search is personalized based on a user's profile and network. Query understanding involves tagging queries to determine entity types like people, companies, or skills. Ranking is also personalized using machine learning models trained on search logs to determine relevance for a specific user's query. The system aims to provide both globally and personally relevant results, as about two-thirds of clicks come from out of a user's network.
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
How LinkedIn's Search Works: Query Understanding and Personalized Ranking
1. Recruiting SolutionsRecruiting SolutionsRecruiting Solutions
formation Retrieval at LinkedIn
Shakti Sinha Daniel Tunkelang
Head, Search Relevance Head, Query Understanding
Shakti Daniel
Find and be Found:
15. Query tagging: key to query understanding.
§ Using human judgments to evaluate tag precision.
– Extremely accurate (> 99%) for identifying person names.
– Harder to distinguish company vs. title vs. skill (e.g., oracle dba).
§ Comparing CTR for tag matches vs. non-matches.
– Difference can be large enough to suggest filtering vs. ranking:
15
16. Detecting navigational vs. exploratory queries.
Pre-retrieval
§ Sequence of query tags.
Post-retrieval
§ Distribution of scores / features.
16
Click behavior
§ Title searches >50x more
likely to get 2+ clicks than
name searches.
17. Query expansion for exploratory queries.
17
software patent lawyer
Query expansions derived
from reformulations.
e.g., lawyer -> attorney
18. Understanding misspelled queries.
18
daniel tankalong infomation retrieval
marisa meyer ingenero eletrico
jonathan podemsky desenista industrail
Did you mean daniel tunkelang?
Did you mean marissa mayer?
Did you mean johnathan podemsky?
Did you mean information retrieval?
Did you mean ingeniero electrico?
Did you mean desenhista industrial?
19. Spelling out the details.
entity data
people, companies
successful queries
tunkelang =>
reformulations
marisa => marissa
n-grams
dublin => du ub bl li in
metaphones
mark/marc => MRK
word pairs
johnathan podemsky
INDEX
} {marisa meyer yoohoo
marissa
marisa
meyer
mayer
yahoo
yoohoo
19
23. Relevant results can be in or out of network.
23
§ Searcher’s network matters for relevance.
– Within network results have higher CTR.
§ But the network is not enough.
– About two thirds of search clicks come from out of
network results.
24. Personalized machine-learned ranking.
24
§ Data point is a triple (searcher, query, document).
– Searcher features are important!
§ Labels: Is this document relevant to the query and
the user?
– Depends on the user’s network, location, etc.
– Too much to ask random person to judge.
§ Training data has to be collected from search logs.
25. Search log data has biases.
25
§ Presentation bias
– Results shown higher tend to get clicked more often.
– Use FairPairs [Radlinski and Joachims, AAAI’06].
not flipped
flipped
flipped
Clicked!
✗
✔
✔
✗
✗
✗
training data
26. Search log data has biases.
26
§ Sample bias
– User clicks or skips only what is shown.
– What about low scoring results from existing model?
– Add low-scoring results as ‘easy negatives’ so model
learns bad results not presented to user.
…
label 0
label 0
label 0
label 0
…
page 1 page 2 page 3 page n
28. How to train your model.
28
§ Train simple models to resemble complex ones.
– Build Additive Groves model [Sorokina et al, ECML ’07],
which is good at detecting interactions.
§ Build tree with logistic regression leaves.
§ By restricting tree to user and query features, only
regression model evaluated for each document.
β0 +β1 T(x1)+...+βn xn
α0 +α1 P(x1)+...+αnQ(xn )
X2=?
X10< 0.1234 ?
γ0 +γ1 R(x1)+...+γnQ(xn )
29. Take-Aways
§ LinkedIn’s search problem is unique because of deep role
of personalization – users are integral part of the corpus.
§ Query understanding allows us to optimize for entity-
oriented search against semi-structured content.
§ Ranking requires us to contextually apply global and
personalized user, query, and document features.
29