SlideShare une entreprise Scribd logo
1  sur  97
Télécharger pour lire hors ligne
2017/07/21@STAIR Lab AI seminar
Improving Nearest Neighbor Methods
from the Perspective of Hubness Phenomenon
Yutaro Shigeto

STAIR Lab, Chiba Institute of Technology
A complete reference list is available at
https://yutaro-s.github.io/download/ref-20170721.html
!3
Nearest neighbor methods are
•a fundamental technique

•used in various fields: NLP, CV, ML, DM
!4
Nearest neighbor methods are
•a fundamental technique

•used in various fields: NLP, CV, ML, DM
!5
Nearest neighbor methods are
•a fundamental technique

•used in various fields: NLP, CV, ML, DM
•a fundamental technique

•used in various fields: NLP, CV, ML, DM
Nearest neighbor methods are
!6
cat
dog
gorilla
•a fundamental technique

•used in various fields: NLP, CV, ML, DM
Nearest neighbor methods are
!7
cat
dog
gorilla
•a fundamental technique

•used in various fields: NLP, CV, ML, DM
Nearest neighbor methods are
!8
cat
dog
gorilla
•a fundamental technique

•used in various fields: NLP, CV, ML, DM
Nearest neighbor methods are
!9cat
cat
dog
gorilla
Hubness Phenomenon
!11
The nearest neighbors of many queries are the same
objects (“hubs”)
Hubness Phenomenon
cat
[Radovanović+, 2010]
!12
The nearest neighbors of many queries are the same
objects (“hubs”)
Hubness Phenomenon
[Radovanović+, 2010]
cat
!13
The nearest neighbors of many queries are the same
objects (“hubs”)
Hubness Phenomenon
[Radovanović+, 2010]
cat
!14
The nearest neighbors of many queries are the same
objects (“hubs”)
Hubness Phenomenon
hub
[Radovanović+, 2010]
cat
: Normal distribution (zero mean)
!15
Then it can be shown that
Fixed objects , with
EX [ x y2 ] EX [ x y1 ] > 0
y1 < y2
is more likely to be closer to
more likely to be a hubi.e.
Why hubs emerge?
[Radovanović+, 2010]
: Normal distribution (zero mean)
!16
Then it can be shown that
Fixed objects , with
EX [ x y2 ] EX [ x y1 ] > 0
y1 < y2
is more likely to be closer to
more likely to be a hubi.e.
Because this holds for any pair and ,
objects closest to the origin tend to be hubs

This bias is called “spatial centrality”
Why hubs emerge?
[Radovanović+, 2010]
Variants
•Squared Euclidean distance [Shigeto+, 2015]

•Inner product [Suzuki+, 2013]
!17
EX x y2
2
EX x y1
2
> 0
1
|D|
x D
x, y2
1
|D|
x D
x, y1 < 0
!18
Research Objective:
Improve the performance of nearest neighbor
methods via reducing the emergence of hubs
Problem:
The emergence of hubs diminishes nearest
neighbor methods
Normalization of Distances
!20
[Suzuki+, 2013]
Centering: Reducing spatial centrality
Spatial centrality implies the object which is similar to
the centroid tends to be hub
After centering, similarities are identical: i.e., zero
centroid
tends to be hub
!21
Centering: Reducing spatial centrality
Spatial centrality implies the object which is similar to
the centroid tends to be hub
After centering, similarities are identical: i.e., zero
centroid = origin
[Suzuki+, 2013]
Mutual proximity: Breaking asymmetric relation
!22
[Schnitzer+, 2012]
Although hub becomes the nearest neighbor of many
objects, such objects can not become the nearest
neighbor of hub
Mutual proximity makes neighbor relations symmetric
hub
Mutual proximity: Breaking asymmetric relation
!23
Although hub becomes the nearest neighbor of many
objects, such objects can not become the nearest
neighbor of hub
Mutual proximity makes neighbor relations symmetric
hub
[Schnitzer+, 2012]
Mutual proximity: Breaking asymmetric relation
!24
Although hub becomes the nearest neighbor of many
objects, such objects can not become the nearest
neighbor of hub
Mutual proximity makes neighbor relations symmetric
hub
[Schnitzer+, 2012]
Zero-Shot Learning
[Shigeto+, 2015]
Zero-shot learning
Active research topic in NLP, CV, ML

Many applications:
•Image labeling

•Bilingual lexicon extraction

+ Many other cross-domain matching tasks
!26
[Larochelle+, 2008]
…but classifier has to predict

labels not appearing in training set
ZSL is a type of multi-class classification
!27
ZSL task
Standard classification task
!28
Pre-processing: Label embedding
Labels are embedded in metric space
Objects and labels = both vectors
label space
lion
tigerexample space
chimpanzee
Find a matrix M that projects examples into label space
!29
chimpanzee
lion
tigerlabel spaceexample space
M
Training: find a projection function
lion
tigerlabel spaceexample space label spaceexample space
chimpanzee
leopard
gorilla
!30
Prediction: Nearest neighbor search
Given test object and test labels,

to predict the label of a test object,
1. project the example into label space, using matrix M
2. find the nearest label
Prediction: Nearest neighbor search
Given test object and test labels,

to predict the label of a test object,
lion
tigerlabel spaceexample space label spaceexample space
chimpanzee
leopard
gorilla
M
!31
Hubness: Problem in ZSL
!32
sheep
zebra
hippo
rat
label spaceexample space
Classifier frequently predicts the same labels (“hubs”)
[Dinu and Baroni, 2015; see also Radovanović+, 2010]
!33
sheep
zebra
hippo
rat
label spaceexample space
Classifier frequently predicts the same labels (“hubs”)
Hubness: Problem in ZSL
[Dinu and Baroni, 2015; see also Radovanović+, 2010]
!34
Classifier frequently predicts the same labels (“hubs”)
sheep
zebra
hippo
rat
label spaceexample space
Hubness: Problem in ZSL
[Dinu and Baroni, 2015; see also Radovanović+, 2010]
!35
sheep
zebra
hippo
rat
label spaceexample space
Classifier frequently predicts the same labels (“hubs”)
Hubness: Problem in ZSL
[Dinu and Baroni, 2015; see also Radovanović+, 2010]
!36
sheep
zebra
hippo
rat
label spaceexample space
Classifier frequently predicts the same labels (“hubs”)
Hubness: Problem in ZSL
[Dinu and Baroni, 2015; see also Radovanović+, 2010]
!37
Problem with current regression approach:
Research objective:
Learned classifier frequently predicts the same labels

(Emergence of “hub” labels)
Investigate why hubs emerge in regression-based ZSL,
and how to reduce the emergence of hubs
Proposed approach
Current approach:
!39
Proposed approach:
chimpanzee
lion
tigerlabel spaceexample space
M
example space
chimpanzee
lion
tigerlabel space
M
Current approach:
Proposed approach:
chimpanzee
lion
tigerlabel spaceexample space
M
example space
chimpanzee
lion
tigerlabel space
label spaceexample space
label spaceexample space !40
leopard
gorilla
M
leopard
gorillaM
Synthetic data result
!41
Hubness

(N1 skewness)
Accuracy
24.2
13.8
0.5
87.6
Current Proposed
Proposed approach reduces hubness

and improves accuracy
Why proposed approach reduces hubness
Shrinkage
in regression
!42
Argument for our proposal relies on two concepts
Spatial centrality
of data distributions
!43
If we optimize
Then, we have
“Shrinkage” in ridge/least squares regression
[See also Lazaridou+,2015]
!44
If we optimize
Then, we have
For simplicity, projected objects are assumed to also follow normal distribution
“Shrinkage” in ridge/least squares regression
[See also Lazaridou+,2015]
Why proposed approach reduces hubness
Shrinkage
in regression
!45
Argument for our proposal relies on two concepts
Spatial centrality
of data distributions
✔
“Spatial centrality”
: query distribution (zero mean)
!46
Fixed objects , with
[See also Radovanović+, 2010]
“Spatial centrality”
: query distribution (zero mean)
is more likely to be closer to
more likely to be a hub
!47
Then it can be shown that
i.e.
Fixed objects , with
EX x y2
2
EX x y1
2
> 0
[See also Radovanović+, 2010]
“Spatial centrality”
: query distribution (zero mean)
is more likely to be closer to
more likely to be a hub
!48
Then it can be shown that
i.e.
Fixed objects , with
Because this holds for any pair and ,
objects closest to the origin tend to be hubs

This bias is called “spatial centrality.”
EX x y2
2
EX x y1
2
> 0
[See also Radovanović+, 2010]
Degree of spatial centrality
!49
Further assume distribution of

and
Degree of spatial centrality
!50
Further assume distribution of 

and
This formula quantifies the degree of spatial centrality
We have
The smaller the variance of label distribution, the
smaller the spatial centrality (= bias causing hubness)
Spatial centrality depends on variance of
label distributions
!51
Why proposed approach reduces hubness
Shrinkage
in regression
!52
Argument for our proposal relies on two concepts
Spatial centrality
of data distributions
✔ ✔
!53
Current approach: map X into Y
Proposed approach: map Y into X
!54
Current approach: map X into Y
Proposed approach: map Y into X
Shrinkage
!55
Current approach: map X into Y
Proposed approach: map Y into X
!56
Current approach: map X into Y
Proposed approach: map Y into X
Q. Which configuration is better for reducing hubs?
!57
Proposed Current
Q. Which configuration is better for reducing hubs?
!58
Proposed Current逆方向 順方向
Spatial centrality
For a fixed query distribution ,
data distribution with smaller variance is
preferable to reduce hubs
Q. Which configuration is better for reducing hubs?
!59
Proposed Current
Q. Which configuration is better for reducing hubs?
!60
Proposed Current
Since distribution is not fixed,
comparing label distribution is not meaningful
!61
Proposed (scaled) Current
Q. Which configuration is better for reducing hubs?
Scaling does not change the nearest neighbor relation
!62
Proposed (scaled) Current
Q. Which configuration is better for reducing hubs?
A. Reverse direction is preferable
For fixed distribution ,
variance of distribution in proposed is smaller
Summary of our proposal
!63
Project labels into example space

➥ reduces variance of labels,
hence suppresses hubness
chimpanzee
gorilla
example space label space
Label distribution with smaller variance is
desirable to reduce hubness
Spatial centrality
Regression shrinks variance of projected
objects
Shrinkage
Proposal
Experiments
64
Experimental objective
!65
We evaluate proposed approach in real tasks
•Does it suppress hubs?

•Does it improve the prediction accuracy?
•Bilingual lexicon extraction
gorilla
leopard
: source language : target language
gorille
Zero-shot tasks
!66
•Image labeling
gorilla
leopard
: image : label
Compared methods
!67
Current Proposed CCA
We used Euclidean distance as a distance measure 

for finding the nearest label
2.00
0.08
2.61
better
Hubness (skewness)
!68
9.2
41.3
22.6
current reverse CCA
better
Accuracy [%]
Image labeling
10.0
37.7
3.85.2
65.162.1
better
Hubness (skewness)
Ja → En En → Ja
Bilingual lexicon extraction: Ja - En
!69
21.620.2
34.431.9
0.40.2
current reverse CCA
better
Accuracy [%]
Ja → En En → Ja
Summary
• Analyzed why hubs emerge in current ZSL approach

- Variance of labels greater than examples

• Proposed a simple method for reducing hubness

- Reverse the mapping direction

• Proposed method reduced hubness and
outperformed current approach and CCA in image
labeling and bilingual lexicon extraction tasks
!70
k-Nearest Neighbor Classification
[Shigeto+, 2016]
k-nearest neighbor classification
!72
Given a dataset D = {(xi, yi)}n
i=1
the label of is decided by its k-nearest neighbors:x
ˆy = arg min
yi:(xi,yi) D
f(x, xi)
k-nearest neighbor classification
!73
Given a dataset D = {(xi, yi)}n
i=1
the label of is decided by its k-nearest neighbors:x
ˆy = arg min
yi:(xi,yi) D
f(x, xi)
k-nearest neighbor classification
!74
Given a dataset D = {(xi, yi)}n
i=1
the label of is decided by its k-nearest neighbors:x
ˆy = arg min
yi:(xi,yi) D
f(x, xi)
!75
Given a dataset D = {(xi, yi)}n
i=1
the label of is decided by its k-nearest neighbors:x
ˆy = arg min
yi:(xi,yi) D
f(x, xi)
k-nearest neighbor classification
Distance metric learning learns a matrix
f(x, xi) = Lx Lxi
L
Training is computationally expensive
!76
Given a dataset D = {(xi, yi)}n
i=1
the label of is decided by its k-nearest neighbors:x
ˆy = arg min
yi:(xi,yi) D
f(x, xi)
Proposal: Dissimilarity
!77
Given a dataset D = {(xi, yi)}n
i=1
the label of is decided by its k-nearest neighbors:x
ˆy = arg min
yi:(xi,yi) D
f(x, xi)
Proposal: Dissimilarity
Spatial centrality
For a fixed query distribution ,
data distribution with smaller variance is
preferable to reduce hubs
!78
Given a dataset D = {(xi, yi)}n
i=1
the label of is decided by its k-nearest neighbors:x
ˆy = arg min
yi:(xi,yi) D
f(x, xi)
The function f needs to be computed only
between labeled objects and unlabeled object

➡labeled objects are always target of retrieval,

and unlabeled object is always query
f(x, xi) = x Wxi
2
Proposal: Dissimilarity
This method is not metric learning
!79
•The goal of classification is to classify the query
correctly

-finding a suitable decision boundary (not metric)
!80
min
W
n
i=1 z Ti
xi Wz 2
+ W 2
F
Find a matrix which minimizes the distance:
Proposal: Training
W
!81
min
W
n
i=1 z Ti
xi Wz 2
+ W 2
F
Proposal: Training
Find a matrix which minimizes the distance:W
!82
min
W
n
i=1 z Ti
xi Wz 2
+ W 2
F
Proposal: Training
Find a matrix which minimizes the distance:W
!83
min
W
n
i=1 z Ti
xi Wz 2
+ W 2
F
Proposal: Training
W = XJXT
(XXT
+ I) 1
This function has the closed-form solution:
Find a matrix which minimizes the distance:W
!84
Givne a query object ,
ˆy = arg min
yi:(xi,yi) D
x Wxi
2
Proposal: Test
x
!85
ˆy = arg min
yi:(xi,yi) D
x Wxi
2
Proposal: Test
Givne a query object ,x
Move labeled objects v.s. move query
!86
f(x, xi) = Mx xi
2
f(x, xi) = x Wxi
2
•Move labeled objects (proposal)
•Move query
This reduces the variance 

= reducing the emergence of hubs
This increases the variance 

= promoting the emergence of hubs
Experiments
87
Experimental objective
!88
We evaluate the proposed method on various datasets

Our main focuses are
-Does it suppress hubs?

-Does it improve the classification accuracy?

-Is it faster than distance metric learning?
Results: Skewness (degree of hubness)
!89
The proposed method

-reduces the emergence of hubs

method RCV News Reuters TDT
original metric 13.35 21.93 7.61 4.89
LMNN 3.86 14.74 7.63 4.01
ITML 4.27 19.65 7.30 2.39
DML-eig 1.71 1.45 3.05 1.34
Move-labeled (proposed) 1.14 2.88 4.53 1.44
Move-query 21.57 33.36 17.49 6.71
(c) Image datasets.
method AwA CUB SUN aPY
original metric 2.49 2.38 2.52 2.80
LMNN 3.10 2.96 2.80 3.94
ITML 2.42 2.27 2.37 2.69
DML-eig 1.90 1.77 2.39 2.17
Move-labeled (proposed) 1.24 0.97 1.02 1.23
Move-query 7.81 7.83 7.48 11.65
Image datasets (smaller is better)
Results: Classification accuracy [%]
!90
Image datasets
The proposed method

-reduces the emergence of hubs

-is better than metric learning methods on most datasets

method RCV News Reuters TDT
original metric 92.1 76.9 89.5 96.1
LMNN 94.7 79.9 91.5 96.6
ITML 93.2 77.0 90.8 96.5
DML-eig 94.5 73.3 85.9 95.7
Move-labeled (proposed) 94.4 81.6 91.6 96.7
Move-query 89.1 70.0 85.9 95.4
(c) Image datasets.
method AwA CUB SUN aPY
original metric 83.2 51.6 26.2 82.2
LMNN 83.0 54.7 24.4 81.8
ITML 83.1 51.3 26.0 82.4
DML-eig 82.0 53.5 22.4 81.6
Move-labeled (proposed) 84.1 52.4 28.3 83.4
Move-query 79.2 43.3 14.6 78.7
Results: Training time [s]
!91
The proposed method

-reduces the emergence of hubs

-is better than metric learning methods on most datasets

-is faster than … on all datasets
Document datasetsndicate the best performer for each dataset.
(b) Image datasets.
method AwA CUB SUN aPY
LMNN 1525.5 1098.2 15704.3 317.3
ITML 1536.3 577.6 1126.4 9211.2
DML-eig 2048.0 2084.7 2006.1 1787.1
proposed 9.5 1.5 4.1 6.4
Results: UCI datasets
!92
The proposed method

-reduces the emergence of hubs

-is better than metric learning methods on most datasets

-is faster than … on all datasets

-does not work well on UCI datasets
able 3: Classification accuracy [%]: Bold figures indicate the best performers for each
ataset.
(a) UCI datasets.
method ionosphere balance-scale iris wine glass
original metric 86.8 89.5 97.2 98.1 68.1
LMNN 90.3 90.0 96.7 98.1 67.7
ITML 87.7 89.5 97.8 99.1 65.0
DML-eig 87.7 91.2 96.7 98.6 66.5
Move-labeled (proposed) 89.6 89.5 97.2 98.6 70.8
Move-query 79.7 89.4 97.2 96.3 62.3
(b) Document datasets.
Summary
!93
Prediction:
ˆy = arg min
yi:(xi,yi) D
x Wxi
2
The proposed method

-reduces the emergence of hubs

-is better than metric learning methods on most datasets

-is faster than … on all datasets

-does not work well on UCI datasets
Misc.
Other topics
• Normalization of distances

- Local scaling [Schnitzer+, 2012], Laplacian-based kernel [Suzuki+, 2012],
Localized centering [Hara+, 2015]

• Classifiers

-hw-kNN [Radovanović+, 2009], h-FNN [Tomašev+, 2013], NHBNN
[Tomašev+, 2011]
!95See comprehensive survey [Tomašev+, 2015; Suzuki, 2014; Radovanović, 2017]
Tools
• Hub miner: Hubness-aware machine learning

• Hub toolbox

• PyHubs

• Our code
!96
Conclusions
• Introduced why hubs emerge

-Spatial centrality

• Showed hub reduction methods which improved the
performance of nearest neighbor methods
!97

Contenu connexe

Tendances

SSII2021 [SS1] Transformer x Computer Visionの 実活用可能性と展望 〜 TransformerのCompute...
SSII2021 [SS1] Transformer x Computer Visionの 実活用可能性と展望 〜 TransformerのCompute...SSII2021 [SS1] Transformer x Computer Visionの 実活用可能性と展望 〜 TransformerのCompute...
SSII2021 [SS1] Transformer x Computer Visionの 実活用可能性と展望 〜 TransformerのCompute...SSII
 
[DL輪読会]GLIDE: Guided Language to Image Diffusion for Generation and Editing
[DL輪読会]GLIDE: Guided Language to Image Diffusion  for Generation and Editing[DL輪読会]GLIDE: Guided Language to Image Diffusion  for Generation and Editing
[DL輪読会]GLIDE: Guided Language to Image Diffusion for Generation and EditingDeep Learning JP
 
深層生成モデルと世界モデル
深層生成モデルと世界モデル深層生成モデルと世界モデル
深層生成モデルと世界モデルMasahiro Suzuki
 
【メタサーベイ】数式ドリブン教師あり学習
【メタサーベイ】数式ドリブン教師あり学習【メタサーベイ】数式ドリブン教師あり学習
【メタサーベイ】数式ドリブン教師あり学習cvpaper. challenge
 
Skip Connection まとめ(Neural Network)
Skip Connection まとめ(Neural Network)Skip Connection まとめ(Neural Network)
Skip Connection まとめ(Neural Network)Yamato OKAMOTO
 
SSII2022 [TS1] Transformerの最前線〜 畳込みニューラルネットワークの先へ 〜
SSII2022 [TS1] Transformerの最前線〜 畳込みニューラルネットワークの先へ 〜SSII2022 [TS1] Transformerの最前線〜 畳込みニューラルネットワークの先へ 〜
SSII2022 [TS1] Transformerの最前線〜 畳込みニューラルネットワークの先へ 〜SSII
 
【DL輪読会】Efficiently Modeling Long Sequences with Structured State Spaces
【DL輪読会】Efficiently Modeling Long Sequences with Structured State Spaces【DL輪読会】Efficiently Modeling Long Sequences with Structured State Spaces
【DL輪読会】Efficiently Modeling Long Sequences with Structured State SpacesDeep Learning JP
 
変分推論法(変分ベイズ法)(PRML第10章)
変分推論法(変分ベイズ法)(PRML第10章)変分推論法(変分ベイズ法)(PRML第10章)
変分推論法(変分ベイズ法)(PRML第10章)Takao Yamanaka
 
SSII2021 [OS2-01] 転移学習の基礎:異なるタスクの知識を利用するための機械学習の方法
SSII2021 [OS2-01] 転移学習の基礎:異なるタスクの知識を利用するための機械学習の方法SSII2021 [OS2-01] 転移学習の基礎:異なるタスクの知識を利用するための機械学習の方法
SSII2021 [OS2-01] 転移学習の基礎:異なるタスクの知識を利用するための機械学習の方法SSII
 
Curriculum Learning (関東CV勉強会)
Curriculum Learning (関東CV勉強会)Curriculum Learning (関東CV勉強会)
Curriculum Learning (関東CV勉強会)Yoshitaka Ushiku
 
【DL輪読会】High-Resolution Image Synthesis with Latent Diffusion Models
【DL輪読会】High-Resolution Image Synthesis with Latent Diffusion Models【DL輪読会】High-Resolution Image Synthesis with Latent Diffusion Models
【DL輪読会】High-Resolution Image Synthesis with Latent Diffusion ModelsDeep Learning JP
 
IIBMP2016 深層生成モデルによる表現学習
IIBMP2016 深層生成モデルによる表現学習IIBMP2016 深層生成モデルによる表現学習
IIBMP2016 深層生成モデルによる表現学習Preferred Networks
 
深層生成モデルと世界モデル(2020/11/20版)
深層生成モデルと世界モデル(2020/11/20版)深層生成モデルと世界モデル(2020/11/20版)
深層生成モデルと世界モデル(2020/11/20版)Masahiro Suzuki
 
【DL輪読会】Contrastive Learning as Goal-Conditioned Reinforcement Learning
【DL輪読会】Contrastive Learning as Goal-Conditioned Reinforcement Learning【DL輪読会】Contrastive Learning as Goal-Conditioned Reinforcement Learning
【DL輪読会】Contrastive Learning as Goal-Conditioned Reinforcement LearningDeep Learning JP
 
[DL輪読会]Dream to Control: Learning Behaviors by Latent Imagination
[DL輪読会]Dream to Control: Learning Behaviors by Latent Imagination[DL輪読会]Dream to Control: Learning Behaviors by Latent Imagination
[DL輪読会]Dream to Control: Learning Behaviors by Latent ImaginationDeep Learning JP
 
画像生成・生成モデル メタサーベイ
画像生成・生成モデル メタサーベイ画像生成・生成モデル メタサーベイ
画像生成・生成モデル メタサーベイcvpaper. challenge
 
Layer Normalization@NIPS+読み会・関西
Layer Normalization@NIPS+読み会・関西Layer Normalization@NIPS+読み会・関西
Layer Normalization@NIPS+読み会・関西Keigo Nishida
 
ICLR2019 読み会in京都 ICLRから読み取るFeature Disentangleの研究動向
ICLR2019 読み会in京都 ICLRから読み取るFeature Disentangleの研究動向ICLR2019 読み会in京都 ICLRから読み取るFeature Disentangleの研究動向
ICLR2019 読み会in京都 ICLRから読み取るFeature Disentangleの研究動向Yamato OKAMOTO
 
【DL輪読会】SimCSE: Simple Contrastive Learning of Sentence Embeddings (EMNLP 2021)
【DL輪読会】SimCSE: Simple Contrastive Learning of Sentence Embeddings  (EMNLP 2021)【DL輪読会】SimCSE: Simple Contrastive Learning of Sentence Embeddings  (EMNLP 2021)
【DL輪読会】SimCSE: Simple Contrastive Learning of Sentence Embeddings (EMNLP 2021)Deep Learning JP
 

Tendances (20)

SSII2021 [SS1] Transformer x Computer Visionの 実活用可能性と展望 〜 TransformerのCompute...
SSII2021 [SS1] Transformer x Computer Visionの 実活用可能性と展望 〜 TransformerのCompute...SSII2021 [SS1] Transformer x Computer Visionの 実活用可能性と展望 〜 TransformerのCompute...
SSII2021 [SS1] Transformer x Computer Visionの 実活用可能性と展望 〜 TransformerのCompute...
 
[DL輪読会]GLIDE: Guided Language to Image Diffusion for Generation and Editing
[DL輪読会]GLIDE: Guided Language to Image Diffusion  for Generation and Editing[DL輪読会]GLIDE: Guided Language to Image Diffusion  for Generation and Editing
[DL輪読会]GLIDE: Guided Language to Image Diffusion for Generation and Editing
 
深層生成モデルと世界モデル
深層生成モデルと世界モデル深層生成モデルと世界モデル
深層生成モデルと世界モデル
 
【メタサーベイ】数式ドリブン教師あり学習
【メタサーベイ】数式ドリブン教師あり学習【メタサーベイ】数式ドリブン教師あり学習
【メタサーベイ】数式ドリブン教師あり学習
 
Skip Connection まとめ(Neural Network)
Skip Connection まとめ(Neural Network)Skip Connection まとめ(Neural Network)
Skip Connection まとめ(Neural Network)
 
SSII2022 [TS1] Transformerの最前線〜 畳込みニューラルネットワークの先へ 〜
SSII2022 [TS1] Transformerの最前線〜 畳込みニューラルネットワークの先へ 〜SSII2022 [TS1] Transformerの最前線〜 畳込みニューラルネットワークの先へ 〜
SSII2022 [TS1] Transformerの最前線〜 畳込みニューラルネットワークの先へ 〜
 
【DL輪読会】Efficiently Modeling Long Sequences with Structured State Spaces
【DL輪読会】Efficiently Modeling Long Sequences with Structured State Spaces【DL輪読会】Efficiently Modeling Long Sequences with Structured State Spaces
【DL輪読会】Efficiently Modeling Long Sequences with Structured State Spaces
 
変分推論法(変分ベイズ法)(PRML第10章)
変分推論法(変分ベイズ法)(PRML第10章)変分推論法(変分ベイズ法)(PRML第10章)
変分推論法(変分ベイズ法)(PRML第10章)
 
SSII2021 [OS2-01] 転移学習の基礎:異なるタスクの知識を利用するための機械学習の方法
SSII2021 [OS2-01] 転移学習の基礎:異なるタスクの知識を利用するための機械学習の方法SSII2021 [OS2-01] 転移学習の基礎:異なるタスクの知識を利用するための機械学習の方法
SSII2021 [OS2-01] 転移学習の基礎:異なるタスクの知識を利用するための機械学習の方法
 
Curriculum Learning (関東CV勉強会)
Curriculum Learning (関東CV勉強会)Curriculum Learning (関東CV勉強会)
Curriculum Learning (関東CV勉強会)
 
【DL輪読会】High-Resolution Image Synthesis with Latent Diffusion Models
【DL輪読会】High-Resolution Image Synthesis with Latent Diffusion Models【DL輪読会】High-Resolution Image Synthesis with Latent Diffusion Models
【DL輪読会】High-Resolution Image Synthesis with Latent Diffusion Models
 
IIBMP2016 深層生成モデルによる表現学習
IIBMP2016 深層生成モデルによる表現学習IIBMP2016 深層生成モデルによる表現学習
IIBMP2016 深層生成モデルによる表現学習
 
深層生成モデルと世界モデル(2020/11/20版)
深層生成モデルと世界モデル(2020/11/20版)深層生成モデルと世界モデル(2020/11/20版)
深層生成モデルと世界モデル(2020/11/20版)
 
【DL輪読会】Contrastive Learning as Goal-Conditioned Reinforcement Learning
【DL輪読会】Contrastive Learning as Goal-Conditioned Reinforcement Learning【DL輪読会】Contrastive Learning as Goal-Conditioned Reinforcement Learning
【DL輪読会】Contrastive Learning as Goal-Conditioned Reinforcement Learning
 
[DL輪読会]World Models
[DL輪読会]World Models[DL輪読会]World Models
[DL輪読会]World Models
 
[DL輪読会]Dream to Control: Learning Behaviors by Latent Imagination
[DL輪読会]Dream to Control: Learning Behaviors by Latent Imagination[DL輪読会]Dream to Control: Learning Behaviors by Latent Imagination
[DL輪読会]Dream to Control: Learning Behaviors by Latent Imagination
 
画像生成・生成モデル メタサーベイ
画像生成・生成モデル メタサーベイ画像生成・生成モデル メタサーベイ
画像生成・生成モデル メタサーベイ
 
Layer Normalization@NIPS+読み会・関西
Layer Normalization@NIPS+読み会・関西Layer Normalization@NIPS+読み会・関西
Layer Normalization@NIPS+読み会・関西
 
ICLR2019 読み会in京都 ICLRから読み取るFeature Disentangleの研究動向
ICLR2019 読み会in京都 ICLRから読み取るFeature Disentangleの研究動向ICLR2019 読み会in京都 ICLRから読み取るFeature Disentangleの研究動向
ICLR2019 読み会in京都 ICLRから読み取るFeature Disentangleの研究動向
 
【DL輪読会】SimCSE: Simple Contrastive Learning of Sentence Embeddings (EMNLP 2021)
【DL輪読会】SimCSE: Simple Contrastive Learning of Sentence Embeddings  (EMNLP 2021)【DL輪読会】SimCSE: Simple Contrastive Learning of Sentence Embeddings  (EMNLP 2021)
【DL輪読会】SimCSE: Simple Contrastive Learning of Sentence Embeddings (EMNLP 2021)
 

En vedette

Stair Captions and Stair Actions(ステアラボ人工知能シンポジウム2017)
Stair Captions and Stair Actions(ステアラボ人工知能シンポジウム2017)Stair Captions and Stair Actions(ステアラボ人工知能シンポジウム2017)
Stair Captions and Stair Actions(ステアラボ人工知能シンポジウム2017)STAIR Lab, Chiba Institute of Technology
 
画像キャプションの自動生成(第3回ステアラボ人工知能セミナー)
画像キャプションの自動生成(第3回ステアラボ人工知能セミナー)画像キャプションの自動生成(第3回ステアラボ人工知能セミナー)
画像キャプションの自動生成(第3回ステアラボ人工知能セミナー)STAIR Lab, Chiba Institute of Technology
 
視覚×言語の最前線(ステアラボ人工知能シンポジウム2017)
視覚×言語の最前線(ステアラボ人工知能シンポジウム2017)視覚×言語の最前線(ステアラボ人工知能シンポジウム2017)
視覚×言語の最前線(ステアラボ人工知能シンポジウム2017)STAIR Lab, Chiba Institute of Technology
 
深層学習を利用した映像要約への取り組み(第7回ステアラボ人工知能セミナー)
深層学習を利用した映像要約への取り組み(第7回ステアラボ人工知能セミナー)深層学習を利用した映像要約への取り組み(第7回ステアラボ人工知能セミナー)
深層学習を利用した映像要約への取り組み(第7回ステアラボ人工知能セミナー)STAIR Lab, Chiba Institute of Technology
 
教師なしオブジェクトマッチング(第2回ステアラボ人工知能セミナー)
教師なしオブジェクトマッチング(第2回ステアラボ人工知能セミナー)教師なしオブジェクトマッチング(第2回ステアラボ人工知能セミナー)
教師なしオブジェクトマッチング(第2回ステアラボ人工知能セミナー)STAIR Lab, Chiba Institute of Technology
 
知識グラフの埋め込みとその応用 (第10回ステアラボ人工知能セミナー)
知識グラフの埋め込みとその応用 (第10回ステアラボ人工知能セミナー)知識グラフの埋め込みとその応用 (第10回ステアラボ人工知能セミナー)
知識グラフの埋め込みとその応用 (第10回ステアラボ人工知能セミナー)STAIR Lab, Chiba Institute of Technology
 
多腕バンディット問題: 定式化と応用 (第13回ステアラボ人工知能セミナー)
多腕バンディット問題: 定式化と応用 (第13回ステアラボ人工知能セミナー)多腕バンディット問題: 定式化と応用 (第13回ステアラボ人工知能セミナー)
多腕バンディット問題: 定式化と応用 (第13回ステアラボ人工知能セミナー)STAIR Lab, Chiba Institute of Technology
 
Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)
Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)
Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)STAIR Lab, Chiba Institute of Technology
 
ヒューマンコンピュテーションのための専門家発見(ステアラボ人工知能シンポジウム2017)
ヒューマンコンピュテーションのための専門家発見(ステアラボ人工知能シンポジウム2017)ヒューマンコンピュテーションのための専門家発見(ステアラボ人工知能シンポジウム2017)
ヒューマンコンピュテーションのための専門家発見(ステアラボ人工知能シンポジウム2017)STAIR Lab, Chiba Institute of Technology
 
最近の重要な論文の紹介 - テキストとの対応付けによる映像の理解に関連して(ステアラボ人工知能シンポジウム2017)
最近の重要な論文の紹介 - テキストとの対応付けによる映像の理解に関連して(ステアラボ人工知能シンポジウム2017)最近の重要な論文の紹介 - テキストとの対応付けによる映像の理解に関連して(ステアラボ人工知能シンポジウム2017)
最近の重要な論文の紹介 - テキストとの対応付けによる映像の理解に関連して(ステアラボ人工知能シンポジウム2017)STAIR Lab, Chiba Institute of Technology
 
自然言語処理分野の最前線(ステアラボ人工知能シンポジウム2017)
自然言語処理分野の最前線(ステアラボ人工知能シンポジウム2017)自然言語処理分野の最前線(ステアラボ人工知能シンポジウム2017)
自然言語処理分野の最前線(ステアラボ人工知能シンポジウム2017)STAIR Lab, Chiba Institute of Technology
 
言語資源と付き合う
言語資源と付き合う言語資源と付き合う
言語資源と付き合うYuya Unno
 
群衆の知を引き出すための機械学習(第4回ステアラボ人工知能セミナー)
群衆の知を引き出すための機械学習(第4回ステアラボ人工知能セミナー)群衆の知を引き出すための機械学習(第4回ステアラボ人工知能セミナー)
群衆の知を引き出すための機械学習(第4回ステアラボ人工知能セミナー)STAIR Lab, Chiba Institute of Technology
 
Computer Vision meets Fashion (第12回ステアラボ人工知能セミナー)
Computer Vision meets Fashion (第12回ステアラボ人工知能セミナー)Computer Vision meets Fashion (第12回ステアラボ人工知能セミナー)
Computer Vision meets Fashion (第12回ステアラボ人工知能セミナー)STAIR Lab, Chiba Institute of Technology
 
時系列ビッグデータの特徴自動抽出とリアルタイム将来予測(第9回ステアラボ人工知能セミナー)
時系列ビッグデータの特徴自動抽出とリアルタイム将来予測(第9回ステアラボ人工知能セミナー)時系列ビッグデータの特徴自動抽出とリアルタイム将来予測(第9回ステアラボ人工知能セミナー)
時系列ビッグデータの特徴自動抽出とリアルタイム将来予測(第9回ステアラボ人工知能セミナー)STAIR Lab, Chiba Institute of Technology
 
情報抽出入門 〜非構造化データを構造化させる技術〜
情報抽出入門 〜非構造化データを構造化させる技術〜情報抽出入門 〜非構造化データを構造化させる技術〜
情報抽出入門 〜非構造化データを構造化させる技術〜Yuya Unno
 
深層学習時代の自然言語処理
深層学習時代の自然言語処理深層学習時代の自然言語処理
深層学習時代の自然言語処理Yuya Unno
 

En vedette (20)

第1回ステアラボ人工知能セミナー(オープニング)
第1回ステアラボ人工知能セミナー(オープニング)第1回ステアラボ人工知能セミナー(オープニング)
第1回ステアラボ人工知能セミナー(オープニング)
 
Stair Captions and Stair Actions(ステアラボ人工知能シンポジウム2017)
Stair Captions and Stair Actions(ステアラボ人工知能シンポジウム2017)Stair Captions and Stair Actions(ステアラボ人工知能シンポジウム2017)
Stair Captions and Stair Actions(ステアラボ人工知能シンポジウム2017)
 
画像キャプションの自動生成(第3回ステアラボ人工知能セミナー)
画像キャプションの自動生成(第3回ステアラボ人工知能セミナー)画像キャプションの自動生成(第3回ステアラボ人工知能セミナー)
画像キャプションの自動生成(第3回ステアラボ人工知能セミナー)
 
視覚×言語の最前線(ステアラボ人工知能シンポジウム2017)
視覚×言語の最前線(ステアラボ人工知能シンポジウム2017)視覚×言語の最前線(ステアラボ人工知能シンポジウム2017)
視覚×言語の最前線(ステアラボ人工知能シンポジウム2017)
 
深層学習を利用した映像要約への取り組み(第7回ステアラボ人工知能セミナー)
深層学習を利用した映像要約への取り組み(第7回ステアラボ人工知能セミナー)深層学習を利用した映像要約への取り組み(第7回ステアラボ人工知能セミナー)
深層学習を利用した映像要約への取り組み(第7回ステアラボ人工知能セミナー)
 
教師なしオブジェクトマッチング(第2回ステアラボ人工知能セミナー)
教師なしオブジェクトマッチング(第2回ステアラボ人工知能セミナー)教師なしオブジェクトマッチング(第2回ステアラボ人工知能セミナー)
教師なしオブジェクトマッチング(第2回ステアラボ人工知能セミナー)
 
知識グラフの埋め込みとその応用 (第10回ステアラボ人工知能セミナー)
知識グラフの埋め込みとその応用 (第10回ステアラボ人工知能セミナー)知識グラフの埋め込みとその応用 (第10回ステアラボ人工知能セミナー)
知識グラフの埋め込みとその応用 (第10回ステアラボ人工知能セミナー)
 
多腕バンディット問題: 定式化と応用 (第13回ステアラボ人工知能セミナー)
多腕バンディット問題: 定式化と応用 (第13回ステアラボ人工知能セミナー)多腕バンディット問題: 定式化と応用 (第13回ステアラボ人工知能セミナー)
多腕バンディット問題: 定式化と応用 (第13回ステアラボ人工知能セミナー)
 
Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)
Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)
Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)
 
ヒューマンコンピュテーションのための専門家発見(ステアラボ人工知能シンポジウム2017)
ヒューマンコンピュテーションのための専門家発見(ステアラボ人工知能シンポジウム2017)ヒューマンコンピュテーションのための専門家発見(ステアラボ人工知能シンポジウム2017)
ヒューマンコンピュテーションのための専門家発見(ステアラボ人工知能シンポジウム2017)
 
最近の重要な論文の紹介 - テキストとの対応付けによる映像の理解に関連して(ステアラボ人工知能シンポジウム2017)
最近の重要な論文の紹介 - テキストとの対応付けによる映像の理解に関連して(ステアラボ人工知能シンポジウム2017)最近の重要な論文の紹介 - テキストとの対応付けによる映像の理解に関連して(ステアラボ人工知能シンポジウム2017)
最近の重要な論文の紹介 - テキストとの対応付けによる映像の理解に関連して(ステアラボ人工知能シンポジウム2017)
 
自然言語処理分野の最前線(ステアラボ人工知能シンポジウム2017)
自然言語処理分野の最前線(ステアラボ人工知能シンポジウム2017)自然言語処理分野の最前線(ステアラボ人工知能シンポジウム2017)
自然言語処理分野の最前線(ステアラボ人工知能シンポジウム2017)
 
言語資源と付き合う
言語資源と付き合う言語資源と付き合う
言語資源と付き合う
 
JSAI Cup2017報告会
JSAI Cup2017報告会JSAI Cup2017報告会
JSAI Cup2017報告会
 
群衆の知を引き出すための機械学習(第4回ステアラボ人工知能セミナー)
群衆の知を引き出すための機械学習(第4回ステアラボ人工知能セミナー)群衆の知を引き出すための機械学習(第4回ステアラボ人工知能セミナー)
群衆の知を引き出すための機械学習(第4回ステアラボ人工知能セミナー)
 
Computer Vision meets Fashion (第12回ステアラボ人工知能セミナー)
Computer Vision meets Fashion (第12回ステアラボ人工知能セミナー)Computer Vision meets Fashion (第12回ステアラボ人工知能セミナー)
Computer Vision meets Fashion (第12回ステアラボ人工知能セミナー)
 
時系列ビッグデータの特徴自動抽出とリアルタイム将来予測(第9回ステアラボ人工知能セミナー)
時系列ビッグデータの特徴自動抽出とリアルタイム将来予測(第9回ステアラボ人工知能セミナー)時系列ビッグデータの特徴自動抽出とリアルタイム将来予測(第9回ステアラボ人工知能セミナー)
時系列ビッグデータの特徴自動抽出とリアルタイム将来予測(第9回ステアラボ人工知能セミナー)
 
情報抽出入門 〜非構造化データを構造化させる技術〜
情報抽出入門 〜非構造化データを構造化させる技術〜情報抽出入門 〜非構造化データを構造化させる技術〜
情報抽出入門 〜非構造化データを構造化させる技術〜
 
深層学習による自然言語処理の研究動向
深層学習による自然言語処理の研究動向深層学習による自然言語処理の研究動向
深層学習による自然言語処理の研究動向
 
深層学習時代の自然言語処理
深層学習時代の自然言語処理深層学習時代の自然言語処理
深層学習時代の自然言語処理
 

Similaire à 高次元空間におけるハブの出現 (第11回ステアラボ人工知能セミナー)

Word_Embeddings.pptx
Word_Embeddings.pptxWord_Embeddings.pptx
Word_Embeddings.pptxGowrySailaja
 
Word2vec slide(lab seminar)
Word2vec slide(lab seminar)Word2vec slide(lab seminar)
Word2vec slide(lab seminar)Jinpyo Lee
 
Poggi analytics - clustering - 1
Poggi   analytics - clustering - 1Poggi   analytics - clustering - 1
Poggi analytics - clustering - 1Gaston Liberman
 
Latent Dirichlet Allocation
Latent Dirichlet AllocationLatent Dirichlet Allocation
Latent Dirichlet AllocationMarco Righini
 
Deep Learning for Information Retrieval
Deep Learning for Information RetrievalDeep Learning for Information Retrieval
Deep Learning for Information RetrievalRoelof Pieters
 
Machine Learning Foundations for Professional Managers
Machine Learning Foundations for Professional ManagersMachine Learning Foundations for Professional Managers
Machine Learning Foundations for Professional ManagersAlbert Y. C. Chen
 
Similarity and Contrast on Conceptual Spaces for Pertinent Description Genera...
Similarity and Contrast on Conceptual Spaces for Pertinent Description Genera...Similarity and Contrast on Conceptual Spaces for Pertinent Description Genera...
Similarity and Contrast on Conceptual Spaces for Pertinent Description Genera...Giovanni Sileno
 
The Search for a New Visual Search Beyond Language - StampedeCon AI Summit 2017
The Search for a New Visual Search Beyond Language - StampedeCon AI Summit 2017The Search for a New Visual Search Beyond Language - StampedeCon AI Summit 2017
The Search for a New Visual Search Beyond Language - StampedeCon AI Summit 2017StampedeCon
 
Clustering Methods with R
Clustering Methods with RClustering Methods with R
Clustering Methods with RAkira Murakami
 
Combinations of Local Search and Constraint Programming
Combinations of Local Search and Constraint ProgrammingCombinations of Local Search and Constraint Programming
Combinations of Local Search and Constraint ProgrammingPaulShawIBM
 
Lecture 6-computer vision features descriptors matching
Lecture 6-computer vision features descriptors matchingLecture 6-computer vision features descriptors matching
Lecture 6-computer vision features descriptors matchingcairo university
 
Deep Learning for NLP: An Introduction to Neural Word Embeddings
Deep Learning for NLP: An Introduction to Neural Word EmbeddingsDeep Learning for NLP: An Introduction to Neural Word Embeddings
Deep Learning for NLP: An Introduction to Neural Word EmbeddingsRoelof Pieters
 
Clustering Methods with R
Clustering Methods with RClustering Methods with R
Clustering Methods with RAkira Murakami
 
Yoav Goldberg: Word Embeddings What, How and Whither
Yoav Goldberg: Word Embeddings What, How and WhitherYoav Goldberg: Word Embeddings What, How and Whither
Yoav Goldberg: Word Embeddings What, How and WhitherMLReview
 
ObjRecog2-17 (1).pptx
ObjRecog2-17 (1).pptxObjRecog2-17 (1).pptx
ObjRecog2-17 (1).pptxssuserc074dd
 
Fast Single-pass K-means Clusterting at Oxford
Fast Single-pass K-means Clusterting at Oxford Fast Single-pass K-means Clusterting at Oxford
Fast Single-pass K-means Clusterting at Oxford MapR Technologies
 

Similaire à 高次元空間におけるハブの出現 (第11回ステアラボ人工知能セミナー) (20)

Lec10 alignment
Lec10 alignmentLec10 alignment
Lec10 alignment
 
Word_Embeddings.pptx
Word_Embeddings.pptxWord_Embeddings.pptx
Word_Embeddings.pptx
 
Word2vec slide(lab seminar)
Word2vec slide(lab seminar)Word2vec slide(lab seminar)
Word2vec slide(lab seminar)
 
Poggi analytics - clustering - 1
Poggi   analytics - clustering - 1Poggi   analytics - clustering - 1
Poggi analytics - clustering - 1
 
Latent Dirichlet Allocation
Latent Dirichlet AllocationLatent Dirichlet Allocation
Latent Dirichlet Allocation
 
Deep Learning for Information Retrieval
Deep Learning for Information RetrievalDeep Learning for Information Retrieval
Deep Learning for Information Retrieval
 
Machine Learning Foundations for Professional Managers
Machine Learning Foundations for Professional ManagersMachine Learning Foundations for Professional Managers
Machine Learning Foundations for Professional Managers
 
Similarity and Contrast on Conceptual Spaces for Pertinent Description Genera...
Similarity and Contrast on Conceptual Spaces for Pertinent Description Genera...Similarity and Contrast on Conceptual Spaces for Pertinent Description Genera...
Similarity and Contrast on Conceptual Spaces for Pertinent Description Genera...
 
The Search for a New Visual Search Beyond Language - StampedeCon AI Summit 2017
The Search for a New Visual Search Beyond Language - StampedeCon AI Summit 2017The Search for a New Visual Search Beyond Language - StampedeCon AI Summit 2017
The Search for a New Visual Search Beyond Language - StampedeCon AI Summit 2017
 
Clustering Methods with R
Clustering Methods with RClustering Methods with R
Clustering Methods with R
 
Combinations of Local Search and Constraint Programming
Combinations of Local Search and Constraint ProgrammingCombinations of Local Search and Constraint Programming
Combinations of Local Search and Constraint Programming
 
Lecture 6-computer vision features descriptors matching
Lecture 6-computer vision features descriptors matchingLecture 6-computer vision features descriptors matching
Lecture 6-computer vision features descriptors matching
 
Deep Learning for NLP: An Introduction to Neural Word Embeddings
Deep Learning for NLP: An Introduction to Neural Word EmbeddingsDeep Learning for NLP: An Introduction to Neural Word Embeddings
Deep Learning for NLP: An Introduction to Neural Word Embeddings
 
riken-RBlur-slides.pptx
riken-RBlur-slides.pptxriken-RBlur-slides.pptx
riken-RBlur-slides.pptx
 
Clustering Methods with R
Clustering Methods with RClustering Methods with R
Clustering Methods with R
 
Clustering - ACM 2013 02-25
Clustering - ACM 2013 02-25Clustering - ACM 2013 02-25
Clustering - ACM 2013 02-25
 
Yoav Goldberg: Word Embeddings What, How and Whither
Yoav Goldberg: Word Embeddings What, How and WhitherYoav Goldberg: Word Embeddings What, How and Whither
Yoav Goldberg: Word Embeddings What, How and Whither
 
16 17 bag_words
16 17 bag_words16 17 bag_words
16 17 bag_words
 
ObjRecog2-17 (1).pptx
ObjRecog2-17 (1).pptxObjRecog2-17 (1).pptx
ObjRecog2-17 (1).pptx
 
Fast Single-pass K-means Clusterting at Oxford
Fast Single-pass K-means Clusterting at Oxford Fast Single-pass K-means Clusterting at Oxford
Fast Single-pass K-means Clusterting at Oxford
 

Plus de STAIR Lab, Chiba Institute of Technology

リアクティブプログラミングにおける時変値永続化の試み (第2回ステアラボソフトウェア技術セミナー)
リアクティブプログラミングにおける時変値永続化の試み (第2回ステアラボソフトウェア技術セミナー)リアクティブプログラミングにおける時変値永続化の試み (第2回ステアラボソフトウェア技術セミナー)
リアクティブプログラミングにおける時変値永続化の試み (第2回ステアラボソフトウェア技術セミナー)STAIR Lab, Chiba Institute of Technology
 
制約解消によるプログラム検証・合成 (第1回ステアラボソフトウェア技術セミナー)
制約解消によるプログラム検証・合成 (第1回ステアラボソフトウェア技術セミナー)制約解消によるプログラム検証・合成 (第1回ステアラボソフトウェア技術セミナー)
制約解消によるプログラム検証・合成 (第1回ステアラボソフトウェア技術セミナー)STAIR Lab, Chiba Institute of Technology
 
グラフ構造データに対する深層学習〜創薬・材料科学への応用とその問題点〜 (第26回ステアラボ人工知能セミナー)
グラフ構造データに対する深層学習〜創薬・材料科学への応用とその問題点〜 (第26回ステアラボ人工知能セミナー)グラフ構造データに対する深層学習〜創薬・材料科学への応用とその問題点〜 (第26回ステアラボ人工知能セミナー)
グラフ構造データに対する深層学習〜創薬・材料科学への応用とその問題点〜 (第26回ステアラボ人工知能セミナー)STAIR Lab, Chiba Institute of Technology
 
企業化する大学と、公益化する企業。そして、人工知能の社会実装に向けて。(ステアラボ人工知能シンポジウム)
企業化する大学と、公益化する企業。そして、人工知能の社会実装に向けて。(ステアラボ人工知能シンポジウム)企業化する大学と、公益化する企業。そして、人工知能の社会実装に向けて。(ステアラボ人工知能シンポジウム)
企業化する大学と、公益化する企業。そして、人工知能の社会実装に向けて。(ステアラボ人工知能シンポジウム)STAIR Lab, Chiba Institute of Technology
 
画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)
画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)
画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)STAIR Lab, Chiba Institute of Technology
 
文法および流暢性を考慮した頑健なテキスト誤り訂正 (第15回ステアラボ人工知能セミナー)
文法および流暢性を考慮した頑健なテキスト誤り訂正 (第15回ステアラボ人工知能セミナー)文法および流暢性を考慮した頑健なテキスト誤り訂正 (第15回ステアラボ人工知能セミナー)
文法および流暢性を考慮した頑健なテキスト誤り訂正 (第15回ステアラボ人工知能セミナー)STAIR Lab, Chiba Institute of Technology
 

Plus de STAIR Lab, Chiba Institute of Technology (7)

リアクティブプログラミングにおける時変値永続化の試み (第2回ステアラボソフトウェア技術セミナー)
リアクティブプログラミングにおける時変値永続化の試み (第2回ステアラボソフトウェア技術セミナー)リアクティブプログラミングにおける時変値永続化の試み (第2回ステアラボソフトウェア技術セミナー)
リアクティブプログラミングにおける時変値永続化の試み (第2回ステアラボソフトウェア技術セミナー)
 
制約解消によるプログラム検証・合成 (第1回ステアラボソフトウェア技術セミナー)
制約解消によるプログラム検証・合成 (第1回ステアラボソフトウェア技術セミナー)制約解消によるプログラム検証・合成 (第1回ステアラボソフトウェア技術セミナー)
制約解消によるプログラム検証・合成 (第1回ステアラボソフトウェア技術セミナー)
 
グラフ構造データに対する深層学習〜創薬・材料科学への応用とその問題点〜 (第26回ステアラボ人工知能セミナー)
グラフ構造データに対する深層学習〜創薬・材料科学への応用とその問題点〜 (第26回ステアラボ人工知能セミナー)グラフ構造データに対する深層学習〜創薬・材料科学への応用とその問題点〜 (第26回ステアラボ人工知能セミナー)
グラフ構造データに対する深層学習〜創薬・材料科学への応用とその問題点〜 (第26回ステアラボ人工知能セミナー)
 
企業化する大学と、公益化する企業。そして、人工知能の社会実装に向けて。(ステアラボ人工知能シンポジウム)
企業化する大学と、公益化する企業。そして、人工知能の社会実装に向けて。(ステアラボ人工知能シンポジウム)企業化する大学と、公益化する企業。そして、人工知能の社会実装に向けて。(ステアラボ人工知能シンポジウム)
企業化する大学と、公益化する企業。そして、人工知能の社会実装に向けて。(ステアラボ人工知能シンポジウム)
 
メテオサーチチャレンジ報告 (2位解法)
メテオサーチチャレンジ報告 (2位解法)メテオサーチチャレンジ報告 (2位解法)
メテオサーチチャレンジ報告 (2位解法)
 
画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)
画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)
画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)
 
文法および流暢性を考慮した頑健なテキスト誤り訂正 (第15回ステアラボ人工知能セミナー)
文法および流暢性を考慮した頑健なテキスト誤り訂正 (第15回ステアラボ人工知能セミナー)文法および流暢性を考慮した頑健なテキスト誤り訂正 (第15回ステアラボ人工知能セミナー)
文法および流暢性を考慮した頑健なテキスト誤り訂正 (第15回ステアラボ人工知能セミナー)
 

Dernier

Call Girls Bannerghatta Road Just Call 👗 7737669865 👗 Top Class Call Girl Ser...
Call Girls Bannerghatta Road Just Call 👗 7737669865 👗 Top Class Call Girl Ser...Call Girls Bannerghatta Road Just Call 👗 7737669865 👗 Top Class Call Girl Ser...
Call Girls Bannerghatta Road Just Call 👗 7737669865 👗 Top Class Call Girl Ser...amitlee9823
 
➥🔝 7737669865 🔝▻ Sambalpur Call-girls in Women Seeking Men 🔝Sambalpur🔝 Esc...
➥🔝 7737669865 🔝▻ Sambalpur Call-girls in Women Seeking Men  🔝Sambalpur🔝   Esc...➥🔝 7737669865 🔝▻ Sambalpur Call-girls in Women Seeking Men  🔝Sambalpur🔝   Esc...
➥🔝 7737669865 🔝▻ Sambalpur Call-girls in Women Seeking Men 🔝Sambalpur🔝 Esc...amitlee9823
 
👉 Amritsar Call Girl 👉📞 6367187148 👉📞 Just📲 Call Ruhi Call Girl Phone No Amri...
👉 Amritsar Call Girl 👉📞 6367187148 👉📞 Just📲 Call Ruhi Call Girl Phone No Amri...👉 Amritsar Call Girl 👉📞 6367187148 👉📞 Just📲 Call Ruhi Call Girl Phone No Amri...
👉 Amritsar Call Girl 👉📞 6367187148 👉📞 Just📲 Call Ruhi Call Girl Phone No Amri...karishmasinghjnh
 
Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...
Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...
Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...only4webmaster01
 
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...amitlee9823
 
Detecting Credit Card Fraud: A Machine Learning Approach
Detecting Credit Card Fraud: A Machine Learning ApproachDetecting Credit Card Fraud: A Machine Learning Approach
Detecting Credit Card Fraud: A Machine Learning ApproachBoston Institute of Analytics
 
➥🔝 7737669865 🔝▻ Bangalore Call-girls in Women Seeking Men 🔝Bangalore🔝 Esc...
➥🔝 7737669865 🔝▻ Bangalore Call-girls in Women Seeking Men  🔝Bangalore🔝   Esc...➥🔝 7737669865 🔝▻ Bangalore Call-girls in Women Seeking Men  🔝Bangalore🔝   Esc...
➥🔝 7737669865 🔝▻ Bangalore Call-girls in Women Seeking Men 🔝Bangalore🔝 Esc...amitlee9823
 
Call Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night StandCall Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night Standamitlee9823
 
Just Call Vip call girls Palakkad Escorts ☎️9352988975 Two shot with one girl...
Just Call Vip call girls Palakkad Escorts ☎️9352988975 Two shot with one girl...Just Call Vip call girls Palakkad Escorts ☎️9352988975 Two shot with one girl...
Just Call Vip call girls Palakkad Escorts ☎️9352988975 Two shot with one girl...gajnagarg
 
Just Call Vip call girls Erode Escorts ☎️9352988975 Two shot with one girl (E...
Just Call Vip call girls Erode Escorts ☎️9352988975 Two shot with one girl (E...Just Call Vip call girls Erode Escorts ☎️9352988975 Two shot with one girl (E...
Just Call Vip call girls Erode Escorts ☎️9352988975 Two shot with one girl (E...gajnagarg
 
VIP Model Call Girls Hinjewadi ( Pune ) Call ON 8005736733 Starting From 5K t...
VIP Model Call Girls Hinjewadi ( Pune ) Call ON 8005736733 Starting From 5K t...VIP Model Call Girls Hinjewadi ( Pune ) Call ON 8005736733 Starting From 5K t...
VIP Model Call Girls Hinjewadi ( Pune ) Call ON 8005736733 Starting From 5K t...SUHANI PANDEY
 
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...amitlee9823
 
Digital Advertising Lecture for Advanced Digital & Social Media Strategy at U...
Digital Advertising Lecture for Advanced Digital & Social Media Strategy at U...Digital Advertising Lecture for Advanced Digital & Social Media Strategy at U...
Digital Advertising Lecture for Advanced Digital & Social Media Strategy at U...Valters Lauzums
 
Just Call Vip call girls Mysore Escorts ☎️9352988975 Two shot with one girl (...
Just Call Vip call girls Mysore Escorts ☎️9352988975 Two shot with one girl (...Just Call Vip call girls Mysore Escorts ☎️9352988975 Two shot with one girl (...
Just Call Vip call girls Mysore Escorts ☎️9352988975 Two shot with one girl (...gajnagarg
 
Call Girls In Attibele ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Attibele ☎ 7737669865 🥵 Book Your One night StandCall Girls In Attibele ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Attibele ☎ 7737669865 🥵 Book Your One night Standamitlee9823
 
Aspirational Block Program Block Syaldey District - Almora
Aspirational Block Program Block Syaldey District - AlmoraAspirational Block Program Block Syaldey District - Almora
Aspirational Block Program Block Syaldey District - AlmoraGovindSinghDasila
 
➥🔝 7737669865 🔝▻ malwa Call-girls in Women Seeking Men 🔝malwa🔝 Escorts Ser...
➥🔝 7737669865 🔝▻ malwa Call-girls in Women Seeking Men  🔝malwa🔝   Escorts Ser...➥🔝 7737669865 🔝▻ malwa Call-girls in Women Seeking Men  🔝malwa🔝   Escorts Ser...
➥🔝 7737669865 🔝▻ malwa Call-girls in Women Seeking Men 🔝malwa🔝 Escorts Ser...amitlee9823
 
DATA SUMMIT 24 Building Real-Time Pipelines With FLaNK
DATA SUMMIT 24  Building Real-Time Pipelines With FLaNKDATA SUMMIT 24  Building Real-Time Pipelines With FLaNK
DATA SUMMIT 24 Building Real-Time Pipelines With FLaNKTimothy Spann
 

Dernier (20)

Abortion pills in Doha Qatar (+966572737505 ! Get Cytotec
Abortion pills in Doha Qatar (+966572737505 ! Get CytotecAbortion pills in Doha Qatar (+966572737505 ! Get Cytotec
Abortion pills in Doha Qatar (+966572737505 ! Get Cytotec
 
Call Girls Bannerghatta Road Just Call 👗 7737669865 👗 Top Class Call Girl Ser...
Call Girls Bannerghatta Road Just Call 👗 7737669865 👗 Top Class Call Girl Ser...Call Girls Bannerghatta Road Just Call 👗 7737669865 👗 Top Class Call Girl Ser...
Call Girls Bannerghatta Road Just Call 👗 7737669865 👗 Top Class Call Girl Ser...
 
➥🔝 7737669865 🔝▻ Sambalpur Call-girls in Women Seeking Men 🔝Sambalpur🔝 Esc...
➥🔝 7737669865 🔝▻ Sambalpur Call-girls in Women Seeking Men  🔝Sambalpur🔝   Esc...➥🔝 7737669865 🔝▻ Sambalpur Call-girls in Women Seeking Men  🔝Sambalpur🔝   Esc...
➥🔝 7737669865 🔝▻ Sambalpur Call-girls in Women Seeking Men 🔝Sambalpur🔝 Esc...
 
👉 Amritsar Call Girl 👉📞 6367187148 👉📞 Just📲 Call Ruhi Call Girl Phone No Amri...
👉 Amritsar Call Girl 👉📞 6367187148 👉📞 Just📲 Call Ruhi Call Girl Phone No Amri...👉 Amritsar Call Girl 👉📞 6367187148 👉📞 Just📲 Call Ruhi Call Girl Phone No Amri...
👉 Amritsar Call Girl 👉📞 6367187148 👉📞 Just📲 Call Ruhi Call Girl Phone No Amri...
 
Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...
Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...
Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...
 
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
 
Detecting Credit Card Fraud: A Machine Learning Approach
Detecting Credit Card Fraud: A Machine Learning ApproachDetecting Credit Card Fraud: A Machine Learning Approach
Detecting Credit Card Fraud: A Machine Learning Approach
 
➥🔝 7737669865 🔝▻ Bangalore Call-girls in Women Seeking Men 🔝Bangalore🔝 Esc...
➥🔝 7737669865 🔝▻ Bangalore Call-girls in Women Seeking Men  🔝Bangalore🔝   Esc...➥🔝 7737669865 🔝▻ Bangalore Call-girls in Women Seeking Men  🔝Bangalore🔝   Esc...
➥🔝 7737669865 🔝▻ Bangalore Call-girls in Women Seeking Men 🔝Bangalore🔝 Esc...
 
Call Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night StandCall Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night Stand
 
Just Call Vip call girls Palakkad Escorts ☎️9352988975 Two shot with one girl...
Just Call Vip call girls Palakkad Escorts ☎️9352988975 Two shot with one girl...Just Call Vip call girls Palakkad Escorts ☎️9352988975 Two shot with one girl...
Just Call Vip call girls Palakkad Escorts ☎️9352988975 Two shot with one girl...
 
Just Call Vip call girls Erode Escorts ☎️9352988975 Two shot with one girl (E...
Just Call Vip call girls Erode Escorts ☎️9352988975 Two shot with one girl (E...Just Call Vip call girls Erode Escorts ☎️9352988975 Two shot with one girl (E...
Just Call Vip call girls Erode Escorts ☎️9352988975 Two shot with one girl (E...
 
VIP Model Call Girls Hinjewadi ( Pune ) Call ON 8005736733 Starting From 5K t...
VIP Model Call Girls Hinjewadi ( Pune ) Call ON 8005736733 Starting From 5K t...VIP Model Call Girls Hinjewadi ( Pune ) Call ON 8005736733 Starting From 5K t...
VIP Model Call Girls Hinjewadi ( Pune ) Call ON 8005736733 Starting From 5K t...
 
(NEHA) Call Girls Katra Call Now 8617697112 Katra Escorts 24x7
(NEHA) Call Girls Katra Call Now 8617697112 Katra Escorts 24x7(NEHA) Call Girls Katra Call Now 8617697112 Katra Escorts 24x7
(NEHA) Call Girls Katra Call Now 8617697112 Katra Escorts 24x7
 
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...
 
Digital Advertising Lecture for Advanced Digital & Social Media Strategy at U...
Digital Advertising Lecture for Advanced Digital & Social Media Strategy at U...Digital Advertising Lecture for Advanced Digital & Social Media Strategy at U...
Digital Advertising Lecture for Advanced Digital & Social Media Strategy at U...
 
Just Call Vip call girls Mysore Escorts ☎️9352988975 Two shot with one girl (...
Just Call Vip call girls Mysore Escorts ☎️9352988975 Two shot with one girl (...Just Call Vip call girls Mysore Escorts ☎️9352988975 Two shot with one girl (...
Just Call Vip call girls Mysore Escorts ☎️9352988975 Two shot with one girl (...
 
Call Girls In Attibele ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Attibele ☎ 7737669865 🥵 Book Your One night StandCall Girls In Attibele ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Attibele ☎ 7737669865 🥵 Book Your One night Stand
 
Aspirational Block Program Block Syaldey District - Almora
Aspirational Block Program Block Syaldey District - AlmoraAspirational Block Program Block Syaldey District - Almora
Aspirational Block Program Block Syaldey District - Almora
 
➥🔝 7737669865 🔝▻ malwa Call-girls in Women Seeking Men 🔝malwa🔝 Escorts Ser...
➥🔝 7737669865 🔝▻ malwa Call-girls in Women Seeking Men  🔝malwa🔝   Escorts Ser...➥🔝 7737669865 🔝▻ malwa Call-girls in Women Seeking Men  🔝malwa🔝   Escorts Ser...
➥🔝 7737669865 🔝▻ malwa Call-girls in Women Seeking Men 🔝malwa🔝 Escorts Ser...
 
DATA SUMMIT 24 Building Real-Time Pipelines With FLaNK
DATA SUMMIT 24  Building Real-Time Pipelines With FLaNKDATA SUMMIT 24  Building Real-Time Pipelines With FLaNK
DATA SUMMIT 24 Building Real-Time Pipelines With FLaNK
 

高次元空間におけるハブの出現 (第11回ステアラボ人工知能セミナー)

  • 1. 2017/07/21@STAIR Lab AI seminar Improving Nearest Neighbor Methods from the Perspective of Hubness Phenomenon Yutaro Shigeto STAIR Lab, Chiba Institute of Technology
  • 2. A complete reference list is available at https://yutaro-s.github.io/download/ref-20170721.html
  • 3. !3 Nearest neighbor methods are •a fundamental technique •used in various fields: NLP, CV, ML, DM
  • 4. !4 Nearest neighbor methods are •a fundamental technique •used in various fields: NLP, CV, ML, DM
  • 5. !5 Nearest neighbor methods are •a fundamental technique •used in various fields: NLP, CV, ML, DM
  • 6. •a fundamental technique •used in various fields: NLP, CV, ML, DM Nearest neighbor methods are !6 cat dog gorilla
  • 7. •a fundamental technique •used in various fields: NLP, CV, ML, DM Nearest neighbor methods are !7 cat dog gorilla
  • 8. •a fundamental technique •used in various fields: NLP, CV, ML, DM Nearest neighbor methods are !8 cat dog gorilla
  • 9. •a fundamental technique •used in various fields: NLP, CV, ML, DM Nearest neighbor methods are !9cat cat dog gorilla
  • 11. !11 The nearest neighbors of many queries are the same objects (“hubs”) Hubness Phenomenon cat [Radovanović+, 2010]
  • 12. !12 The nearest neighbors of many queries are the same objects (“hubs”) Hubness Phenomenon [Radovanović+, 2010] cat
  • 13. !13 The nearest neighbors of many queries are the same objects (“hubs”) Hubness Phenomenon [Radovanović+, 2010] cat
  • 14. !14 The nearest neighbors of many queries are the same objects (“hubs”) Hubness Phenomenon hub [Radovanović+, 2010] cat
  • 15. : Normal distribution (zero mean) !15 Then it can be shown that Fixed objects , with EX [ x y2 ] EX [ x y1 ] > 0 y1 < y2 is more likely to be closer to more likely to be a hubi.e. Why hubs emerge? [Radovanović+, 2010]
  • 16. : Normal distribution (zero mean) !16 Then it can be shown that Fixed objects , with EX [ x y2 ] EX [ x y1 ] > 0 y1 < y2 is more likely to be closer to more likely to be a hubi.e. Because this holds for any pair and , objects closest to the origin tend to be hubs This bias is called “spatial centrality” Why hubs emerge? [Radovanović+, 2010]
  • 17. Variants •Squared Euclidean distance [Shigeto+, 2015] •Inner product [Suzuki+, 2013] !17 EX x y2 2 EX x y1 2 > 0 1 |D| x D x, y2 1 |D| x D x, y1 < 0
  • 18. !18 Research Objective: Improve the performance of nearest neighbor methods via reducing the emergence of hubs Problem: The emergence of hubs diminishes nearest neighbor methods
  • 20. !20 [Suzuki+, 2013] Centering: Reducing spatial centrality Spatial centrality implies the object which is similar to the centroid tends to be hub After centering, similarities are identical: i.e., zero centroid tends to be hub
  • 21. !21 Centering: Reducing spatial centrality Spatial centrality implies the object which is similar to the centroid tends to be hub After centering, similarities are identical: i.e., zero centroid = origin [Suzuki+, 2013]
  • 22. Mutual proximity: Breaking asymmetric relation !22 [Schnitzer+, 2012] Although hub becomes the nearest neighbor of many objects, such objects can not become the nearest neighbor of hub Mutual proximity makes neighbor relations symmetric hub
  • 23. Mutual proximity: Breaking asymmetric relation !23 Although hub becomes the nearest neighbor of many objects, such objects can not become the nearest neighbor of hub Mutual proximity makes neighbor relations symmetric hub [Schnitzer+, 2012]
  • 24. Mutual proximity: Breaking asymmetric relation !24 Although hub becomes the nearest neighbor of many objects, such objects can not become the nearest neighbor of hub Mutual proximity makes neighbor relations symmetric hub [Schnitzer+, 2012]
  • 26. Zero-shot learning Active research topic in NLP, CV, ML Many applications: •Image labeling •Bilingual lexicon extraction + Many other cross-domain matching tasks !26 [Larochelle+, 2008]
  • 27. …but classifier has to predict labels not appearing in training set ZSL is a type of multi-class classification !27 ZSL task Standard classification task
  • 28. !28 Pre-processing: Label embedding Labels are embedded in metric space Objects and labels = both vectors label space lion tigerexample space chimpanzee
  • 29. Find a matrix M that projects examples into label space !29 chimpanzee lion tigerlabel spaceexample space M Training: find a projection function
  • 30. lion tigerlabel spaceexample space label spaceexample space chimpanzee leopard gorilla !30 Prediction: Nearest neighbor search Given test object and test labels, to predict the label of a test object,
  • 31. 1. project the example into label space, using matrix M 2. find the nearest label Prediction: Nearest neighbor search Given test object and test labels, to predict the label of a test object, lion tigerlabel spaceexample space label spaceexample space chimpanzee leopard gorilla M !31
  • 32. Hubness: Problem in ZSL !32 sheep zebra hippo rat label spaceexample space Classifier frequently predicts the same labels (“hubs”) [Dinu and Baroni, 2015; see also Radovanović+, 2010]
  • 33. !33 sheep zebra hippo rat label spaceexample space Classifier frequently predicts the same labels (“hubs”) Hubness: Problem in ZSL [Dinu and Baroni, 2015; see also Radovanović+, 2010]
  • 34. !34 Classifier frequently predicts the same labels (“hubs”) sheep zebra hippo rat label spaceexample space Hubness: Problem in ZSL [Dinu and Baroni, 2015; see also Radovanović+, 2010]
  • 35. !35 sheep zebra hippo rat label spaceexample space Classifier frequently predicts the same labels (“hubs”) Hubness: Problem in ZSL [Dinu and Baroni, 2015; see also Radovanović+, 2010]
  • 36. !36 sheep zebra hippo rat label spaceexample space Classifier frequently predicts the same labels (“hubs”) Hubness: Problem in ZSL [Dinu and Baroni, 2015; see also Radovanović+, 2010]
  • 37. !37 Problem with current regression approach: Research objective: Learned classifier frequently predicts the same labels (Emergence of “hub” labels) Investigate why hubs emerge in regression-based ZSL, and how to reduce the emergence of hubs
  • 39. Current approach: !39 Proposed approach: chimpanzee lion tigerlabel spaceexample space M example space chimpanzee lion tigerlabel space M
  • 40. Current approach: Proposed approach: chimpanzee lion tigerlabel spaceexample space M example space chimpanzee lion tigerlabel space label spaceexample space label spaceexample space !40 leopard gorilla M leopard gorillaM
  • 41. Synthetic data result !41 Hubness (N1 skewness) Accuracy 24.2 13.8 0.5 87.6 Current Proposed Proposed approach reduces hubness and improves accuracy
  • 42. Why proposed approach reduces hubness Shrinkage in regression !42 Argument for our proposal relies on two concepts Spatial centrality of data distributions
  • 43. !43 If we optimize Then, we have “Shrinkage” in ridge/least squares regression [See also Lazaridou+,2015]
  • 44. !44 If we optimize Then, we have For simplicity, projected objects are assumed to also follow normal distribution “Shrinkage” in ridge/least squares regression [See also Lazaridou+,2015]
  • 45. Why proposed approach reduces hubness Shrinkage in regression !45 Argument for our proposal relies on two concepts Spatial centrality of data distributions ✔
  • 46. “Spatial centrality” : query distribution (zero mean) !46 Fixed objects , with [See also Radovanović+, 2010]
  • 47. “Spatial centrality” : query distribution (zero mean) is more likely to be closer to more likely to be a hub !47 Then it can be shown that i.e. Fixed objects , with EX x y2 2 EX x y1 2 > 0 [See also Radovanović+, 2010]
  • 48. “Spatial centrality” : query distribution (zero mean) is more likely to be closer to more likely to be a hub !48 Then it can be shown that i.e. Fixed objects , with Because this holds for any pair and , objects closest to the origin tend to be hubs This bias is called “spatial centrality.” EX x y2 2 EX x y1 2 > 0 [See also Radovanović+, 2010]
  • 49. Degree of spatial centrality !49 Further assume distribution of and
  • 50. Degree of spatial centrality !50 Further assume distribution of and This formula quantifies the degree of spatial centrality We have
  • 51. The smaller the variance of label distribution, the smaller the spatial centrality (= bias causing hubness) Spatial centrality depends on variance of label distributions !51
  • 52. Why proposed approach reduces hubness Shrinkage in regression !52 Argument for our proposal relies on two concepts Spatial centrality of data distributions ✔ ✔
  • 53. !53 Current approach: map X into Y Proposed approach: map Y into X
  • 54. !54 Current approach: map X into Y Proposed approach: map Y into X Shrinkage
  • 55. !55 Current approach: map X into Y Proposed approach: map Y into X
  • 56. !56 Current approach: map X into Y Proposed approach: map Y into X
  • 57. Q. Which configuration is better for reducing hubs? !57 Proposed Current
  • 58. Q. Which configuration is better for reducing hubs? !58 Proposed Current逆方向 順方向 Spatial centrality For a fixed query distribution , data distribution with smaller variance is preferable to reduce hubs
  • 59. Q. Which configuration is better for reducing hubs? !59 Proposed Current
  • 60. Q. Which configuration is better for reducing hubs? !60 Proposed Current Since distribution is not fixed, comparing label distribution is not meaningful
  • 61. !61 Proposed (scaled) Current Q. Which configuration is better for reducing hubs? Scaling does not change the nearest neighbor relation
  • 62. !62 Proposed (scaled) Current Q. Which configuration is better for reducing hubs? A. Reverse direction is preferable For fixed distribution , variance of distribution in proposed is smaller
  • 63. Summary of our proposal !63 Project labels into example space ➥ reduces variance of labels, hence suppresses hubness chimpanzee gorilla example space label space Label distribution with smaller variance is desirable to reduce hubness Spatial centrality Regression shrinks variance of projected objects Shrinkage Proposal
  • 65. Experimental objective !65 We evaluate proposed approach in real tasks •Does it suppress hubs? •Does it improve the prediction accuracy?
  • 66. •Bilingual lexicon extraction gorilla leopard : source language : target language gorille Zero-shot tasks !66 •Image labeling gorilla leopard : image : label
  • 67. Compared methods !67 Current Proposed CCA We used Euclidean distance as a distance measure for finding the nearest label
  • 69. 10.0 37.7 3.85.2 65.162.1 better Hubness (skewness) Ja → En En → Ja Bilingual lexicon extraction: Ja - En !69 21.620.2 34.431.9 0.40.2 current reverse CCA better Accuracy [%] Ja → En En → Ja
  • 70. Summary • Analyzed why hubs emerge in current ZSL approach - Variance of labels greater than examples • Proposed a simple method for reducing hubness - Reverse the mapping direction • Proposed method reduced hubness and outperformed current approach and CCA in image labeling and bilingual lexicon extraction tasks !70
  • 72. k-nearest neighbor classification !72 Given a dataset D = {(xi, yi)}n i=1 the label of is decided by its k-nearest neighbors:x ˆy = arg min yi:(xi,yi) D f(x, xi)
  • 73. k-nearest neighbor classification !73 Given a dataset D = {(xi, yi)}n i=1 the label of is decided by its k-nearest neighbors:x ˆy = arg min yi:(xi,yi) D f(x, xi)
  • 74. k-nearest neighbor classification !74 Given a dataset D = {(xi, yi)}n i=1 the label of is decided by its k-nearest neighbors:x ˆy = arg min yi:(xi,yi) D f(x, xi)
  • 75. !75 Given a dataset D = {(xi, yi)}n i=1 the label of is decided by its k-nearest neighbors:x ˆy = arg min yi:(xi,yi) D f(x, xi) k-nearest neighbor classification Distance metric learning learns a matrix f(x, xi) = Lx Lxi L Training is computationally expensive
  • 76. !76 Given a dataset D = {(xi, yi)}n i=1 the label of is decided by its k-nearest neighbors:x ˆy = arg min yi:(xi,yi) D f(x, xi) Proposal: Dissimilarity
  • 77. !77 Given a dataset D = {(xi, yi)}n i=1 the label of is decided by its k-nearest neighbors:x ˆy = arg min yi:(xi,yi) D f(x, xi) Proposal: Dissimilarity Spatial centrality For a fixed query distribution , data distribution with smaller variance is preferable to reduce hubs
  • 78. !78 Given a dataset D = {(xi, yi)}n i=1 the label of is decided by its k-nearest neighbors:x ˆy = arg min yi:(xi,yi) D f(x, xi) The function f needs to be computed only between labeled objects and unlabeled object ➡labeled objects are always target of retrieval, and unlabeled object is always query f(x, xi) = x Wxi 2 Proposal: Dissimilarity
  • 79. This method is not metric learning !79 •The goal of classification is to classify the query correctly -finding a suitable decision boundary (not metric)
  • 80. !80 min W n i=1 z Ti xi Wz 2 + W 2 F Find a matrix which minimizes the distance: Proposal: Training W
  • 81. !81 min W n i=1 z Ti xi Wz 2 + W 2 F Proposal: Training Find a matrix which minimizes the distance:W
  • 82. !82 min W n i=1 z Ti xi Wz 2 + W 2 F Proposal: Training Find a matrix which minimizes the distance:W
  • 83. !83 min W n i=1 z Ti xi Wz 2 + W 2 F Proposal: Training W = XJXT (XXT + I) 1 This function has the closed-form solution: Find a matrix which minimizes the distance:W
  • 84. !84 Givne a query object , ˆy = arg min yi:(xi,yi) D x Wxi 2 Proposal: Test x
  • 85. !85 ˆy = arg min yi:(xi,yi) D x Wxi 2 Proposal: Test Givne a query object ,x
  • 86. Move labeled objects v.s. move query !86 f(x, xi) = Mx xi 2 f(x, xi) = x Wxi 2 •Move labeled objects (proposal) •Move query This reduces the variance = reducing the emergence of hubs This increases the variance = promoting the emergence of hubs
  • 88. Experimental objective !88 We evaluate the proposed method on various datasets Our main focuses are -Does it suppress hubs? -Does it improve the classification accuracy? -Is it faster than distance metric learning?
  • 89. Results: Skewness (degree of hubness) !89 The proposed method -reduces the emergence of hubs method RCV News Reuters TDT original metric 13.35 21.93 7.61 4.89 LMNN 3.86 14.74 7.63 4.01 ITML 4.27 19.65 7.30 2.39 DML-eig 1.71 1.45 3.05 1.34 Move-labeled (proposed) 1.14 2.88 4.53 1.44 Move-query 21.57 33.36 17.49 6.71 (c) Image datasets. method AwA CUB SUN aPY original metric 2.49 2.38 2.52 2.80 LMNN 3.10 2.96 2.80 3.94 ITML 2.42 2.27 2.37 2.69 DML-eig 1.90 1.77 2.39 2.17 Move-labeled (proposed) 1.24 0.97 1.02 1.23 Move-query 7.81 7.83 7.48 11.65 Image datasets (smaller is better)
  • 90. Results: Classification accuracy [%] !90 Image datasets The proposed method -reduces the emergence of hubs -is better than metric learning methods on most datasets method RCV News Reuters TDT original metric 92.1 76.9 89.5 96.1 LMNN 94.7 79.9 91.5 96.6 ITML 93.2 77.0 90.8 96.5 DML-eig 94.5 73.3 85.9 95.7 Move-labeled (proposed) 94.4 81.6 91.6 96.7 Move-query 89.1 70.0 85.9 95.4 (c) Image datasets. method AwA CUB SUN aPY original metric 83.2 51.6 26.2 82.2 LMNN 83.0 54.7 24.4 81.8 ITML 83.1 51.3 26.0 82.4 DML-eig 82.0 53.5 22.4 81.6 Move-labeled (proposed) 84.1 52.4 28.3 83.4 Move-query 79.2 43.3 14.6 78.7
  • 91. Results: Training time [s] !91 The proposed method -reduces the emergence of hubs -is better than metric learning methods on most datasets -is faster than … on all datasets Document datasetsndicate the best performer for each dataset. (b) Image datasets. method AwA CUB SUN aPY LMNN 1525.5 1098.2 15704.3 317.3 ITML 1536.3 577.6 1126.4 9211.2 DML-eig 2048.0 2084.7 2006.1 1787.1 proposed 9.5 1.5 4.1 6.4
  • 92. Results: UCI datasets !92 The proposed method -reduces the emergence of hubs -is better than metric learning methods on most datasets -is faster than … on all datasets -does not work well on UCI datasets able 3: Classification accuracy [%]: Bold figures indicate the best performers for each ataset. (a) UCI datasets. method ionosphere balance-scale iris wine glass original metric 86.8 89.5 97.2 98.1 68.1 LMNN 90.3 90.0 96.7 98.1 67.7 ITML 87.7 89.5 97.8 99.1 65.0 DML-eig 87.7 91.2 96.7 98.6 66.5 Move-labeled (proposed) 89.6 89.5 97.2 98.6 70.8 Move-query 79.7 89.4 97.2 96.3 62.3 (b) Document datasets.
  • 93. Summary !93 Prediction: ˆy = arg min yi:(xi,yi) D x Wxi 2 The proposed method -reduces the emergence of hubs -is better than metric learning methods on most datasets -is faster than … on all datasets -does not work well on UCI datasets
  • 94. Misc.
  • 95. Other topics • Normalization of distances - Local scaling [Schnitzer+, 2012], Laplacian-based kernel [Suzuki+, 2012], Localized centering [Hara+, 2015] • Classifiers -hw-kNN [Radovanović+, 2009], h-FNN [Tomašev+, 2013], NHBNN [Tomašev+, 2011] !95See comprehensive survey [Tomašev+, 2015; Suzuki, 2014; Radovanović, 2017]
  • 96. Tools • Hub miner: Hubness-aware machine learning • Hub toolbox • PyHubs • Our code !96
  • 97. Conclusions • Introduced why hubs emerge -Spatial centrality • Showed hub reduction methods which improved the performance of nearest neighbor methods !97