Ce diaporama a bien été signalé.
Nous utilisons votre profil LinkedIn et vos données d’activité pour vous proposer des publicités personnalisées et pertinentes. Vous pouvez changer vos préférences de publicités à tout moment.

Deep Learning for Recommender Systems - Budapest RecSys Meetup

6 964 vues

Publié le

An overview of some deep learning methods for recommender systems along with an intro to the relevant deep learning methods such as convolutional neural networks (CNN's), recurrent neural networks (RNN's), autoencoders, restricted boltzmann machines (RBM's) and more.

Publié dans : Données & analyses
  • Identifiez-vous pour voir les commentaires

Deep Learning for Recommender Systems - Budapest RecSys Meetup

  1. 1. Deep Learning for Recommender Systems Alexandros Karatzoglou Senior Research Scientist @ Telefonica Research first.lastname@gmail.com @alexk_z
  2. 2. Telefonica Research Machine Learning HCI Network & Systems Mobile Computing http://www.tid.es
  3. 3. Why Deep? ImageNet challenge error rates (red line = human performance)
  4. 4. Why Deep?
  5. 5. Inspiration for Neural Learning Early aviation attempts aimed at imitating birds, bats
  6. 6. Neural Model
  7. 7. Neuron a.k.a. Unit
  8. 8. Feedforward Multilayered Network
  9. 9. Learning
  10. 10. Stochastic Gradient Descent Generalization of (Stochastic) Gradient Descent
  11. 11. Stochastic Gradient Descent
  12. 12. Stochastic Gradient Descent
  13. 13. Stochastic Gradient Descent
  14. 14. Feedforward Multilayered Network
  15. 15. Backpropagation
  16. 16. Backpropagation Does not work well in plain a “normal” multilayer deep network Vanishing Gradients Slow Learning SVM’s easier to train 2nd Neural Winter
  17. 17. Modern Deep Networks Ingredients: Rectified Linear Activation function a.k.a. ReLu
  18. 18. Modern Deep Networks Ingredients: Dropout:
  19. 19. Modern Deep Networks Ingredients: Mini-batches: Stochastic Gradient Descent Compute gradient over many (50 -100) data points (minibatch) and update.
  20. 20. Modern Deep Networks Ingredients: Softmax output:
  21. 21. Modern Deep Networks Ingredients: Categorical cross entropy loss
  22. 22. Modern Feedforward Networks Ingredients: Batch Normalization
  23. 23. Modern Feedforward Networks Ingredients: Adagrad a.k.a. adaptive learning rates
  24. 24. Restricted Boltzmann Machines
  25. 25. Restricted Boltzmann Machines
  26. 26. Convolutional Networks
  27. 27. Convolutional Networks [Krizhevsky 2012]
  28. 28. Convolutional Networks [Faster R-CNN: Ren, He, Girshick, Sun 2015] [Farabet et al., 2012]
  29. 29. Convolutional Networks [Faster R-CNN: Ren, He, Girshick, Sun 2015] [Farabet et al., 2012]
  30. 30. Convolutional Networks Self Driving Cars Convolutional example slides from Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 6 75
  31. 31. Convolutional Networks Standford CS231n: Convolutional Neural Networks for Visual Recognition
  32. 32. Convolutional Networks
  33. 33. Convolutional Networks
  34. 34. Convolutional Networks
  35. 35. Convolutional Networks
  36. 36. Convolutional Networks
  37. 37. Convolutional Networks
  38. 38. Convolutional Networks AlexNet [Krizhevsky et al 2014]
  39. 39. d d D-tour → Matrix Factorization 2 5 4 5 4 4 1 5 5 4 1 2 5 2 4 1
  40. 40. Convolutional Networks for enhancing Collaborative Filtering VBPR: Visual Bayesian Personalized Ranking from Implicit Feedback He, etl AAAI 2015
  41. 41. Convolutional Networks for Music feature extraction Deep Learning can be used to learn item profiles e.g. music Map audio to lower dimensional space where it can be used directly for recommendation Useful in recommending music from the long tail (not popular) A solution to the cold start problem
  42. 42. Convolutional Networks for Music feature extraction A. van den Oord, S. Dielmann, B. Schrauwen Deep content- based music recommendation NIPS 2014
  43. 43. Convolutional Networks deepart.io
  44. 44. Recurrent Neural Networks
  45. 45. Recurrent Neural Networks Long Short Term Memory
  46. 46. Recurrent Neural Networks
  47. 47. Recurrent Neural Networks PANDARUS: Alas, I think he shall be come approached and the day When little srain would be attain'd into being never fed, And who is but a chain and subjects of his death, I should not sleep. Second Senator: They are away this miseries, produced upon my soul, Breaking and strongly should be buried, when I perish The earth and thoughts of many states. DUKE VINCENTIO: Well, your wit is in the care of side and that. Second Lord: They would be ruled after this chamber, and my fair nues begun out of the fact, to be conveyed, Whose noble souls I'll have the heart of the wars. Clown: Come, sir, I will make did behold your worship. VIOLA: I'll drink it.
  48. 48. Recurrent Neural Networks
  49. 49. Recurrent Neural Networks
  50. 50. Recurrent Neural Networks
  51. 51. Recurrent Neural Networks
  52. 52. Session-based recommendation with Recurrent Neural Networks RNN (GRU) with ranking loss function ICLR 2016 [B. Hidasi, et.al.] Treat each user session as sequence of clicks
  53. 53. Session-based recommendation with Recurrent Neural Networks RNN (GRU) with ranking loss function ICLR 2016 [B. Hidasi, et.al.] Treat each user session as sequence of clicks
  54. 54. Autoencoders
  55. 55. Autoencoders
  56. 56. Autoencoders
  57. 57. Personalized Autoencoders Collaborative Denoising Auto-Encoders for Top-N Recommender Systems Wu et.al. WSDM 2016
  58. 58. (Some) Deep Learning Software Theano: Python Library TensorFlow: Python Library Keras: High Level Python Library (Theano &TF) MXNET: R, Python, Julia
  59. 59. Thanks ● Some slides or parts of slides are taken from other excellent talks and papers on Deep Learning (e.g. Yan Lecun, Andrej Karpathy and other great deep learning researchers)