当前位置:文档之家› 矩阵分解技术应用到推荐系统

矩阵分解技术应用到推荐系统

Matrix Factorization techniques for recommender systems
Catalogue
1. Paper Background 2. Introduction 3. Recommender Systems Strategies 4. Matrix Factorization Methods 5. A Basic Matrix Factorization Model 6. Learning Algorithm 7. Adding Biases 8. Adding Input Sources 9. Temporal Dynamics 10. Input With Varying Confidence Levels 11. Netflix Prize Competition 12. Conclusion
3、Recommender System Strategies
Content Filtering Collaborative Filtering
1.Neighborhood methods user-oriented item-oriented
tent Fator Model
Sunday, August 11, 2019
Dot product captures the user’s estimated interest in the item:

q p T
r ui
iu
(1)
Here,the elements of q measure the extent to
i
which the item possesses those factors,the
, implicit feedback . Implicit feedback :purchase history , browsing
history ,search patterns , mouse movement and so on.
Sunday, August 11, 2019
11
5、A Basic Matrix Factorization Model
6.2、Alternating Least Squares
ALS teachniques rotate between fixing the q ' S and i
fixing the p 'S u
ALS is favorable in at least two cases: Allows massive parallelization Centered on implicit data
Sunday, August 11, 2019
3
2、Introduction
Modern consumers are inundated with choices. More retailor have become interested in RS,which
analyze patterns of user interest in products to pride personalized recommendations that suit a user's taste. Netflix and have made RS a salient part of their websites. Particularly userful for entainment products such as movies,music,and TV shows.
Sunday, August 11, 2019
15
6.1、Stochastic Gradient Descent
Loop through all ratings in the training set For each given traing case,the system predicts and computes the associated prediction error
Two information:item attributes,user attributes
elements
of
p u
measure
the
extent
of
interest
the
user has in items that are high on the
corresponding factors.
Challenge:How to compute a mapping of items and users factor vectors?
7、Adding Biases
A first-order approximation of the bias involved
in rating rui is as follows:

b b b
ui
u
i
(3)
Here, is the overall average;the parameters
neighborhood methods user-oriented item-oriented
Latent factor models
Sunday, August 11, 2019
7
3.2.1、Neighborhood methods
Centered on computing the relationships between items or users.
Sunday, August 11, 2019
6
3.2、Collaborative Filtering
Analyze relationships between users and interdepencies among products to identify new user-item asSocitions. Disadvantages: cold start Two primary areas:
Characterize both items and users by vectors of factors inferred from item rating patterns.
RS rely on different types of input data. Strength: incorporation of additional information
bu bi qTi
2
pu)

(
qi
2

pu
2

2
bu2ຫໍສະໝຸດ bi )(5)Sunday, August 11, 2019
18
8、Adding Input Sources
Problem:cold start
Solution:incorporate additional sources of information about the users.
bu ,biindicate the observed deviations of user
and item i.
Including bias parameters in the prediction:
Optimize:
r b b q p
T

ui
i
u
iu
(4)
min (rui p.,q.,b. (u,i )
Increases the amount of data Modeling directly the observed ratings
We need to approach that can simply ignore missing value
5.1、Singular Value Decomposition
Approaches:
Singular Value Decompositionn(SVD)
Sunday, August 11, 2019
12
5.1、Singular Value Decomposition
Require factoring the user-item rating matrix Conventional SVD is undefined for incomplete Imputation to fill in missing values
Measures:a regularized model

min (r q p ) q p ui p.,q. (u,i )
T i
2
( u
2
i
2
u)
(2)
Here, is the set of the (u,i)pairs for which rui
is known(the training set);the constant controls
Assumption : Ratings can be inferred from a model put together from a smaller number of parameters.
Sunday, August 11, 2019
10
4、Matrix Factorization Methods
e r q p def
T

ui
ui
iu
By magnitude proportional to in the opposite
direction of the gradient
q i

q i


e( ui
.
p u

.
q) i
p u
相关主题