Partial derivative in gradient descent for social recommendations

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












In paper entitled with "recommendations in signed social networks" by Jiliang Tang, he suggest model for capturing local and global information from signed social networks as follows:



$$min sum_i=1^n sum_j=1^m g(W_ij,w_i) ||R_ij-U_i^T V_j||_2^2 +alpha (||U||_F^2 + ||V||_F^2) +\ beta sum_i=1^n max (0, ||U_i - bar U_i^p ||_2^2 - ||U_i - bar U_i^n ||_2^2 ) ;;;; (1)$$



where:



$$bar U_i^p= frac sum_u_j in P_i S_ij U_j sum_u_j in P_i S_ij $$



$$bar U_i^n= frac sum_u_j in N_i S_ij U_j sum_u_j in N_i S_ij $$



And because Eq. (1) is jointly convex with respect to U and V, there is no nice solution in closed form due to the use of the max function. So he use Mki at the k-th iteration for ui as follows:



$$ M_i^K= { 1 ;;;;;; if ||U_i - bar U_i^p ||_2^2 - ||U_i - bar U_i^n ||_2^2>0 \ 0 ;;;;;; otherwise$$



Then he use J to denote the objective function of Eq. (1) in the k-th iteration as follows:



$$J= sum_i=1^n sum_j=1^m g(W_ij,w_i) ||R_ij-U_i^T V_j||_2^2 +alpha (||U||_F^2 + ||V||_F^2) +\ beta sum_i=1^n M_i^K ( ||U_i - frac sum_u_j in P_i S_ij U_j sum_u_j in P_i S_ij ||_2^2 - ||U_i - frac sum_u_j in N_i S_ij U_j sum_u_j in N_i S_ij ||_2^2 ) ;;;; (2)$$



He compute The derivatives of J with respect to Ui and Vj as follows:



$$ frac partial J partial U_i= -2 sum_j g(W_ij,w_i) (R_ij-U_i^T V_j) V_j + 2 alpha U_i \ +2 beta M_i^k (U_i - bar U_i^p ) -2 beta M_i^k (U_i - bar U_i^n )\ -2beta sum_u_j in P_i M_j^k (U_j - bar U_j^p ) frac 1sum_u_j in P_i S_ji \ +2beta sum_u_j in N_i M_j^k (U_j - bar U_j^n ) frac 1sum_u_j in N_i S_ji \ \$$



$$ frac partial J partial V_j= -2 sum_j g(W_ij,w_i) (R_ij-U_i^T V_j)U_i+ 2 alpha V_j $$



I do not understand how the partial derivative with respect to Ui was done. I would very much like to understand this if possible. Could someone show how the partial derivative could be taken step by step, or link to some resource that I could use to learn more? I apologize if I haven't used the correct terminology in my question.







share|cite|improve this question





















  • Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
    – José Carlos Santos
    2 days ago










  • ok, thanks for advice
    – diab hr
    2 days ago














up vote
0
down vote

favorite












In paper entitled with "recommendations in signed social networks" by Jiliang Tang, he suggest model for capturing local and global information from signed social networks as follows:



$$min sum_i=1^n sum_j=1^m g(W_ij,w_i) ||R_ij-U_i^T V_j||_2^2 +alpha (||U||_F^2 + ||V||_F^2) +\ beta sum_i=1^n max (0, ||U_i - bar U_i^p ||_2^2 - ||U_i - bar U_i^n ||_2^2 ) ;;;; (1)$$



where:



$$bar U_i^p= frac sum_u_j in P_i S_ij U_j sum_u_j in P_i S_ij $$



$$bar U_i^n= frac sum_u_j in N_i S_ij U_j sum_u_j in N_i S_ij $$



And because Eq. (1) is jointly convex with respect to U and V, there is no nice solution in closed form due to the use of the max function. So he use Mki at the k-th iteration for ui as follows:



$$ M_i^K= { 1 ;;;;;; if ||U_i - bar U_i^p ||_2^2 - ||U_i - bar U_i^n ||_2^2>0 \ 0 ;;;;;; otherwise$$



Then he use J to denote the objective function of Eq. (1) in the k-th iteration as follows:



$$J= sum_i=1^n sum_j=1^m g(W_ij,w_i) ||R_ij-U_i^T V_j||_2^2 +alpha (||U||_F^2 + ||V||_F^2) +\ beta sum_i=1^n M_i^K ( ||U_i - frac sum_u_j in P_i S_ij U_j sum_u_j in P_i S_ij ||_2^2 - ||U_i - frac sum_u_j in N_i S_ij U_j sum_u_j in N_i S_ij ||_2^2 ) ;;;; (2)$$



He compute The derivatives of J with respect to Ui and Vj as follows:



$$ frac partial J partial U_i= -2 sum_j g(W_ij,w_i) (R_ij-U_i^T V_j) V_j + 2 alpha U_i \ +2 beta M_i^k (U_i - bar U_i^p ) -2 beta M_i^k (U_i - bar U_i^n )\ -2beta sum_u_j in P_i M_j^k (U_j - bar U_j^p ) frac 1sum_u_j in P_i S_ji \ +2beta sum_u_j in N_i M_j^k (U_j - bar U_j^n ) frac 1sum_u_j in N_i S_ji \ \$$



$$ frac partial J partial V_j= -2 sum_j g(W_ij,w_i) (R_ij-U_i^T V_j)U_i+ 2 alpha V_j $$



I do not understand how the partial derivative with respect to Ui was done. I would very much like to understand this if possible. Could someone show how the partial derivative could be taken step by step, or link to some resource that I could use to learn more? I apologize if I haven't used the correct terminology in my question.







share|cite|improve this question





















  • Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
    – José Carlos Santos
    2 days ago










  • ok, thanks for advice
    – diab hr
    2 days ago












up vote
0
down vote

favorite









up vote
0
down vote

favorite











In paper entitled with "recommendations in signed social networks" by Jiliang Tang, he suggest model for capturing local and global information from signed social networks as follows:



$$min sum_i=1^n sum_j=1^m g(W_ij,w_i) ||R_ij-U_i^T V_j||_2^2 +alpha (||U||_F^2 + ||V||_F^2) +\ beta sum_i=1^n max (0, ||U_i - bar U_i^p ||_2^2 - ||U_i - bar U_i^n ||_2^2 ) ;;;; (1)$$



where:



$$bar U_i^p= frac sum_u_j in P_i S_ij U_j sum_u_j in P_i S_ij $$



$$bar U_i^n= frac sum_u_j in N_i S_ij U_j sum_u_j in N_i S_ij $$



And because Eq. (1) is jointly convex with respect to U and V, there is no nice solution in closed form due to the use of the max function. So he use Mki at the k-th iteration for ui as follows:



$$ M_i^K= { 1 ;;;;;; if ||U_i - bar U_i^p ||_2^2 - ||U_i - bar U_i^n ||_2^2>0 \ 0 ;;;;;; otherwise$$



Then he use J to denote the objective function of Eq. (1) in the k-th iteration as follows:



$$J= sum_i=1^n sum_j=1^m g(W_ij,w_i) ||R_ij-U_i^T V_j||_2^2 +alpha (||U||_F^2 + ||V||_F^2) +\ beta sum_i=1^n M_i^K ( ||U_i - frac sum_u_j in P_i S_ij U_j sum_u_j in P_i S_ij ||_2^2 - ||U_i - frac sum_u_j in N_i S_ij U_j sum_u_j in N_i S_ij ||_2^2 ) ;;;; (2)$$



He compute The derivatives of J with respect to Ui and Vj as follows:



$$ frac partial J partial U_i= -2 sum_j g(W_ij,w_i) (R_ij-U_i^T V_j) V_j + 2 alpha U_i \ +2 beta M_i^k (U_i - bar U_i^p ) -2 beta M_i^k (U_i - bar U_i^n )\ -2beta sum_u_j in P_i M_j^k (U_j - bar U_j^p ) frac 1sum_u_j in P_i S_ji \ +2beta sum_u_j in N_i M_j^k (U_j - bar U_j^n ) frac 1sum_u_j in N_i S_ji \ \$$



$$ frac partial J partial V_j= -2 sum_j g(W_ij,w_i) (R_ij-U_i^T V_j)U_i+ 2 alpha V_j $$



I do not understand how the partial derivative with respect to Ui was done. I would very much like to understand this if possible. Could someone show how the partial derivative could be taken step by step, or link to some resource that I could use to learn more? I apologize if I haven't used the correct terminology in my question.







share|cite|improve this question













In paper entitled with "recommendations in signed social networks" by Jiliang Tang, he suggest model for capturing local and global information from signed social networks as follows:



$$min sum_i=1^n sum_j=1^m g(W_ij,w_i) ||R_ij-U_i^T V_j||_2^2 +alpha (||U||_F^2 + ||V||_F^2) +\ beta sum_i=1^n max (0, ||U_i - bar U_i^p ||_2^2 - ||U_i - bar U_i^n ||_2^2 ) ;;;; (1)$$



where:



$$bar U_i^p= frac sum_u_j in P_i S_ij U_j sum_u_j in P_i S_ij $$



$$bar U_i^n= frac sum_u_j in N_i S_ij U_j sum_u_j in N_i S_ij $$



And because Eq. (1) is jointly convex with respect to U and V, there is no nice solution in closed form due to the use of the max function. So he use Mki at the k-th iteration for ui as follows:



$$ M_i^K= { 1 ;;;;;; if ||U_i - bar U_i^p ||_2^2 - ||U_i - bar U_i^n ||_2^2>0 \ 0 ;;;;;; otherwise$$



Then he use J to denote the objective function of Eq. (1) in the k-th iteration as follows:



$$J= sum_i=1^n sum_j=1^m g(W_ij,w_i) ||R_ij-U_i^T V_j||_2^2 +alpha (||U||_F^2 + ||V||_F^2) +\ beta sum_i=1^n M_i^K ( ||U_i - frac sum_u_j in P_i S_ij U_j sum_u_j in P_i S_ij ||_2^2 - ||U_i - frac sum_u_j in N_i S_ij U_j sum_u_j in N_i S_ij ||_2^2 ) ;;;; (2)$$



He compute The derivatives of J with respect to Ui and Vj as follows:



$$ frac partial J partial U_i= -2 sum_j g(W_ij,w_i) (R_ij-U_i^T V_j) V_j + 2 alpha U_i \ +2 beta M_i^k (U_i - bar U_i^p ) -2 beta M_i^k (U_i - bar U_i^n )\ -2beta sum_u_j in P_i M_j^k (U_j - bar U_j^p ) frac 1sum_u_j in P_i S_ji \ +2beta sum_u_j in N_i M_j^k (U_j - bar U_j^n ) frac 1sum_u_j in N_i S_ji \ \$$



$$ frac partial J partial V_j= -2 sum_j g(W_ij,w_i) (R_ij-U_i^T V_j)U_i+ 2 alpha V_j $$



I do not understand how the partial derivative with respect to Ui was done. I would very much like to understand this if possible. Could someone show how the partial derivative could be taken step by step, or link to some resource that I could use to learn more? I apologize if I haven't used the correct terminology in my question.









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited 2 days ago
























asked 2 days ago









diab hr

11




11











  • Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
    – José Carlos Santos
    2 days ago










  • ok, thanks for advice
    – diab hr
    2 days ago
















  • Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
    – José Carlos Santos
    2 days ago










  • ok, thanks for advice
    – diab hr
    2 days ago















Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
– José Carlos Santos
2 days ago




Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
– José Carlos Santos
2 days ago












ok, thanks for advice
– diab hr
2 days ago




ok, thanks for advice
– diab hr
2 days ago















active

oldest

votes











Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2871834%2fpartial-derivative-in-gradient-descent-for-social-recommendations%23new-answer', 'question_page');

);

Post as a guest



































active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes










 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2871834%2fpartial-derivative-in-gradient-descent-for-social-recommendations%23new-answer', 'question_page');

);

Post as a guest













































































Comments

Popular posts from this blog

What is the equation of a 3D cone with generalised tilt?

Color the edges and diagonals of a regular polygon

Relationship between determinant of matrix and determinant of adjoint?