Jacobian of linear map, with variable matrix
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
Let $beta in mathbbR^k, x in mathbbR^q$, and $A: mathbbR^k to mathbbR^k times q$ a matrix function of $beta$. Define
$$
h(beta) = A(beta) x,
$$
so that $h:mathbbR^k to mathbbR^k$. I would like the Jacobian of $h$ with respect to $beta$, but it seems like that would require some notion of a tensor derivative that I am unfamiliar with. I am having a hard time figuring out where to even begin looking, since any search results involving
"linear map" assumes $A$ is constant.
If it helps, $A$ has some structure. Specifically, $A(beta) = mathbbE[g(beta, w)z'] B$ where $z in mathbbR^q$, $w$ are random variables and $B$ is a constant $mathbbR^q times q$ matrix. The function $g : mathbbR^k times mathcalW to mathbbR^k$, where $mathcalW$ is the space the random variable $w$ lies in. It is reasonable to assume that the derivative can "pass through" the expectation.
If it helps further, we can assume that $g$ is affine/linear in $beta$.
calculus
add a comment |Â
up vote
0
down vote
favorite
Let $beta in mathbbR^k, x in mathbbR^q$, and $A: mathbbR^k to mathbbR^k times q$ a matrix function of $beta$. Define
$$
h(beta) = A(beta) x,
$$
so that $h:mathbbR^k to mathbbR^k$. I would like the Jacobian of $h$ with respect to $beta$, but it seems like that would require some notion of a tensor derivative that I am unfamiliar with. I am having a hard time figuring out where to even begin looking, since any search results involving
"linear map" assumes $A$ is constant.
If it helps, $A$ has some structure. Specifically, $A(beta) = mathbbE[g(beta, w)z'] B$ where $z in mathbbR^q$, $w$ are random variables and $B$ is a constant $mathbbR^q times q$ matrix. The function $g : mathbbR^k times mathcalW to mathbbR^k$, where $mathcalW$ is the space the random variable $w$ lies in. It is reasonable to assume that the derivative can "pass through" the expectation.
If it helps further, we can assume that $g$ is affine/linear in $beta$.
calculus
It might help to write $A(beta)$ as a matrix with rows of the form $a_i(beta)$ and then perform the derivation step by step starting from $h_i(beta)=a_i(beta)xin mathbbR$.
â WalterJ
Jul 25 at 16:09
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Let $beta in mathbbR^k, x in mathbbR^q$, and $A: mathbbR^k to mathbbR^k times q$ a matrix function of $beta$. Define
$$
h(beta) = A(beta) x,
$$
so that $h:mathbbR^k to mathbbR^k$. I would like the Jacobian of $h$ with respect to $beta$, but it seems like that would require some notion of a tensor derivative that I am unfamiliar with. I am having a hard time figuring out where to even begin looking, since any search results involving
"linear map" assumes $A$ is constant.
If it helps, $A$ has some structure. Specifically, $A(beta) = mathbbE[g(beta, w)z'] B$ where $z in mathbbR^q$, $w$ are random variables and $B$ is a constant $mathbbR^q times q$ matrix. The function $g : mathbbR^k times mathcalW to mathbbR^k$, where $mathcalW$ is the space the random variable $w$ lies in. It is reasonable to assume that the derivative can "pass through" the expectation.
If it helps further, we can assume that $g$ is affine/linear in $beta$.
calculus
Let $beta in mathbbR^k, x in mathbbR^q$, and $A: mathbbR^k to mathbbR^k times q$ a matrix function of $beta$. Define
$$
h(beta) = A(beta) x,
$$
so that $h:mathbbR^k to mathbbR^k$. I would like the Jacobian of $h$ with respect to $beta$, but it seems like that would require some notion of a tensor derivative that I am unfamiliar with. I am having a hard time figuring out where to even begin looking, since any search results involving
"linear map" assumes $A$ is constant.
If it helps, $A$ has some structure. Specifically, $A(beta) = mathbbE[g(beta, w)z'] B$ where $z in mathbbR^q$, $w$ are random variables and $B$ is a constant $mathbbR^q times q$ matrix. The function $g : mathbbR^k times mathcalW to mathbbR^k$, where $mathcalW$ is the space the random variable $w$ lies in. It is reasonable to assume that the derivative can "pass through" the expectation.
If it helps further, we can assume that $g$ is affine/linear in $beta$.
calculus
edited Jul 25 at 15:50
asked Jul 25 at 15:40
Matt
1566
1566
It might help to write $A(beta)$ as a matrix with rows of the form $a_i(beta)$ and then perform the derivation step by step starting from $h_i(beta)=a_i(beta)xin mathbbR$.
â WalterJ
Jul 25 at 16:09
add a comment |Â
It might help to write $A(beta)$ as a matrix with rows of the form $a_i(beta)$ and then perform the derivation step by step starting from $h_i(beta)=a_i(beta)xin mathbbR$.
â WalterJ
Jul 25 at 16:09
It might help to write $A(beta)$ as a matrix with rows of the form $a_i(beta)$ and then perform the derivation step by step starting from $h_i(beta)=a_i(beta)xin mathbbR$.
â WalterJ
Jul 25 at 16:09
It might help to write $A(beta)$ as a matrix with rows of the form $a_i(beta)$ and then perform the derivation step by step starting from $h_i(beta)=a_i(beta)xin mathbbR$.
â WalterJ
Jul 25 at 16:09
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862544%2fjacobian-of-linear-map-with-variable-matrix%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
It might help to write $A(beta)$ as a matrix with rows of the form $a_i(beta)$ and then perform the derivation step by step starting from $h_i(beta)=a_i(beta)xin mathbbR$.
â WalterJ
Jul 25 at 16:09