Posts

Showing posts from August 11, 2018

Checking $sum_beta in Phi^+ langle beta, alpha^vee rangle = 2$ for a simple root $alpha$

Image
Clash Royale CLAN TAG #URR8PPP up vote 0 down vote favorite Let $mathfrak g$ be a semisimple complex Lie algebra and $Phi$ a set of simple roots. Let $alpha in Phi$, I want to know why $$ sum_beta in Phi^+ langle beta, alpha^vee rangle = 2$$ I checked it by hands for $mathfrak sl_n$ for $n=2,3,4$. I don't know how to generalize and this is stated without proof in several places for arbitrary semisimple Lie algebra so I'm sure I miss something. Any hints is appreciated. lie-algebras root-systems share | cite | improve this question asked Jul 26 at 11:29 student 71 8 add a comment  |  up vote 0 down vote favorite Let $mathfrak g$ be a semisimple complex Lie algebra and $Phi$ a set of simple roots. Let $alpha in Phi$, I want to know why $$ sum_beta in Phi^+ langle beta, alpha^vee rangle = 2$$ I checked it by hands for $mathfrak sl_n$ for $n=2,3,4$. I don't know how to generalize and this is stated without pro...

Different eigenvalues of the same linear transformation according different bases

Image
Clash Royale CLAN TAG #URR8PPP up vote 1 down vote favorite I have a question about linear transformation and eigenvalues. My question: Given a linear transformation $T:R^3 to R^3$, And: $E$ is the standard basis of $R^3$, $B$ is another basis of $R^3$. Let's denote: $A=[T]^B_B$ . Let's assume that after gaussian elimination process on $A$ we get a matrix $M$ with one row of $0$'s, and now we calculate the eigenvalues of $M$. Are the eigenvalues of $M$ also the eigenvalues of the transformation $T?$ I think yes , because the eigenvalues don't change when you change basis, but the correct answer is no , can someone explain to me why? By the way, Is it correct to say that we always must to work only with the standard basis of $R^3$ to find the eigenvalues of $T?$ (i.e the eigenvalues of $T$ are the roots of the caracteristic polynomial $P_A = Det(A- lambda cdot I)$ where $A=[T]^E_E$ , and $E$ is the standard basis)? Thanks for help! linear-algebra eigenv...

Integration by parts with empirical measure

Image
Clash Royale CLAN TAG #URR8PPP up vote 1 down vote favorite I'm currently reading through the paper Asymptotic normality of nearest neighbor regression function estimates and am struggling to understand the asymptotic equalities that were shown in the proof of Lemma 4 (p. 923) where the last one was proven by using integration by parts: $beginalign*&qquad fracm(x_0)sqrta_n^3int left[alpha_n(x_0)-alpha_n(x)right]K'left(fracF(x_0)-F(x)a_nright)F(d x)\ &= -fracm(x_0)sqrta_n^3int alpha_n(x)K'left(fracF(x_0)-F(x)a_nright)F(d x)\ &= -fracm(x_0)sqrta_nint Kleft(fracF(x_0)-F(x)a_nright)alpha_n(dx). endalign*$ Here $m(x) = E[Y|X=x]$, $alpha_n(x) = sqrtn [F_n(x)-F(x)]$ denotes the empirical process of the random variable $X$ (having continuous distribution function $F$), $K$ being a twice continuously differentiable probability kernel with bounded support and $na_n^3 to infty$, $a_n to 0$ for $n to infty$. I know that I could expand the second to last term l...