{"id":51,"date":"2011-08-01T11:51:08","date_gmt":"2011-08-01T11:51:08","guid":{"rendered":"http:\/\/florian-roemer.de\/blog\/?p=51"},"modified":"2011-08-02T08:39:57","modified_gmt":"2011-08-02T08:39:57","slug":"more-algebra-fun","status":"publish","type":"post","link":"https:\/\/florian-roemer.de\/blog\/more-algebra-fun\/","title":{"rendered":"More algebra fun"},"content":{"rendered":"<p>I <a href=\"http:\/\/florian-roemer.de\/blog\/?p=41\" title=\"Fun with algebra\">recently posted<\/a> on algebraic rules for rewriting linear and quadratic forms and showed how Kronecker products come into play when expanding products of matrices.<\/p>\n<p>Likewise, we might come into a situation where parameters of interest are inside a Kronecker product which we want to &#8220;pull out&#8221;. As the operation is still linear this should be an easy step. Strangely at the time I needed it I could not find the algebraic rules for it anywhere. I found one by trial and error for a special case and my colleague Martin Weis came up with the much more general case. Here it is for your convenience:<\/p>\n<p>Let $\\ma{A} \\in \\compl^{M \\times N}$ and $\\ma{B} \\in \\compl^{P \\times Q}$. Then, $\\ma{A} \\otimes \\ma{B}$  is a matrix with depends on the elements of $\\ma{A}$ as well as the elements of $\\ma{B}$ linearly. Consequently, ${\\rm vec}(\\ma{A} \\otimes \\ma{B})$ must be linear in ${\\rm vec}(\\ma{A})$ and ${\\rm vec}(\\ma{B})$. Indeed it is and the coefficient matrices are found as:<\/p>\n<p>$$ {\\rm vec}(\\ma{A} \\otimes \\ma{B}) =<br \/>\n\\left( \\ma{I}_N \\otimes \\left[ \\begin{array}{c}<br \/>\n\\ma{I}_M \\otimes \\ma{b}_1 \\\\<br \/>\n\\ma{I}_M \\otimes \\ma{b}_2 \\\\<br \/>\n\\vdots \\\\<br \/>\n\\ma{I}_M \\otimes \\ma{b}_Q<br \/>\n\\end{array} \\right] \\right) \\cdot {\\rm vec}(\\ma{A})<br \/>\n=<br \/>\n\\left[ \\begin{array}{c}<br \/>\n\\ma{I}_Q \\otimes \\ma{a}_1 \\otimes \\ma{I}_P \\\\<br \/>\n\\ma{I}_Q \\otimes \\ma{a}_2 \\otimes \\ma{I}_P \\\\<br \/>\n\\vdots \\\\<br \/>\n\\ma{I}_Q \\otimes \\ma{a}_N \\otimes \\ma{I}_P<br \/>\n\\end{array} \\right]  \\cdot {\\rm vec}(\\ma{B}).<br \/>\n$$<\/p>\n<p>Special cases where either of $M, N, P, Q$ are equal to one (i.e., one or two matrices are row or column vectors) are easily found from this, e.g.,<br \/>\n${\\rm vec}(\\ma{a} \\otimes \\ma{B}) = \\left[\\ma{I}_Q \\otimes \\ma{a} \\otimes \\ma{I}_P\\right] \\cdot {\\rm vec}(\\ma{B})$ or ${\\rm vec}(\\ma{A} \\otimes \\ma{b}) = \\left[\\ma{I}_{MN} \\otimes \\ma{b} \\right] \\cdot {\\rm vec}(\\ma{A})$.<\/p>\n<p>A related important observation is that the matrix $\\ma{A} \\otimes \\ma{B}$ and the matrix ${\\rm vec}(\\ma{B}) \\cdot {\\rm vec}(\\ma{A})^{\\rm T}$ contain exactly the same elements. Therefore, you can permute a given Kronecker product so that it becomes a rank one matrix, where each column is a scaled version of ${\\rm vec}(\\ma{B})$ and each row a scaled version of ${\\rm vec}(\\ma{A})^{\\rm T}$. This is particularly relevant when we have an estimate of a matrix that should be a Kronecker product and want to find a &#8220;Least-Squares Kronecker factorization&#8221;. By performing the above mentioned rearrangement, this problem is transformed into finding a Least-Squared rank-one approximation which is readily solved by a truncated Singular Value Decomposition (SVD) via the <a href=\"http:\/\/en.wikipedia.org\/wiki\/Singular_value_decomposition#Low-rank_matrix_approximation\" title=\"Eckart-Young theorem\">Eckart-Young theorem<\/a>. <\/p>\n<p>Starting from this I wondered what happens if we consider a rank higher than one. This allows us to decompose a given matrix into the sum of Kronecker products and find exact as well as approximate low-rank decompositions with a Least-Squares optimality guarantee. Unfortunately, I found out later that others had this idea before. A decomposition like that is known as Kronecker Product SVD. <a href=\"http:\/\/www.cs.cornell.edu\/cv\/ResearchPDF\/KSVD.pdf\" title=\"Kronecker Product SVD\">Here is a slide set by C. Van Loan on it<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>I recently posted on algebraic rules for rewriting linear and quadratic forms and showed how Kronecker products come into play when expanding products of matrices. Likewise, we might come into a situation where parameters of interest are inside a Kronecker product which we want to &#8220;pull out&#8221;. As the operation is still linear this should [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[4,5,6],"_links":{"self":[{"href":"https:\/\/florian-roemer.de\/blog\/wp-json\/wp\/v2\/posts\/51"}],"collection":[{"href":"https:\/\/florian-roemer.de\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/florian-roemer.de\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/florian-roemer.de\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/florian-roemer.de\/blog\/wp-json\/wp\/v2\/comments?post=51"}],"version-history":[{"count":9,"href":"https:\/\/florian-roemer.de\/blog\/wp-json\/wp\/v2\/posts\/51\/revisions"}],"predecessor-version":[{"id":60,"href":"https:\/\/florian-roemer.de\/blog\/wp-json\/wp\/v2\/posts\/51\/revisions\/60"}],"wp:attachment":[{"href":"https:\/\/florian-roemer.de\/blog\/wp-json\/wp\/v2\/media?parent=51"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/florian-roemer.de\/blog\/wp-json\/wp\/v2\/categories?post=51"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/florian-roemer.de\/blog\/wp-json\/wp\/v2\/tags?post=51"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}