From: moba on
Hi,

I wonder whether there is a way of proving that the multiplication of
two given polynomial matrices is commutative. Let' s say for the point
of argument, we have two given polynomial matrices, whose coefficients
are given by matrices

p1(A) = -1*A^(-1) + I + 2*A + 3*A*A
p2(A) = A^(-1)

Obviously
p1(A)*p2(A) = p2(A)*p1(A)

Is there a prove to check for commutativity of polynomial matrix
multiplication, especially if we have multivariate polynomials, e.g.
p1(A,B)*p2(A) ?

Best Regards
From: Chip Eastham on
On Aug 13, 9:23 am, moba <wish.i.had.an.i...(a)gmail.com> wrote:
> Hi,
>
> I wonder whether there is a way of proving that the
> multiplication of two given polynomial matrices is
> commutative. Let' s say for the point of argument,
> we have two given polynomial matrices, whose coefficients
> are given by matrices
>
> p1(A) = -1*A^(-1) +  I + 2*A + 3*A*A
> p2(A) = A^(-1)
>
> Obviously
> p1(A)*p2(A) = p2(A)*p1(A)
>
> Is there a prove to check for commutativity of
> polynomial matrix multiplication, especially if
> we have multivariate polynomials, e.g.
> p1(A,B)*p2(A) ?
>
> Best Regards

Since your definition of p1(A) and p2(A) involves
inverses of A, it is not customary to refer to
these as polynomials in A. However they are
(special cases of) rational functions in A.

Two polynomials in matrix A (or two rational
functions in A, where defined) will commute
under matrix multiplication. You can prove
the purely polynomial case by induction on
the degrees of the multiplicands.

If you introduce a polynomial in two matrices,
as with p1(A,B), then it is no longer the case
unless you add a condition such as A and B
commute. If p1(A,B) = B and p2(A) = A, then
of course the commutativity of those requires
that A,B commute, so this condition is needed
for the most general conclusion.

hope this helps, chip
From: Chip Eastham on
On Aug 13, 10:29 am, Chip Eastham <hardm...(a)gmail.com> wrote:
> On Aug 13, 9:23 am, moba <wish.i.had.an.i...(a)gmail.com> wrote:
>
>
>
> > Hi,
>
> > I wonder whether there is a way of proving that the
> > multiplication of two given polynomial matrices is
> > commutative. Let' s say for the point of argument,
> > we have two given polynomial matrices, whose coefficients
> > are given by matrices
>
> > p1(A) = -1*A^(-1) +  I + 2*A + 3*A*A
> > p2(A) = A^(-1)
>
> > Obviously
> > p1(A)*p2(A) = p2(A)*p1(A)
>
> > Is there a prove to check for commutativity of
> > polynomial matrix multiplication, especially if
> > we have multivariate polynomials, e.g.
> > p1(A,B)*p2(A) ?
>
> > Best Regards
>
> Since your definition of p1(A) and p2(A) involves
> inverses of A, it is not customary to refer to
> these as polynomials in A.  However they are
> (special cases of) rational functions in A.
>
> Two polynomials in matrix A (or two rational
> functions in A, where defined) will commute
> under matrix multiplication.  You can prove
> the purely polynomial case by induction on
> the degrees of the multiplicands.
>
> If you introduce a polynomial in two matrices,
> as with p1(A,B), then it is no longer the case
> unless you add a condition such as A and B
> commute.  If p1(A,B) = B and p2(A) = A, then
> of course the commutativity of those requires
> that A,B commute, so this condition is needed
> for the most general conclusion.
>
> hope this helps, chip

Let me add that once you've proven polynomials
in matrix A commute, it's easy to extend this
to rational functions of matrix A. For example
suppose we want to show:

p(A) * (q(A))^-1 = (q(A))^-1 * p(A)

given p,q are polynomial s.t q(A) is invertible.

Since we know q(A)*p(A) = p(A)*q(A), it follows:

p(A) = (q(A))^-1 * p(A) * q(A)

by multiply both sides on the left by (q(A))^-1,
and then multiplying both sides on the right by
the same inverse of q(A) gives the desired
result. Combining this with the well-worn
identity that the the inverse of a product is
the product of the inverses in opposite order
gives a recipe to put every rational function
of matrix A into a "single fraction" p(A)/q(A).

regards, chip