(05/06/2015, 02:42 PM)marraco Wrote:(05/05/2015, 07:40 AM)Gottfried Wrote: P*A = A*Bb
I think that we are speaking of different things.
Obviously, there should be a way to demonstrate the equivalence of both, because they are trying to solve the same problem; looking for the same solution.
But as I understand, the Carleman matrix A only contains powers of a_i coefficients, yet if you look at the red side, it cannot be written as a matrix product A*Bb, because it needs to have products of a_i coefficients (like \( a_1^3.a_3^2.a_5^8.a_... \)). Maybe it is a power of A.Bb, or something like A^Bb?
No, no ... In your convolution-formula you have in the inner of the double sum powers of powerseries (the red-colored formula \( a^{ \;^x a} \) in your first posting ) with the coefficients of the a()-function (not of its single coefficients), and if I decode this correctly, then this matches perfectly the composition of
V(x)*A * Bb = (V(x)*A) * Bb = [1,a(x), a(x)^2, a(x)^3),...] * Bb = V(a(x))*Bb
Only, that after removing of the left V(x)-vector we do things in different order:
V(x)*A * Bb = V(x)*(A * Bb )
and I discuss that remaining matrix in the parenthese of the rhs. That V(x) can be removed on the rhs and on the lhs of the matrix-equation must be justified; if anywhere occur divergent series, this becomes difficult, but as far as we have nonzero intervals of convergence for all dot-products, this exploitation of associativity can be done /should be possible to be done (as far as I think). (The goal of this all is of course to improve computability of A, for instance by diagonalization of P or Bb and algebraic manipulations of the occuring matrix-factors).
Anyway - I hope I didn't actually misread you (which is always possible given the lot of coefficients... )
Gottfried
Gottfried Helms, Kassel

