Hi Bo, I see you have a good fluency in formal power-series and I'd like to know if you have any immediate insight about the possibility/problems of two-variables power-series satisfying a "formal" Ackermann/Goodstein equation.

Basically formal power-series over a ring \(R[[X]]\) are formally the same as functions \(R^\mathbb N\) but with much richer algebraic structure on it. Formal power-series on two variables are something like infinite matrices \(R[[X,Y]]\simeq R^{\mathbb N\times \mathbb N} \).

In some cases maybe it is possible to nest-compose 2-variables formal power-series. What about a powerserie \(A\in \mathbb R[[X,Y]]\) s.t.

\[A(S_0(X),S_1(Y))=A(X,A(S_0(X),Y))\]

Where \(S_0(X)=1+X\) and \(S_1(Y)=1+Y\).

What would be the condition to impose on the coefficient matrix of \(A\) that ensure the existence of that composition "as a formal power-series", and what that equation would imply on the coefficient matrix itself?

Inspired by the discussion in (wikipedia) Formal group law.

Let \(A\in R[[X,Y]] \) and \(A(X,Y)=\sum_{n,m}a_{n,m}X^nY^m\). If I'm not mistaken, the condition \(A(0,Y)=Y+1\) implies that \(a_{0,0}=1\), \(a_{0,1}=1\) and if \(1\lt m\) we have \(a_{0,m}=0\).

\[

A=

\begin{bmatrix}

1 & 1 & 0 & 0 & \cdots \\

a_{10} & a_{11} & a_{12} & a_{13} &\cdots \\

a_{20} & a_{21} & a_{22} & a_{23} &\cdots \\

\vdots & \vdots & \vdots & \vdots & \ddots \\

\end{bmatrix}\]

So the initial condition, what I call "trivial zeration" or Goodstein condition, implies that

\[A(X,Y)=1+Y+\sum_{0\lt n,m}a_{n,m}X^nY^n\]

Basically formal power-series over a ring \(R[[X]]\) are formally the same as functions \(R^\mathbb N\) but with much richer algebraic structure on it. Formal power-series on two variables are something like infinite matrices \(R[[X,Y]]\simeq R^{\mathbb N\times \mathbb N} \).

In some cases maybe it is possible to nest-compose 2-variables formal power-series. What about a powerserie \(A\in \mathbb R[[X,Y]]\) s.t.

\[A(S_0(X),S_1(Y))=A(X,A(S_0(X),Y))\]

Where \(S_0(X)=1+X\) and \(S_1(Y)=1+Y\).

What would be the condition to impose on the coefficient matrix of \(A\) that ensure the existence of that composition "as a formal power-series", and what that equation would imply on the coefficient matrix itself?

Inspired by the discussion in (wikipedia) Formal group law.

Let \(A\in R[[X,Y]] \) and \(A(X,Y)=\sum_{n,m}a_{n,m}X^nY^m\). If I'm not mistaken, the condition \(A(0,Y)=Y+1\) implies that \(a_{0,0}=1\), \(a_{0,1}=1\) and if \(1\lt m\) we have \(a_{0,m}=0\).

\[

A=

\begin{bmatrix}

1 & 1 & 0 & 0 & \cdots \\

a_{10} & a_{11} & a_{12} & a_{13} &\cdots \\

a_{20} & a_{21} & a_{22} & a_{23} &\cdots \\

\vdots & \vdots & \vdots & \vdots & \ddots \\

\end{bmatrix}\]

So the initial condition, what I call "trivial zeration" or Goodstein condition, implies that

\[A(X,Y)=1+Y+\sum_{0\lt n,m}a_{n,m}X^nY^n\]

MSE MphLee

Mother Law \((\sigma+1)0=\sigma (\sigma+1)\)

S Law \(\bigcirc_f^{\lambda}\square_f^{\lambda^+}(g)=\square_g^{\lambda}\bigcirc_g^{\lambda^+}(f)\)