07/16/2022, 01:41 PM
I am copying a bit from my (rather notepad) article https://www.researchgate.net/profile/Dmi...ration.pdf to have some preliminaries about formal power series calculation, going from addition to multiplication to composition:
\begin{align}
\left(\sum_{n=0}^\infty f_{n} x^n\right) + \left(\sum_{n=0}^\infty
g_n x^n\right)&=\sum_{n=0}^\infty (f_{n}+g_{n})x^n & (f+g)_n &=
f_{n} + g_{n}\\
\left(\sum_{n_1=0}^\infty f_{n_1} x^{n_1}\right)
\left(\sum_{n_2=0}^\infty g_{n_2} x^{n_2}\right)&=
\sum_{n,m=0}^\infty f_{n_1} g_{n_2} x^{n_1+n_2} &
(fg)_n &= \sum_{n_1+n_2=n} f_{n_1} g_{n_2}
\end{align}
When taking powers, note that \({f^m}_{n}\) means the \(n\)-th coefficient of the \(m\)-th power of \(f\), while \({f_{n}}^m\) means the \(m\)-th power of the \(n\)-th coefficient of \(f\). Deriving from
the multiplication identity we get
\begin{align}
{f^m}_{n} = \sum_{n_1+\dots+n_m=n} f_{n_1}\dotsm f_{n_m} =
\sum_{\substack{m_1+2m_2+\dots+nm_n = n\\m_0+\dots+m_n=m}}
\frac{m!}{m_0!\dotsm m_n!}{f_{0}}^{m_0}\dotsm {f_{n}}^{m_n}
\end{align}
This enables a formula for composition
\begin{align*}
f(g(x))&=\sum_{m=0}^\infty f_{m} g(x)^m &
(f\circ g)_{n} &= \sum_{m=0}^\infty f_{m} {g^m}_n
\end{align*}
which may however problematic as we dont know about the convergence of each coefficient. A minor constraint on \(g\) however makes the coefficients finite expressions.
If \(g_{0}=0\) then \({g^m}_{n} = 0\) for \(n<m\). Hence
\begin{align}
(f\circ g)_{n} &= \sum_{m=0}^n f_{m} {g^m}_{n}, \quad g_0=0.
\end{align}
So much for the preliminaries about formal powerseries. Now to our real problem:
We want to define an iteration \(f^{\mathbb{R} t}\) (the \(\mathbb{R}\) is not a variable, but refers to the regular iteration) that satisfies:
\[f^{\mathbb{R} s}\circ f^{\mathbb{R} t} = f^{\mathbb{R} s+t}\]
For such an iteration must also hold:
\[f^{\mathbb{R}t}\circ f = f\circ f^{\mathbb{R} t}\]
This condition already determines the power series coeffecients up to \({f^{\mathbb{R} t}}_1\): Let \(g = f^{\mathbb{R} t}\), then we have the equation system (from above power series composition and \(g_0 = f_0 = 0\) )
\begin{align}
\sum_{m=1}^n f_{m} {g^m}_{n} &= \sum_{m=1}^n g_{m} {f^m}_{n}\\
\end{align}
and we pull out the parts that contain \(g_n\) and make a recursive formula out of it:
\begin{align}
f_1 g_n + f_n {g^n}_n + \sum_{m=2}^{n-1} f_{m} {g^m}_{n} &= g_1 f_n + g_n {f^n}_n+\sum_{m=2}^{n-1} g_{m} {f^m}_{n}\\
f_1 g_n + f_n {g_1}^n + \sum_{m=2}^{n-1} f_{m} {g^m}_{n} &= g_1 f_n + g_n {f_1}^n+\sum_{m=2}^{n-1} g_{m} {f^m}_{n}\\
g_n &= \frac{1}{{f_1}^n - f_1} \left(
f_n {g_1}^n - g_1 f_n + \sum_{m=2}^{n-1} f_m {g^m}_n - g_m
{f^m}_n\right)
\end{align}
This formula surely computes the same that Mathematica calculates. (But always note these are purely formal power series, it is nothing said about their convergence. But on the other hand they are convergent, but that is proven elsewhere.)
So up to now we see that the formal powerseries \(g\) is determined, by the law \(g\circ f = f\circ g\) and its first coeffiecent \(g_1\).
So we can say that \(f^{\mathbb{R} s+t}\) is determined by this. But even so is \(f^{\mathbb{R} s}\circ f^{\mathbb{R} t}\):
\[
(f^{\mathbb{R}s} \circ f^{\mathbb{R}t}) \circ f = f^{\mathbb{R}s} \circ f \circ f^{\mathbb{R} t} = f \circ (f^{\mathbb{R}s} \circ f^{\mathbb{R}t})
\]
As we set \({f^{\mathbb{R}t}}_1 = f_1^t\) and know that \((f\circ g)_1 = f_1g_1\) the first coefficient is equal:
\[{f^{\mathbb{R}s+t}}_1= f_1^{s+t} = f_1^sf_1^t = (f^{\mathbb{R}s}\circ f^{\mathbb{R}t})_1\] and hence the other coefficients are equal too and we have our desired identity
\[f^{\mathbb{R} s+t } = f^{\mathbb{R} s}\circ f^{\mathbb{R}t}\]
\begin{align}
\left(\sum_{n=0}^\infty f_{n} x^n\right) + \left(\sum_{n=0}^\infty
g_n x^n\right)&=\sum_{n=0}^\infty (f_{n}+g_{n})x^n & (f+g)_n &=
f_{n} + g_{n}\\
\left(\sum_{n_1=0}^\infty f_{n_1} x^{n_1}\right)
\left(\sum_{n_2=0}^\infty g_{n_2} x^{n_2}\right)&=
\sum_{n,m=0}^\infty f_{n_1} g_{n_2} x^{n_1+n_2} &
(fg)_n &= \sum_{n_1+n_2=n} f_{n_1} g_{n_2}
\end{align}
When taking powers, note that \({f^m}_{n}\) means the \(n\)-th coefficient of the \(m\)-th power of \(f\), while \({f_{n}}^m\) means the \(m\)-th power of the \(n\)-th coefficient of \(f\). Deriving from
the multiplication identity we get
\begin{align}
{f^m}_{n} = \sum_{n_1+\dots+n_m=n} f_{n_1}\dotsm f_{n_m} =
\sum_{\substack{m_1+2m_2+\dots+nm_n = n\\m_0+\dots+m_n=m}}
\frac{m!}{m_0!\dotsm m_n!}{f_{0}}^{m_0}\dotsm {f_{n}}^{m_n}
\end{align}
This enables a formula for composition
\begin{align*}
f(g(x))&=\sum_{m=0}^\infty f_{m} g(x)^m &
(f\circ g)_{n} &= \sum_{m=0}^\infty f_{m} {g^m}_n
\end{align*}
which may however problematic as we dont know about the convergence of each coefficient. A minor constraint on \(g\) however makes the coefficients finite expressions.
If \(g_{0}=0\) then \({g^m}_{n} = 0\) for \(n<m\). Hence
\begin{align}
(f\circ g)_{n} &= \sum_{m=0}^n f_{m} {g^m}_{n}, \quad g_0=0.
\end{align}
So much for the preliminaries about formal powerseries. Now to our real problem:
We want to define an iteration \(f^{\mathbb{R} t}\) (the \(\mathbb{R}\) is not a variable, but refers to the regular iteration) that satisfies:
\[f^{\mathbb{R} s}\circ f^{\mathbb{R} t} = f^{\mathbb{R} s+t}\]
For such an iteration must also hold:
\[f^{\mathbb{R}t}\circ f = f\circ f^{\mathbb{R} t}\]
This condition already determines the power series coeffecients up to \({f^{\mathbb{R} t}}_1\): Let \(g = f^{\mathbb{R} t}\), then we have the equation system (from above power series composition and \(g_0 = f_0 = 0\) )
\begin{align}
\sum_{m=1}^n f_{m} {g^m}_{n} &= \sum_{m=1}^n g_{m} {f^m}_{n}\\
\end{align}
and we pull out the parts that contain \(g_n\) and make a recursive formula out of it:
\begin{align}
f_1 g_n + f_n {g^n}_n + \sum_{m=2}^{n-1} f_{m} {g^m}_{n} &= g_1 f_n + g_n {f^n}_n+\sum_{m=2}^{n-1} g_{m} {f^m}_{n}\\
f_1 g_n + f_n {g_1}^n + \sum_{m=2}^{n-1} f_{m} {g^m}_{n} &= g_1 f_n + g_n {f_1}^n+\sum_{m=2}^{n-1} g_{m} {f^m}_{n}\\
g_n &= \frac{1}{{f_1}^n - f_1} \left(
f_n {g_1}^n - g_1 f_n + \sum_{m=2}^{n-1} f_m {g^m}_n - g_m
{f^m}_n\right)
\end{align}
This formula surely computes the same that Mathematica calculates. (But always note these are purely formal power series, it is nothing said about their convergence. But on the other hand they are convergent, but that is proven elsewhere.)
So up to now we see that the formal powerseries \(g\) is determined, by the law \(g\circ f = f\circ g\) and its first coeffiecent \(g_1\).
So we can say that \(f^{\mathbb{R} s+t}\) is determined by this. But even so is \(f^{\mathbb{R} s}\circ f^{\mathbb{R} t}\):
\[
(f^{\mathbb{R}s} \circ f^{\mathbb{R}t}) \circ f = f^{\mathbb{R}s} \circ f \circ f^{\mathbb{R} t} = f \circ (f^{\mathbb{R}s} \circ f^{\mathbb{R}t})
\]
As we set \({f^{\mathbb{R}t}}_1 = f_1^t\) and know that \((f\circ g)_1 = f_1g_1\) the first coefficient is equal:
\[{f^{\mathbb{R}s+t}}_1= f_1^{s+t} = f_1^sf_1^t = (f^{\mathbb{R}s}\circ f^{\mathbb{R}t})_1\] and hence the other coefficients are equal too and we have our desired identity
\[f^{\mathbb{R} s+t } = f^{\mathbb{R} s}\circ f^{\mathbb{R}t}\]
