(03/31/2023, 07:19 PM)tommy1729 Wrote: (03/29/2023, 07:35 AM)JmsNxn Wrote: I thought I'd add here a relationship between "iterated matrices" and "iterated derivatives". This is a secondary thought to most of my work on fractional calculus; but it's masterful as a bridge between these operations. I am going to refer to this as Ramanujan; and "little circle method" mathematicians would look at it.
Let's let the Matrix \(A\) be a non-singular matrix; so that \(A^{-1}\) exists. To be simple; let's assume that \(A : \mathbb{C}^n \to \mathbb{C}^n\). And let's write that:
\[
e^{Ax} = \sum_{n=0}^\infty A^n \frac{x^n}{n!}\\
\]
We can safely assume that \(e^{Ax} : \mathbb{C}^n \to \mathbb{C}^n\). Where \(x\) produces a semi-group structure. From here, we can write:
\[
\frac{d^s}{dx^s} e^{Ax} = A^s e^{Ax}\\
\]
Where this is a linear operator applying \(\mathbb{C}^n \to \mathbb{C}^n\). We can set \(x=0\); and I'm just rewriting Ramanujan's master theorem as he wrote it:
\[
\Gamma(s) A^{-s} = \int_0^\infty e^{-Ax}x^{s-1}\,dx\\
\]
And we've fractionally iterated the matrix \(A: \mathbb{C}^n \to \mathbb{C}^n\). I avoided a lot of "singular moments" here. But if we can map the matrix \(A\) well enough; this discussion is entirely rigorous. It relates Daniel's work; Sheldon's work; Bo's work; Tommy's work; and all that matrix shit in a quantum physics./hilbert space shit.
Fractional calculus is just:
\[
\frac{d^s}{dA^s} : \mathbb{C}_{\Re(s) > 0} \times \mathcal{H} \to \mathcal{H}\\
\]
Where
\[
\mathcal{H} = \{ A : \mathbb{C}^n \to \mathbb{C}^n\,|\, A^{-1} :\mathbb{C}^n \to \mathbb{C}^n\}\\
\]
Then:
\[
\frac{d^s}{dA^s} e^{Ax} = x^s e^{Ax}
\]
And we can differentiate across \(A\) or \(x\); and still have the same rules.
This is the language I think in; and it's just a translation of much of the standard "tetration forum" language; and current literature.
In theory yeah.
But in practice ...
Differentiating a noninteger amount of times with respect to a nontrivial infinite square matrix that might not be diagonalizable ?!
That gives a non-unique infinite tensor with divergent norm ?!
Or did you mean differentiating with respect to a vector ?
And that is just the last line of your answer.
regards
tommy1729
Oh yes! Tommy! I apologize; by "A" I meant non-singular; which implies diagonalizable. I can write the math for you if you'd like. But every FINITE diagonalizable matrix can be differentiated as:
\[
A^s e^{Ax} = \frac{d^s}{dx^s} e^{Ax}\\
\]
But we must choose a path of the differintegral so that it converges. For example; let:
\[
A x_j = \lambda_j x_j\\
\]
Where \(x_j\) is an eigenvector, and \(\lambda_j\) is an eigenvalue. And find a path \(\gamma\) such that \(\gamma(0) = 0\) and \(\gamma(\infty) = \infty\). Then assuming that:
\[
\int_\gamma \left|e^{-\lambda_j x}\right|\,dx < \infty\,\,\text{for all}\,\, 1 \le j \le n\\
\]
Then the differintegral always converges. Finding \(\gamma\) can be tricky. But if for the sake of the argument we assume that \(-\pi/2 < -\kappa< \arg(\lambda_j) < \kappa < \pi/2\); then choosing \(\gamma = [0,\infty]\) works fine. It gets much much trickier in a more general sense; but the idea still holds--at least in a general sense.
The differentiation by \(A\); is a formal operation. But it is entirely rigorous, though not common notation. If I take a function:
\[
f(A) = \sum_{k=0}^\infty f_k A^k\\
\]
Then:
\[
\frac{d}{dA} f(A) = \sum_{k=1}^\infty kf_k A^{k-1}\\
\]
Since \(A\) is an n'th order matrix; by the Cayley-Hamilton theorem; this is still a finite polynomial; and we're just differentiating a polynomial in \(A\).
The idea is that \(\frac{d^s}{dA^s}\) exists in the dual space; where as \(\frac{d^s}{dx^s}\) exists in the normal space. I could never be bothered to work out too many of the details; but much of it holds weight in numerical calculations.
You'll probably see this more often in functional analysis but the operation:
\[
x^{-s} e^{Ax} = \frac{1}{\Gamma(s)} \int_0^\infty e^{-Ax}A^{s-1}\,dA\\
\]
Is a perfectly valid operation.
Also, I apologize; I was mostly just spitballing a lot. So I may have screwed up some details. But the idea looks a lot like this--but still, roughly looks like this.