![]() |
|
Functional Square Root - Printable Version +- Tetration Forum (https://tetrationforum.org) +-- Forum: Tetration and Related Topics (https://tetrationforum.org/forumdisplay.php?fid=1) +--- Forum: Mathematical and General Discussion (https://tetrationforum.org/forumdisplay.php?fid=3) +--- Thread: Functional Square Root (/showthread.php?tid=1402) |
RE: Functional Square Root - tommy1729 - 06/25/2022 (06/25/2022, 08:51 AM)Catullus Wrote: How about d/dx f(x)=f(f(x))? first equation assuming f is not a constant and analytic almost everywhere : f ' (x) = f(f(x)) f(x) = a x^b f(f(x)) = a * (a x^b)^b = a^(b+1) * x^(b^2) f ' (x) = a b x^(b-1) so a^(b+1) x^(b^2) = a b x^(b-1) b^2 = b - 1 b = (1 +/- sqrt(-3))/2 a^b = b a = exp(+/- 1/3 6throot(-1) pi) or a = exp(+/- 5/3 (-1)^5/6 pi ) second equation analytic f ' (x) usually grows similar to f(x) because 1) log(f(x)) = integral f ' (x)/f(x) 2) For x > 1 and f(x) > 0 ( for x > 0 ) and if f ' (x) > 0 then integral f ' (x) dx from 0 to positive x < x f(x) . 3) superfunctions are not well-defined around multiple fixpoints but rather on strictly increasing regions not having fixpoints. I think there are no interesting analytic solutions. Or even no analytic ones. Even the super of polyomial grows much faster than the polynomial , yet the derivative of a polynomial is a polynomial. Since taylors theorem requires polynomial approximations as do most fixpoint methods I seriously doubt nice solutions. As for functions not growing fast but staying in a fatou set , those boundaries are usually complicated and fractal , while the function is usually less complicated and fractal like. regards tommy1729 RE: Functional Square Root - MphLee - 06/25/2022 This thread seems a wastebin, full of not related ides. But I'll take it as a place where to discuss interesting functional eqns. I'd like to say something on the equation \[Df=f\circ f\] It seem interesting. Idk if this is fruitfull. Sure there is Tommy's derivation from the hypothesis \(f(x)=ax^b\)... but I'd like to be see if we can be more general: define the function \(C_fg:=g\circ f\) Now I'd like to look at the equation \[Dg=C_fg\] So assume that \(g\in A\) for \(A\) an algebra over the ring \(R\). We can look for abstract derivations \(d:A\to A\), i.e. a \(R\)-linear algebra morphism satisfying Leibniz rule. Now, assume that \(A\) is an \(R\)-algebra of functions and \(f\in A\) can be precomposed with functions in \(A\). fact. Precomposition is always linear: let \(f,g,h\in A\), \(k\in R\) then \(C_f(g+h)=(g+h)\circ f=gf+hf=C_f(g)+C_f(h)\) and \(C_f(\lambda g)=(\lambda g)\circ f=\lambda (gh)=\lambda C_f(g)\). So both precompositions \(C:A\to A\) and derivation \(d:A\to A\) are \(R\)-linear operators. They form a vector space, we can add, subtract and scale them. Given a derivation \(d\) and a precomposition \(C\) over \(A\). We now look for \(g\in A\) that satisfies \(dg=Cg\), i.e. solutions to the linear equation \(dg-Cg=0\)... in other words we study the kernel of the operator \[(d-C)g=0\] The question can be reformulated as follows: find all the \(f\in A\) s.t. \[f\in {\rm ker}(d-C_f)\] Question for experts in linear algebra... what can we deduce by studying the kernels of operators that are subtraction of a derivation minus a precomposition? RE: Functional Square Root - tommy1729 - 06/25/2022 (06/25/2022, 09:49 PM)MphLee Wrote: This thread seems a wastebin, full of not related ides. Hmm Basically we could look at it as a difffential equation F ‘ (x) = G(F(x)) This has solution F(x) =inverse ( integral dx/ G(x) + C ) + C2 For Some appropriate constants C and C2. Now G = F itself. I think this ultimately forces F to be slower than exp and Reduces finally to my solution. Using f(x) = a x^b + h(x) seems to imply h(x) = 0. Maybe I made a mistake. But I even think I made this exercise before… regards tommy1729 RE: Functional Square Root - MphLee - 07/01/2022 Idk Tommy. In my attempt I tried to set it up as \(Dg=C_fg\) i.e. \[g'(x)=g(f(x))\] where we solve in the \(g\) I did that only because pre-composition by \(f\) is linear. Only for that reason. In fact \(g'(x)-g(f(x))=0\) can be expressed in linear operators as \((D-C_f)[g(x)]=0\)... Anyways I remember I read something about \(g^{-1}(g'(x))\)... somewhere... maybe something about the legendre transform? Too bad I'm ignorant about that. Just a note: define \(\mathcal F[g]=g^{-1}\circ Dg\) then what we are loking for are fixed points \[\mathcal F[f]=f\] So maybe we could unlock some iterative method: consider \[f_{n+1}(x)=f_n^{-1}(f_n'(x))\] and take the limit \(\lim f_n\) RE: Functional Square Root - tommy1729 - 07/01/2022 (07/01/2022, 12:10 AM)MphLee Wrote: Idk Tommy. In my attempt I tried to set it up as \(Dg=C_fg\) i.e. \[g'(x)=g(f(x))\] where we solve in the \(g\) I think it all solves to or all converges to my solution. Consider that taking many iterations of inverses of analytic functions usually gives singularities that keep making the radius smaller. So if the radius goes to 0 or the derivative is no longer defined the concept f ' (x) is getting dubious. But not in my example. regards tommy1729 |