10/03/2021, 10:12 PM
|
using sinh(x) ?
|
|
07/16/2022, 08:25 AM
see post 8 and 9 here :
https://math.eretrandre.org/tetrationfor...1#pid10731 ( copied ) f^[s+t](z) = f^[s](f^[t](z)) = f^[t](f^[s](z)) Actually Paul Lévy [1] showed how to obtain an iteration of e^x if we have an iteration of e^x-1. Say \( \beta \) is an Abel function of e^x-1, then \( \alpha(x)-\alpha(x_0)=\lim_{n\to\infty} \beta(\exp^{[n]}(x)) - \beta(\exp^{[n]}(x_0)) \) is an Abel function of \( e^x \). This should also work for beta being the Abel function of \( 2\sinh(x) \). This approach is actually equivalent to the "change of base" approach we considered here on the forum, also Walker [2] used a similar method. (But I am in the moment to lazy to detail how exactly they imply each other.) It is still open whether it is analytic, but it is proven to be infinitely differentiable in [2]. [1] Lévy, P. (1927). Sur l'itération de la fonction exponentielle. C. R., 184, 500–502. [2] Walker, P. L. (1991). Infinitely differentiable generalized logarithmic and exponential functions. Math. Comput., 57(196), 723–733. Ok, first let us verify that it is indeed an iteration of exp, i.e that \( f^z(x)=\text{TommySexp_e}(z,x) \) indeed satisfies: \( f^{v+w}(x)=f^{v}(f^{w}(x)) \) and \( f^1(x)=\exp(x) \). neglecting some rules of properly evaluating limits we get\( f^v(f^w(x))=\lim_{n\to\infty} \ln^{[n]} (2\sinh^{[u]}(\exp^{[n]}(\ln^{[n]}(2\sinh^{[v]}(\exp^{[n]}(x))))))=\lim_{n\to\infty} \ln^{[n]}(2\sinh^{u+v}(\exp^{[n]}(x))=f^{u+v}(x) \). and \( f^1(x)=\lim_{n\to\infty} \ln^{[n]}(2\sinh(\exp_^{[n]}(x))=\exp(x) \) because towards infinity \( 2\sinh \) gets arbitrarily close to \( \exp \). Basically thats the iteration equivalent of the Abel function Lévy proposes: \( \beta(x) = \lim_{n\to\infty} \alpha(\exp^{[n]}(x)) - \alpha(\exp^{[n]}(x_0)) \) where \( \alpha \) is the Abel function of \( 2\sinh \) (or in Lévy's case \( \exp(x)-1 \)). The superfunction \( \sigma \) is then (the inverse of \( \beta \)): \( \lim_{n\to\infty} \alpha(\exp^{[n]}(x)) - \alpha(\exp^{[n]}(x_0))=y \) \( \sigma(y)=x=\lim_{n\to\infty} \log^{[n]}(\alpha^{-1}(y+\alpha(\exp^{[n]}(x_0)))) \) \( \sigma(y)=\lim_{n\to\infty} \log^{[n]}(2\sinh^{[y]}(\exp^{[n]}(x_0)))) \) which is the same as Tommy's superfunction. *** tommy1729 JmsNxn So, this is actually a defining property of the standard Schroder iteration. But it's a little difficult to fully flesh out why. Now, to begin I'll construct an arbitrary iteration which has a constant noodle, and show there are many of them. If you iterate locally about a fixed point, and your solution satisfies \(f^t(p) = p\), then the iteration is expressible via Shroder iteration (provided that \(|f'(p)| \neq 0,1\)). For convenience, assume that \(|f'(p)| < 1\). You can actually prove this pretty fast. Assume that \(f^t(x)\) is a super function in \(t\) about a fixed point \(p\), and \(x\) is in the neighborhood of \(p\). Assume that \(f^t(p) = p\). Well then: \[ \Psi(f^{t+1}(x)) = \lambda \Psi(f^t(x))\\ \] So that: \[ \theta(t) = \frac{\Psi(f^t(x))}{\lambda^t}\\ \] And we know that there must be some 1-periodic function \(\theta(t)\), such that: \[ f^t(x) = \Psi^{-1}\left(\lambda^{t}\theta(t) \Psi(x)\right)\\ \] In fact, any periodic function will work fine here, and will have a constant noodle, will be a super function, but will not be Schroder iteration. Now let's add one more constraint, let's say that \(f^{t}(f^{s}(x)) = f^{t+s}(x)\). Well then, we have that \(\theta\) must be constant. By which, we are guaranteed that it's a Schroder iteration... So to clarify. Any fractional iteration with a constant noodle is a Schroder iteration. But there are plenty of superfunctions of \(f\) which have a constant noodle, but in turn, they aren't fractional iterations then (don't satisfy the semi-group law). This is not acceptable for me. Not formal , detailed and general enough. It is actually quite simple even without fixpoints. \(f^{t}(f^{s}(x)) = f^{t+s}(x)\). let theta(v) be a one periodic function. so do we get \(f^{t+theta(t)}(f^{s+theta(s)}(x)) = f^{t+s+theta(s+t)}(x)\). well that would require theta(v+1) = theta(v) and theta(t + s) = theta(t) + theta(s) so theta must be constant. keywords : cauchy functional equation , axiom of choice , linear function , non-constructive , non-periodic. So we get a (local ?) uniqueness criterion. This has many consequences. (for instance probably if you use iterations of another function to get to iterations of yours , it is required that the other function has the semi-group property.) So we have uniqueness up to convergeance speed ofcourse. This should be on page 1 of any dynamics book ! regards tommy1729 So we want f^[s+t](z) = f^[s](f^[t](z)) = f^[t](f^[s](z)) 2sinh^[s+t](x) has this property for real x. Or 2sinh^[s+t](z) has this property for complex z around the real axis. That is if we use the koenings function around the fixpoint 0 of sinh for the construction. It is easy to show that ln ( 2sinh^[s+t](exp(x)) ) has the same property. and by induction ln^[n] ( 2sinh^[s+t](exp^[n](x)) ) for any positive integer n also has the property. notice ln^[n] ( 2sinh^[s+t](exp^[n](x)) ) is also analytic on the real line. Letting n go to +oo gives the solution on the real line : exp^[s+t](x) = lim_n ln^[n] ( 2sinh^[s+t](exp^[n](x)) ) and this limit has the same property !!! see also : Paul Lévy [1] showed how to obtain an iteration of e^x if we have an iteration of e^x-1. Say \( \beta \) is an Abel function of e^x-1, then \( \alpha(x)-\alpha(x_0)=\lim_{n\to\infty} \beta(\exp^{[n]}(x)) - \beta(\exp^{[n]}(x_0)) \) is an Abel function of \( e^x \). This should also work for beta being the Abel function of \( 2\sinh(x) \). This approach is actually equivalent to the "change of base" approach we considered here on the forum, also Walker [2] used a similar method. (But I am in the moment to lazy to detail how exactly they imply each other.) It is still open whether it is analytic, but it is proven to be infinitely differentiable in [2]. [1] Lévy, P. (1927). Sur l'itération de la fonction exponentielle. C. R., 184, 500–502. [2] Walker, P. L. (1991). Infinitely differentiable generalized logarithmic and exponential functions. Math. Comput., 57(196), 723–733. let us verify that it is indeed an iteration of exp, i.e that \( f^z(x)=\text{TommySexp_e}(z,x) \) indeed satisfies: \( f^{v+w}(x)=f^{v}(f^{w}(x)) \) and \( f^1(x)=\exp(x) \). we get \( f^v(f^w(x))=\lim_{n\to\infty} \ln^{[n]} (2\sinh^{[u]}(\exp^{[n]}(\ln^{[n]}(2\sinh^{[v]}(\exp^{[n]}(x))))))=\lim_{n\to\infty} \ln^{[n]}(2\sinh^{u+v}(\exp^{[n]}(x))=f^{u+v}(x) \). and \( f^1(x)=\lim_{n\to\infty} \ln^{[n]}(2\sinh(\exp_^{[n]}(x))=\exp(x) \) because towards infinity \( 2\sinh \) gets arbitrarily close to \( \exp \). Basically thats the iteration equivalent of the Abel function Lévy proposes: \( \beta(x) = \lim_{n\to\infty} \alpha(\exp^{[n]}(x)) - \alpha(\exp^{[n]}(x_0)) \) where \( \alpha \) is the Abel function of \( 2\sinh \) (or in Lévy's case \( \exp(x)-1 \)). The superfunction \( \sigma \) is then (the inverse of \( \beta \)): \( \lim_{n\to\infty} \alpha(\exp^{[n]}(x)) - \alpha(\exp^{[n]}(x_0))=y \) \( \sigma(y)=x=\lim_{n\to\infty} \log^{[n]}(\alpha^{-1}(y+\alpha(\exp^{[n]}(x_0)))) \) \( \sigma(y)=\lim_{n\to\infty} \log^{[n]}(2\sinh^{[y]}(\exp^{[n]}(x_0)))) \) which is the same as Tommy's superfunction. So my 2sinh solution has this uniqueness criterion making it quite special. Other methods i proposed that are similar * real entire functions with unique real fixpoint at 0 with real derivative larger than 1 , fast asymptotic to exp(x) and some minor details * also carry this property .... SO they are unique !! This also explains that if using others than 2sinh(x) , it must give the same function. THEREFORE the nonreal fixpoints of 2sinh are not really an issue. Also the 2sinh(x) is easy to handle since its maclauren series has all nonnegative derivates. *** even more interesting is that the 2sinh method can be extended to bases > exp(1/2). And probably by analytic continuation to all bases larger than eta ( e^(1/e) ). but for bases =< exp(1/2) we get issues with multiple fixpoints or derivatives at the fixpoints. THAT is also the reason why I proposed alternative similar methods. Since those alternatives agree on bases larger than exp(1/2) by the uniqueness criterion , they must be the 2sinh method extended to lower bases !! I hope that is clear to everyone. Regards tommy1729 ps wiki will not accept the 2sinh method in the tetration section , even when it does mention non-C^oo solutions. meh !
12/07/2022, 01:14 PM
see also the related topic ( and an alternative limit ! )
https://math.eretrandre.org/tetrationfor...p?tid=1678 regards tommy1729
02/06/2023, 10:42 PM
It is perhaps interesting to describe how fast the 2sinh method works.
We consider "speed" here by considering changing bases. The gaussian method changes the bases by exp(t(s)) = exp( (1+erf(s))/2 ). F(s+1) = exp( t(s) F(s) ) So we get exp(t(v) exp( t(v-1) exp (t(v-2) exp(... This is fast. But much slower than the 2sinh method. Because the bases change as function of iterations of 2sinh(x). *** well ln ln ln ... 2sinh^[n] exp exp ... gives most attention to the 2sinh iteration part, we could also consider the exp iterations part ... all is relative afterall lim ln ln ... 2sinh^[x] exp^[y]( exp exp ... is essentially equivalent to lim ln ln ... 2sinh^[x+y]( exp exp ... keeping x between zero and 1 gives the most practical way ofcourse. But to keep things simple we consider 2sinh here. As we will later see here , the 2sinh vs exp iterations details here are not so important afterall *** SO suppose 2sinh(x) has base A(x) [ setting 2sinh(x) = A^x is equivalent ] then the method has " speed " A(2sinh^[n](x_0)) , in other words it converges faster than superexponential ! Indeed the new base depends on the previous iteration sinh(x) so we get somewhat superexponential growth. But perhaps a little more precise would be nice. Well a closed form that is simpler than the super of sinh or tetration is probably not possible. But we can describe the limiting behaviour very well. It comes down by understanding A(x) at +infinity. So A(x) = ln( 2sinh(x) ) / x. Now consider the illuminating limit ; lim x to +oo ( ln( 2sinh(x) ) / x )^(C + x exp(2x)) = exp(-1) Or equivalent ( A(x) )^(C + x exp(2x)) = exp(-1) for a constant C. This implies that A(x) in the limit behaves alot like A(x) = 1 - 2/W(2 (x + D)) Where D is a constant and W is the Lambert-W function. So we get some idea how fast this method works. *** I hesitated to post this earlier because the asymptotic is not very precise. Perturbation theory and taylor series might be a better way. Maybe there are better ways. All roads lead to Rome but some require more effort than others. But I like limits as you know , so I go a step further : lim x to +oo [ ( ln( 2sinh(x) ) / x )^(x exp(2x)) - exp(-1) ] exp(2x) = - 1/(2e) At this point I think using an integral transform is probably smart. Laplace probably. The reason is taylor series might not be so useful because of a finite radius. A taylor series after an integral transforms might be useful though. A step further we get lim x to +oo [ [ ( ln( 2sinh(x) ) / x )^(x exp(2x)) - exp(-1) ] exp(2x) + 1/(2e)] x = - 1/(2e) These methods are probably not optimal but Im dropping it here anyway for those who care. Also going further by hand becomes quite somewhat dangerous and alot of work. It would require more paper. *** Notice the comment 2sinh iterations vs exp iterations written before becomes more important when studying these more precise limits. This is logical. Although the details , I admit they are not completely understood by myself. Anyway we focus on A(x) here which is much more clearly defined. *** *** Upper and lower bounds on A(x) can also be computed and might be more insightful than the situation at infinity. On the other hand convergeance to infinity is fast and good so maybe not that much more interesting , well maybe as an exercise. *** regards tommy1729 |
|
« Next Oldest | Next Newest »
|
| Possibly Related Threads… | |||||
| Thread | Author | Replies | Views | Last Post | |
| [exercise] fractional iteration of f(z)= 2*sinh (log(z)) ? | Gottfried | 4 | 9,474 |
03/14/2021, 05:32 PM Last Post: tommy1729 |
|
| exp^[3/2](x) > sinh^[1/2](exp(x)) ? | tommy1729 | 7 | 23,139 |
10/26/2015, 01:07 AM Last Post: tommy1729 |
|
| 2*sinh(3^h*asinh(x/2)) is the superfunction of (...) ? | Gottfried | 5 | 19,505 |
09/11/2013, 08:32 PM Last Post: Gottfried |
|
| zeta and sinh | tommy1729 | 0 | 6,243 |
05/30/2011, 12:07 PM Last Post: tommy1729 |
|
Users browsing this thread: 1 Guest(s)


we get