06/07/2022, 12:04 PM
Im considering an old idea of myself again.
This is not " the 2sinh method " for all clarity.
But it also uses 2sinh.
f(n,x) = 2sinh(x + n)*exp(-n) = exp(x) - exp(- x - n).
Notice how this function f(n,x) approximates exp(x) very well for Re(x) > - n/2.
It also has exactly one real fixpoint with a derivative there about the same value as minus the fixpoint.
For instance sinh(x +10) exp(-10) has a fixpoint at -23.1416 and the derivative there is 23.1409.
This approximates exp(x) very well for Re(x) > -5.
In fact the derivative converges (conveniently ?) to minus the fixpoint as n goes to +oo.
The idea is now ofcourse that
lim f(n,x)^[s] = exp^[s](x)
I strongly recommend using fast converging methods for the fixpoint method, and by that i * assume * higher order derivatives are neccessary.
I think it will be both important numerically and theoretically.
For Re(x) > -n + 1 we get about the same fixpoints and cycles as for exp(x) since it converges so fast to it.
But for Re(x) << - n + 1 the situation is not yet clear to me. Although for large negative x we get close to sinh(x) again.
This matters for radius and analytic continuation ofcourse.
Im not sure if this differs from other methods but the way of computation seems not equivalent.
I want to point out that a converging analytic ( near the real line ) sequence of functions is also necc analytic , this is a theorem !
So if it is converging and c^oo it MUST be analytic for Re(x) > -n/2 and by the limit for all x near the real line.
I assume this old idea was missed by most since I did not give it much attention and investigated other ways.
But I feel we are ready for this.
regards
tommy1729
Tom Marcel Raes
This is not " the 2sinh method " for all clarity.
But it also uses 2sinh.
f(n,x) = 2sinh(x + n)*exp(-n) = exp(x) - exp(- x - n).
Notice how this function f(n,x) approximates exp(x) very well for Re(x) > - n/2.
It also has exactly one real fixpoint with a derivative there about the same value as minus the fixpoint.
For instance sinh(x +10) exp(-10) has a fixpoint at -23.1416 and the derivative there is 23.1409.
This approximates exp(x) very well for Re(x) > -5.
In fact the derivative converges (conveniently ?) to minus the fixpoint as n goes to +oo.
The idea is now ofcourse that
lim f(n,x)^[s] = exp^[s](x)
I strongly recommend using fast converging methods for the fixpoint method, and by that i * assume * higher order derivatives are neccessary.
I think it will be both important numerically and theoretically.
For Re(x) > -n + 1 we get about the same fixpoints and cycles as for exp(x) since it converges so fast to it.
But for Re(x) << - n + 1 the situation is not yet clear to me. Although for large negative x we get close to sinh(x) again.
This matters for radius and analytic continuation ofcourse.
Im not sure if this differs from other methods but the way of computation seems not equivalent.
I want to point out that a converging analytic ( near the real line ) sequence of functions is also necc analytic , this is a theorem !
So if it is converging and c^oo it MUST be analytic for Re(x) > -n/2 and by the limit for all x near the real line.
I assume this old idea was missed by most since I did not give it much attention and investigated other ways.
But I feel we are ready for this.
regards
tommy1729
Tom Marcel Raes