(02/16/2023, 11:29 PM)tommy1729 Wrote: We know that in the limit x to +oo , f(x,n,m) = exp^[n]( (ln^[n](x))^m ) for all n,m > 1 is smaller than exp(x).
In fact even any fixed amount of iterations of that grow smaller than exp(x) at least in the limit x to +oo.
This implies lim_x f(x,n,m) < exp^[v](x) for any 0 < v.
Now suppose we want a function that grows ( in the limit ) faster than f(x,n,m) but slower than exp^[v] for some 0 < v < 1.
Also suppose we want to avoid defining a sexp or slog , and we want to avoid using functions for which we can clearly see it is asymptotic to exp^[v] for an easy to see value of v such that 0 < v < 1. ( so no half-iterate of sinh(x) or exp(x)-1 , no fake function theory and so on )
Also no taylor , fourier , pade or typical integral transforms.
We want an infinite sum.
So I define the following function for x > 0 :
T_n(x) = ( 1/ (2n)! ) * sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](n))^2 )
T(x) = T_1(x) + T_2(x) + T_3(x) + ...
T(x) = sum_n T_n(x)
where the sum is over the strict positive integers n.
Now this function T(x) eventually grows faster than exp^[n]( (ln^[n](x))^2 ) or even exp^[n]( (ln^[n](x))^m ) for all large x and all integer n and m > 1.
But how fast does T(x) grow ?
What is the smallest value of v such that 0 < v < 1 and
LIM x to +oo
T(x) < exp^[v](x)
??
Ofcourse if we understand the speed of sinh^[x]( (arcsinh^[x](x))^2 ) very well , that would already be a great help and might even resolve all problems.
The problem is that sinh^[n]( (arcsinh^[n](x))^2 ) grows very fast and reaches relatively high values for small x, before it starts to be relatively slow compared to exp(x).
So understanding a relative max (and maybe some kind of average) of sinh^[x]( (arcsinh^[x](x))^2 ) would be very enlightning.
regards
tommy1729

