02/16/2023, 11:29 PM
(This post was last modified: 02/17/2023, 09:55 PM by tommy1729.
Edit Reason: sleepy
)
We know that in the limit x to +oo , f(x,n,m) = exp^[n]( (ln^[n](x))^m ) for all n,m > 1 is smaller than exp(x).
In fact even any fixed amount of iterations of that grow smaller than exp(x) at least in the limit x to +oo.
This implies lim_x f(x,n,m) < exp^[v](x) for any 0 < v.
Now suppose we want a function that grows ( in the limit ) faster than f(x,n,m) but slower than exp^[v] for some 0 < v < 1.
Also suppose we want to avoid defining a sexp or slog , and we want to avoid using functions for which we can clearly see it is asymptotic to exp^[v] for an easy to see value of v such that 0 < v < 1. ( so no half-iterate of sinh(x) or exp(x)-1 , no fake function theory and so on )
Also no taylor , fourier , pade or typical integral transforms.
We want an infinite sum.
So I define the following function for x > 0 :
T_n(x) = ( 1/ (2n)! ) * sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](n))^2 )
T(x) = T_1(x) + T_2(x) + T_3(x) + ...
T(x) = sum_n T_n(x)
where the sum is over the strict positive integers n.
Now this function T(x) eventually grows faster than exp^[n]( (ln^[n](x))^2 ) or even exp^[n]( (ln^[n](x))^m ) for all large x and all integer n and m > 1.
But how fast does T(x) grow ?
What is the smallest value of v such that 0 < v < 1 and
LIM x to +oo
T(x) < exp^[v](x)
??
***
Btw this series might be a good first guess in iterative or recursive methods to find the semi-exp(x) ...
***
Now T(x) is ofcourse just an instance of the more general idea.
T_g_n(x) = ( 1/ (2n)! ) * sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](g(n)))^2 )
T_g(x) = T_g_1(x) + T_g_2(x) + T_g_3(x) + ...
T_g(x) = sum_n T_g_n(x)
where the sum is over the strict positive integers n.
for some strictly increasing integer sequence g(n).
And ofcourse we could also consider different coefficients than 1/ (2n)! .
But I wanted to start with a small idea and specific case.
Hence T(x).
Sorry for not using tex , im tired and need sleep.
regards
tommy1729
In fact even any fixed amount of iterations of that grow smaller than exp(x) at least in the limit x to +oo.
This implies lim_x f(x,n,m) < exp^[v](x) for any 0 < v.
Now suppose we want a function that grows ( in the limit ) faster than f(x,n,m) but slower than exp^[v] for some 0 < v < 1.
Also suppose we want to avoid defining a sexp or slog , and we want to avoid using functions for which we can clearly see it is asymptotic to exp^[v] for an easy to see value of v such that 0 < v < 1. ( so no half-iterate of sinh(x) or exp(x)-1 , no fake function theory and so on )
Also no taylor , fourier , pade or typical integral transforms.
We want an infinite sum.
So I define the following function for x > 0 :
T_n(x) = ( 1/ (2n)! ) * sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](n))^2 )
T(x) = T_1(x) + T_2(x) + T_3(x) + ...
T(x) = sum_n T_n(x)
where the sum is over the strict positive integers n.
Now this function T(x) eventually grows faster than exp^[n]( (ln^[n](x))^2 ) or even exp^[n]( (ln^[n](x))^m ) for all large x and all integer n and m > 1.
But how fast does T(x) grow ?
What is the smallest value of v such that 0 < v < 1 and
LIM x to +oo
T(x) < exp^[v](x)
??
***
Btw this series might be a good first guess in iterative or recursive methods to find the semi-exp(x) ...
***
Now T(x) is ofcourse just an instance of the more general idea.
T_g_n(x) = ( 1/ (2n)! ) * sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](g(n)))^2 )
T_g(x) = T_g_1(x) + T_g_2(x) + T_g_3(x) + ...
T_g(x) = sum_n T_g_n(x)
where the sum is over the strict positive integers n.
for some strictly increasing integer sequence g(n).
And ofcourse we could also consider different coefficients than 1/ (2n)! .
But I wanted to start with a small idea and specific case.
Hence T(x).
Sorry for not using tex , im tired and need sleep.
regards
tommy1729

