Tetration Forum
How fast are these type of sums ? - Printable Version

+- Tetration Forum (https://tetrationforum.org)
+-- Forum: Tetration and Related Topics (https://tetrationforum.org/forumdisplay.php?fid=1)
+--- Forum: Mathematical and General Discussion (https://tetrationforum.org/forumdisplay.php?fid=3)
+--- Thread: How fast are these type of sums ? (/showthread.php?tid=1707)



How fast are these type of sums ? - tommy1729 - 02/16/2023

We know that in the limit x to +oo , f(x,n,m) = exp^[n]( (ln^[n](x))^m ) for all n,m > 1 is smaller than exp(x).

In fact even any fixed amount of iterations of that grow smaller than exp(x) at least in the limit x to +oo.

This implies lim_x  f(x,n,m) < exp^[v](x) for any 0 < v.

Now suppose we want a function that grows ( in the limit ) faster than f(x,n,m) but slower than exp^[v] for some  0 < v < 1.

Also suppose we want to avoid defining a sexp or slog , and we want to avoid using functions for which we can clearly see it is asymptotic to exp^[v] for an easy to see value of v such that 0 < v < 1. ( so no half-iterate of sinh(x) or exp(x)-1 , no fake function theory and so on )

Also no taylor , fourier , pade or typical integral transforms.

We want an infinite sum.

So I define the following function for x > 0 :

 T_n(x) =  ( 1/ (2n)! ) * sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](n))^2 )

T(x) = T_1(x) + T_2(x) + T_3(x) + ...

T(x) = sum_n T_n(x)

where the sum is over the strict positive integers n.

Now this function T(x) eventually grows faster than exp^[n]( (ln^[n](x))^2 ) or even exp^[n]( (ln^[n](x))^m ) for all large x  and all integer n and m > 1.

 
But how fast does T(x) grow ?

What is the smallest value of v such that 0 < v < 1 and 

LIM x to +oo

T(x) < exp^[v](x)

??

***
Btw this series might be a good first guess in iterative or recursive methods to find the semi-exp(x) ...
***

Now T(x) is ofcourse just an instance of the more general idea.

T_g_n(x) =    ( 1/ (2n)! ) *  sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](g(n)))^2 )

T_g(x) = T_g_1(x) + T_g_2(x) + T_g_3(x) + ...

T_g(x) = sum_n T_g_n(x)

where the sum is over the strict positive integers n.

for some strictly increasing integer sequence g(n).

And ofcourse we could also consider different coefficients than 1/ (2n)! .

But I wanted to start with a small idea and specific case.
Hence T(x).


Sorry for not using tex , im tired and need sleep.

regards

tommy1729


RE: How fast are these type of sums ? - tommy1729 - 02/17/2023

(02/16/2023, 11:29 PM)tommy1729 Wrote: We know that in the limit x to +oo , f(x,n,m) = exp^[n]( (ln^[n](x))^m ) for all n,m > 1 is smaller than exp(x).

In fact even any fixed amount of iterations of that grow smaller than exp(x) at least in the limit x to +oo.

This implies lim_x  f(x,n,m) < exp^[v](x) for any 0 < v.

Now suppose we want a function that grows ( in the limit ) faster than f(x,n,m) but slower than exp^[v] for some  0 < v < 1.

Also suppose we want to avoid defining a sexp or slog , and we want to avoid using functions for which we can clearly see it is asymptotic to exp^[v] for an easy to see value of v such that 0 < v < 1. ( so no half-iterate of sinh(x) or exp(x)-1 , no fake function theory and so on )

Also no taylor , fourier , pade or typical integral transforms.

We want an infinite sum.

So I define the following function for x > 0 :

 T_n(x) =  ( 1/ (2n)! ) * sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](n))^2 )

T(x) = T_1(x) + T_2(x) + T_3(x) + ...

T(x) = sum_n T_n(x)

where the sum is over the strict positive integers n.

Now this function T(x) eventually grows faster than exp^[n]( (ln^[n](x))^2 ) or even exp^[n]( (ln^[n](x))^m ) for all large x  and all integer n and m > 1.

 
But how fast does T(x) grow ?

What is the smallest value of v such that 0 < v < 1 and 

LIM x to +oo

T(x) < exp^[v](x)

??

Ofcourse if we understand the speed of sinh^[x]( (arcsinh^[x](x))^2 ) very well , that would already be a great help and might even resolve all problems.

The problem is that sinh^[n]( (arcsinh^[n](x))^2 ) grows very fast and reaches relatively high values for small x, before it starts to be relatively slow compared to exp(x).

So understanding a relative max (and maybe some kind of average) of  sinh^[x]( (arcsinh^[x](x))^2 ) would be very enlightning.



regards

tommy1729


RE: How fast are these type of sums ? - tommy1729 - 02/17/2023

(02/17/2023, 10:19 PM)tommy1729 Wrote:
(02/16/2023, 11:29 PM)tommy1729 Wrote: We know that in the limit x to +oo , f(x,n,m) = exp^[n]( (ln^[n](x))^m ) for all n,m > 1 is smaller than exp(x).

In fact even any fixed amount of iterations of that grow smaller than exp(x) at least in the limit x to +oo.

This implies lim_x  f(x,n,m) < exp^[v](x) for any 0 < v.

Now suppose we want a function that grows ( in the limit ) faster than f(x,n,m) but slower than exp^[v] for some  0 < v < 1.

Also suppose we want to avoid defining a sexp or slog , and we want to avoid using functions for which we can clearly see it is asymptotic to exp^[v] for an easy to see value of v such that 0 < v < 1. ( so no half-iterate of sinh(x) or exp(x)-1 , no fake function theory and so on )

Also no taylor , fourier , pade or typical integral transforms.

We want an infinite sum.

So I define the following function for x > 0 :

 T_n(x) =  ( 1/ (2n)! ) * sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](n))^2 )

T(x) = T_1(x) + T_2(x) + T_3(x) + ...

T(x) = sum_n T_n(x)

where the sum is over the strict positive integers n.

Now this function T(x) eventually grows faster than exp^[n]( (ln^[n](x))^2 ) or even exp^[n]( (ln^[n](x))^m ) for all large x  and all integer n and m > 1.

 
But how fast does T(x) grow ?

What is the smallest value of v such that 0 < v < 1 and 

LIM x to +oo

T(x) < exp^[v](x)

??

Ofcourse if we understand the speed of sinh^[x]( (arcsinh^[x](x))^2 ) very well , that would already be a great help and might even resolve all problems.

The problem is that sinh^[n]( (arcsinh^[n](x))^2 ) grows very fast and reaches relatively high values for small x, before it starts to be relatively slow compared to exp(x).

So understanding a relative max (and maybe some kind of average) of  sinh^[x]( (arcsinh^[x](x))^2 ) would be very enlightning.



regards

tommy1729

ok here is an idea

sinh^[x]( (arcsinh^[x](x))^2 )

=

sinh^[x]( arcsinh^[x](x) * arcsinh^[x](x) )

keep that in mind ,

remember how sinh^[x]( 2 arcsinh^[x](x) ) was similar to sinh^[x+1] ( arcsinh^[x](x) ) and therefore

sinh^[x]( 2 arcsinh^[x](x) ) is close to sinh(x).

so returning to the previous equation :

sinh^[x]( arcsinh^[x](x) * arcsinh^[x](x) )

is close to 

sinh^[x + ln(arcsinh^[x](x))/ln(2)] (arcsinh^[x](x) ) 

= close to

sinh^[ arcsinh[x+1](x) ] (x)

so in the limit as x goes to + oo we still have growth 0.


But this gives us a tool to work with.

sinh^[1/2](x) 



sinh^[x +1/2]( arcsinh^[x](x) )

= close to

sinh^[x]^[ sqrt(2) arcsinh^[x](x) ]

which makes alot of sense !

This is the classic limit formula n to +oo

sinh^[n]^[ sqrt(2) arcsinh^[n](x) ]

where we replace oo with x.

Since that limit formula converges very fast this is a reasonable approximation for large x.


We should not jump to conclusions yet.

But it gives a way to attack the problem !



Conjecture for x > 3: 

sinh^[1/2](x) - sinh^[x]^[ sqrt(2) arcsinh^[x](x) ] = O( x^( 1/n ) - x^( 1/(n-1) ) )

where O is big O notation and n is the next integer after x.

But I think we already know

sinh^[1/2](x) - sinh^[x]^[ sqrt(2) arcsinh^[x](x) ] = O( 1/(sqrt(2)^x) )

because the convergeance is exp type.

Then again that last equation sinh^[1/2](x) - sinh^[x]^[ sqrt(2) arcsinh^[x](x) ] = O( 1/(sqrt(2)^x) )

might only hold near the origin because that is where the iterations are exp in growth.

And ofcourse we actually want to understand exp^[1/2] but I like to use sinh^[1/2] as a tool.

 

Yes yes , this is still all very informal.



regards

tommy1729


RE: How fast are these type of sums ? - tommy1729 - 02/17/2023

Im excited about this !

Assuming my ideas are correct and combining with some basic calculus and fake function theory this really gives a deep insight into asymptotics of series expansions !



In fact if I push these ideas , I can even make a series expansion asymptotic to the half-iterate of tetration.

Yes.

f(x+1) = exp(f(x))

g(g(x)) = f(x)


without using a fixpoint !



regards

tommy1729


RE: How fast are these type of sums ? - tommy1729 - 02/17/2023

It might be conceivable that using these approximation functions as upper or lower bounds give uniqueness criterions for fractional iterations.


regards

tommy1729


RE: How fast are these type of sums ? - tommy1729 - 02/17/2023

im working with x  > 0.

By using functional equations we can get anywhere.

But I would like a method that is a good approx for all real x ...

hmm

that is tricky.


regards

tommy1729