How fast are these type of sums ?
#1
We know that in the limit x to +oo , f(x,n,m) = exp^[n]( (ln^[n](x))^m ) for all n,m > 1 is smaller than exp(x).

In fact even any fixed amount of iterations of that grow smaller than exp(x) at least in the limit x to +oo.

This implies lim_x  f(x,n,m) < exp^[v](x) for any 0 < v.

Now suppose we want a function that grows ( in the limit ) faster than f(x,n,m) but slower than exp^[v] for some  0 < v < 1.

Also suppose we want to avoid defining a sexp or slog , and we want to avoid using functions for which we can clearly see it is asymptotic to exp^[v] for an easy to see value of v such that 0 < v < 1. ( so no half-iterate of sinh(x) or exp(x)-1 , no fake function theory and so on )

Also no taylor , fourier , pade or typical integral transforms.

We want an infinite sum.

So I define the following function for x > 0 :

 T_n(x) =  ( 1/ (2n)! ) * sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](n))^2 )

T(x) = T_1(x) + T_2(x) + T_3(x) + ...

T(x) = sum_n T_n(x)

where the sum is over the strict positive integers n.

Now this function T(x) eventually grows faster than exp^[n]( (ln^[n](x))^2 ) or even exp^[n]( (ln^[n](x))^m ) for all large x  and all integer n and m > 1.

 
But how fast does T(x) grow ?

What is the smallest value of v such that 0 < v < 1 and 

LIM x to +oo

T(x) < exp^[v](x)

??

***
Btw this series might be a good first guess in iterative or recursive methods to find the semi-exp(x) ...
***

Now T(x) is ofcourse just an instance of the more general idea.

T_g_n(x) =    ( 1/ (2n)! ) *  sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](g(n)))^2 )

T_g(x) = T_g_1(x) + T_g_2(x) + T_g_3(x) + ...

T_g(x) = sum_n T_g_n(x)

where the sum is over the strict positive integers n.

for some strictly increasing integer sequence g(n).

And ofcourse we could also consider different coefficients than 1/ (2n)! .

But I wanted to start with a small idea and specific case.
Hence T(x).


Sorry for not using tex , im tired and need sleep.

regards

tommy1729
Reply
#2
(02/16/2023, 11:29 PM)tommy1729 Wrote: We know that in the limit x to +oo , f(x,n,m) = exp^[n]( (ln^[n](x))^m ) for all n,m > 1 is smaller than exp(x).

In fact even any fixed amount of iterations of that grow smaller than exp(x) at least in the limit x to +oo.

This implies lim_x  f(x,n,m) < exp^[v](x) for any 0 < v.

Now suppose we want a function that grows ( in the limit ) faster than f(x,n,m) but slower than exp^[v] for some  0 < v < 1.

Also suppose we want to avoid defining a sexp or slog , and we want to avoid using functions for which we can clearly see it is asymptotic to exp^[v] for an easy to see value of v such that 0 < v < 1. ( so no half-iterate of sinh(x) or exp(x)-1 , no fake function theory and so on )

Also no taylor , fourier , pade or typical integral transforms.

We want an infinite sum.

So I define the following function for x > 0 :

 T_n(x) =  ( 1/ (2n)! ) * sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](n))^2 )

T(x) = T_1(x) + T_2(x) + T_3(x) + ...

T(x) = sum_n T_n(x)

where the sum is over the strict positive integers n.

Now this function T(x) eventually grows faster than exp^[n]( (ln^[n](x))^2 ) or even exp^[n]( (ln^[n](x))^m ) for all large x  and all integer n and m > 1.

 
But how fast does T(x) grow ?

What is the smallest value of v such that 0 < v < 1 and 

LIM x to +oo

T(x) < exp^[v](x)

??

Ofcourse if we understand the speed of sinh^[x]( (arcsinh^[x](x))^2 ) very well , that would already be a great help and might even resolve all problems.

The problem is that sinh^[n]( (arcsinh^[n](x))^2 ) grows very fast and reaches relatively high values for small x, before it starts to be relatively slow compared to exp(x).

So understanding a relative max (and maybe some kind of average) of  sinh^[x]( (arcsinh^[x](x))^2 ) would be very enlightning.



regards

tommy1729
Reply
#3
(02/17/2023, 10:19 PM)tommy1729 Wrote:
(02/16/2023, 11:29 PM)tommy1729 Wrote: We know that in the limit x to +oo , f(x,n,m) = exp^[n]( (ln^[n](x))^m ) for all n,m > 1 is smaller than exp(x).

In fact even any fixed amount of iterations of that grow smaller than exp(x) at least in the limit x to +oo.

This implies lim_x  f(x,n,m) < exp^[v](x) for any 0 < v.

Now suppose we want a function that grows ( in the limit ) faster than f(x,n,m) but slower than exp^[v] for some  0 < v < 1.

Also suppose we want to avoid defining a sexp or slog , and we want to avoid using functions for which we can clearly see it is asymptotic to exp^[v] for an easy to see value of v such that 0 < v < 1. ( so no half-iterate of sinh(x) or exp(x)-1 , no fake function theory and so on )

Also no taylor , fourier , pade or typical integral transforms.

We want an infinite sum.

So I define the following function for x > 0 :

 T_n(x) =  ( 1/ (2n)! ) * sinh^[n]( (arcsinh^[n](x))^2 ) / sinh^[n]( (arcsinh^[n](n))^2 )

T(x) = T_1(x) + T_2(x) + T_3(x) + ...

T(x) = sum_n T_n(x)

where the sum is over the strict positive integers n.

Now this function T(x) eventually grows faster than exp^[n]( (ln^[n](x))^2 ) or even exp^[n]( (ln^[n](x))^m ) for all large x  and all integer n and m > 1.

 
But how fast does T(x) grow ?

What is the smallest value of v such that 0 < v < 1 and 

LIM x to +oo

T(x) < exp^[v](x)

??

Ofcourse if we understand the speed of sinh^[x]( (arcsinh^[x](x))^2 ) very well , that would already be a great help and might even resolve all problems.

The problem is that sinh^[n]( (arcsinh^[n](x))^2 ) grows very fast and reaches relatively high values for small x, before it starts to be relatively slow compared to exp(x).

So understanding a relative max (and maybe some kind of average) of  sinh^[x]( (arcsinh^[x](x))^2 ) would be very enlightning.



regards

tommy1729

ok here is an idea

sinh^[x]( (arcsinh^[x](x))^2 )

=

sinh^[x]( arcsinh^[x](x) * arcsinh^[x](x) )

keep that in mind ,

remember how sinh^[x]( 2 arcsinh^[x](x) ) was similar to sinh^[x+1] ( arcsinh^[x](x) ) and therefore

sinh^[x]( 2 arcsinh^[x](x) ) is close to sinh(x).

so returning to the previous equation :

sinh^[x]( arcsinh^[x](x) * arcsinh^[x](x) )

is close to 

sinh^[x + ln(arcsinh^[x](x))/ln(2)] (arcsinh^[x](x) ) 

= close to

sinh^[ arcsinh[x+1](x) ] (x)

so in the limit as x goes to + oo we still have growth 0.


But this gives us a tool to work with.

sinh^[1/2](x) 



sinh^[x +1/2]( arcsinh^[x](x) )

= close to

sinh^[x]^[ sqrt(2) arcsinh^[x](x) ]

which makes alot of sense !

This is the classic limit formula n to +oo

sinh^[n]^[ sqrt(2) arcsinh^[n](x) ]

where we replace oo with x.

Since that limit formula converges very fast this is a reasonable approximation for large x.


We should not jump to conclusions yet.

But it gives a way to attack the problem !



Conjecture for x > 3: 

sinh^[1/2](x) - sinh^[x]^[ sqrt(2) arcsinh^[x](x) ] = O( x^( 1/n ) - x^( 1/(n-1) ) )

where O is big O notation and n is the next integer after x.

But I think we already know

sinh^[1/2](x) - sinh^[x]^[ sqrt(2) arcsinh^[x](x) ] = O( 1/(sqrt(2)^x) )

because the convergeance is exp type.

Then again that last equation sinh^[1/2](x) - sinh^[x]^[ sqrt(2) arcsinh^[x](x) ] = O( 1/(sqrt(2)^x) )

might only hold near the origin because that is where the iterations are exp in growth.

And ofcourse we actually want to understand exp^[1/2] but I like to use sinh^[1/2] as a tool.

 

Yes yes , this is still all very informal.



regards

tommy1729
Reply
#4
Im excited about this !

Assuming my ideas are correct and combining with some basic calculus and fake function theory this really gives a deep insight into asymptotics of series expansions !



In fact if I push these ideas , I can even make a series expansion asymptotic to the half-iterate of tetration.

Yes.

f(x+1) = exp(f(x))

g(g(x)) = f(x)


without using a fixpoint !



regards

tommy1729
Reply
#5
It might be conceivable that using these approximation functions as upper or lower bounds give uniqueness criterions for fractional iterations.


regards

tommy1729
Reply
#6
im working with x  > 0.

By using functional equations we can get anywhere.

But I would like a method that is a good approx for all real x ...

hmm

that is tricky.


regards

tommy1729
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  my proposed extension of the fast growing hierarchy to real numbers Alex Zuma 2025 0 1,318 09/28/2025, 07:15 PM
Last Post: Alex Zuma 2025
Question Repeated Differentiation Leading to Tetrationally Fast Growth Catullus 5 8,034 07/16/2022, 07:26 AM
Last Post: tommy1729
  Trying to find a fast converging series of normalization constants; plus a recap JmsNxn 0 3,547 10/26/2021, 02:12 AM
Last Post: JmsNxn
  How fast does this grow ?? tommy1729 0 4,975 11/13/2017, 02:06 AM
Last Post: tommy1729
  On naturally occuring sums tommy1729 0 5,710 10/24/2013, 12:27 PM
Last Post: tommy1729
  Continuum sums -- a big problem and some interesting observations mike3 17 58,440 10/12/2010, 10:41 AM
Last Post: Ansus
  general sums tommy1729 5 19,461 06/24/2010, 07:56 PM
Last Post: kobi_78
  a type of equation tommy1729 0 5,497 06/23/2010, 11:08 PM
Last Post: tommy1729
  Arithmetic in the height-parameter (sums, series) Gottfried 7 30,407 02/06/2010, 12:52 AM
Last Post: bo198214
  Borel summation and other continuation/summability methods for continuum sums mike3 2 13,861 12/30/2009, 09:51 PM
Last Post: mike3



Users browsing this thread: 1 Guest(s)