02/11/2023, 12:13 AM
Tetration has log singularities.
And log log singularities , log log log singularities etc
The idea is that the best tetration has all these log iteration singularities only once (on the real line ).
That is in fact a uniqueness criterion when analytic in the upper plane.
So the greedy method idea is that we simple take away those log iterate singularities one by one and finally end up with ....
nothing.
Or we end up with an entire function.
The problem is we cannot simply say
tet(x) - ( ln(x+2) + ln(ln(x+3)) + ln(ln(ln(x+4))) + ... ) = entire(x)
because we end up with issues :
1) convergeance of ln(x+2) + ln(ln(x+3)) + ln(ln(ln(x+4))) + ...
2) ln ln x is not analytic at x = 0 AND at x = 1 !!
So taking away a singularity in this naive way actually creates at least one other singularity !
The solution is we need " ghost functions " if they exist ;
functions that take away ln ln ln ... singularities without adding new problems ( non analytic points ).
So we need a function that is locally like ln(ln(x)) around its unique singularity but is analytic everywhere else.
same for ln ln ln x , we need another function that is locally like ln ln ln x around its unique singularity but is analytic everywhere else.
So we end up with f_n (x) " ghost functions ".
Im not sure if these ghost functions exist and are easy to catch and make things work out.
It is a very greedy way of thinking , hence a greedy method with special functions : " ghost functions ".
So the end goal is
tet(x) - ( ln(x+2) + f_1(x+3) + f_2(x+4) + ... ) = entire(x)
By first defining appropriate " ghost functions ".
The second step is understanding the entire(x).
This will probably go deep into complex analysis if it can be done.
Or will be shown impossible fast ...
I was thinking about inverting exp(2sinh(2sinh(x))) but that has issues to ...
the (inverse) 2sinh can solve for 0 but has log branches on the complex plane too
regards
tommy1729
And log log singularities , log log log singularities etc
The idea is that the best tetration has all these log iteration singularities only once (on the real line ).
That is in fact a uniqueness criterion when analytic in the upper plane.
So the greedy method idea is that we simple take away those log iterate singularities one by one and finally end up with ....
nothing.
Or we end up with an entire function.
The problem is we cannot simply say
tet(x) - ( ln(x+2) + ln(ln(x+3)) + ln(ln(ln(x+4))) + ... ) = entire(x)
because we end up with issues :
1) convergeance of ln(x+2) + ln(ln(x+3)) + ln(ln(ln(x+4))) + ...
2) ln ln x is not analytic at x = 0 AND at x = 1 !!
So taking away a singularity in this naive way actually creates at least one other singularity !
The solution is we need " ghost functions " if they exist ;
functions that take away ln ln ln ... singularities without adding new problems ( non analytic points ).
So we need a function that is locally like ln(ln(x)) around its unique singularity but is analytic everywhere else.
same for ln ln ln x , we need another function that is locally like ln ln ln x around its unique singularity but is analytic everywhere else.
So we end up with f_n (x) " ghost functions ".
Im not sure if these ghost functions exist and are easy to catch and make things work out.
It is a very greedy way of thinking , hence a greedy method with special functions : " ghost functions ".
So the end goal is
tet(x) - ( ln(x+2) + f_1(x+3) + f_2(x+4) + ... ) = entire(x)
By first defining appropriate " ghost functions ".
The second step is understanding the entire(x).
This will probably go deep into complex analysis if it can be done.
Or will be shown impossible fast ...
I was thinking about inverting exp(2sinh(2sinh(x))) but that has issues to ...
the (inverse) 2sinh can solve for 0 but has log branches on the complex plane too
regards
tommy1729