Arguments for the beta method not being Kneser's method
#45
(10/09/2021, 12:27 PM)tommy1729 Wrote:
(10/07/2021, 04:12 PM)sheldonison Wrote:
(10/07/2021, 05:20 AM)JmsNxn Wrote: All of the numbers you've posted evaluate small, but non-zero; no matter the depth of iteration I invoke. There also seems to be no branch-cuts after your singularities...

AHHH I see much more clearly. I think you are running into a fallacy of the infinite though.

(to begin, that should be \( \tau(z) \approx -\log(1+\exp(-z)) \), though (I'm sure it's a typo on your part).)

... My diagnosis of the singularities is loss of accuracy in the sample points of beta... And furthermore, straight up artifacts.

James,
I'm trying to understand your concerns.  It is true that I was focused exclusively on the zeros of \( f(z)=\ln(\beta(z+1,1)=\beta(z,1)-\ln(1+\exp(-z)) \).  At each of the points I listed, beta(z) and f(z) are both well defined and analytic and relatively easy to compute with pari.gp, and at each of these points f(z)=0, which leads to a singularity in \( \ln(f(z)) \) and seems to be a problem...  Unlike tet(z), which has no zeros in the complex plane for \( \Im(z)<>0 \) f(z) does have zeros, and it has an infinite number of zeros.

Am I correct that one of your suggestions would be to instead loo
k at the following function in the neighborhood of the zeros?
\( \lim_{m\to \infty}f^m(z)=\ln^{\circ m}(f(z+m)) \)
Then it would seem the value of z shifts a little.   The limit would be at the nearby point where where \( f(z+4)=e\uparrow\uparrow 3 \) at which point no further numeric convergence is possible.
Code:
z0 is the value for m=0:           z0=5.31361674343693018580658 + 0.803861889686272103890852*I; f(z0)=0; beta(z0+1)=1
z4 is the limiting value for m=4:  z4=5.32119139366544998965263 + 0.816482374289017956146532*I; f(z4+4)=e^^3

Let's consider \( tet(s+c) = \lim_{n\to +\infty}f^n(s)=\ln^{\circ n}(f(s+n)) \)

t(s+n) is close to 1 for n small and going to 1 for n large.

then

f(s+n+2)= exp(t(s+n+1) * f(s+n+1))

f is never zero so log f is never log(0).

So lets investigate ln ln f.

if ln ln f = log(0) then f must be 1 exactly.

f(s+n+2)= exp(t(s+n+1) * f(s+n+1)) = 1

ln f(s+n+2) = t(s+n+1) * f(s+n+1)

since t is close to 1 , and f(s+n+1) is never zero , t(s+n+1) * f(s+n+1) is never close to 0 ! ( but rather closer to k 2 pi i for k at least 1 in absolute value.)

so ln ln(f(s+n+2)) is never log(0).

by induction f = 1,e^e,... all do not give rise to log(0).

So log singularities are not " expected ".

essential singularities are also not much expected.

So its seems close to the real positive line we get analytic. ( t(s) is close to 1 there )

regards

tommy1729

remark :


the idea that for all points s in a set A there are probably points close to s ; s* such that s* equals " whatever " is a flaw.

Why ? 

Well because we prove it for a point s in the set A , therefore it holds for any arbitrary point s in the set A.

Or in others words it holds for ALL s in the set A.

Since nearby points s* also belong to this ALL s in the set , proving it for any random point s in the set A is sufficient.

Ok ok at the boundary of the set A we can consider points s* that are outside of the set A ofcourse.

But the conjecture applies only the set A so that is not an issue.

It does imply though that even for an infinitesimal closeness the boundary of the set A is problematic and hence the argument works very well for OPEN sets A.

In other words it works perfectly for the open set A bounded by a jordan curve.
Or restated a simply connected space A.

The set A is ofcourse where t(s) is very close to 1.



regards

tommy1729
Reply


Messages In This Thread
RE: Arguments for the beta method not being Kneser's method - by tommy1729 - 10/09/2021, 08:02 PM

Possibly Related Threads…
Thread Author Replies Views Last Post
  Fractional tetration method Koha 2 6,049 06/05/2025, 01:40 AM
Last Post: Pentalogue
  The ultimate beta method JmsNxn 8 10,739 04/15/2023, 02:36 AM
Last Post: JmsNxn
  Artificial Neural Networks vs. Kneser Ember Edison 5 6,802 02/22/2023, 08:52 PM
Last Post: tommy1729
  greedy method for tetration ? tommy1729 0 3,013 02/11/2023, 12:13 AM
Last Post: tommy1729
  tommy's "linear" summability method tommy1729 15 17,846 02/10/2023, 03:55 AM
Last Post: JmsNxn
  another infinite composition gaussian method clone tommy1729 2 4,994 01/24/2023, 12:53 AM
Last Post: tommy1729
  Semi-group iso , tommy's limit fix method and alternative limit for 2sinh method tommy1729 1 4,606 12/30/2022, 11:27 PM
Last Post: tommy1729
  [MSE] short review/implem. of Andy's method and a next step Gottfried 4 6,723 11/03/2022, 11:51 AM
Last Post: Gottfried
  Is this the beta method? bo198214 3 6,028 08/18/2022, 04:18 AM
Last Post: JmsNxn
  Describing the beta method using fractional linear transformations JmsNxn 5 8,654 08/07/2022, 12:15 PM
Last Post: JmsNxn



Users browsing this thread: 3 Guest(s)