11/09/2021, 01:12 PM
let h(s) = inv.f( exp( f(s) ).
whenever t(s) is close to 1 , h(s) is suppose to be very close to s+1.
To study h(s) it makes sense to consider its derivative.
The problem with derivatives is that no matter what kind of calculus you use , h-derivative or q-derivative or other types of derivatives , things just basicly look the same.
By that I mean that for instance the chain rule remains the same independant of the use of h-derivative or q-derivative etc.
This makes it hard to estimate things.
Many have had the idea of replacing the derivatives with another concept to understand " change " but not very succesful I think.
Special cases might however show succes.
And therefore it is hard to exclude the idea completely.
Ideas are welcome.
Maybe James compositional calculus might help ??
I still do not understand that.
Anyways the derivative of h(s) :
h ' (s) = exp( f(s) ) * f ' (s) / f ' ( h(s) )
( notice if h(s) was indeed s + 1 EXACTLY then f has to be tetration EXACTLY )
This identity is complicated.
It is hard to show h ' (s) must be close to 1 when t(s) is close to 1.
( Based on nothing but imagination the idea of finding an f,h,t such that h'(s) = t(s) near the real line is fascinating , but i have no clue )
But what stands out is the division by f ' ( h(s) ).
let h(s) = S.
Then we want to understand 1/ f ' (S).
In particular when is f ' (S) zero or close to zero ??
More general - or less - forget that S = h(s) and take s instead :
when is f ' (s) zero or close to 0 ?
We know that f(s) =/= 0.
We have jensens theorem to relate log f ' (0) to the zero's of f ' (s) = 0 within a radius.
Clearly when f ' (S) = 0 and t(s) is close to 1 , we have a singularity for h(s).
( notice I assumed f ' (s) = f ' (h(s)) = 0 and t(s) is close to 1 is not possible )
also notice the zero's of f ' (s) also affect when h ' (s) = 0.
So this deserves attention.
regards
tommy1729
Tom Marcel Raes
whenever t(s) is close to 1 , h(s) is suppose to be very close to s+1.
To study h(s) it makes sense to consider its derivative.
The problem with derivatives is that no matter what kind of calculus you use , h-derivative or q-derivative or other types of derivatives , things just basicly look the same.
By that I mean that for instance the chain rule remains the same independant of the use of h-derivative or q-derivative etc.
This makes it hard to estimate things.
Many have had the idea of replacing the derivatives with another concept to understand " change " but not very succesful I think.
Special cases might however show succes.
And therefore it is hard to exclude the idea completely.
Ideas are welcome.
Maybe James compositional calculus might help ??
I still do not understand that.
Anyways the derivative of h(s) :
h ' (s) = exp( f(s) ) * f ' (s) / f ' ( h(s) )
( notice if h(s) was indeed s + 1 EXACTLY then f has to be tetration EXACTLY )
This identity is complicated.
It is hard to show h ' (s) must be close to 1 when t(s) is close to 1.
( Based on nothing but imagination the idea of finding an f,h,t such that h'(s) = t(s) near the real line is fascinating , but i have no clue )
But what stands out is the division by f ' ( h(s) ).
let h(s) = S.
Then we want to understand 1/ f ' (S).
In particular when is f ' (S) zero or close to zero ??
More general - or less - forget that S = h(s) and take s instead :
when is f ' (s) zero or close to 0 ?
We know that f(s) =/= 0.
We have jensens theorem to relate log f ' (0) to the zero's of f ' (s) = 0 within a radius.
Clearly when f ' (S) = 0 and t(s) is close to 1 , we have a singularity for h(s).
( notice I assumed f ' (s) = f ' (h(s)) = 0 and t(s) is close to 1 is not possible )
also notice the zero's of f ' (s) also affect when h ' (s) = 0.
So this deserves attention.
regards
tommy1729
Tom Marcel Raes

