Sequence Interpolation
#1
Question 
EDIT: I believe this ratio interpolation yields k raised to the power of the Newton interpolation of the logarithm base k of the points, for all k not equal to one or zero.
In order to do Newton interpolation on a sequence you take the differences of the terms. One could also take ratios of terms instead. Replacing subtraction with division. 
One could even find what number they have to raise to the power of one term to go to the next. 
They could even use higher hyper-operations. 
How do you interpolation like this, when not all of the terms are in a straight line? Like an interpolation that works when not all of the terms are on a straight line. But produces the same interpolation as the ratio interpolation when all of the terms are on a straight line?
Please remember to stay hydrated.
ฅ(ミ⚈ ﻌ ⚈ミ)ฅ Sincerely: Catullus /ᐠ_ ꞈ _ᐟ\
Reply
#2
I see... this is really interesting.

Definition. Let \(H_n(b,x)=y\) be a real arguments extension of an hyperoperation family \(H_n:\mathbb R\times\mathbb R\). Define \(L_n(b,y)\) as the solution \(x\) to the equation \[H_n(b,x)=y\]

a) The difference operator. We want to solve the generalize the functional equation so that \(F\) is the discrete difference of \(f\)
\[f(x)+F(x)=f(x+1)\]
This is the same as solving \(H_1(f(x),F(x))=f(H_0(x))\) assuming \(H=goodstein\) This gives \(F=\Delta f\).

b) The forward difference operator. Consider now the forward difference operator with delay \(h\in\mathbb R\). It is the solution of \[f(x)+F(x)=f(h+x)\] i.e.\(H_1(f(x),F(x))=f(H_0^h(x))=f(H_1(x,h))\)  and \(F\) measures the additive \(H_1\) perturbation on the output of \(f\) given an iterated \(H_0\) successor perturbation of the input of \(f\).

c) How to measure the perturbation. The perturbation in the output can be measured in different ways. I believe, based on some heuristics on differentiation, that the perturbation in the output must be measured as quantity of times we have to iterate \(H_{0}(f(x),-)\) i.e. we measure it "hyper-logarithmically"
\[H_{0}^{F(x)}(f(x),H_1(f(x),0))=f(H_{0}^h(x,H_1(x,0)))\]
but by definition of goodstein \(H_n^h(b,H_{n+1}(b,0))=H_{n+1}(b,h)\) and we can consider instead the functional equation
\[H_1(f(x),F(x))=f(H_1(x,h))\]

d) Generalizing to higher ranks. We follow this schema by looking at the solutions of the following functional equation
\[H_n(f(x),F(x))=f(H_n(x,h)) \]

Definition (hyperdifference). We define the Hyperdifference at \(x\) of \(f\) as the \(H_n\) perturbation of the output of \(f\) given an \(H_n\) perturbation of the input of \(f\) at \(x\). Let \(f:\mathbb R\to\mathbb R\). Define the partial function \(\Delta^H_{n,h}[f]:\mathbb R\to\mathbb R\) as
\[\Delta^H_{n,h}[f] (x):=L_n(f(x),f(H_n(x,h)))\]



e) The differentiation schema consider now the functional equation \(H_{n+1}(\Delta^H_{n,h}[{\rm id}] (x),F(x))=\Delta^H_{n,h}[f] (x)\) to be solved in the \(F\) we call the solution \(D^H_{n,h}[f]\)

Definition. Let \(f:\mathbb R\to\mathbb R\). Define the partial function \(D^H_{n,h}[f]:\mathbb R\to\mathbb R\) as

\[D^H_{n,h}[f] (x):=L_{n+1}(\Delta^H_{n,h}[{\rm id}] (x),\Delta^H_{n,h}[f] (x))\]

Lets take the limit now and define the hyperderivative of rank \(n\), it may not exists globally, as the previous operators.

Definition (hyperderivative). Let \(f:\mathbb R\to\mathbb R\). Define the partial function \(D^H_{n}[f]:\mathbb R\to\mathbb R\) as

\[D^H_{n,h}[f] (x):=\lim_{h\to\varepsilon_n}D^H_{n,h}[f] (x)\]
where \(\varepsilon_n\) is the left unit element \(H_n{b,\varepsilon_n}=b\).

Question. is it possible to extend Newton, Taylor formalism and calculus in general to this? Is it useful?



example. Unwind the definition and observe that \(D^H_{n,h}[f] (x)=L_{n+1}(L_n(x,H_n(x,h)),L_n(f(x),f(H_n(x,h)))) =L_{n+1}(h,L_n( f(x),f(H_n(x,h)) )  )  \)

Assume \(H\) is the standard Goodstein family.
  • \(D^H_{-1,h}[f] (x)=L_{0}(h,L_0( f(x),f(H_{0}(x,h)) ) ) = f(h+1)-2 \);
  • \(D^H_{0,h}[f] (x)=L_{1}(h,L_0( f(x),f(H_0(x,h)) )  ) = f(h+1)-(h+1) \) but \(\varepsilon_0\) doesn't exists;
  • \(D^H_{1,h}[f] (x)=L_{2}(h,L_1( f(x),f(H_1(x,h)) )  ) =\frac{f(x+h)-f(x)}{h}  \);
  • \(D^H_{2,h}[f] (x)=L_{3}(h,L_2( f(x),f(H_2(x,h)) )  ) =\frac{\ln f(hx)-\ln f(x)}{\ln h}\);
  • \(D^H_{3,h}[f] (x)=L_{4}(h,L_3( f(x),f(H_3(x,h)) )  )  ={\rm slog}_h (\frac{\ln f(x^h)}{\ln f(x)})\).
  • ...

Assume \(B\) is the Bennet family. (What follows was already considered by Rubtsov 20-30 years ago)
  • \(D^B_{-1,h}[f] (x)=L_{0}(h,L_{-1}( f(x),f(B_{-1}(x,h)) )  )= \ln(e^{f(\ln (e^x+e^h))}-e^{f(x)})-h =\ln(\frac{e^{f(\ln (e^x+e^h))}-e^{f(x)}}{e^h})\) obviously \(D^B_{-1}f(x)=\lim_{h\to -\infty}\ln(\frac{e^{f(\ln (e^x+e^h))}-e^{f(x)}}{e^h})\);
  • \(D^B_{0,h}[f] (x)=L_{1}(h,L_0( f(x),f(B_0(x,h)) )  )  =\frac{f(x+h)-f(x)}{h}\)
  • \(D^B_{1,h}[f] (x)=L_{2}(h,L_1( f(x),f(B_1(x,h)) )  )  =\frac{f(hx)}{f(x)}\ominus_2 h=\exp( \frac{\ln(\frac{f(hx)}{f(x)})}{\ln h})=\sqrt[\ln h]{\frac{f(xh)}{f(x)
    }}\) note also that we can express it as \(  \exp( \frac{  \ln f(hx)-\ln f(x)}  {\ln h}  )  \). The Bennet derivative of rank \(1\) is \(D^B_1 f(x)=\lim_{h\to 1}e^{D^H_{2,h} f(x)}=e^{D^H_{2} f(x)}\)
  • ...

MSE MphLee
Mother Law \((\sigma+1)0=\sigma (\sigma+1)\)
S Law \(\bigcirc_f^{\lambda}\square_f^{\lambda^+}(g)=\square_g^{\lambda}\bigcirc_g^{\lambda^+}(f)\)
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Name for sequence y=0[0]0. y=1[1]1, y=2[2]2, y= n[n]n, y=x[x]x, y=i[i]i, y=z[z]z etc? Ivars 11 25,628 07/08/2022, 09:08 AM
Last Post: Catullus
  The balanced hyperop sequence bo198214 6 17,539 11/30/2009, 11:37 PM
Last Post: bo198214



Users browsing this thread: 1 Guest(s)