12/10/2011, 04:34 PM
Further thoughts
I'm sorry if the writing that follows is sometimes clumsy, but maybe you'll get something out of it. It's a kind of philospophy of natural numbers in the context of hyperoperators and large numbers.
Philosophical considerations
The following is the standard definition of N=Natural Numbers.
“Peano's successor function S(n) = n+1 uniquely covers all numbers 1.. starting from n=0 by iteration of S, and thereby defines the set of natural numbers.”
To understand N better and more accurately consider
N = {N[T, SPN]} U {N[UDC]}
T = Tally; SPN = Standard Positional Notation
UDC = Unbounded Descriptive Capability
This mysterious formula acknowledges the three viewpoints about N
N[T] -> As a tally without bound
(Self-referential concept about the action of counting and size representation)
N[T, SPN] -> As a tally without bound OR As an SPN digit sequence without bound
(This is what people think of about N, in normal, practical situations.)
N[UDC] -> Counting numbers with Unbounded Descriptive Capability
(For example, Scientific Notation, power towers etc are an extended conception of N.)
Counting numbers, in a basic and fundamental way, serve the purpose of indexing and sequencing items, entities etc and one possible way to provide lexicographic ordering.
The tally system allows the fundamental process of pointing to an item, incrementing a counter (recording the presence of the item) changing status from unread to read or removing same item from a collection set.
In this way items in the collection can be counted.
The Thoroughness Property is evident, we believe that incrementing is the unambiguous, systematic method that counts things one-by-one forever.
The reality is that “forever” should be relativised to mean towards a “horizon”,
that is not well-defined but accurately represents an intuition about self-reference regarding “quantity” (the transitions betweeen initial tallying, SPN, and digits-in-sequence tally), and intuition concerning “large enough”.
N[T, SPN] gives the unestimable advantage of allowing a sensible method of Information Condensation while retaining Thoroughness Property.
With SPN, “large enough” can be made small by use of “base” and in so doing frees- up “large enough” to be controlled by other aspects of the description.
And so “large enough” is now the consideration of number of digits in the SPN sequential presentation.
With N[UDC] we have an Emerging Trade-Off between Thoroughness Property and Unbounded Descriptive Capability.
When considering the various big numbers more information is directed towards magnitude and less towards fine details. This is a trade-off between Descriptive Capability and Thoroughness Property. It is an emerging trade-off because there are phase transitions in the trade-off.
For example: A googol is both SPN-describable (a digit sequence of 1 followed by 100 zeros) and UDC-possible (10^100). A googolplex is not SPN-describable but is UDC-possible (10^(10^100)). For numbers between googol and googolplex it is hard to maintain Thoroughness Property. Introducing treelike structures such as HBN (Hereditary Base N) is an attempt to recapture Thoroughness Property but at the expense of greater structural pattern complexity.
Similar phenomena can be observed in the discussions concerning the infinite ordinals.
Considering large numbers and fast growing functions gives a dual reality:
A) A tangible magnitude-into-pattern transformation
B) Traditional perspective of ever-increasing patterns of magnitude
FIFF paradigm
Fuzzy Infinite Fuzzy Finite paradigm
Finite, infinite dichotomy, that is: {1,…,n} versus {1,…}
The appearance seems clear and unambiguous
But this viewpoint is biased by the dominant SPN perspective
And the evidence from considering Nopt structures shows it is a false dichotomy.
Some of the transitions
SPN shows exponentiation as an incremental add-one-digit way. Knuth arrow notation shows Ackermann function as an incremental add-one-to-tally way.
NOPT structures show that Ackermann numbers increase exponentially with respect to Minimal Symbolic Notational requirements on a level playing field (the benchmark or yardstick of using multi-level nested layers with a fixed operation, and power towers)
Nopt structures use a sensible minimal symbolic representation.
Next stage is “zooming in on” HEFTY Nopt structure with a microscope!
Can then introduce another level of chunkiness by storing High Resolution HEFTY Nopts into little boxes… And start the process again…
And so on into the ethereal realms of incomprehensible vastness…
The Inevitable chunkiness of large numbers
In the consideration of fast growing functions there is an inevitable chunkiness that comes about due to natural limit of descriptive capabilities.
You can start out slow with 1, 2, 3 and successor function or fast with Graham’s number and g-subscript power towers but the contemplation of pushing out further into the endless unboundedness of infinity calls upon chunkiness.
NOPT structures are dimensionless until an operation is specified, but even though they are dimensionless, structure can be identified and codified.
Reaching out further and more information hiding is natural and unavoidable.
What about the huge wealth of numbers between g1 and g2 from the gi-sequence leading to Graham’s number? We could traverse the intermediary space by applying the standard math integer functions to the Knuth arrows. And to do this, all the structure leading up to “3 hexated to 3’ could be replicated but this time applying to number of Knuth arrows, padding out the hyperoperator hierarchy to dizzying realms.
The beauty of SPN (standard positional notation) numbers is they preserve the initial successor function, increment natural and successive orders of magnitude while retaining condensation property for as long as a string of digits can go.
A number of visualisations from Wolfram demonstrations show cellular automata applied to binary numbers or other base numbers; we see the condensation property that is systematic reuse and exhaustion of previous orders of magnitude.
An understanding of hyperoperators coded into NOPT structures also has systematic reuse of orders of magnitude but the condensation or thoroughness property from initial successor function is necessarily relaxed. By using exponentiation and the power towers thereof inside a NOPT structure we have the NEPT structures, and now the notion of “successor” is transformed or transmuted into “adjacent power tower”.
The normal successor function we are so familiar with, that is counting distinct symbols is retained and distinctly present in the NEPT structure but we are counting
Power towers symbolically adjacent to one another.
(From some perspectives, in some ways, the traditional finite, infinite separation in maths is flawed, we should think in terms of required chunkiness and layers of nestation.)
A heptation NOPT structure also contains NOPT structures of all previous orders, that is to say, hexation, pentation, tetration, exponentiation are also present and part and parcel of heptation structure. A number such as 53,672 is a 10^4 order number, and also contains information about previous orders of magnitude. A nonation number requires octation, heptation, hexation, pentation, tetration, exponentiation.
Once we get to the gi-sequence it is like a hyperdrive of magnitude that shows the transition between magnitude and pattern.
I'm sorry if the writing that follows is sometimes clumsy, but maybe you'll get something out of it. It's a kind of philospophy of natural numbers in the context of hyperoperators and large numbers.
Philosophical considerations
The following is the standard definition of N=Natural Numbers.
“Peano's successor function S(n) = n+1 uniquely covers all numbers 1.. starting from n=0 by iteration of S, and thereby defines the set of natural numbers.”
To understand N better and more accurately consider
N = {N[T, SPN]} U {N[UDC]}
T = Tally; SPN = Standard Positional Notation
UDC = Unbounded Descriptive Capability
This mysterious formula acknowledges the three viewpoints about N
N[T] -> As a tally without bound
(Self-referential concept about the action of counting and size representation)
N[T, SPN] -> As a tally without bound OR As an SPN digit sequence without bound
(This is what people think of about N, in normal, practical situations.)
N[UDC] -> Counting numbers with Unbounded Descriptive Capability
(For example, Scientific Notation, power towers etc are an extended conception of N.)
Counting numbers, in a basic and fundamental way, serve the purpose of indexing and sequencing items, entities etc and one possible way to provide lexicographic ordering.
The tally system allows the fundamental process of pointing to an item, incrementing a counter (recording the presence of the item) changing status from unread to read or removing same item from a collection set.
In this way items in the collection can be counted.
The Thoroughness Property is evident, we believe that incrementing is the unambiguous, systematic method that counts things one-by-one forever.
The reality is that “forever” should be relativised to mean towards a “horizon”,
that is not well-defined but accurately represents an intuition about self-reference regarding “quantity” (the transitions betweeen initial tallying, SPN, and digits-in-sequence tally), and intuition concerning “large enough”.
N[T, SPN] gives the unestimable advantage of allowing a sensible method of Information Condensation while retaining Thoroughness Property.
With SPN, “large enough” can be made small by use of “base” and in so doing frees- up “large enough” to be controlled by other aspects of the description.
And so “large enough” is now the consideration of number of digits in the SPN sequential presentation.
With N[UDC] we have an Emerging Trade-Off between Thoroughness Property and Unbounded Descriptive Capability.
When considering the various big numbers more information is directed towards magnitude and less towards fine details. This is a trade-off between Descriptive Capability and Thoroughness Property. It is an emerging trade-off because there are phase transitions in the trade-off.
For example: A googol is both SPN-describable (a digit sequence of 1 followed by 100 zeros) and UDC-possible (10^100). A googolplex is not SPN-describable but is UDC-possible (10^(10^100)). For numbers between googol and googolplex it is hard to maintain Thoroughness Property. Introducing treelike structures such as HBN (Hereditary Base N) is an attempt to recapture Thoroughness Property but at the expense of greater structural pattern complexity.
Similar phenomena can be observed in the discussions concerning the infinite ordinals.
Considering large numbers and fast growing functions gives a dual reality:
A) A tangible magnitude-into-pattern transformation
B) Traditional perspective of ever-increasing patterns of magnitude
FIFF paradigm
Fuzzy Infinite Fuzzy Finite paradigm
Finite, infinite dichotomy, that is: {1,…,n} versus {1,…}
The appearance seems clear and unambiguous
But this viewpoint is biased by the dominant SPN perspective
And the evidence from considering Nopt structures shows it is a false dichotomy.
Some of the transitions
SPN shows exponentiation as an incremental add-one-digit way. Knuth arrow notation shows Ackermann function as an incremental add-one-to-tally way.
NOPT structures show that Ackermann numbers increase exponentially with respect to Minimal Symbolic Notational requirements on a level playing field (the benchmark or yardstick of using multi-level nested layers with a fixed operation, and power towers)
Nopt structures use a sensible minimal symbolic representation.
Next stage is “zooming in on” HEFTY Nopt structure with a microscope!
Can then introduce another level of chunkiness by storing High Resolution HEFTY Nopts into little boxes… And start the process again…
And so on into the ethereal realms of incomprehensible vastness…
The Inevitable chunkiness of large numbers
In the consideration of fast growing functions there is an inevitable chunkiness that comes about due to natural limit of descriptive capabilities.
You can start out slow with 1, 2, 3 and successor function or fast with Graham’s number and g-subscript power towers but the contemplation of pushing out further into the endless unboundedness of infinity calls upon chunkiness.
NOPT structures are dimensionless until an operation is specified, but even though they are dimensionless, structure can be identified and codified.
Reaching out further and more information hiding is natural and unavoidable.
What about the huge wealth of numbers between g1 and g2 from the gi-sequence leading to Graham’s number? We could traverse the intermediary space by applying the standard math integer functions to the Knuth arrows. And to do this, all the structure leading up to “3 hexated to 3’ could be replicated but this time applying to number of Knuth arrows, padding out the hyperoperator hierarchy to dizzying realms.
The beauty of SPN (standard positional notation) numbers is they preserve the initial successor function, increment natural and successive orders of magnitude while retaining condensation property for as long as a string of digits can go.
A number of visualisations from Wolfram demonstrations show cellular automata applied to binary numbers or other base numbers; we see the condensation property that is systematic reuse and exhaustion of previous orders of magnitude.
An understanding of hyperoperators coded into NOPT structures also has systematic reuse of orders of magnitude but the condensation or thoroughness property from initial successor function is necessarily relaxed. By using exponentiation and the power towers thereof inside a NOPT structure we have the NEPT structures, and now the notion of “successor” is transformed or transmuted into “adjacent power tower”.
The normal successor function we are so familiar with, that is counting distinct symbols is retained and distinctly present in the NEPT structure but we are counting
Power towers symbolically adjacent to one another.
(From some perspectives, in some ways, the traditional finite, infinite separation in maths is flawed, we should think in terms of required chunkiness and layers of nestation.)
A heptation NOPT structure also contains NOPT structures of all previous orders, that is to say, hexation, pentation, tetration, exponentiation are also present and part and parcel of heptation structure. A number such as 53,672 is a 10^4 order number, and also contains information about previous orders of magnitude. A nonation number requires octation, heptation, hexation, pentation, tetration, exponentiation.
Once we get to the gi-sequence it is like a hyperdrive of magnitude that shows the transition between magnitude and pattern.

