Inventory Loops (i.e. Counting Sequences) have Pre-period
Abstract. An Inventory Sequence is the iteration of the map defined roughly by taking an integer to its numericized description (e.g. since “” has two ’s, one , and one ). Our work analyzes the iteration under the infinite base. Any starting value of positive digits is known to be ultimately periodic [1] (e.g. reaches the 1-cycle ). Parametrizations of all possible cycles are also known [2,3]. We answer Bronstein and Fraenkel’s open question of 26 years showing the pre-period of any such starting value is no more than where . And oddly the period of the cycle can be determined after only iterations.
1 Games and Grammar
Mathematicians (as we all know by now) play games and often do so turning sentences into numbers. For example, what happens when we describe a number in English, and numericize the description? The number 1381 for example has
two ’s, one , and one .
We might call “” the child of . So what’s the grandchild? The number has
three ’s, one , one , and one .
Going on like this we generate the “Family Lineage” of ,
![[Uncaptioned image]](%22lineage1381%22.png)
Figure 1.1
and find which has three ’s, two ’s, three ’s, one , and one . We have found a number which describes itself (is the mathematical equivalent of an autogram!). Some call these self-inventoried numbers since they “take inventory”, so to speak, of their own digits. In our terms, we would say is its own parent. So are there numbers strictly their own Grandparents? Great-grandparents? Yes and yes.
![[Uncaptioned image]](%22cycles_unbolded%22.png)
Figure 1.2
The former examples terminate the Family Lineages of and respectively. These cycles will come up repeatedly in the article. We call them Inventory Loops since each entry takes inventory, so to speak, of its predecessor.
Similarly to Monopoly, as one continues playing, various rule choices come up.
-
1.
When multi-digit numbers appear do we leave them clumped or split? E.g.111Thanks to math.stackexchange user Alexander Sibelius for improving this strange question’s wording. Does “” become one “” and one “” or a single “”?
-
2.
Can we pick a starting value with infinite digits?
-
3.
What order do we count the digits in? (notice so far digits have been counted up smallest-to-largest)
-
4.
Do we count all the digits? or just consecutive runs? or even digits of all preceding ancestor?
-
5.
Can we skip nouns?
Each answer choice branches a new variation and – wonderfully! – all the aforementioned variations have been studied. The first question is really asking whether our base is infinite (see [1,2,3]) or finite (see [5,6]). There are also perfectly reasonable infinite starting values [1]. The “number”222Since “” is acting as a single “digit” it is more technically correct to call this a sequence rather than a number. has
two ’s, three ’s, two ’s, one , five ’s, five ’s, five ’s, five ’s, six ’s, six ’s, …
Digits can be counted smallest-to-largest (see [1,2,3,8,9]), by appearance (see [5,6,7]), or largest-to-smallest (which isn’t much different from smallest-to-largest). And we have been counting all digits at once (see [1,2,3,6,8,9]) but the most famous variation – Conway’s look-and-say sequence [4,7] – counts only consecutive runs. Counting the digits of all ancestor has even been played (see OEIS:A060857). Lastly, one can skip nouns (see [8,9]) obtaining which has
six ’s, two ’s, one , zero ’s, zero ’s, zero ’s, one , zero ’s, zero ’s, and zero ’s.
These and other variations can be found on the Online Encyclopedia of Integer Sequences [10] (see A098155, A083671, A005151, A047842, A007890, A005150, A006711 for a sampling).
Researching any of these variations is confusing at first as many of them receive multiple names and some names are used of multiple variations. That is, the namevariation map is ill-defined in both directions. Our variation of present concern is studied in [1,2,3]:
-
1.
infinite base (=clumping multi-digits),
-
2.
finite amount of digits,
-
3.
counting smallest-to-largest
-
4.
all digits of the parent only,
-
5.
and keeping nouns.
Although, the order of counting doesn’t have much effect as we’ll see later.
Bronstein and Fraenkel showed in 1994 [1] all starting values of positive digits eventually reach an Inventory Loop. They asked a few follow-up questions. One about a different variation was answered by Sauerberg and Shu in 1997 [3]. The only question about our present variation was: how long will a given starting value take to reach an Inventory Loop? (in their terms: what are “meaningful pre-period bounds”?). British mathematician The Math Book by Clifford Pickover mentions Roger Hargrave also did work on our variation but Mr. Pickover unfortunately no longer has the source 333Correspondence with the author.. That is all to say, the pre-period (or “time-to-loop”) has remained unbounded for 26 years.
Here, we replicate the enumerations of Inventory Loops in [1,2,3] by alternate methods leading to a pre-period bound of where is either the largest entry of our starting value or no greater than its length.
2 Clarifying Names
Our variation alone has appeared under the names “Counting Sequences” [1,2,3], “Reverse-likeness Sequences” [4], the “Pea-pattern Sequence” [5], and “Inventory Sequences” [6]. Technically, the first of the names is the only source in which the infinite base variation is rigorously defined (the base is either finite or undefined in the latter three sources). It is therefore tempting to leave our particular variation under the name “Counting Sequences” but the author hesitates to do so for two reasons.
-
1.
Self-descriptive numbers [8,9] are sometimes called “Self-counting” numbers and thus we would have ambiguity between the nounless [8,9] and nounful [1,2,3,4,5,6,7] variations.
-
2.
The adjective “Counting” is in general hideously ambiguous in mathematics.
Out of the aforementioned names, “Inventory” is intuitive and has relatively little existing usage in the literature. It is therefore proposed to call both the finite-base and infinite-base variations of our iteration “Inventory Sequences” and the corresponding cycles “Inventory Loops”. For example we would say: “Kowacs [5] enumerates Inventory Loops in bases through ” – and – “We analyze Inventory Loops in the infinite base” – etc.
3 Multisets
Our analysis will be done in the language of multisets – that is (as one may guess) sets with multiple copies of elements. This guts much redundancy from the start since maps integers/sequences to the same value regardless of their digits/elements’ ordering (e.g. ). To build up in the multiset ecosystem some symbols must be first defined. Let
-
•
denote positive integers,
-
•
denote multisets of positive integers (e.g. ),
-
•
“” denote the set of elements in (e.g. ),
-
•
“”444Note a multiset is also a set if . denote “the multiplicity of in ” (e.g. ), and
-
•
“” denote “The multiset of multiplicities in ”. I.e.
So for example .
-
•
For multisets , the sum “” will mean concatenation (essentially adding multiplicities). I.e.
So for example . This generalizes set union.
-
•
And “” will mean “the elements of also in ”. Thus if we have and then . This generalizes set intersection.
-
•
Similarly “” will mean “the elements of not in ”. Thus and . This – as may be expected – generalizes set difference.
With the new vocabulary our iteration of interest becomes
For example, etc.
In addition to these symbols, we also use metaphor throughout to guide intuition. The metaphorical terms will be capitalized from here on to avoid ambiguity and are as follows555The authors wonder if all mathematicians shouldn’t enumerate their metaphors as with definitions.
-
•
is thought of as the Child of and thus as the ParentChild map ( is thought of as the Grandchild and so on).
-
•
Accordingly is thought of as the Family Lineage of ,
-
•
Any such that for some is thought of as an Ancestor of , and all such Ancestors are thought of as the Family Tree or Ancestry of .
-
•
is thought of as the First Generation, as the Second Generation, and so on.
-
•
The largest element is thought of as the Height of . Thus if we say has Grown Taller than its Parent.
-
•
The set of multiplicities is thought of as the Adjectives of and the set as the Nouns of .
-
•
Later on, some particular Adjectives which are neither the smallest or the largest of a multiset will be called the Core Adjectives.
-
•
Also later on, some multisets are obtained from others by removing or decreasing elements. This is thought of as Deterioration and the resulting multiset as a Deteriorate.
Lastly, for easier reading we sometimes represent multisets as integers (e.g. “” stands for “”). Parenthesis are placed where ambiguity requires (e.g. “” stands for “”). This will be specified when used as “Integer Notation”.
4 Establishing Cycles
This section we 1) specify multisets Taller than their Parents, 2) deduce therefrom all Inventory Sequences reach a maximum Height, and 3) conclude a Loop results in all cases. Presume our Inventory Sequence (so in other words ). We take the first part first.
Lemma 4.1.
If and then where . In other words if a multiset past the first Generation is Taller than its Parent, we can nicely parametrize its Grandparent in two variables.
Proof.
The plan is to nail down with one bound, nail down with a second, and then put them together giving a form for itself.
Since , exists and . Let , the amount of distinct elements in . It follows by pigeon-holing some element of is at least (just as if you had ten people in a room all different ages some person must be at least ten years old). And by the inclusion we can know
with equality throughout when .
Next notice must be an Adjective of (otherwise ). So for some . Because by definition contains only distinct values we know . Thus
with equality when for some .
Taking our inequalities together,
implies equality in both cases. Consequently and which together force . The constraint “” becomes “”. ∎
Corollary 4.1.1.
The Height of a multiset is either equal to, or the increment of, its Parent’s Height (excepting only when the Parent is the starting value).
Lemma 4.2.
If then . In other words, a Child is always larger (in order) than its Parent except possibly only when the Parent is the starting value.
Proof.
The inclusion tells us . The lemma follows from and ∎
Lemma 4.3.
If and then either or .
Proof.
Lemma 4.1 applies so for some . Consequently . The bound implies is not the starting value – and therefore Lemma 4.2 gives us
implying . ∎
Lemma 4.4.
If and then is one of
In other words, if a multiset past the fourth Generation is Taller than its Parent, its Grandparent must be one of the former multisets.
Proof.
Lemma 4.3 narrows down to two possibilities.
In the case the number must be even since . We will bound by the fact
Since and since contains elements the former tells us (the latter equality is the triangular number formula with ). Using Lemma 4.2 again,
implies or equivalently . Thus in this first case
Alternatively if we create a similar bound using
yielding . But we can do better. The multisets and (corresponding to and ) have no Grandparents because their Parents,
are all odd in order. But must have Grandparents by implying . This leaves and ∎
We are ready for part 2. The Family Trees of the four possible ’s of the previous lemma were worked out in full. And only three Family Trees are actually needed since appears in ’s Ancestry.
![[Uncaptioned image]](%22tree112233%22.png)
Figure 4.1
![[Uncaptioned image]](%22tree123456%22.png)
Figure 4.2
![[Uncaptioned image]](%22tree1234%22.png)
Figure 4.3
Integer Notation is being used (e.g. “” stands for “”). Orange nodes are odd in order (and therefore have no Parents). Blue nodes have so many elements Nouns cannot be selected distinctly (again meaning no Parents).
The figure lets us enumerate all starting values with Tallest Descendants appearing past the fourth Generation.
Theorem 4.5.
For all the maximum Height appears by and, for all but finitely many , by (I.e. for ). The exceptions are given exhaustively by the entries (and the Descendants) of
max at | |
---|---|
The entries are written as integers for easier reading (e.g. “” stands for ).
* The entry “” in particular stands for a Family of multisets: the Grandparents of “” which themselves have no Parents. The Family is also equivalent to .
** The entry “” in particular stands for a Family of multisets: the Grandparents of “”.
Proof.
Lemma 2.4 ensures all such exceptions must pass through one of the four given values. The Family Trees given above are complete since a multiset has no Parent if
-
1.
its order is odd or
-
2.
there are too many elements to assign Nouns distinctly (i.e. if ).
∎
Thus ends part 2 of this section. Part 3 is the shortest.
Corollary 4.5.1.
The Inventory Game ends in a Loop for all .
Proof.
Since for , the proceeding multisets are bounded in length (in particular ). Thus the Descendants can take on only . ∎
Corollary 4.5.2.
This gives us our first (and worst) pre-period bound: where
Proof.
Since Height at most increments we know . Past , all multisets are one of values. ∎
5 Enumerating Cycles
It turns out easier to create another game having cycles corresponding to the those of the Inventory Game and then to find all the cycles in the new game. This new game is played on the Adjectives of the Inventory Game.
Firstly, we define the new Parent-Child relationship with two functions:
-
1.
-
2.
Lemma 5.1.
Every cycle under corresponds to a cycle under some . In particular if
then
with
Proof.
The rule gives us an inclusion chain
forcing . So let and we get a new rule
For easier further reading, we use a change of variables: (with for djectives!). The the new rule becomes . Taking this together with the fact that we deduce
where . Letting and substituting we have which by definition means . ∎
To further investigate we should place it in its natural habitat. Let be the set of all order multisets whose totals are twice their order. In other words
Lemma 5.2.
Our function sends any multiset of elements into . That is, if then .
Proof.
Suppose . Noting firstly , the order
is correct. Proceedingly observe
From this it follows the total,
is correct as well. ∎
Corollary 5.2.1.
Every cycle of multisets with order corresponds to a cycle in under .
Accordingly here are graphs displaying the action of on :
![[Uncaptioned image]](%22Adjective_cycles_0_through_7%22.png)
Figure 5.1
The cycles are marked with thicker cell borders and once again we use integers to represent multisets (e.g. “” stands for “”).
Theorem 5.3.
The Inventory Loops of elements or less are given exhaustively by
No. elements | Loop(s) |
---|---|
where and can be any distinct whole numbers that aren’t already in the Loop.
Proof.
By the Corollary, any such Inventory loop must correspond to one of the nine cycles shown in Figure 5.1. The seven Inventory Loops above correspond to the cycles under and (in the manner laid out in Lemma 5.1). We claim the cycles under and do not correspond to any Inventory Loop.
The cycle of oscillates between and . But this means any corresponding Inventory Loop must contain and and thus be at least order . But the correspondence would enforce ’s Loop order . Similarly, the cycle under suffers the same illness. Any corresponding loop would contain and – and would therefore be on multisets at least order . ∎
We could use this same strategy on and so on… But we have infinitely to go! We need an approach to knock out and will create one in the next lemma (which should really be three or four lemmas but every attempt to split it felt unnatural and added confusion – so it was left as a single super-lemma).
Lemma 5.4.
Any cycle in under with contains one of
Proof.
Suppose our cycle
The total proof consists of ten subproofs each of which respectively proves
-
1.
,
-
2.
,
-
3.
if then ,
-
4.
if then ,
-
5.
for all ,
-
6.
for some ,
-
7.
,
-
8.
if then ,
-
9.
, and
-
10.
is one of the multisets listed above.
To summarize in English, we track the amount of distinct elements in each (written “”), find that it it drops to or less, and use the fact to whittle down a finite list of required cycle elements. We will take the first point first.
1.) The amount of distinct elements in is at most one more than that of . Using the definition of and the fact for any we have
We use the “” as a multiset scalar to simply indicate the exact order is not needed but can be assumed greater than zero.
2.) The amount of distinct elements in is at most two more than that of . We start similarly with
This next part is a bit ugly. We must figure out what exactly looks like. But again, since for any ,
Putting the two together,
3.) If any has more distinct elements than , then has at most distinct elements. From subproof (1.), if then . In other words, – and therefore also – contains all distinct values. It follows and therefore from subproof (2.): .
4.) If any has equal amount of distinct elements as , then has at most distinct elements. Similarly, if , then . In other words, contains exactly one duplicate entry and distinct elements otherwise implying
It follows .
5.) All have or less distinct elements. From the two former subproofs, either or . Thus eventually the distinct element counts drops to . It then either remains at indefinitely (as with the self-inventoried number ), drops to or less (in which case our claim remains true), or increments to . But if increments, then again, and the distinct element count can never again climb up above . Thus at some point all have or less distinct elements. But we are in a cycle ()! So at some point really means always.
6.) Since what goes up must come down we may as well assume also: for some .
We now switch tracks. Instead of narrowing down the distinct element count further, we use what we’ve got to narrow down required cycle elements.
7.) The amount of non- elements in is the amount of distinct elements of . Note
8.) The largest element of is one greater than the count of ’s in (if ). From the previous two subproofs we know
In other words, at most of the elements in are not ’s. Equivalently, at least of the elements are ’s. Thus ’s are the majority if
And lastly, if ’s are majority then
9.) The distinct element count of can be calculated from . Firstly,
Next we rework
which together with the former gives
The previous subproof is now needed. It tells us . Thus by rearranging and substitution
10. One of the following appears in any cycle under for . The value of is that specified in subproof (6.). We use the formulas from subproofs (7.) and (9.) to calculate and from the elements of .
The first multiset, , however cannot appear as it enforces . In other words, it has no Grandparents. It’s Parent is which itself has a Parent only when – or equivalently, when (look back to Figure 5.1 and observe only include in their cycles). ∎
Accordingly, here is a graph displaying the action of on the multisets (and one more) from the lemma:
![[Uncaptioned image]](%22Adjective_cycles_general%22.png)
Figure 5.2
Once again, we use integers to represent multisets (e.g. “” stands for “”). The integer by each arrow is the smallest value of at and past which the map holds true.
Theorem 5.5.
All Inventory Loops of length fall into one of the two following parametric families:
-
1.
-
2.
where the ’s can be any whole numbers not already present in the Loop.
Proof.
Lemma 5.4 gives six multisets which must appear in any Inventory Loop and Figure 5.2 show which Loops those six multisets must fall into. The Inventory Loops above (written in Integer Notation) follow the correspondence of Lemma 5.1. ∎
Corollary 5.5.1.
No Inventory Loop is longer than three numbers.
Proof.
Theorems 5.3 and 5.5 together give all Inventory Loops. The longest is length appearing only at . ∎
Corollary 5.5.2.
This gives us a better (but not best) pre-period bound: where .
Proof.
From Lemma 5.4 the value either decrements or falls below as increases on the assumption all elements have appeared. Thus after iterations, either a loop has been entered or a new element appears. Since, from the previous section, we may conclude at most elements appear. Since also we may presume the pre-period quadratic in .
This is not rigorous and the case has not been included. We leave a full proof undone as the bound is improved (with rigor) in the following section. ∎
6 Tracking Distinct Adjectives
This section and the next generalize the methods of the previous to Inventory Sequences in general (as opposed to just Loops). The main result is the amount of distinct Adjectives of an Inventory Sequence (i.e. ) drops to or less in time. Deducing this will require some husky machinery. To make matters worse, this is one of the few sections without pictures.
Lemma 6.1.
For any if is the difference between ’s order and the count of distinct elements (in other words, if ) then
Proof.
Let . Then
And since , we may proceed
Rearranging yields from which the lemma follows by quadratic formula. ∎
The previous section was easy on us since we could assume no new elements were appearing (i.e. that ). Here the assumption doesn’t hold. Whereas before we had a simple recurrence of Adjectives, , here the corresponding recurrence – the Ugly Recurrence – becomes much messier:
Consequentially, some work is needed to patch the holes.
Lemma 6.2.
For any multisets
and if – or in other words if is of the form – then
Proof.
By our generalizations of set intersection and difference we may say with . This tells us where the “” simply marks an unspecified scalar. Paired with the fact we may deduce
Lastly, since and are essentially a disjoint partition of we know . This implies which in turn implies
For part (b) simply observe
Taking part (c), if we have then
and if instead then
where In either case at most one element is appended to with another possibly removed. ∎
Our next lemma generalizes subproofs (1.) through (4.) of Lemma 5.4.
Lemma 6.3.
Letting , in any Inventory Sequence
and if then
Proof.
To make reading easier we define thus partially hiding the ugliness of the Ugly Recurrence. We have then . Since for any multisets and we may deduce
Applying Lemma 6.2b with tells us further .
And for (b) we start noting
We can apply Lemma 6.2c (choosing ) and then 6.2a (choosing ) obtaining
Now we saw earlier implying . Taking this together with and tell us
Thus we apply Lemma 6.1 with obtaining
since . Substituting into our bound on yields the lemma statement. ∎
Going forward, this bound on ’s is our fuel. We need one last lemma to convert it into a bound of the pre-period.
Lemma 6.4.
Suppose obeys
If and in particular then for all
Proof.
Firstly the sequence is increasing. This is seen inductively since if then and the recursive bound tells us . Secondly we claim when is defined – or equivalently when . We may substitute the recursive bound into itself obtaining
The case reduces to showing – or equivalently which is given by assumption.
The lemma statement may now be shown through strong induction assuming more weakly since the base case forces equality in the bound. The base case is given by assumption. The inductive case is
∎
Theorem 6.5.
Letting there exists
such that .
Proof.
We should first establish some such exists. Lemma 6.3b tells us if (i.e. ) then and if (i.e. ) then . It follows there eventually must be for some .
More generally the same lemma gives us a bound on from above,
which after proper rearrangement gives us a bound on from below,
This is of course the bound from the previous lemma and by no coincidence. We may now climb back up from to . But we need footing to get started.
Let’s presume our is the smallest such . Therefore if they are defined (since otherwise and will do) and (since otherwise ). From here all prior can be bounded recursively. For example,
Since each is an integer we may presume . Going on in this way we may backtrack from taking the smallest integer value at each turn:
For instance, this sequence tells us if then – or conversely, if the distinct Adjective count starts at less than it will be down to or less before the 17th Generation. We now see just how quickly this metric drops. It’s time to apply the previous lemma.
Let’s define so that and . We have
so Lemma 6.4 applies and
Picking in particular tells us
Thus a first logarithm gives us and a second gives
One last point must be made. In defining we have presumed and further the bound gives sensible results only if . Both hiccups are resolved exchanging “” for “” in the bound. ∎
7 Tracking Core Adjectives
Once the amount of distinct Adjectives has fallen below , we can nearly determine the precise identity of the Adjectives themselves. But to see why we can only nearly determine their identity we first should define two new terms. Let and be the unique multiset such that
-
1.
, and
-
2.
We call the Core Adjectives of (where as is just the Adjectives). By the end of this section, we will have used Theorem 6.5 to narrow down the Core Adjectives to just eight possibilities past a certain point. As before, some machinery is needed before we can say so.
This next lemma generalizes subproofs (6.) through (9.) from Lemma 5.4.
Lemma 7.1.
If then and
In other words, if the distinct Adjective counts of two consecutive Generations are or less, then there are at most Core Adjectives in the third Generation and their count subtracted from their sum is at most .
Proof.
We must bring back the Ugly Recurrence from the previous section:
From our definition of it then follows
That left hand side looks bad, but Lemma 6.2b gives us Deducing from here amounts to noticing is a singleton.
The second part is trickier. It begins with the equation
And here the author is not sure how to proceed with clarity. It seems the Ugly Recurrence must be used once more. Note that if we replaced one “” in particular with a “” the Ugly Recurrence becomes much less intolerable:
In total, the change increments a few elements of . Thus we may say
Another application of the Ugly Recurrence and Lemma 6.2b give us
Linking these four messes together neatly gives us
Finishing off the lemma now amounts to showing . Note tells us . And since we defined we may conclude
∎
Corollary 7.1.1.
If then – expressed in Integer Notation – is one of
or is the empty set .
Proof.
Simply exhaust the possibilities specified by the previous lemma with . ∎
To proceed we must study how these possibilities of the Core Adjectives map to each other under . Unfortunately, does not determine their mapping uniquely. In other words, knowing is not enough information to determine . The relationship is determined in Loops but in Inventory Sequences in general knowing only narrows the identity of down to some candidate values. This indeterminacy is caused again by the same non-inclusion which made the Ugly Recurrence ugly.
So what assumptions, then, did we make when working with Loops? Two actually:
-
1.
All new elements have appeared.
-
2.
is sufficiently large.
And indeed when we assume these, forces a unique map on Core Adjectives since
-
1.
if all new elements have appeared then and
-
2.
if is large enough then most Adjectives must be ’s (presuming the Core Adjectives are known and fixed) and thus .
With such assumptions fullfilled the equation
tells us . Simple enough. So in accordance with our naming convention from Section 5 (where was the map on Adjectives) and the former two naive assumptions we define a map on Core Adjectives:
Here then is ’s action on the possibilities from the previous corollary:
![[Uncaptioned image]](%22gNaive1%22.png)
Figure 7.1
![[Uncaptioned image]](%22gNaive2%22.png)
Figure 7.2
The Core Adjectives appear in Integer Notation and are arranged vertically by as indicated on the left/right.
One can see a single self-cyclic node in Figure 7.1 and a loop of two nodes in Figure 7.2. As one might expect, these correspond to the parametric -cycle and -cycle given in theorem 5.5. And if covered the general case we would be done. Unfortunately, it is which determines .
At this point we can specify how exactly narrows down the identity of . We must first define Deterioration. A multiset is called a Deteriorate of another if can be obtained from using one or both of the following operations:
-
1.
Replace any subset of the elements by their decrements discarding ’s (e.g. ).
-
2.
Replace the largest element by any lesser value or discard it entirely (e.g. or ).
Lemma 7.2.
The multiset is either equal to or a Deteriorate of and is equal only if and .
Proof.
We claim the two operations of Deterioration correspond respectively to the two assumptions made in defining . To see this we should start by writing out the recurrence defining in the full messy general case:
Thus consider some particular . If then might contribute a “” to via (and fails to do so only if – i.e. only if the contribution is the largest Adjective). But if then might contribute only a “” (and fails to do so as in the former case). And this nearly covers the first operation of Deterioration. Note lastly in the latter case if then the contribution is absorbed into .
The second operation of Deterioration covers the possibility might not be the largest Adjective. If not, some element of or of will take the place of “” and will fall into – or will be discarded entirely if . Thus the second operation covers all such possible exchanges (in fact, it covers more – but its tight enough for the use we will make of it).
From this the “if” of the lemma’s “only if” follows immediately. Conversely, if then we have
with all elements . Equality of order dictates simplifying the equation to
And since is at least as big as anything in we may presume either or . But in the latter case either we have also the former or the elements of the right-hand-side sum to strictly less than the sum of those on the left-hand-side. We may then presume and are left with
Considering the sum of elements again tells us implying implying finally . ∎
These lemmas taken together are enough to wrangle ’s list of candidate values from the infinite into the finite.
Theorem 7.3.
Letting , then for
then either
-
1.
is a fixed point under (i.e. and therefore an Inventory Loop), or
-
2.
The Core Adjectives in Integer Notation are one of or are the empty set .
Proof.
Theorem 6.5 and Corollary 7.1.1 taken together imply there exists
such that is one of the 64 multisets there listed. But more strongly, we make the claim not simply of but for all thereafter (i.e. that is one of the 64 multisets for ). Those multisets are clearly closed under (as Figures 7.1 and 7.2 show). But they are closed also under Deterioration since is strictly decreased. Lemma 6.2 tells us then if is one of our 64 values, then so is . In other words for
the multiset is one of the 64 values.
Further, after adding Deteriorate edges to Figures 7.1 and 7.2 the resulting graph has exactly two self-cyclic strongly-connected-components (SCCs). This can of course be verified by computer but can also be seen more easily from the Figures. Since Deterioration strictly decreases we are assured all but 3 Deteriorate edges originating beneath the dashed green line point upwards (with exceptions and which each point horizontally). Hence the graphs were arranged vertically as they were.
After additional iterations the sequence must therefore enter one of the two SCCs:
-
1.
or
-
2.
.
If the first is entered, the theorem holds true. If instead the 2nd SCC is entered, then after another additional iteration (so after in total) we must have for some . It turns out this is enough information to know a fixed point of (and on this we spend the proof’s remainder).
To make things easier we presume (and address the case last). Lemma 7.2 tells us . But since in addition to and the elements of the multiset contains only ’s, the former really means implying – or in English, that no new elements have appeared. The same can be said also of so in total
The definition/equation also tells us
for Putting the former with the latter we then obtain
Here again the assumption is relevant. By it, the definition tells us
for . In particular, we therefore have which by Lemma 7.2 again tells us . Now if we simply put together the deductions A) , B) , and C) it follows that . And paired with this of course tells us .
Lastly if then must contain only ’s since if not, we would have
But Lemma 4.2 tells us a Parent can only be larger in order than its Child if the first Generation – a contradiction since . But then if (i.e. if the Adjectives are all ’s) then
which would imply – a contradiction as well. Thus we may (as done above) presume . ∎
Corollary 7.3.1.
The period of any Inventory Sequence can be determined after
iterations from the initial value .
Proof.
This is odd since (as we will see later) the Loop itself may not begin for many iterations after . The former Theorem tells us after iterations either is a fixed point (an Inventory Loop with period ) or the Core Adjective multiset is in Integer Notation one of or is the empty multiset . But those eight possibilities are closed under and Deterioration and thus the Core Adjectives will be one of those eight possible multisets.
Out of the two parametric Loops of Theorem 5.5 only the -cycle uses the eight Core Adjectives (the -cycle has ).
And the Loops listed in Theorem 5.3 all start within the first iterations. This was checked with computers by generating family trees. The longest pre-period was which started its Loop at with period . ∎
8 The Main Stuff
“I am to give my readers not the best absolutely but the best I have.”
- C. S. Lewis, The Problem of Pain
The author found it difficult to write this section clearly even though, containing the main result, it is the most important. Any reader therefore intending to build higher should inspect this foundation in case the author fell short of proper rigor. They ask your patience.
We should begin by summarizing our position. Theorem 7.3 told us some Inventory Sequences reach a Loop in time. And further, Core Adjectives of the Sequences which don’t are one of
past a certain point (also reached in time). Our remaining task then is to analyze how exactly the former eight multisets of Core Adjectives jump from one to each other. In other words, what values can take?
The rules of Deterioration laid out in the previous section are enough to specify which possible values might take (that is what Lemma 7.2 was saying). But if our Sequence is order or larger – equivalently, if – and we know also which elements (if any) of have appeared for the first time in the Sequence, then is sufficient information to determine . In other words, if there are at least eight Adjectives then the Core Adjectives and new appearances of one Generation are enough to work out the Core Adjectives of the next Generation (we will see it is also enough to work out ).
What follows is said (and understood) more easily if we define an Inventory Sequence’s Maturity. We say such a Sequence has Matured at iteration if and the Core Adjectives are one of the former eight possibilities. If either of these assumptions is broken we say the Sequence is Immature. 666So for example, the parametric -cycle from Theorem 5.5 might well have over elements per term. However the Core Adjectives () are not one of the former eight possibilities. Thus we say such a Sequences has Looped Immaturely. Theorem 7.3 essentially says any Inventory Sequence either Loops or reaches Maturity in time.
Let’s assume then our Sequence has Matured and redraw Figure 7.2 with just the Core Adjectives we care about. This time Deteriorate edges corresponding to new element appearances are shown (and are labeled by the elements making new appearance).
![[Uncaptioned image]](%22Deteriorates%22.png)
Figure 8.1
The letter “” is shown when value of is making a new appearance. The “&” denotes multiple new elements appearing together (e.g. “&&” means and have each and all together appeared for the first time in the entire Inventory Sequence). Blue arrows are for now new appearances (and therefore match up with the black arrows of Figure 7.2). Orange arrows are for new appearance edges which can be taken any amount of times. Green arrows are for new appearance edges which can be taken only once (or, for the dashed arrow, at most twice). The dashed nodes containing and are only placeholders to reduce clutter.
We won’t compute the correctness of all arrows by hand. Any individual arrow can be checked if needed. However the claim some edges can be taken at most once (or twice) does require longer justification.
Lemma 8.1.
If an Inventory Sequence is past Maturity then particular consecutive Core Adjective pairs can further appear at most a fixed number of times – given explicitly by the following table:
Edge(s) | Max Occurrences |
---|---|
Proof.
Most edges occur at most once or twice because they require the new appearance of and/or which by definition can happen at most once. Nothing can appear for the first time twice. The two edges which could plausibly appear repeatedly are and requiring the new appearance of only.
To take care of these exceptional edges, we will use Backtracking. The process is tedious to define so we begin by just showing the Backtracking Tree for and define afterwards.
![[Uncaptioned image]](%22BTT222_4%22.png)
Figure 8.2
The idea is to start with and work backwards to possible values of … and so on. The trick is that by tracking the appearance of new elements carefully, we can terminate the Backtracking after a finite number of steps. This is done by forcing the Sequence into Immaturity (at which point we don’t push farther since we have a good bound on the development of Maturity777As some of us wish we had generally.). The tedious bit is determining what value each must take in terms of . Notice Maturity implies . Our work from Section 5 will help us. Let’s take an example.
If we claim either or the Sequence has just Matured at . If the Sequence was also Mature at then there were no new appearances in – or equivalently, . This is because the multiset receives only blue arrows in Figure 8.1. We therefore know and . Putting these together we can derive
implying .
Similarly if new elements appear we can amend the derivation starting instead with and . The corresponding result is Accordingly let’s make a table of values (assuming Maturity in the prior Generation):
– | ||||
– | ||||
– | – | |||
– | – | |||
– | – | |||
– | – | |||
– | – | – | ||
– | – | – |
The em dashes “–” mark impossible iterations past Maturity. For example, and are marked because past Maturity appears only when no new elements have appeared ( receives only blue arrows in Figure 8.1). Similarly, only is unmarked in the column because only receives arrows requiring new elements.
There is another picky point to be made about Backtracking. Once we have fixed for some the values of for may not be precisely what our table specifies. For example consider ’s Backtracking Tree letting . Both “” and “” appear when “” appears nowhere in their rows of the table. Why? Because when new elements appear is incremented (or decremented if Backtracking) accordingly. Thus is one more than the table dictates because it appears after the new appearance of . Similarly, in “” is one less than expected because it appears before the new appearance of .
Lastly, at each turn of Backtracking we steer inside the boundaries of Maturity and terminate when we are forced to say some element appears before its new appearance – a contradiction by definition. The Inventory Sequence may well continue backwards further by another route but any such route can be taken only before reaching Maturity.
The Backtracking Tree for tells us the edge appears (if at all) within iterations after Maturing and, at most, once in total.
The Backtracking tree for is much larger but the process of creation is identical using the table and decrementing specified above. Whereas the tree contained nodes, the tree contains nodes (though our computer-generated tree has some redundancy so the minimal nodes required for the computation might be a hundred or so smaller). The full tree and the code to produce it is available by links in the references. The tree has a height of and every path down contains at most two occurrences of . We visualize here a single path from our output file:
![[Uncaptioned image]](%22Path24_22%22.png)
Figure 8.3
Thus with the new appearance of or , the edge can occur at most once (and with the new appearance of , at most twice). And there are a total of occurrences at most. ∎
We are nearly ready for the main theorem. Two lemmas are still needed. The first will tell us when and how exactly a repetition of Core Adjectives gives rise to an Inventory Loop.
Lemma 8.2.
Suppose and for some
where “” represents mapping by Then is a member of an Inventory Loop.
Proof.
The proof goes similarly to the second half of Theorem 7.4. In fact, we could have proven this first and used it as justification therein. However, it is often easier for readers to digest theorems built up by increasing generalization (rather than first proving the most general and going on to apply in particular cases). And the understanding of the reader more important than logical brevity.
Note also that because and , we are assuming something stronger than the single reappearance of some .
We may assume since if not all elements in are . But if so, must contain only ’s since otherwise would be larger in order than – a contraction since by Lemma 4.2 we would have (but we assumed ). And if , we would necessarily also have – contradicting the lemma assumption.
Given then , Lemma 7.2 ensures
Noting in general the former implies
The first of these implies also since in general if then . And again, Lemma 7.2 tells us . Now since A) , B) , and C) , we may conclude which, paired with , implies . ∎
In Figure 8.1 and Lemma 8.1 we assumed our Sequence had Matured. This includes the assumption . But some Loops occur with less than elements and our bounds at present don’t apply to them. This final lemma amends their case.
Lemma 8.3.
Suppose and . Then for any such that is not member of any Inventory Loop and we may presume where counts appearances of new elements.
Proof.
Let . Since we know . Our goal then is relating to Figure 5.1 which gives the action of on – the set of multisets of order whose elements sum to .
There are two possibilities. If no new elements appeared in then since in this case and we may therefore say
But if any new elements did appear then will (by definition) still have elements but summing instead to strictly less than .
In total, this means the sequence swims around until either some set of Adjectives reappears (implying an Inventory Loop has been entered) or some new element appears and the sequence eventually jumps up into (or into …). Lemma 5.2 assures us as well the sequence spends at most one iteration outside its source and destination at every subsequent arrival of new elements. Thus we can bound the maximum iterations occurring before either is at least (and therefore ) or a Loop begins. To do this let’s redraw Figure 5.1 marking out the longest path available in each :
![[Uncaptioned image]](%22AdjCyclePaths%22.png)
Figure 8.4
The longest paths are left in color. Remaining nodes are colorless.
And we see the path lengths for are length respectively. Incrementing each (for the border crossing between ’s) and taking sum we get iterations at most. But instead of leaving a grand total of we instead choose a slightly different representation, attributing iterations to each new element (hence “” in the lemma statement) and leaving as constant. This modified representation plays more nicely with the main theorem. ∎
Theorem 8.4.
The pre-period of any Inventory Sequence with initial value is at most
where .
Proof.
Here we total up the iterations from the relevant theorems and lemmas:
-
1.
Theorem 7.3 says after
iterations from the initial value either is an Inventory Loop or the Core Adjectives, , are in Integer Notation one of or is the emptyset .
-
2.
Lemma 8.3 says after iterations from the initial value either is part of an Inventory Loop or (where counts appearances of new elements – we will bound it further down).
-
3.
The former two taken together imply after
iterations from the initial value either is part of an Inventory Loop or the sequence has Matured.
-
4.
Figure 8.1 and Lemma 8.2 taken together imply an Inventory Sequence will enter a Loop at most iterations after Maturity if no new elements appear.
-
5.
One can conclude from Figure 8.1 that each new element delays the Loop by at most two Generations unless it appears with or .
-
6.
Lemma 8.1 tells us the former exceptional appearances occur at most a fixed amount. In fact, we can tabulate exactly how many iterations each might add:
No. new Expected Worst Max. no. Worst Edge elements delay delay occurrences overture Thus attributing a delay of iterations per new element gives bound off by no more than iterations. More careful analysis could reduce this number but, as there are a lot possible improvements to our bound’s constant term, we will enumerate them all together in the next section.
-
7.
Adding the constant terms of items (3.), (4.), and (6.) gives . Thus after
iterations from the initial value, our Sequence has entered a Loop.
Our last task is to get rid of the “”. This is possible because the elements all Inventory Sequences are bounded from above. In particular, Theorem 5.4 tells us the largest element appears by (except for a handful of exceptions). In other words, for . Further, Corollary 4.1.1 states the Height of an Inventory Sequence at most increments. Thus and in most cases for all . The worst of the exceptions is which gives and for . So in all cases .
Now counts new appearances from onward. So because the elements of the Sequence are bounded by we may conclude
Substituting into our former bound we obtain a pre-period bound of
∎
Corollary 8.4.1.
The pre-period of any Inventory Sequence with is at most
Proof.
The recurrence tells us . This means the larger we make , the larger becomes too. In particular, an increment to increases the term by at most
but reduces the term by . In other words, the bound is largest when is smallest.
Picking then gives us a more symbolically compact bound:
∎
9 On Improvements
It isn’t clear the pre-period bound is best formed as . And ours ( and ) isn’t necessarily the tightest. However, Bronstein and Fraenkel’s original request was for meaningful – not exact – pre-period bounds and we feel the request is answered. Before closing, the authors would like to speculate about improvements to our bound.
Firstly, is easily improvable and the authors took no pains to do so when the mathematical ecosystem offered brevity in exchange. The following shortcuts each admit improvements:
-
1.
Lemma 6.2 might be adapted to our particular subspecies of multisets yielding better bounds – or more likely, a set of bounds for different cases (e.g. when new elements have appeared and when not). The result would carry into Lemmas 6.3 and 6.4 and become a lower constant term and/or lower logarithmic bases in Theorem 5.4. A lowered constant would also reduce the multisets listed in Corollary 7.1.1 and would then become a lower constant in Theorem 7.3.
-
2.
Lemma 8.1 was completed as if all edges could occur together at their maximum occurrence. However, most of them are exclusive. For example, and each require the new appearance of and thus only one can appear. Similarly if , this could probably be shown to occur at most or times (instead of ). Thus the constant “” from part (6.) of Theorem 8.3’s proof can probably be shrunk to or so.
-
3.
Lemma 8.2 requires the repetition of a consecutive Core Adjective pair. However, if adapted for our case, it is possible or (i.e. the repetition of a single Core Adjective multiset) would be enough to force a Loop. This would reduce the delay constant from to as well as reduce the overture of some edges in part (6.) of Theorem 8.4’s proof.
-
4.
Lemma 8.3 was completed as if the longest path of all seven ’s might occur together. However, the length of the path through has consequences on the path taken through . For example, if the longest path (length ) is taken though then the path through will be at most length . The constant “” can probably be reduced to .
But these improvements still leave us with something in the end (around probably). One might then wonder can be decreased?
No. Not in the general case at least. The orange edges in Figure 8.1 can in fact be taken indefinitely. For example, takes repeatedly ( times exactly):
![[Uncaptioned image]](%22SharpExample%22.png)
Figure 9.1
The starting value is represented as a multiset but for readability remaining nodes are written in Integer Notation.
The trick works because the family bounces around the green area repeatedly – times in fact. The authors worked out similar staring values for three other orange edges. Two of them ( and ) give rise to infinite families of starting values for which the is sharp. Here are the edges marked out in the Core Adjectives graph:
![[Uncaptioned image]](%22OrangeFamilies%22.png)
Figure 9.2
And here are their families given explicitly:
Edge | Loop at | |||
---|---|---|---|---|
It is left to the reader to work out families for and (if they exist!).
10 Replacing with
So far we have assumed our multisets are of strictly positive integers. We’d like to know if our results hold in other regions – say or . It turns out by simply allowing and (or really any two values outside the image of ) we lose even ”Ultimate Cyclicity”. That is to say, one can find which never enters a Loop. For example:
![[Uncaptioned image]](%22NonCycleExample%22.png)
Figure 10.1
Thus is something of a boundary case. As far as the authors can see, the results of Sections 5 through 9 run isomorphically substituting “” for “”. Section 4 is probably affected only in that the list of exceptions for Theorem 4.5 becomes longer. This is because the proof of Lemma 4.1 relied on the fact that for . But in we might have say . The difficulty can be amended with some extra work.
But the authors have not pursued this variation or any others as the project has already grown larger than the authors should have allowed. Further analysis is therefore left undone and the paper will close with speculation about an interesting (but yet unstudied) reformulation and the variations it offers.
11 Multisets as Functions
Every multiset “” corresponds to a multiplicity function “”. Some fun variations are easier to find (and rigorously define) by treating the Inventory Game as an iteration of functions,
instead of an iteration of multisets
It will be a bit of work to redefine things, but the authors found the results worth the trouble.
How then do we translate our recurrence “” into the language of functions? The task can be broken down to redefining 1) the multiset-of-multiplicities function “”, 2) the multiset-to-set function “”, and 3) multiset addition “”.
Taking the first point first, what is for a particular ? Well – appears in once for however many elements have a multiplicity of in . In other words, counts how many elements the function sends to . This value will come up a lot so we should make some notation for it (it’s cluttered to express it as summation).
For any function let “” mean “the set of elements sent to by ”. In other words,
As an equation we then have
But this definition isn’t quite right still. There is an exception. Zero. There are infinitely many elements in any finite multiset occurring zero times. That is, there are infinitely many elements sends to zero. But we exclude this. We do not, for instance, say has zero ’s, two ’s, zero ’s, one , zero ’s, zero ’s, and so on… The zeros are left out. Our equation therefore requires amendment:
The function is easier to work through. The value of is if appears in and is otherwise. It’s therefore something of an indicator function. As an equation,
Lastly, addition of functions is defined simply by adding values element-wise since for any two multisets
All together we have
And has been defined totally in terms of . To make reading easier from here on (and to clear multiset conceptions out of our minds) the function “” will simply be called “”. Thus the Inventory Game really means the sequence for some where
Our previous results then tell us that if sends only a finite portion of to non-zero values then the sequence enters a cycle of length or in time where .
Before going on to exotic variations, it may be good to work an example. Let’s take the classic case with the corresponding function
Because case equations are large, difficult to read (and to code), and are usually ugly, we will use an alternative representation. The statement “” will mean “, the function sending to and everything else to ”. Thus our iteration may be visualized in ways – as multisets, as function descriptions, and as function compositions:
![[Uncaptioned image]](%22ThreeVisualizations%22.png)
Figure 11.1
Of course, these redefinitions and revisualizations are only like setting the table. Now we are ready to eat.
It was said in the first section some mathematicians have played the Inventory Game with infinitely long starting values [1,2,3]. In our current language, we would say they chose sending infinitely many values to a non-zero image. But to keep everything well-defined, they had to make sure infinitely many values were not sent to any particular since then would be undefined. But there are perfectly reasonable ways to define when the set is infinitely large. And doing so gives us transfinite variations of the Inventory Game.
In the most simple case, we take instead of and say whenever sends infinitely many elements to (to be completely rigorous we should also specify that and ). For example, if we start with (or equivalently with ) then we get the sequence on the left:
![[Uncaptioned image]](%22TransfiniteVariations%22.png)
Figure 11.2
And a Loop occurs at .
There are other more interesting Transfinite variations (e.g. the one on the right). So to keep from confusing ourselves, let’s call the version on the left STIG for Simplest Transfinite Inventory Game. The version we’re about to define (the one on the right) will be called TOIG for Transifnite Ordinals Inventory Game since it is played with surreal numbers (hence using “” instead of “”).
In STIG, we said “”. TOIG is more or less the result of saying instead “”. Correspondingly in STIG, we said if sent any infinite portion of to . In TOIG however, we say if sends all of to . And if sends all but a finite portion of to , say , then we say . In the round of TOIG in Figure 11.2 for example, sends to . Accordingly, and . And again, if in addition to , sends some finite set of transfinites, , to then we say
But this variation has a couple of holes the authors have not seriously attempted to patch. Firstly, what happens when infinite transfinites are sent to ? And secondly, what happens when an infinite portion of with infinite exceptions is sent to ? For instance of the latter, what if we start with ? (i.e. – the non-negative even numbers).
The only hopeful solution occurring to the authors was to play over Conway’s surreal numbers (as opposed to a more simple surordinal set). The first difficulty might then be resolved setting to or (or perhaps something more nuanced). The second difficulty, , might be also resolved setting to . And in general if has density in then we set . If the density is or then an interesting work-around might be found with (or the variation might play out consistently without any patch-work). The soil seems fertile here but the authors haven’t made time to plant anything.
The multiset-as-function formulation has also some nice alterations worth mentioning. But it will need some surgery first – it’s not flexible enough currently. There are three procedures to be done:
-
1.
First we treated as a map from to , then as a map from to , and thirdly as a map from the surreal numbers to themselves. In general, sends some set “into” itself. Really the only requirement of is an additive structure with identity – in other words, to clearly define , we need some and such that for any .
-
2.
When working out we had to make an exception for zero since we don’t mention zero amounts of things in description (e.g. saying has “one , zero ’s, one , zero ’s, zero ’s, …”). In spoken languages, zero is (usually) an assumed or insignificant Adjective. But what if we want to mention zero things explicitly? – if we want to give zero some significance? Or alternatively, what if we want to treat other Adjectives besides zero insignificantly? Both of these can be done defining a set of Insignificant Adjectives where so far we have used .
-
3.
We said in the first section some people play the Inventory Game without Nouns. Their gameplay can be accommodated by adding a parameter counting Noun-mentions. We played with . Choosing gives the Nounless Variation and corresponds tot he map in the multiset formulation. More will be said about further on.
Putting all these together, the recurrence becomes
The order operator can really be any map sending subsets of to elements of (i.e. ).
A lot of variations can now be labeled by a -tuple . Our main specimen of analysis has been . STIG is where sends infinite sets to and TOIG is where is Conway’s surreal numbers and we never defined exhaustively. Additionally, the decimal Nounless Variation is where sends sets of order to . This variation has two Loops. There is a -cycle corresponding to the self-descriptive number and a -cycle corresponding the mutually descriptive pair and . The authors here resist the temptation to speculate about Nounless Transfinite Variations.
One wonders how variations behave when is neither nor . Consider then . It’s a lot like the version we analyzed except a Noun is mentioned only if there are or of it. For example, we would say has “two ’s, one , and three ’s”. The ’s are not mentioned. This variation produces Loops longer than elements:
The last point of any substance the authors have to make is about . Let’s play – i.e. the variation we analyzed with . This means Nouns are repeated times each in description saying, for example, has
“two ’s, one , and one .”
One inevitably thinks of speaking with a stutter. But we will call these OEIGs for Over-Emphasized Inventory Games (and spoofing the OEIS [10]). Thus here we say and a round might go like:
The gameplay should feel oddly similar to the variation we studied. Look – the Loop’s Adjectives are (a bit similar to the fixed point of : ). It isn’t coincidence. Alternatively, if one had started with (or again, equivalently with one would find a -cycle:
with Adjectives again very similar to :
Loop Adjectives | -cycle |
---|---|
from var. | from |
To find out what’s going on, we should generalize . So let be the set of multisets
-
1.
containing elements,
-
2.
whose elements sum to times their order,
-
3.
and whose elements are or greater.
Formally
Our old definition then lines up as . One could similarly generalize and and go on to show A) the Adjectives of a Loop in are members of , B) the multisets of any and are in bijective correspondence (in particular if then any is obtained adding to each element of some ), and C) therefore every Loop of is mapped under a bijective correspondence to a cycle in under . Note also is in bijective correspondence to partitions of .
The authors wonder if in general the Loops of can always be reduced by isomorphism to the Loops of for some particular (in the former case, ). They wonder also if and when such isomorphisms of Loops offer a translation of pre-period bounds across variations.
The speculations have been completed. The authors have navigated only one path and a couple landmarks of the Inventory forest (and perhaps not the most efficient path at that). They are not sure if these speculations are so simple (or so general) as to be useless. That is, they aren’t sure if they have lead the reader to really new and nice scenery – or only in a circle (or perhaps to a cliff). Thus the length of these speculations may only be evidence they don’t have the nose for telling dead from fertile mathematical soil. But they believe it is always better to ask more questions than one answers and that to live in a universe with no open questions (or more likely, to live without interest for the open questions around oneself) is the closest experience to Hell we might have while alive. So, as one might imagine the totality of known mathematics as a house and the unknown as the wild outdoors, the authors have in this last section thrown open the doors and windows nearest them. They hoped to give the reader new exploration (or at least a fresh breeze). They ask the reader’s forgiveness if – in ignorance or haste – they have only thrown open a closet, the oven, or the door to another room.
References
-
[1]
Victor Bronstein and Aviezri S. Fraenkel. “On a Curious Property of Counting Sequences.”
The American Mathematical Monthly, vol. 101, no. 6, 1994, pp. 560–563. JSTOR,
http://www.jstor.org/stable/2975323 -
[2]
Victor Bronstein and Aviezri S. Fraenkel. “More on Counting Sequences.” (1995)
https://pdfs.semanticscholar.org/3328/255c7198aa6c92a8a555a7b60c647659ce9c.pdf -
[3]
Jim Sauerberg and Linghsueh Shu. “The Long and Short of Counting Sequences.”
The American Mathematical Monthly vol. 104, no. 4 1997, pp. 306-317. JSTOR,
https://www.jstor.org/stable/2974579 - [4] Clifford Pickover, The Math Book, Sterling Publishing, 2009. p. 486, Audioactive Sequence
- [5] André Pedroso Kowacs. “Studies on the Pea Pattern Sequence”, arXiv:1708.06452v5 [math.HO], https://arxiv.org/abs/1708.06452v5
-
[6]
Carlos Rivera,
The Inventory Sequences and the Self-inventoried Numbers,
The Prime Puzzles & Problems Connection #207,
https://www.primepuzzles.net/puzzles/puzz_207.htm - [7] J. H. Conway. “The Weird and Wonderful Chemistry of Audioactive Decay”, Eureka 46, 5-18, 1986.
-
[8]
Wikipedia,
Self-descriptive Number,
https://en.wikipedia.org/wiki/Self-descriptive_number -
[9]
James Grime,
Maths Puzzle: The self descriptive number solution,
YouTube, https://www.youtube.com/watch?v=1GKfEDvhWdY - [10] OEIS Foundation Inc. (2019), The On-Line Encyclopedia of Integer Sequences, https://oeis.org
-
[11]
Inventory Numbers Python code.
Live: https://repl.it/@onnomc/InventoryNumbers
GitHub: https://github.com/onnomc/InventoryNumbers