All fractions are one fraction

I call my weblog "Rough Drafts" even though some of the entries are fairly well polished, published after weeks or even months of careful writing and editing before release. But in other entries, I write wild stream-of-consciousness gagompoofery like this. This one is so rough it's right at the edge of being thrown in the wastebin, but it won't be because there are some really interesting gems buried in it. I was writing an epiphany as it unfolded, and I expect most readers will bail out of the wild ride before it's finished. But I'm leaving it intact because it's a great example of the fairly chaotic, naive, grammatically-impossible poorly-informed-but-inquiring-anyway formative process behind some of the more polished things I write. Note, not all epiphanies come through this awkwardly; some of them are very well written right out of the gate; but those are rare.

All fractions are the fraction 1/2, iterated with different, shall we say, flavors, in keeping with the concept of Linux distro flavors, which are all different, yet, the same. This insight arises from a long-held belief that had no words until I read moments ago [approx 5:30am] in an article about Wittgenstein's philosophy of mathematics "Wittgenstein discusses the claim that fractions cannot be ordered by magnitude." I paused briefly, having never considered the idea in that form before, and within a minute had realized that this point is true.

(Integers may have the same underlying issue, but for now let's work with the idea that integers can be ordered, and fractions cannot*). Integers can be ordered, in positive form at least, because they refer to a reality that is growing in size with each enumeration. The referent is ordered, so obviously they are. They can also be ordered abstractly -- with no referent -- based upon their root in reality, where they are specifically enumerating things, but this more abstract form is a different kind of ordering, one that is clearly more abstract (and therefore less meaningful? I'll revist that thought in a moment) even though we may not always be aware of the transition between the two layers of abstraction (i.e. counting specific things, vs just counting, without specific individual referents). *

[Update: this gets modified a few minutes later, so hold your horses. By "cannot" I was riffing on the usage in the stanford.edu article, and apparently mean "ought not be ordered in the simple manner previously understood."]

But fractions are not adding anything, ever. They are _always_ referring to a **division** of what exists, never to a referent which is _adding_ more reality to whatever was being enumerated before their act of division happens. Fractions are never talking about _more_ reality in the way that integers are -- they are always talking about a subdivision of something or another. [Update: Therefore fractions can remain at zero on the number line, having no referent that is larger than zero.]

In this way, they are all the same. 3/4 is the same as 1/2, because it is, on the most basic level, representing a reiteration of 1/2. It is 1/2 plus 1/2 of the second half, therefore two iterations of 1/2, not actually representing something _different_, but more like the same thing _again_. It's not different like a tree is different from a car, but different like a young tree is different from an older version of itself -- with maybe more growth rings, but still the same underlying thing.

Okay, so that's the basic thought. From this perspective, fractions are "all the same" as I wrote above, because they are **always** abstracted from reality, one layer "above" whole, undivided integers, at least two layers of abstraction away from reality, which itself remains undivided. We're only dividing within the language/mathematical sphere.

In this understanding is the key which shows the great weakness of set theory in its illusion as something which usefully describes a Platonic reality that exists independent of set theory. Set theory is clearly not describing something that pre-existed set theory, and is being "discovered," as a Platonic ideal would be.

This goes with Wittgenstein's anti-Platonism, with which I heartily agree in spirit, although I am like Godel, a Platonist, and thus see Wittgenstein's point as not against Platonism per se, but against poor implementations of it -- for example, the prevailing implementation of set theory, and many other "theories" like it, which are fabricated upon foundations that are, as Wittgenstein would say, "nonsense." By this he was meaning outside the bounds of the logic within which they presume to be exemplifying.

All fractions are abstracted one layer above integers. 1/2 is, within our binary mindset, the primordial fraction, but even as I write that, I have a distinct sense that a truly Platonic understanding of fractions would not begin the series with 1/2, but 1/3, thereby acknowleding that 1/2 arises from a deeper form of division which then creates the concept of "excluded middle" and uses that 1/3 to divide the other two thirds from each other, creating the 2nd abstraction which we know as 1/2.

There, I just said what I've been trying to frame dozens of times over a couple decades about as concisely as I've ever said it. Now it is clear that, indeed, C. S. Peirce was correct in realizing that logic cannot have less than 3 fundamental elements. One of these three elements is used to create all future divisions (until we have abstracted far enough away that we lose awareness of its origin and think that the act of division has only two elements, forgetting that our mind with which we divide is the knife doing the dividing, and is itself a third thing).

Again, I've said this one pretty concisely, too, a thought I've thought many times over the years from many directions but not usually so concisely, and I think never while writing.

Integers are created out of division

At the same time as I'm saying that all fractions are one fraction, agreeing that they cannot be ordered beyond that, I can also see that they _can_ be ordered, but not merely in the simple way we normally might imagine. They are ordered vertically, as abstractions upon abstractions, not just horizontally, identifying points along the real number line. And we have a real problem that arises when the map no longer repesents the territory accurately, specifically because we've lost track of a referent, or of multiple referents.

([Update] Upon reading this after writing the rest, I realize I really need to come up with a name for this vertical ordering; I've thought about it enough that I can see it clearly now, can see that it's a vertical dimension of the same "substratum" that we know as the unnumbered horizontal number line, it's time for a name. But wow, it's entangled with the confusion around complex numbers, yet my next point is about simplification, hm, will take some pondering to come up with a name.)

([Update2_11:16am] Okay while editing, I see what I meant while writing, but it should be reworded: fractions are first ordered in a vertical direction, which to the horizontal direction looks all the same, as they are all being ordered along the "Y" axis, having a horizontal ordering of 0. Then, they are ordered along the X axis. I suppose we could do this arbitrarily, X,Y or Y,X, but I think if we're talking about ordering things, let's order things. In the Y axis, they're ordered strictly by size, and thus grow smaller and smaller vertically until they reach "the youngest infinitesimal" which is a moving standard, referring to the latest infinitesimal, either in the local series or globally. {1/2 is "the oldest infinitesimal" in that nomenclature, although we haven't yet figured out what to do with the primordial 1/3 division which explicitly preserves the infinitismal historically known as "the excluded middle." This infinitesimal is a special one which is used in all subsequent divisions as the divider. That is, this "infinitesimal" is the boundary of the set which is not enumerated in the set, because it is the boundary, forged out of the adamantium of excluded middleness. Hmmm.... Looks like this plays a dual role, being the divider between elements in the set as well as being the boundary of a set, dividing one set from another. What a wow-its-everywhere moment, realizing that the excluded middle is not nowhere, but everywhere! Hmmm...} By this I mean there is always ONE single infinitesimal which has a size, being the tiniest infinitesimal which exists by reference either in someone's mind or in a book, somewhere. That is the tiniest infinitesimal and as soon as no one is thinking of it, it disappears into where it came from, nowhere, and the next largest infinitesimal is the globally smallest infinitesimal. And locally, when we're actively counting and using a series of fractions resulting in an infinitesimal, there's always the smallest one of the local series. On occasion, the local and global are the same. Using this definition of infinitesimal, we remain bounded by finitary and sensible math, and do not escape into nonsense, talking about things which have no possible referent as if they do. So, back to the ordering of fractions. Fractions are ordered vertically by size, and half is the largest, then one third is the next largest, then one-fourth, etc. You might think 63/64th should be before 1/3 in this series, but the correct place for 63/64 is vertically higher, when we've gotten to 64ths. We do not 'cut' something larger -- a mistake easily made when working with a 2D number line. We're working with a 3D number sphere and instead of cutting the original line, we add a layer of abstraction. We move upward one more layer, and then divide within each layer. It might be hard to imagine this series, so I'll work on a drawing to illustrate it. Roughly:

"smallest local infinitesimal" ------->
.-------->
.-------->
.-------->
1/65----->
1/64----->
.-------->
.-------->
.-------->
1/5------>
1/4------>
1/3------>
1/2------> largest possible division within the concept of division
1/1=1

Now this brings to mind some other patterns, like, 1/4 is a 1/2 division of 1/2, and 2/64 is 1/32 {or is it?}, etc, so maybe there are alternate ways of ordering, depending on purpose. Need to play around with this idea some more. The point is, the ordering grows smaller in an incremental, simple, manner, tending toward the local infinitesimal as a way of bringing elegant closure to the concept of division being iterated. And now back to your previously-scheduled broadcast:)

Abstraction is simplification, meaning that we're losing granularity in our representation of whatever we're abstracting (our perceptual reality, originally, but we can get so lost in abstraction upon abstraction we lose touch with... the original something). What happens when we abstract away from a reality a few layers and then forget an important relationship between two things that are, in the reality layer, united very strongly, but in the abstraction, not very strongly, because we've lost granularity?

Then we begin to create nonsense in the way that Wittgenstein was talking about.

And if we go one more step, we _are_ speaking nonsense, like "elephant in a tree" kind of incoherence, whether we realize it or not. We're talking about something that has no valid reference in reality*. Now, it can be entertaining, educational, funny, wise, pithy, hilarious, sarcastic, punning, it can be all kinds of things that we enjoy, but it is not reality. And when we begin to think that it is reality, we commit a form of logical fallacy. In fact, I believe it is safe to say this is where all logical fallacies have their root: they are all nonsensical abstractions rooted in a reality they no longer accurately represent. I'll come back and investigate how true that statement is later, but for now it fits for the point I'm making.

* perhaps this is what Wittgenstein meant by true/false, in which case it's not binary as he imagined -- along with everyone in his era -- but a gradient.

The number 3, hm, no, let's go a little further up the integer line than three because it's so meaningful and I'm intending something not so meaningful. The number 5 is quite nearly identical with the number 1, in all of its attributes except its symbolic shape and its referent. Its essential role as number is identical, for example. What is its referent, and how different is it from the referent for 1? Well we see that even set theory acknowledges the structural idea that the number 5 is a re-iteration of the number 1. So five, in a way, is an abstraction of 1. In a very real way it is, thus what we discover about 1 will be true of five, in almost every respect, because 5 is just a reiteration of what exists within 1. In this way, it's true: "All numbers are one number."

So to understand integers, we need to understand 1, and the iterator concept, and we've got all integers figured out. (I believe Wildberger would add that there are some crazy things going on with very very large numbers, but I think his points are encompassed within the "nonsense" layer that I'm talking about, meaning that his points about the absurdness of extremely large numbers are the same as Wittgenstein's points about nonsense -- at some point in that layer, we can observe objectively that we've gotten ourselves into a reductio ad absurdum out of which we can bail whenever we want, without consequence. Hm, is nonsense that which can be dropped without affecting reality? Nonsense is irrelevant, even when we temporarily think it is relevant?).

In that vein, what is happening with 1? And what is happening with 1/2? It seems that the integer 1 was created, at least historically, before 1/2 ever came into being. But in fact, 1/2 was there at the beginning of numbers -- there could be no concept of 1 without the preceding concept of division, which is represented later in time by the nomenclature 1/2 but which existed in a pre-nomenclature condition, causing the 1 to be carved out of what existed before, and applied as a referent to things.

Evolution provided us the ability to do rudimentary counting, up to about three or four, the same skill that exists in pre-numerate infants, as well as most animals. Crows apparently have a rather sophisticated ability to count, in comparison to other animals, but almost all animals have a basic ability to count, which was all we had too, until one of our ancestors maybe a 100,000 years ago developed the ability to go beyond that, leading to the development of "concrete" abstractions like tally sticks, which led to increasingly sophisticated understanding of numbers.

But we don't need to go into that history here, because we're looking at something common to all animals, the ability to enumerate and understand what "1" means as opposed to "2" or "more" or however we quantify infinity. (Infinity used to be as small enough to hold in your hand, it was "that beyond which we can count" when that was four or five. It's always growing, infinity is.)

So the concept of 1, which rests upon the concept of division, existed in our awareness before we perhaps had the ability to understand that it was coming out of division... and we've had no reason to think of it that way ever since. Which brings me back to the original point regarding fractions. We're using fractions without fully thinking about what they are, because we're using techniques that go so far back in our enumerating skills that we've lost track of what's really happening underneath our counting. As we peel back the layers, then we come down to Peirce's three elements, and then division no longer happens. We have loose, disconnected fragments... no wait, that's what we see when we're looking at it from within the concept of division... from within the concept of whatever existed before division came into being, we have a singularity.

And that singularity still exists, in undivided form, and we can place our awareness into that singularity and look at the divided forms, seeing the countable from within the uncountable, and seeing it for what it truly is, not what it appears to be from within itself, but for what it is.

And here we find Wittgenstein's advice, commending us to silence, for we are speaking of the unspeakable.

Wittgenstein moved our attention from the pseudo-awareness that came about from getting lost in abstractions, where we thought we were "discovering" Platonic mathematics, rather than "inventing" this clumsy artificial tool by which we divide, divide, divide, the world, into the proper place. He placed us inside the sphere of language we had created, not in a Platonic realm we were discovering, but in a Babylonian (half sense, half nonsense) realm we have invented. He was right -- and Godel was right, paradoxically: There _is_ a Platonic mathematics, but we haven't "discovered" it yet, because we've been stumbling around in our own invention thinking it was Platonic discovery. Sure, there are elements of mathematics which we've gotten right, but look, it's founded upon division. What if there is a whole other form of mathematics founded upon... something other than division? What would it look like? Can we even conceive of such a thing? Wittgenstein came along and showed us that we were surrounded by a cloud of sense -- language -- which was then surrounded by a cloud of nonsense. Reality is not in either cloud, is it the outer envelope? Not sure how to organize them (nested? parallel? interwoven? need a map), but reality is the unspeakable part.

Add a comment

HTML code is displayed as text and web addresses are automatically converted.

Page top