I got a tiny glimpse into the beautiful, mystical quality of number theory, having read about the Strong Law of Small Numbers. Simply put, there aren’t enough small numbers (or distinct signifiers) to meet demands placed upon them. This is not merely a mathematical observation; it has applications elsewhere.
Consider, for example, the limited number of small numbers (only 100 numbers have just one or two digits) that are positioned at the beginning of the number line. That’s not very many numbers bearing useful, distinct meanings compared, for example, to words. That limitation often poses a problem because numbers are reused a lot in different contexts where distinct meanings are lost — making things that aren’t truly equivalent look as though they are. Put another way, one-fourth of the first hundred whole numbers are prime, and one-tenth are perfect squares. Distinctions like these become increasingly rare farther up the number line, and large numbers themselves lose utility as bearers of meaning separate and distinct from each other.
This problem occurs all the time: the disappearance of new, toll-free 800 numbers, the fragmentation of the zip code system, the necessity of new area codes to meet demand of new users (and now ubiquitous ten-digit dialing). Outside of numbers, all the short, memorable domain names are spoken for. Pharmaceutical companies can’t coin new drug names without exhaustive searches and marketing studies. Logo designers have trouble creating simple, distinctive logos that haven’t already been used. Trademarks and trade names increasingly run afoul of each other. Acronyms now assume dozens of meanings depending on context. And an increasing likelihood exists that someone sharing another’s name will become famous (or infamous).
Perhaps these examples are neither here nor there to the average reader. That’s probably true in some respects. But the usefulness of small, memorable bits of information to designate something is linked to cognition. A famous study called The Magical Number Seven, Plus or Minus Two observes that humans are typically limited in their ability to remember and/or perceive (these are related tasks) more that seven distinct bits of information. Trying to memorize a string of numbers more than seven (the length of a phone number without the area code) is difficult. Similarly, the ability to perceive more than a few colors (the color wheel is divided into six primary and secondary regions; addition of tertiary regions increases the number to twelve) or more than seven musical pitches (the diatonic scale is seven distinct pitches). Trying to remember more than seven people met at a party or meeting poses a problem as well.
As I say, it’s a tiny glimpse into number theory, with a curious applicability to or isomorphism into cognition. Make of it what you will.