The Final POTW Has Been Posted

The title says it all. Go have a look and let me know what you think. Problem of the eek will make a triumphant return in January. See you then!

More like this

"Problem of the eek" indeed, because many of your problems make one utter, "Eek!"

By James Brown (not verified) on 01 Dec 2015 #permalink

My initial thought: the flaw is that you're switching the definition of "unambiguously describe" mid-problem. In the first sentence you mean something like "name the number N in a way where the words correspond to the digits." But in the quote, you're instead referencing "N" (the symbol) as if it itself has this property, when it doesn't.
Its certainly possible to use symbols to reference arbitrarily long irrational numbers (pi being the obvious example), and to then work with those symbols in math to solve problems. But the symbol "pi" is not the same as a description of the ordering of every digit. You can label either "pi" or the set of words that correspond to the infinitely long string of digits as an "unambiguous description," but that will be two different definitions of the term 'unambiguous description.'

I'm assuming you heard about problem #10 from a Spanish barber.

By john harshman (not verified) on 01 Dec 2015 #permalink

This is a variation of the Russel paradox..

Re #3:
There is an interesting take on the spanish barber that I ran across many years ago.

We can note that shaving oneself is fundamentally different than shaving another person; one requires only one person, the other requires two people. Thus we can define two predicates, autoshave(x), in which a person shaves themself, and heteroshave(x,y), in which person x shaves another person y. Note that heteroshave(a,a) is false; a person cannot heteroshave themself; one cannot sit in the barber's chair and be shaved and simultaneously stand beside the chair and shave the occupant of the chair.

The statement of the problem becomes the barber heteroshaves everyone who does not autoshave, or
E(a)A(x)(heteroshave(a,x) iff ~autoshave(x))

Consider the case where a = x. The problem becomes heteroshave(a,a) iff ~autoshave(a). But heteroshave(a,a) is necessarily false, so, ~autoshave(a) is also false, so autoshave(a) is true. The barber shaves themself.

In fact, it does not seem possible to even state the original problem if one limits one's definitions to autoshave and heteroshave.

There's another tack here, which is to say that if you can reference other numbers in the description, as in the "ten cubed" example, then all the naturals can in fact be unambiguously defined in fewer than 14 words and the proof by self-referential paradox is simply unnecessary – things went wrong when we started down that road.

By Another Matt (not verified) on 01 Dec 2015 #permalink

@Another Matt: Depending how you define "word", you can still run into problems with that. Is "twenty-two" one word, or two? I think most people would call it one in English because of the hyphen, though some would say two. "Two hundred twenty-two" is then either three or four words. But in German, with its propensity for compound words, both are unambiguously one ("zweiundzwanzig" and "zweihundertzweiundzwanzig"). The rules don't specify that the fourteen words have to be English, so if you are generous about allowing German compound words, it follows that all natural numbers can be described with a single word. But if you insist on an English language description, you will eventually run into a problem if the number has enough nonzero digits.

By Eric Lund (not verified) on 02 Dec 2015 #permalink

Yep, I thought of that last night. I confess I was thinking of it in a word-processor way rather than the dictionary way: "The number between 1206540 and 1206542." counts as six words in a word count.

By Another Matt (not verified) on 02 Dec 2015 #permalink

But if you insist on an English language description, you will eventually run into a problem if the number has enough nonzero digits.

Not the way Jason has phrased the second part of his problem. If we can designate any arbitrary number with the word "n", then it becomes trivial to describe numbers slightly larger or smaller than n in fourteen words or less. The problem is that the word "n" is not an enumeration or description* of the digits of n, which is what the first sentence implies/promises the sentence will deliver. so I'm sticking to my original thought, which is that the proof is flawed because what is meant by "unambiguously described" changes between the first sentence and the later example.

*I don't think enumeration is really the right word here, but I'm struggling to think of a better one.

Sorry, that first sentence was meant to be a quote of Eric Lund @6.

@the other eric: No, that only buys you some time. You can use a symbol to denote an arbitrary collection of digits without enumerating them (π is perhaps the most familiar example). But the number of available symbols is finite: each of the lexical symbols in all of the languages of the world can be mapped to a unique 16-bit number (this is Unicode). Even in everyday mathematical and physical practice, the full set of Latin and Greek letters is not enough; the Hebrew letter aleph is used to denote transfinite numbers. You can try to strategically associate certain numbers with certain symbol combinations. You will eventually run out of Hebrew letters, Cyrillic letters, kana, etc.; and depending how you define a word, you will eventually exhaust all of the possible combinations thereof.

That's why the caveat that you are not restricted to English is important. Some languages, of which German is the most widely known, let you form compound words of arbitrary length, so the maximum number of words restriction never comes into play. Or, as Another Matt points out, you can use the loophole that a word processing program considers a string of digits to be a single word, which means that any finite number can be described by a single word.

By Eric Lund (not verified) on 03 Dec 2015 #permalink

You have not actually 'unambiguously described' the number.

'n' is a symbol that stands for that number but is not itself that number and is, in point of fact purposely ambiguous.