Via StumbleUpon, I came across this short text page which lists three mathematical ‘proofs’ which seem to violate common sense, listed below. The first is:
The second one is:
The third one is:
Each of these proofs is (intentionally) wrong! They highlight classic fallacies in mathematical thinking. See if you can figure out where, in each of them, the proof goes wrong, and then look for the answers below the fold…
(Note: the third proof involves the ‘imaginary number’ . If you’re not familiar with it, you can safely skip that problem, as it is closely related to one of the others.)
The mistake in the first proof is highlighted in red below:
The mistake lies in the assumption that we can divide by . Why is this a problem? Because, according to the equation we started with, ! We are not allowed to divide by zero, and therefore the result highlighted in red is invalid.
The algebra ‘muddies the waters’ significantly, and it can be helpful to see the same proof, but with . The first two steps suggest that . The third step suggests that . The fourth step suggests that . We run into problems in the next step, in trying to divide both sides of this equation by zero!
The next ‘proof’ is a little more subtle. Again, we highlight the problem step in red:
The problem here is that, in order to get to the step in red, there is an implicit square root taken of both sides of the equation. However, there are potentially two roots to any square; e.g. has possible solutions or . The ‘proof’ above assumes that the positive root is correct, which leads to the erroneous answer . If one looks at the negative root, one finds that , which leads right back to the starting equation . This proof, incidentally, might have been helpful to legislators in Indiana back in the late 1900s 1800s.
The third ‘proof’ suffers from essentially the same problem as the second, except that complex numbers are involved:
The step highlighted in red assumes that we break up the square roots as , and then take individually the positive root of each term. It is wrong, however, to assume that the positive roots are always the correct ones when dealing with the square root of an equation. If we have an equation of the form
we can at best say that , and not necessarily both are true. Nothing changes if we take the square root of an equation of the form
in which case we can at best conclude that
For our ‘proof’ above, we find that the statement in red should be replaced with “ or , but not necessarily both.”
There’s a nice classic book that covers such mathematical errors, and other more important mathematical paradoxes: Bryan Bunch, Mathematical Fallacies and Paradoxes (Dover, New York, 1997).
If you really want to be certain that you understand such mistakes, try writing your own ‘proof’ for !Feel free to post yours in the comments. I’ll add my own ‘proof’ at a later date!