When I first made the transition from studying as a full time pursuit to a part time guilty pleasure there were a lot of things I missed – being able to study in the daylight hours for one. Focusing on the positives however – something that was, and still is very refreshing is the amount of freedom. When you stop studying because you need a qualification to get on in life, and you are studying because you want to you can relax, change direction a thousand time and take as long as you want to look at things. A curriculum is an experience designed to take you from a to b in a specified length of time, without giving you the luxury of being able to take tangents.
One such tangent would be developing a rigorous and robust understanding of why dividing by zero does not work. I am sure many people know this – but a surprising number will not. A lot of people will tell you it’s does not exist – true, but why? Others will say it’s just infinity, which is fine but can I define it using a limit and infinity? No, of course you can’t.
Division is first explained to us as sharing some quantity of objects among another quantity of objects (or people). So 15 shared among 3 gives five. The question of how many items are received “when” no items are shared among one, two, three of four hundred people has no real meaning. When are no items shared? Always? Never? There is no real meaning in this elementary description of division – so we may conclude that it is just undefined. Loosely this is right, but does not really contain a complete description of division.
So why can’t we just attack the problem with limits? This argument would see something like this set up;
Where a and b are real numbers, and b approaches zero from the right. when we examine this limit from the left, we would get the following expression;
So when we combine these two, to calculate the limit as b tends to zero?
So it is wrong to define a/0 as infinity, approaching infinity etc. The limit does not exist. So you are only correct if you say that division by zero is undefined; which above we have shown using calculus. You can also show this using algebraic rings, or inverse multiplication. The most basic, and amusing demonstration as to why this monstrosity cannot exist, is the fallacy we would create.
Doing some really basic maths;
0 x 20 = 0
0 x 5 = 0
In this case, the following must also be true;
0 x 20 = 0 x 5.
In a world where we accept division by zero, we can write;
0/0 x 20 = 0/0 x 5,
20 = 5.
This is no world I want to live in.
Let me know if you want to look into other ways we can demonstrate that a/0, and indeed 0/0 is undefined – there are so many of them and it is fundamental to mathematics as we know it. There are areas of mathematics (such as matrices) where such operations are defined or pseudo defined, but these are special cases which in no way violate the discussion above.