Division by Zero
In mathematics it is a rule that we cannot divide by zero, because it contradicts the other rules of mathematics. But what is actually wrong about division by zero?
Contents
Division of the form \(\frac{a}{0}\), \(a\neq 0\)
In mathematics, division by zero is where the divisor (denominator) is zero and is of the form \(\frac{a}{0}\). Suppose now we applied this operation to some numbers \(x\) and \(a\). Assume \(a\neq 0\).
\[x=\frac{a}{0}\]
Since division is the inverse of multiplication,
\[x\times 0=a\]
We know from the rules of multiplication that any number multiplied by zero is zero.
\[x\times 0=0=a\]
Which contradicts our earlier assumption that \(a\neq 0\). What this contradiction tells us is that there is no defined form for \(x\) , so division by zero is said to be undefined.
To understand this more intuitively, lets look at the concept of division in elementary arithmetic. We define division by zero in arithmetic to be splitting a set of objects in to equal pieces.
For example if we have \(15\) apples and we want to evenly distribute it to \(3\) people,then by definition of division each student would receive \(\frac{15}{3}=5\) apples each.
By the same logic \(\frac{a}{0}\) means to equally distribute \(a\) apples among \(0\) people.That is meaningless; there is no logical way to distribute a set of objects to \(0\) people, so we can say it is undefined.
Another way to understand division by zero is instead of just dividing by zero, we let the denominator get closer and closer to zero, and extrapolate from that. For example, let's make \(a\) equal to \(1\) and see how the values changes as the denominator gets closer and closer to zero.
\[\frac{1}{0.1}=10\]
\[\frac{1}{0.01}=100\]
\[\frac{1}{0.001}=1000\]
\[\frac{1}{0.0001}=10000\]......
We can see that that as the denominator gets closer and closer to zero, the the value gets bigger and bigger, so we might be tempted to say that \(\frac{1}{0}=+\infty\). But we have only considered the value as the denominator gets to zero from the positive side, it is equally valid to test it from the negative side.
\[\frac{1}{-0.1}=-10\]
\[\frac{1}{-0.01}=-100\]
\[\frac{1}{-0.001}=-1000\]
\[\frac{1}{-0.0001}=-10000\]......
As the denominator gets closer and closer to zero from the negative side, \(\frac{1}{0}=-\infty\). Besides the fact that infinity isn't even a number, we have two limits the division approaches as the denominator gets closer and closer to zero. What this tells us is that even though it is defined as it approaches zero, it is not defined as it equals zero.
Division of the form \(\frac{0}{0}\)
Let the value of the operation be equal to \(x\).
\[x=\frac{0}{0}\]
\[x\times 0 = 0\]
We can see that the left side of the equation is equal to zero. for any value of \(x\) so we have the equation to be a true statement for any value of \(x\). Since that is so we have no way of determining the value of \(x\), it is said to be indeterminate.
In elementary arithmetic \(\frac{0}{0}\) means to equally distribute \(0\) apples to \(0\) people. It is possible to partition \(0\) elements to \(0\) people, but because of the vacuous truth, we can assert that every such partition has any number of elements.
Does 2=1?
Consider the following "proof" that \(2=1\):
Assume \(X=Y\):
\[X=Y\]
multiply both sides by \(X\):
\[X^{2}=XY\]
Subtract \(Y^{2}\) from sides:
\[X^{2}-Y^{2}=XY-Y^{2}\]
Factor both sides:
\[(X+Y)(X-Y)=Y(X-Y)\]
Divide both sides by \((X-Y)\):
\[X+Y=Y\]
Since \(X=Y\)
\[Y+Y=Y\]
\[2Y=Y\]
\[2=1\]
We are getting a false statement when we divide both sides by \(X-Y\). Since \(X=Y\), \(X-Y=0\), which means we divided both sides by zero. We can conclude that division by zero is not a valid operation.