Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
Calculators also say that dividing by 0 is an error, but logic says that the answer is infinite. (If i recall, it’s more correctly ‘undefined’, but I’m years out of math classes now.)
That is, as you divide a number by a smaller and smaller number, the product increases. 1/.1=10, 1/.01=100, 1/.001=1000, etc. As the denominator approaches 0, the product approaches infinity. But you can’t quantify infinity per se, which results in an undefined error.
If someone that’s a mathematician wants to explain this correctly, I’m all ears.
It approaches positive and negative infinity, depending on the sign of the denominator. The result must not be two different numbers at once, so dividing by zero cannot be defined.
There are other reasons, too, but I forgot about them.
Calculators also say that dividing by 0 is an error, but logic says that the answer is infinite. (If i recall, it’s more correctly ‘undefined’, but I’m years out of math classes now.)
That is, as you divide a number by a smaller and smaller number, the product increases. 1/.1=10, 1/.01=100, 1/.001=1000, etc. As the denominator approaches 0, the product approaches infinity. But you can’t quantify infinity per se, which results in an undefined error.
If someone that’s a mathematician wants to explain this correctly, I’m all ears.
It approaches positive and negative infinity, depending on the sign of the denominator. The result must not be two different numbers at once, so dividing by zero cannot be defined.
There are other reasons, too, but I forgot about them.