I wonder where dividing by zero becomes useful.
It doesn't. I was completely wasting my time ._.
That reminds me: Can someone who knows this stuff tell me why, given that divide by zeros are undefined, we don't do the same thing as with x*x=2 and -1*-1, and create a new kind of number for them? Or is there such numbers already defined and I just haven't heard of them?
This asshole thought he did. The reason why you don't is because you completely break all of mathematics, and anything you get from dividing by zero is effectively useless. Essentially, all numbers become indistinguishable from zero, which completely breaks anything that uses numbers/identity elements in the first place. That's Algebra and Analysis completely dead, number theory, anything that uses metrics (functions to compute distance/length)... e.g., pretty much everything but a few slivers of topology and geometry.
mathematics using numbers can come close to, but not define, true math. The system is flawed by the constraints of the language, so you get things such as dividing by zero since a numerical representation of what's going on isn't the full picture.
It's a theory of mine. Representations such as numbers and the base9 (or is it base10?) system cannot fully describe everything going on in mathematics.
So, the truth is that there
is lots of math going on where we use no numbers. Another truth is that the base 10 system is just a tool we use to make writing numbers easier, because numbers are actually defined like this:
empty set (set with no elements) = ∅
0 = ∅
1 = {∅} (the set containing zero)
2 = {∅, {∅}} (the set containing zero and one)
3 = {∅, {∅} {∅, {∅}}} (the set containing zero, one, and two)
and so on.
We don't exactly get things like dividing by zero because of constraints of language--it's fairly well-defined, because we can look at limits approaching zero and know what kind of behavior will result (explosions towards infinity >_>). It's true, though, that there's probably scads of things that are going on that we don't understand because of our reliance on notation.
Oh. Yeah, the "new kinds of numbers" are irrational numbers and imaginary numbers, and they're usually taught in basic algebra...?
Those are the common expansions, yeah, but generally anything that isn't a natural number is considered a "new kind of number." Integers -> Rationals -> Reals -> Complex, and so on. I think they're first introduced in basic algebra, but every college math class I've had re-introduced them.
If you want more interesting numbers, quaternions and transcendentals are really, really cool.
Imaginary numbers came from another axiom being added to mathematics that defined i. As far as I know, calculus deals with all applications in which division by zero would be useful, without actually needing to divide by zero, meaning that, as Footkerchief said, there is no real reason to define it.
Imaginary numbers actually just come from an application of ordered pairs and an unusual multiplication system applied to them. It's a clever definition, no more, no less.
To expand on your thoughts a little, Analysis is a branch of mathematics that deals extensively with the use of limits, and so on. Some of those limits will involve a sequence which tends to zero, which may lead to what looks like the quotient 0/0 (but isn't... quite). Since calculus is one of the many applications of Analysis, you're partially correct, but there's a lot more out there
For a popularization of Gödel's incompleteness theorem, you should read
Gödel, Escher, Bach by Douglas Hofstader. It's a really, really good book, and I recommend it thoroughly to anyone.