Okay, here's an actual explanation of why 0.999... = 1 (at least in the real numbers):
Real numbers can be formally defined as the completion of the rational numbers, which can for example be constructed by dividing the ring of rational Cauchy sequences by its maximal ideal of rational null sequences. Bit technical, but I'll try to explain using common words.
A rational Cauchy sequence is basically any sequence of rational numbers which you can make arbitrarily narrow just by cutting off enough of the start of the sequence. To define the real numbers, every rational Cauchy sequence shall represent (i.e. converge to) a real number. Now we declare that two Cauchy sequences converge to the same real number if and only if the sequences' difference converges to 0.
For example the rational sequence (1, 1, 1, 1, ...) is Cauchy, since it already has no extent and is arbitrarily narrow even without cutting anything off. The rational sequence (0, 0.9, 0.99, 0.999, 0.9999, ...) is also Cauchy, since removing the first n elements makes it 10-n wide, which can be arbitrarily small for sufficiently large n. Now the difference (1, 0.1, 0.01, 0.001, 0.0001, ...) of these sequences becomes arbitrarily close to 0, so these sequences converge to the same real number.
Now by definition of real decimal representations, the real number corresponding to a decimal representation is defined as the limit of the sequence of rationals obtained by cutting off the representation's tail later and later, e.g. 3.14159... is defined as the limit of the rational Cauchy sequence (3, 3.1, 3.14, 3.141, ...). To answer the question of whether 0.99999... = 1, we just look at the example above, and see that they must represent the same real number.