The best explanation I heard of this was: «can you think of a number between 0.9 and 1.0?».
Any number would have an infinite number of zeroes before any significant digit, thus it would be equal to 0.
Basically it boils down to the identity principle:
[ eqn ] A - B = 0 \Longrightarrow A = B [ /eqn ]
There are also other fancier proofs but I think this one is the simplest.