A nice "proof" that 1 + 1 = 0 (from one of Martin Gardner's books) is this: We begin with -1 = -1, we rewrite that as -1/1 = 1/-1, and then we square-root both sides so that, since a/b squared equals a squared over b squared, we obtain i/1 = 1/i (where i is the square-root of -1). But then multiplying both sides by i would yield i squared = 1, or -1 = 1, whence 1 + 1 = 0.
......That "proof" is fallacious (and hence it's no reason to outlaw square-roots, for example) because non-zero numbers have 2 square-roots (e.g. +1 and -1 both square to 1, while +i and -i both square to -1) so that, in particular, i/1 = 1/-i. But nonetheless it's fairly compelling because, when square-rooting -1/1 = 1/-1, we could easily assume that both instances of the square-root of 1, and also both instances of the square-root of -1, would have the same sign.
......Furthermore, although when we solve quadratics, for example, we give 2 solutions (arising from the square-root sign in the familiar formula) as a matter of course (it being noteworthy when they're equal), nonetheless we may lose the habit of thinking of, for example, -2 when square-rooting 4. Maybe we lose that habit because we usually use (the very useful) functions, which are one-to-one (e.g. taking the non-negative square-root) rather than multifunctions, which are one-to-many (e.g. taking roots).
......So note that the use of functions is only a matter of convenience (it is not that 4 really does have only the one square-root). I think that it is worth noting that fact because, although some will rightly say that 1/0 is undefined (usually) and that 0/0 is an indeterminate form (many numbers yielding 0 when multiplied by 0), others will say that division by 0 is impossible (less accurately) and even that 0/0 is nonsense.
......The usual "proof" that division by 0 is impossible goes something like this: 0 equals 0, so 0 times 1 (which is just 0) equals 0 times -1 (which is also 0), but if we could divide by 0 we could cancel out those zeroes and so obtain 1 = -1 (whence 1 + 1 = 0). But note that we would only obtain that contradiction if dividing those zeroes by zero gave us, not an indeterminate form (such as all the finite numbers, since zero times any of those is zero) but 1, and why should 0/0 equal 1?
......I can only think of 2 remotely plausible answers, neither of which is very compelling. Firstly we might extrapolate, to the case of a = 0, from a/a = 1 for all non-zero numbers. That is not very compelling because such extrapolations are notoriously unreliable, e.g. think of a to the power of 0, which equals 1 for all positive a, and think of 0 to the power of a, which equals 0 for all positive a.
......Secondly, since 'division by x' means 'multiplication by the multiplicative inverse of x' within number fields, and since the multiplicative inverse of x is whatever yields 1 when multiplied by x, hence 0/0 should, if allowed, equal 1. But that would only be the case were division by 0 being allowed within number fields; whereas it is certainly not allowed within fields!
......Nonetheless, division by 0 is allowed within number pitches, which contain number fields in an algebraically strong, and maybe even a physically applicable way (and which were defined in my 2005:)
[Link] Outgroups, Bias, and the Dark Web (for Quillette magazine) - My first essay for Quillette builds on "The Context is Conflict" to explore the clash between decouplers and contextualizers, why everyone is a hypocrite, ...
13 hours ago