User blog comment:BlauesWasser/Why Zero Shouldn't be considered a cardinal/@comment-30754445-20180501091716/@comment-30754445-20180508180322

I understand the philosophy and actually agree with it.

What I don't understand is why you think it renders the definition of 1={0}, 2={0,1},... to be less meaningful. I actually think that the reverse is true (and will hereby explain why).

First of all, here's a crucial point: Any method used to define "equality" between isomorphic structures, cannot be based on equality of ordinary sets. There's no way to encode the concept of "all structures isomorphic to X" into a set (otherwise we'll run into Russell's Paradox). So something else is needed. A new conceptual framework.

And it seems to me, that this new conceptual framework (type theory?) just formalizes and enforces the intuitive meaning of statements like 1={0}.

When we were limited to classical set theory alone, then n+1={0,...,n} is technically nothing more than a simple equality of sets. It's just one possible model of the natural numbers. We could use any other isomoprhic structure of sets, and it will be equally valid.

As you said yourself, this gets us - technically - into trouble, because we are treating different things as equal. The only reason it does sort-of work, is that people have a strong intuitive notion of "isomorphic structures represent the same thing". So we don't really care which model we use to describe the natural numbers (or the rationals or the reals), as long as this description is true to form. This is why such definitions, while technically lacking, are quite helpful and useful.

But now, imagine that we have a logical framework that actually formalizes this intuitive concept. A logical framework that can actually say, in technical terms, that isomorphic structures are equal in some way. This, right there, solves the problem. Now we can safely define any concept by one model, and it doesn't matter which model we pick! After all, they are all "equal" in a provable logical way! So now, when I write: 0=∅ and n+1 = {0,...,n}, it can actually serve as a 100% formal definition of the concept of natural numbers.