Tuesday, October 10, 2006

Wholly Natural

My daughter's math text list natural numbers as {1,2,&hellip} and whole numbers as {0,1,2,&hellip}. I remembered them the other way around, after all zero seems natural but is a whole lot of nothing. But for her homework she has to use the textbook terminology lest she gets marked wrong.

Afterwards in my class I used the phrase "For every natural number n" and then stopped myself and asked the class about what natural and whole means to them. The class had diverse opinions and also mentioned something called counting numbers. Searching various web sources seem to agree for the most part that whole numbers start with 0 but are less clear on the natural and counting numbers.

For that class and in most of what I do the distinction is irrelevant since we usually only care about "sufficiently large" n. But if you are using a context where zero matters, please use the terms "nonnegative integers" and "positive integers" so as to not confuse those of us who can't keep our whole and naturals straight.

14 comments:

  1. It's a dialect issue.

    If you're a logician, the naturals are the finite cardinals, and if you don't throw in 0 you can't count the number of elements in the empty set.

    If you're a number theorist, the naturals are the positive integers, because otherwise you have to keep writing "except 0" in any statement that depends on unique factorization.

    This is similar to how different branches of mathematics handle 0^0 (obviously equal to 1 for combinatorialists and obviously undefined for analysts). It may be a good thing for students to see ambiguous terminology occasionally so they can learn that definitions are chosen because they are useful and not because they are right. But I imagine that the grade-school curriculum is not really designed to get this point across.

    ReplyDelete
  2. I find it helpful to use the explicit notation \mathbb N_0 and \mathbb N_+

    ReplyDelete
  3. As I was trained in logic and not in number theory, I have always reckoned 0 among the natural numbers for the reason Jim Aspnes says. I started teaching theory courses in 1975, and when Papadimitriou and I published our undergraduate text in 1978, that was the convention we followed. Every year there is at least one student who complains that 0 is not a natural number, citing some authority, and I now tend to reply that it's my classroom so I am the only authority that counts for the time being :) I have now abandoned our book in favor of Sipser's, which has 0 *not* being a natural number. But in my lectures I still insist that it is, because it seems so much more, well, natural to do so, and because it is helpful to keep reminding students to get in the habit of checking that the n=0 case of almost any statement makes sense. I am glad to have the additional rationale now that learning to deal with ambiguous notation is educational in itself!

    ReplyDelete
  4. A related problem: some students learn that "0 is neither even nor odd" in grade school, and they still believe it when they come to the university.

    I'm not sure why they're taught this. Maybe because it is confusing to say that 0 objects can be divided into two groups of equal size?

    ReplyDelete
  5. One of my professors said that zero is natural number for computer scientists as it is natural to get a zero in a computer science exam.

    ReplyDelete
  6. I'm neither a theoretical computer scientist nor a logician (nor very logical for that matter), so my view is a bit different.

    I think that the only number naturally attributable to physical objects is one (1). The number zero (0) indicates nothingness, and there exists no "no physical object": Physical objects can only naturally exhibit their existence (“this is a brick”). They cannot exhibit their non-existence (“this is a no-brick”).

    The number two (2) involves an abstract coupling of two physical objects (one object and one object), but abstractions are ontologically subjective, that is, abstractions don't exist without intelligent beings (yet the number two is pretty much an epistemologically objective thing).

    ReplyDelete
  7. "abstractions are ontologically subjective, that is, abstractions don't exist without intelligent beings"

    The first part "abstractions are ontologically subjective" is only true if you are a nominalist. I think it is safe to say only neo-platonists read this blog. Thus making nominalists ontologically subjective; or, at best, by your definition one exists!

    The second part is not a restatement of the first, but rather a contradiction. If God is necessary (which He is) and intelligent (which He is), then there exists no world without intelligent beings. Thus abstractions are not ontologically subjective.

    (I am a computer scientist and a logician).

    ReplyDelete
  8. Dear Anonymous logician, I'm afraid I can't fully understand you (which may be due to the fact that I am not a logician).

    In your first argument, you respond to a nominalist who commented on this blog, arguing that it is "safe to say only neo-Platonists read this blog". Now, that seems to me like a contradiction.

    Basing your second argument on religious beliefs is also a bit unconventional for a modern logician, isn't it?

    (There's a good article about terms such as epistemologically objective and ontologically subjective, by John R. Searle)

    ReplyDelete
  9. A 'whole number' is, at least etymologically, the same thing as an 'integer', isn't it? Both mean a number in its entirety, i.e., a non-fractional number. 0 seems to be a nice whole number when viewed in this way.

    On the other hand, the notion of zero was discovered comparatively late in the human history, which may be an evidence that 0 may be somewhat 'unnatural', compared to any positive integers, at least to our ancestors who didn't know 0.

    But of course, the two terms will forever be ambiguous, and in technical writings 'positive integers' and 'nonnegative interers' would be safer.

    ReplyDelete
  10. Is it even clear that "positive integer" always means "> 0"? More precisely: is it true in all English speaking countries and in the older literature? In French "positif" means ">= 0". In Spanish it tends to mean "> 0", but it is not as clear-cut as in English. What about in other languages?

    ReplyDelete
  11. Historically, it is hard to say that including 0 was 'natural'.

    In the late 1960's when New Math finally arrived in my 5th grade class, we were given a bunch of names for abstract properties that we had little reason to understand or care about other than to memorize (closure properties, associative, commutative, and distributive laws, etc). As part of this we were told that the 'natural or counting numbers' started at 1 and the 'whole numbers' included 0. The 'whole numbers' seemed pretty silly to me.
    (More precisely, why would one care to make such a big deal about the difference between such similar concepts?)

    In some of my first college courses there was a distinction made between 'blackboard N' and 'blackboard N_0' but the term 'whole numbers' totally disappeared. Was this just made up to put in grade-school texts?

    The thing that convinced me that the natural #'s should include 0 was how the Peano axioms were a bit cleaner in that case:
    FORALL x.(x+0=x) is nicer than
    FORALL x.(x+1=s(x)) and
    FORALL x.(x*0=0) is just as good as
    FORALL x.(x*1=x).

    ReplyDelete
  12. What about in other languages?

    In Italian, it has the same meaning as in English (i.e., positive <-> > 0)

    ReplyDelete
  13. Alex,
    my convention is that a positive integer is not zero, but that a positive real number may be.

    I've never seen the French use positive to allow zero, but (1) I wasn't looking for it and (2) I may only notice in fields that use the term so often the term has to trump the language (eg, nonpositive curvature, what I'd like to call negative curvature).

    ReplyDelete