I gave a talk on computing’s development the other day, and ended up in a debate with a colleague about the importance of the binary numeral system in computing. Among other conceptual developments that underlie modern computing, I mentioned the binary numeral system and noted that today’s popular narrative about computers is “it’s all about ones and zeros”. My colleague disagreed about its importance, and argued that binary numeral system is not necessary but contingent: You can build computers and reason about computing using any reasonable numeral system. And he is right indeed, both from theoretical as well as engineering perspectives: ENIAC was a decimal computer and Russians built ternary computers at some point. And Knuth considered the ternary numeral system “the prettiest” number system of all.

But binary numeral system still holds a special place in digital computing for a simple reason. You need *at least* two different symbols, but you *don’t need more* than two symbols. From an engineering perspective, binary arithmetic is a good choice because (as Burks, one of the early computer pioneers wrote) so many things are “naturally” binary. Binary arithmetic simplifies computer architecture.

## 0 comments ↓

There are no comments yet...Kick things off by filling out the form below..

## Leave a Comment