Entries from October 2015 ↓

Computational thinking revolution

Over the past years I’ve heard plenty about the importance of teaching computational thinking in schools. After 55 years of computational thinking advocacy work by computer scientists, educators have finally woken up to the great epistemic changes in science and in doing science.

See, as early as 1960 Alan Perlis started to talk about “algorithmizing,” and in his famous 1968 article What to Do Till the Computer Scientist Comes George Forsythe started to campaign for seeing computing as a general-purpose thinking tool. In the mid-1970s Statz and Miller campaigned for “algorithmic thinking,” and in the 1980s Knuth discussed its importance, too. The same ideas were turned into “computational thinking” in the mid-1990s by Seymour Papert, and the latest strong champion for computational thinking is Jeannette Wing.

But what is important to understand about this history is that the computational thinking revolution won’t be started by computer scientists. Whatever we do won’t change how people in other fields see their own disciplines. That is not to say that there won’t be a revolution, because there will be. But the computational thinking revolution in schools will start from the sciences: biology, physics, mathematics, and it will start when those fields have a generation of young teachers who are used to the new ways of doing science and to seeing their own field through the computational lens.

Why Binary?

I gave a talk on computing’s development the other day, and ended up in a debate with a colleague about the importance of the binary numeral system in computing. Among other conceptual developments that underlie modern computing, I mentioned the binary numeral system and noted that today’s popular narrative about computers is “it’s all about ones and zeros”. My colleague disagreed about its importance, and argued that binary numeral system is not necessary but contingent: You can build computers and reason about computing using any reasonable numeral system. And he is right indeed, both from theoretical as well as engineering perspectives: ENIAC was a decimal computer and Russians built ternary computers at some point. And Knuth considered the ternary numeral system “the prettiest” number system of all.

But binary numeral system still holds a special place in digital computing for a simple reason. You need at least two different symbols, but you don’t need more than two symbols. From an engineering perspective, binary arithmetic is a good choice because (as Burks, one of the early computer pioneers wrote) so many things are “naturally” binary. Binary arithmetic simplifies computer architecture.