Math CountS Dec. 2005 issue of SIGCSE Inroads
kirby.urner at computer.org
Wed Feb 8 12:49:20 EST 2006
> I consider this the crux of the problem. Programming educators and
> computer science departments who have put them into place have failed
> for the past few decades to evolve this notion into what it really
> should be:
> thinking in a logical and mathematical manner.
Of course this happens at many levels. When the focus is registers and
addressed memory, per MMX, the algorithm will be implemented in terms of
jumps and compares.
Conditionals, looping and indexing (stepping through values by incrementing
an address) remain relevant as we ascend to higher level languages.
However, the way we mold our thinking depends a lot on the target language
under study. Trying to use a pure Turing Machine (as stripped down as
possible) can be a real pain.
This is true in pure mathematics as well. There's not just one unified
notation or set of concepts out there. Notations come and go, just like
computer languages, though some have more staying power.
What we mean by "thinking logically" is not completely divorced from the
language we think with (if philosophy has taught us anything in the last 50
years, it has taught us that).
> We keep excusing people who don't do it this way. We keep saying that
> yes they can program. The problem is that it is not a matter of degree.
> If you don't learn to design a program, you will never be able to
> program. All you can do is
> write grammatically correct phrases in an artificial language.
There's been an historical trajectory here. The languages have given us new
ways to conceptualize a problem. Computer programming is still a relatively
new discipline (Ada died in 1852).
The structured programming paradigm stepped in (1960s) to counter all that
"goto spaghetti" that made debugging impossible (but ensured job security
for an inner circle -- of one sometimes). Peter mentions Dijkstra's famous
article in his MathCounts7.
The OO paradigm (Smalltalk et al, commercially more 1980s even if the ideas
came earlier) taught us to conceptualize in terms of objects, which was a
breakthrough on many levels, since that's closer to how we think anyway.
CS0 students learn to play the game of "I am" e.g. "I am a hydroelectric
plant" or "I am a light switch" -- so now what are my properties and
"I am" modeling is a segue to "empathy with materials" which many engineers
claim is sorely lacking in today's students. And they're right. Computer
models should be supplemented with hands on encounters (actually visit that
hydroelectric plant, and take apart that light switch).
> This is "programming" in the same sense as "speaking English" for a
> German native who has never encountered anyone from an English-speaking
> country but has learned all the grammar rules and the 100 most
> frequently used words. (There is a little blue book in Germany that
> teaches you just that, and I bet there are books like this here for
> other foreign languages.)
> Stand up for this notion and be counted. Throw everyone else out of the
> introductory CS curriculum. -- Matthias
In my view, the mathematicians have been too slow to acknowledge dot
notation as a legitimate contribution to mathematics proper. The math books
don't allow that A.b(z) is a good way to organize thinking, A being an
object of some type, perhaps inheriting from parent types, and b being a
method, accepting argument(s) z. v1.add(v2) is as smart a way to signify
vector addition as v1 + v2, plus makes it clear where the methods belong:
in the class definitions.
When math students are asked to actually number crunch with these
mathematical objects (quaternions say), dot notation is what they encounter,
more often than not. And yet there's this artificial line that gets drawn:
"that's just the computer's way of expressing things, not really a math
notation" is the party line (I'm not in that party).
What we need to throw out or de-emphasize are all these calculators that
have taken over in K-12. Pre-college kids need serious computer language
experience in connection with math learning, and the calculators aren't
giving it to them.
True, these calculators are programmable, but the languages are insipid, not
well suited to full-fledged alphanumeric processing, as opposed to simple
numeric processing. Business is not happy with all this dumbing down.
Our students deserve bigger, more colorful screens. It's a living standards
issue. Math is simply too austere for its own good at that level. Thank
heaven we've at least got those robots (Mindstorms etc.) or our Silicon
Forest would be even *further* behind in recruiting local/native talent.
The new Logo -> Squeak -> Python curriculum will *demand* that we move
beyond calculators (and beyond flatland) for a change. That's my strategy:
keep pumping out curriculum for which calculators are by definition
A great way to learn OO concepts *and* review basic arithmetic at the same
time, is to program a Rational Number object, preferably in a language that
(a) immediate evaluation in a shell and
(b) operator overloading (so + / * - may be used directly).
We then go on to design Polynomial and Vector objects (e.g. see my 'Algebra
with Python' http://www.4dsolutions.net/ocn/moodles/ ), currently undergoing
The reasons for the calculators' hegemony are more economic and political
I'm hoping our new classroom furniture concepts help move us along, as time
is of the essence.
More information about the Math-thinking-l