Sigma Notation and Loop constructs
kirby.urner at gmail.com
Thu Feb 18 15:10:29 EST 2010
On Thu, Feb 18, 2010 at 11:20 AM, Shriram Krishnamurthi <sk at cs.brown.edu> wrote:
> This is a common source of confusion, and hence worth clarification:
I appreciate your taking the time...
>> Phrases like "mutable variables are outside those boundaries" strike
>> me as odd, because to me "mutable variable" sounds redundant.
>> Like *of course* variables are mutable, because that's what "variable"
>> means. If they don't vary, they're not variables, they're constants.
> There are TWO meanings for the word "variable", and you've
> conveniently conflated them.
Plus I went on to use the word 'name' in place of variable, talking about
binding as a verb for naming. That's a shop talk, although I don't think
the word "variable" will go away any time soon.
In early programming, PL/1 era, variables were little boxes you put things
inside of. Pigeon-holes. Cubby-holes. Much later, in 2010, we look at
names more as handles to objects, strings to balloons. Multiple strings
to the same balloon is allowed, if not always a good idea (usually not a
good idea, but in Ben Franklin's case, two strings to that kite saved him
his life, at least by many accounts).
In this paradigm, the thing that a name names has state, and the state
changes. I know that's problematic as side-effects make programs very
hard to debug. So best to insist that all changes to state happen clearly
through the name and its recognized API, not by some back door. This
is a programmer's responsibility. Some languages may be very relaxed
about poor style, which doesn't mean they can't be used in production,
you just need skilled coders. Like they say: "you can do anything in C".
> f(x) = x + 1
> I can ask for the value of
> or also
> Thus x is called a "variable" because its value varies across
> different uses of f. (In contrast, the value of 1 stays constant
> across different uses of f.) This is the meaning of a "variable" in
> This is very different from the use of "variable" in computer
> science. Consider:
> def g(x):
> y = x
> y = y+1
> where x varies like it does in the algebraic example above, but y
> varies WITHIN each use of g. You can ask what "the" value of g's x is
> for any given invocation, just as you could of f's x. But you can't
> ask for "the" value of y even in a given invocation: you must say at
> what time inside the running of g you want the value. In contrast,
> algebraic functions don't have a notion of "time during running".
> This is not just a philosophical question. It has notable impact on
> not only program construction and reasoning but also on tools. A
> debugger for the latter language MUST make time explicit, if it is to
> make any sense at all. A debugger for the former need not (indeed, it
> would be pointless).
Given my bias, I might not say "just a philosophical question" as
sometimes those are the ones that really matter.
I understand that programming has this thing called "flow" which
connotes an arrow of time. Mathematicians are sometimes uncomfortable
with any arrow of time and want it out. Computer science has a timer
right in the chip, cycles going, so is time-friendly. Mathematics and
computer science meet at this idea of "incrementation" (a small change,
in any dimension).
> Furthermore, in the latter language, one might also have
> z = 1
> def h(x):
> z = z+x
> Now things get even worse: you have to understand the entire global
> behavior of the program to know what h is doing at any point.
Yes, easy to write bad code.
What programming and mathematics have in common is they're
collective endeavors that need to share names, yet not get bogged
down in confusion over the different meanings of these names.
Everyone wants to use the variable name 'x' at some point. We
have a limited number of keywords and we don't want to selfishly
prevent some new meaning of "vector" just so long as we have
a clearly demarcated context.
A rough idea of namespaces (context, knowledge domain) develops
in any discipline, is what we call "scope". This happens *within*
mathematics, other disciplines i.e. we're not just talking about the
cell walls between departments.
Just saying: how to manage complexity is truly a challenge and
I'd welcome more help from FP style thinking. Perhaps JQuery
hails from there? Just pointing out the failings of the competition
is less helpful. What's the web framework du jour in the hottest
FP language, and is it open source? If that's a way to get work,
far be it from me to get in anyone's way.
> I would suggest thoroughly ingesting this distinction to understand
> what Felleisen and Page and others have been saying for a while (like,
> uh, decades).
I recognize your school of thought as significant, reputable, consistent
with its message. I just sometimes need a refresher, as I wander to
other groups and hear their stories as well (not necessarily conflicting,
just concerned with different matters sometimes).
One needs more than one paradigm to know what "paradigm" means,
so I would at least advocate for a minimum of two paradigms in any
subject claiming to teach about paradigms. Whether K-12 should is
an open question, however, Kuhn's Structure of Scientific Revolutions
has been around long enough to have made these concepts rather
universal. So I'm all for OO and FP as a minimal combo, though others
will think of others. I like the J language, and wonder if that's embraced
as functional programming by anyone. Inherits from APL.
More information about the Math-thinking-l