Sigma Notation and Loop constructs

kirby urner kirby.urner at gmail.com
Thu Feb 18 16:39:33 EST 2010


On Thu, Feb 18, 2010 at 12:35 PM, Shriram Krishnamurthi <sk at cs.brown.edu> wrote:
>> In early programming, PL/1 era, variables were little boxes you put
>> things inside of.  Pigeon-holes.  Cubby-holes.  Much later, in 2010,
>> we look at names more as handles to objects, strings to balloons.
>
> Now you're confused again.  You've confused variables with references
> to values ("objects", as you like to say).  They have two different
> forms of mutation.  Feel free to read the chapters under "Church and
> State" in my free PL text, www.plai.org, to understand the
> distinction, since I don't feel like re-typing it here.  But do
> understand the distinction.
>

I think I was just hoping to sketch some of the changes over the years.

When only assembly language was available, thinking of registers,
with fetch and store to RAM (via bus), made the most sense, and these
were containers.  Variables were addresses to these "bricks in memory".
The idea of "reference" is always close by (an address references a
word or double word etc.).

Many programming books showed variables as little boxes, with numbers
and letters going into them, like papers into desk drawers.  Nowadays,
you'll find books giving that picture, then backpedaling and saying "oops,
that's not quite right, it's more like this" and then they give the analogy of
strings or pointers, more referential as you say.

Whereas one might say I'm confused, one might also say I'm simply
aware of the diachronic aspect of these concepts.  The idea of a
"variable" is itself mutable.  It's literally an exercise in archeology to
haul out the various textbooks and web pages in archive.org and start
pointing out these evolutionary changes.

Analogy:  the concept of "polyhedron" has changed over the years.
They used to be "rock solid" but over time have become more gossamer,
more wireframe, more networks.  Peter Cromwell mentions this trend
in his excellent book 'Polyhedra'.  What used to be "faces" are now
merely "windows". Hard to put one's finger on these shifts, but they're
real nonetheless.  Seems to happen around computers especially quickly.

I've seen an animated web cartoon where first they explain variables in
terms of cubby holes, then in terms of strings, then ask for the viewer to
vote on which is clearer.  Maybe someone here knows what I'm talking
about and wants to give the URL.  If I come across it, I'll drop it by.  The
page was about Python, so I should know, but I don't off hand.

The video below uses the older terminology, talks about storing the
value 5 "into" the variable.  One still thinks of variables as containers,
even though Python really encourages a different mental model.

http://showmedo.com/videotutorials/video?name=6950010&fromSeriesID=695

>> So best to insist that all changes to state happen clearly through
>> the name and its recognized API, not by some back door.  This is a
>> programmer's responsibility.  Some languages may be very relaxed
>> about poor style, which doesn't mean they can't be used in
>> production, you just need skilled coders.
>
> The billions of dollars -- and even lives -- lost annually on software
> errors, security errors, privacy leaks, etc., are evidence that these
> perfect programmers you seek are not easily found.  If you have a
> secret cache of these mythical people, please let them out -- the
> world desperately needs them.
>

We're in complete agreement then.

Lots of empty chairs on the front lines, awaiting those mythical
programmers.  The real ones are often far from perfect, much as
they strive to better themselves daily.

I liked that novel 'The Bug' by Ellen Ullman.

> Anyway, this is a software engineering issue that is irrelevant to the
> discussion at hand.  Let's not distract from fixing basic confusion.
>
>> I understand that programming has this thing called "flow" which
>> connotes an arrow of time.  Mathematicians are sometimes
>> uncomfortable with any arrow of time and want it out.
>
> Ever since Newton/Leibnitz (at least), we've had great tools to reason
> about change over time (and other, uh, variables).  Given that
> calculus is the first math course in most college math curricula, for
> majors and non-majors, saying mathematicians are uncomfortable with
> time sounds rather silly.  (Whether time indeed has an "arrow" is
> anyway a matter for physics.)

OK, I cop to being Neo-Platonic, tipping my hat to those branches
of mathematics concerned with eternal forms.  How could a tetrahedron
ever "decay" they might ask.  There's this tradition of absenting the
time dimension because one has no use for its energetic parameters.
Geometry is especially well known for this bias.

However, if those Polyhedra spin, like clocks, around an axis, then
you have that theta angle, d (theta) / dt, and all the calculus for smooth
motion physics, just as you say.  "Spinning without any time sense"
might be something a philosopher could define, if really thinking
about it, but most mathematicians wouldn't bother.  They *want* time
in the picture.  They're more like computer scientists in that regard.

>
>> Computer science has a timer right in the chip, cycles going, so is
>> time-friendly.
>
> Now you're confusing implementations with interfaces -- a rather
> embarassing mistake for such a proponent of OOP!
>

I sort of expected to get in hot water on that one.  I'm such a conflator!

> The computer also has capacitors and transistors inside, but nobody
> says "computer science is capacitor/transistor-friendly".  It has
> buses; one does not think of computer science as "bus-friendly".
> Underneath all these are atoms, but nobody thinks of computer science
> as "quark-friendly".  Etc.
>

And yet we're all drilled in the von Neumann machine, the program
status word uptick, the number of cycles each op code might take.

http://en.wikipedia.org/wiki/Von_Neumann_architecture

Time and flow are so evident, in JMP statements, in branching.

Removing the time dimension from computer science would be
really difficult.

I tried to conflate all that by blaming the chip architecture, didn't
get away it it.

I still think computer science is time-friendly though, despite my
vacuous premise.

If p then q, but not p, doesn't mean not q.

>>> Furthermore, in the latter language, one might also have
>>>
>>>  z = 1
>>>  def h(x):
>>>    z = z+x
>>>
>>> Now things get even worse: you have to understand the entire global
>>> behavior of the program to know what h is doing at any point.
>>
>> Yes, easy to write bad code.
>
> This isn't bad code.  It can't be without some reference point (eg, a
> specification that it does or does not meet).
>

True enough.  Still easy to write bad code though.

> Not only is it not bad code, it represents one of your beloved
> objects.  Wrapping this in some class mumbo-jumbo doesn't affect the
> underlying computational model, whereby calling h with the same
> argument twice doesn't yield the same results.  That's what happens
> when you have variables with non-local effects, aka, methods.
>

I admit to not understanding your pseudo-code completely, as I don't
know the language.  In Python, the inner scope throws an exception
because z has no meaning when you go z + x.  You have to explicitly
label (stigmatize?) z as a global, thusly:

>>> z = 1

>>> def h(x):
	z = z + x

	
>>> z
1

>>> h(10)
Traceback (most recent call last):
  File "<pyshell#5>", line 1, in <module>
    h(10)
  File "<pyshell#3>", line 2, in h
    z = z + x
UnboundLocalError: local variable 'z' referenced before assignment

>>> def h(x):
	global z
	z = z + x

	
>>> h(10)
>>> z
11

>> Just pointing out the failings of the competition
>> is less helpful.  What's the web framework du jour in the hottest
>> FP language, and is it open source?  If that's a way to get work,
>> far be it from me to get in anyone's way.
>
> Sorry, there are too many buzzwords in that for me to be able to
> answer it.
>

FP = functional programming

>> One needs more than one paradigm to know what "paradigm" means,
>
> The word paradigm as applied to computer programming languages is also
> nonsense.  Here's a short document that explains why:
>

Sigh.  I know less and less every day.

>  http://www.cs.brown.edu/~sk/Publications/Papers/Published/sk-teach-pl-post-linnaean/
>
>  Teaching Programming Languages in a Post-Linnaean Age
>  SIGPLAN Workshop on Undergraduate Programming Language Curricula, 2008
>
> Shriram
>

More homework.

Honored that you would take the time sir.

I do enjoy these lively debates, am reading all the threads, not just the ones
with me in them.  edu-sig in the meantime is being very quiet.  I think there's
a hush before Pycon / Atlanta, or has that started already?

Kirby

PS:  You have inadvertently given me a new idea for the new kind of computer
class:  Mumbo Jumbo.  That would need to be an affectionate nickname, not
the formal rigorous name.  I could see gaining some traction with
that, is there'd
indeed be lots of buzzwords, lots of shop talk (terminology, nomenclature).
http://www.youtube.com/watch?v=N9qYF9DZPdw (just for grins)



More information about the Math-thinking-l mailing list