[REBOL] Real programmers? or myopic egotists
From: kenneth::nwinet::com at: 26-May-2001 14:23
Hi all, been to any good jousts lately?
(this is a long post and it rambles so for those of you that just want to
hit the delete key, I promise not be offended ;-))
At 16 I could have been the best at something. At 42 I've determined that I
have too many interests to become an expert at any one of them. So I
ramble. Ok, you can call me shallow. ;-) OTOH, I've noticed as have
others the close ties between genius and insanity. Which brings us to
Not to take anything away from Chuck Moore but he reminds me of a
Minneapolis programmer (not a young guy) I once knew that thought
hexadecimal was much more natural and so regardless of what language he
wrote in he always used that representation (even when decimal was the
natural way to express the number.) ...and this was a guy that worked with
computers that used 9 and 36 bit cpu's. Am I alone in thinking this funny?
> I think of a router as a very simple thing that takes
> data in and sends it out. Perhaps it more complicated
> than that. -- C.M.
There's a paradox here. On one hand it shows a person somewhat out of
touch, but on the other hand it shows someone that better than most
understands the power inherent in fundamental abstractions. So he invents a
language where all programmers are an island and in the process breaks the
heart of those that see the potential inherent in the fundamental design.
It makes me wish I were a better programmer or better visionary (or perhaps
just a programmer with more time) so I could take elements of his design and
build on them to come up with something better. I'm attracted to Rebol
because I see some of those elements existing here.
> >If it succeeds, these people will prefer to look at scripts
> >instead of black boxes...
> If I understand you correctly, you are opposed to libraries
> as they dumb down users who could potentially be learning to
> be better programmers by examining scripts and building from
> examples rather than using a black box.
> If so, then that view is shared by some eminent programmers
> including Charles Moore, the inventor of Forth.
I want to talk more about C.M.'s thoughts on real programmers, but first...
It amazes me when someone disparages the 'black box'. I think Joel pointed
out that, even machine language represents a black box because as has been
mentioned it sits on top of microcode. The abstraction of the black box is
always there and is where power (there's that word again) comes from. (Does
it stop at Quanta? I don't know, but I suspect not.)
For me the issue is which black boxes are foundational and which are
programs themselves rather than tools for programmers (I'm not quite happy
with how I expressed this thought, but I'm not sure how to better express
I have to be careful here because I'm not arguing that a language should not
include these 'programs themselves' but that they should be part of a
library (which then paradoxically makes them a programming tool.)
My thoughts are a little muddy on this issue. For example, I would not
classify most visual components like textboxes and such as foundational to a
language. Although I would want to be able to easily create new widgets
with language elements that are foundational. Others might reasonable see
this differently, so perhaps it's just a matter of where you put the fence.
Another example, printf doesn't seem foundational to me (C programmers
probably think this sounds weird, especially since it can't be written in C
because printf does not have a fixed number of parameters (C++ could
however.) I would say it's not foundational because it includes both
formatting and output. To me, printf is something a programmer (or library
builder) would create for their own use, but not include as a fundamental
part of the language. Ok, so it's part of the standard library then, hey
guys, cut me some slack. I see printf as a programming kludge, not well
thought out (but have no problem with format specifications and escape
sequences in general ---as opposed to using special symbols--- as in APL
keyboards and there ilk.)
To me, Rebol's use of refinements is very appealing. Although using
different words may be more concise (print, write, out, send, etc.) My
purist heart likes the idea of a single word (copy perhaps) that takes a
pointer to a buffer and redirects it to another buffer with some
preprocessing dependant on a refinement.
Then I realize that taken to the extreme it becomes ridiculous (say copy is
the only language element) and the refinements become the language!
Copy from the keyboard into a buffer. Copy from a blob to a screen
location. Copy this range over that range using Xor. Etc. Shades of turing
OTOH, datatypes are another way of supplying a refinement that seems closer
to doing things the way they should be done. It allows the computer to do
the housekeeping that computers are really good at and frees the programmer
to think about the fundamentals of the algorithm they are writing.
Doesn't it strike some with awe that the exact same bits, depending on how
they are interpreted can be both data and code, can represent a letter of
the alphabit (another freudian? Alphabet,) or a color on a screen, or the
address of that color on the screen? There's something fundamental here
that is almost a religious experience (hey, and I don't even do drugs! It's
a natural high. Ok, aspirin on occassion. Feel free to take a few
yourself. *** Insert Medical Legal Disclaimer Here *** What a world we live
in? I spilled coffee on myself for months at McDonalds and they never game
me a million dollars? Incredible huh?)
Ok, now on to my fundamental objection to CM's elitist attitude: Real
Programmers Kluge! They patch! They connect things together that no one
had considered putting together. Real programmers maintain the code that
elitist programmers didn't have time to get right or have to adjust because
the specification changed. I believe in elegant code. I love the power of
simplicity. I'm constantly telling the guys I work with that code that
doesn't exist doesn't produce an error (Ok, I know that's not exactly true
and wish I could find a better way of phrasing the concept.) I'm often
reducing 500 lines of code into 100 that does the job better and with more
flexibility and that other programmers can understand without having me
around to explain it to them. I also know there are other guys that do that
better than I do. I don't do it just to be rewriting other's code (which I
think is insulting to the original programmer and often unproductive or
counter-productive) or because I somehow know better. But in my book,
maintenance programmers are the unsung hero's that keep things working after
the Real Programmers are long gone. Yeah, I'd much rather be doing
something original and less like housework, but the piles would be getting
pretty high if we didn't have these legions of dumb programmers out there
doing their job. Occasionally, making things harder for those that follow,
but so what? That's just the way it works and no amount of moaning or
elitism is going to change it any time soon.
I find myself both agreeing and disagreeing with both the conservative and
radical view's expressed on page xviii of R:TOG. Both start by saying that
programs become too large to be completely understood. My view is that it
takes a combination of both discipline and mind expansion. The key is to
carefully choose the abstractions that are to be considered fundamental. I
see this as the craft of a language designer. The use of the word craft as
in craftsman is deliberate.
When a language turns into some obscenity of characters ' -#@$ %" I think
something is fundamentally wrong. I think that when indirect memory
references become a series of pointers to pointers, then something is
fundamentally wrong. I think that when you have to introduce stack
manipulation words like pick, rot, dup, drop and swap (my favorite) it may
not be wrong, but something isn't quite right either. Actually, why doesn't
somebody use Pluck? What? Prejudiced?
I hear the arguments regarding stacks vs. registers and think... Why not
have the first 16 or 256 elements of the stack map to registers?
************* you can safely ignore this section (or everything for that
matter ;-)) ************************** comments on elitism continue below...
This leads to what I admit is an ugly notation but suppose... (and here a
better choice of letters or replacing them with actual words sounds fine to
me. A shorthand version might also exist in that case.) BTW, I got this
wrong which I'm sure you'll all be able to spot.
R5 would directly address the fifth element on the stack (leaving it on the
A5 or @5 might indirectly address memory pointed to by the fifth element.
U9 to use it like A9 but also removes it from the stack.
P10 to pick the tenth element and remove it from the stack sliding
everything else up (or perhaps everything down to that point since
presumable the value picked goes to the top of the stack.)
I99 to insert the top of the stack into the 99th position.
Inc25 to increment stack location 25 and... (Inc might be an alias for Inc0)
Dec25 to decrement it (Yule get used to it.)
S200 to swap the top of the stack with the contents of R200. S1 would be
the equivalent of forth's swap word.
T42 fill a Target register with an indirect address from register 42.
C11 to copy the count in R0 from A11 to Target which was the last Txx
Z11 to copy from A11 until a zero value.
F255 fill the target with the value at R255 using the count on the top of
Then you've got logical operations like...
AND, XOR, OR, ROR, ROL, SHR, SHL,NOT that could include register references
For example: @XOR4 to exclusive or the value pointed to by A4 with the top
of the stack (otherwise known as the accumulator!)
I would assume that hardware could be produced that handled all this
shifting of data in an efficient manner.
The reason for saying R5 rather than the more flexible R 5 is that I
envision R5 being a literal register in hardware. For portability this
could just as easily be a virtual register in software, in which case R 5
would then be preferable as a notation (because a virtual machine can easily
adapt itself to a non-fixed number of registers.) Charles Moore even went
on to create a forth chip in hardware so I don't see why this couldn't be
The key to all this (and why there even is a stack vs. register debate) is
that registers are usually better at being registers than they are at being
stack elements. To change the stack you can just change the stack pointer,
not so easy with registers. But that doesn't mean that registers can't be
incorporated into the top of the stack in an efficient manner or that stack
manipulation couldn't be made to work efficiently. Under the covers R0 -
R255 could be some sort of mapped array (pointing to the real R0 - R255.)
Anyone care to make an emulator? I'll start...
Registers: [R0 R1 R2 R3 ... ]
***************** back to charles****************************
C.M. thinks real programmer don't need floating point. Hello? Sure
floating point has it's problems (like when comparing and exact dollar value
to the same value in a SQL database and having it fail!) And scaled
integers are certainly faster, but real life uses a lot of real numbers.
Conversion is the computers job, not mine. If that means I'm not a real
programmer, then you can bite my real...
oops! Please forgive me. Got carried away there (yeah, and other places...
Me, I like the reluctant messiahs like Linus Torvalds. He just wants to
keep working and everyone is always asking his opinion. The guy seems to
have some real humility (My humility has been humilitated out of me...)
Life is too short. Enjoy what you find interesting. Be responsible to
those less fortunate. Leave the rest to eternity.
Shoot the messenger! Enjoy the weekend. Go to the beach! (For all those
purists that don't like gotos!)