My experience with computers began back in 1970, when I travelled
from Southern California to Tucson to attend the University of
Arizona. I came to study Geological Engineering, not Computer
Science (back in those days the computer folks studied Systems
Engineering, or Math it seems). At any rate, I was registering
for courses and pondered a possibility to round out my
schedule, something called SIE 78, Fortran. I asked someone
what in the world Fortran was, and soon I met my first computer.
The Control Data Cyber 6400
The 6400 in those days was the only computer on campus apart
from some kind of Univac in the Math Department. The 6400
was a big mainframe in its own special room. It ran the
SCOPE operating system ("Supervisory Control of Program Execution").
It was an interesting machine, even by todays standards.
Here are some highlights: It was word addressed with 60bit
words. Instructions were 15, 30, or 60 bits and got packed
as efficiently as possible into the words. Characters were
6 bit display code, and you could cram 10 of them into a word
by shifting and masking.
IO was done by a group of 10 peripheral processors that monitored
some dedicated bit of shared memory watching for requests to be
posted. The assembly language for the beast was known as
"compass", and looks amazingly like some of todays load/store RISC
instruction sets. I would lay odds this thing was designed by
Seymour Cray before CDC ran him off and then ran the company into
the ground.
The Digital DEC-10
Around about 1972 or so, the University acquired its second computer,
which was the amazing DEC-10. This was an introduction to a whole
new way of life, with editing sessions using the SOS editor (TECO
was also available for the rugged of heart and mind), replacing
keypunch sessions. It ran the TOPS-10 operating system and
was really a fun machine to work with (which really couldn't be
said of the 6400).
IBM's big iron
After I graduated school, I made a short sojourn to the University
of Chicago, where at the time computing was done on IBM 370
mainframes. After TOPS-10, IBM JCL was more than any reasonable
soul could stomach, and I pretty much went into computer hibernation
and kept my mind on geology. When I did need to use these monsters
(you had your choice of fortran compilers, the H-compiler, as well
as other letter designations.), I borrowed prepackaged JCL decks from
other veteran of battles with big blues big iron.
The Digital DEC-20
Events brought me back to Tucson and in the employ of "Computing
Associates" developing software to serve the then prospering mining
industry. We ran most of our jobs on CDC Cyber 6600 and 7600 machines
in Houston, managing a public access terminal with a high-speed link
to the machines in Texas. After a time CAI purchased a DEC-20 machine
which was frankly one of the nicest machines I have ever worked on.
It ran TOPS-20 (of course!) which had features that are only today
being rediscovered. It had virtual memory. The DDT debugger was
set up to load into a virtual address dedicated to it, which meant it
could be invoked and attached to an already running task in a clean
and straightforward way. The DEC-20 was again word addressed, with
36 bit words now, holding 5 characters of 7 bits (leaving an "extra"
bit, the observant reader will note -- the editor used this to tag
line numbers in files). The DEC-20 did have instructions that could
handle strings and magically incremented right across that nasty extra
bit, so you didn't really mind it after all. But we still weren't
talking bytes yet, and octal dumps were the order of the day, as with
the Cyber machines.
The Interdata 8/32
Back at the University of Arizona things weren't standing still.
The folks at the computer center had managed to get a second DEC-10
and got the two DEC-10's talking to each other in some way that
slowed the two of them down to where it was better to just have
had one or the other of them. They also got one or both of them
talking to the 6400, or whatever Cyber mainframe they had then.
This was a good thing, since jobs could be prepared on the DEC-10
with an interactive editor and submitted over whatever link they
had between the two machines. (This was back when men were men and
networks weren't invented).
I found myself over at the Lunar and Planetary Lab working on an
unusual machine known as the Interdata 8/32. This was what you
would call a minicomputer. It didn't cost multiple millions of
dollars, so a small department could afford it. It was sold by
Perkin-Elmer and ran a nightmare of an operating system known as
OS-32. I would ask the questions: Why didn't you just buy a VAX?
and later, Why aren't we running UNIX on this thing? But some folks
felt hurt by questions like this, so eventually I learned not to ask.
We could have been running VMS or TOPS-10, but instead we had
OS-32 that didn't even allow typeahead. Of course it was coded in
assembly language for speed. :->
The Interdata could crank about 1 MIPS and sported a full complement
of 1 megabyte of memory! They told me it was core memory, but I didn't
believe it then and still don't. The interdata was byte addressed, and
did have a segmented MMU. We had edition 7 unix on a tape, and the
grand moment arrived when the previous system manager moved on to another
job, and we replaced the pair of 80 megabyte "removable pack" disks
(each was a washing machine sized monster), with a pair of 220 megabyte
Fujitsu eagles. It turned out that OS-32 couldn't support the eagles
so it was a fine chance to dust off the unix tape and make the switch.
Personal computers
While we were busy with unix on the Interdata, other folks were
busy with Z80 computers, the S100 bus, and other forms of 8-bit
desktop computing. This whole period pretty much just passed me
by. But then my boss purchased one of the original IBM PC's and I
had a shot at programming it. Once again an IBM product severely
abused me, but I really cannot blame IBM, rather Intel and Microsoft
conspired to repulse me. I honestly approached the whole business
with an open mind, but between the segment register business from
Intel and the bondage and discipline assembler MASM from microsoft,
I was soon thoroughly disgusted.
Experiences with Data General
About this time, I jumped ship and went across the street to work
at Steward Observatory. At the time their computing was primarily
services by a Data General MV/10000 running AOS. True, this thing
did run 2 or 3 mips, but it had the most disgusting internal architecture
I have ever laid eyes on, bar none. Just for starters, it could
address memory with byte addresses or word addresses, depending on
just what instruction was being executed. This was the first machine
I did work on that I did not attempt to program in assembly language.
I browsed the manual and got a birds eye view of the architecture and
that was plenty. I even managed to mostly ignore the operating system
as well by running the unix emulator which had a C shell interface.
This stood me in good stead and spared many healthy brain cells a
terrible fate.
Water under the bridge
It is perhaps worth just stopping to ponder a valid and wholesome
approach to some issues. Some stairs can be climbed two steps at a
time. The computer industry moves fast enough that it is reasonable
to just let some developments pass by. I pretty much let FORTH pass
me by (tho it has tried to grab me on several occasions and I even
put together a Threaded Interpretive language that ran on the 6502).
Then there are DEC VAXen and the whole world of VMS that I managed to
miss. Then there is the whole world of Z80 machines and CP/M - this
too tried to grab me, but I soon arrived at a rule of life: Anything
that doesn't have a hard drive isn't really a computer. And of course
there is the world of Data General machines and AOS that I largely
managed to ignore.
Sun workstations
As the MV/10000 passed into its golden years, the observatory began
to work with these fine computers. Here is unix at its best.
Here are wonderful machines with beautiful interactive graphics.
And when we added our first sun4 to our sun3 collection the poor
Data General was absolutely outclassed. I compiled my first little
C program and couldn't believe the compile had really happened the
prompt was back so fast. I was used to starting a compile with a
script that would beep when it was done and pass the time reading
a few pages in a book.
Intel revisited
I have been forced to reexamine the whole x86 architecture and the
discovery is that an amazing piece of work has been done. The 8088
is a reasonable chip if you view it as an 8-bit processor on steroids.
It is like a Z80 with a divide instruction and built in bank-switching
features to get beyond the 64k barrier. The 8088 was perhaps the
finest 8 bit processor ever made, but I have had more fun programming
the Z80. The 80286 was brain damage from the word go, what else needs
to be said. But the 80386 and up show an incredible tranformation.
The giant wart of the segment registers is transformed into a transparent
segmentation mechanism and paging and virtual memory capabilites have
been added. These things can run a real operating system, so I do,
I am running NetBSD on an x86 (where x86 is 486, 5x86, or P5 at different
times), and it does a beautiful job.
The future
What lies ahead? Can I ignore C++ and leapfrog to the next language
technology? Will java be the way of the future. Will I one day be
running Linux instead of NetBSD. Stay tuned for answers to these
questions and more!
Have any comments? Questions?
Drop me a line!
Adventures in Computing / [email protected]