Raskins andre artikkel


This essay is from: 

JefRaskin@aol.com

In the interest of eliciting more historical facts, here are a few additional
recollections. A lot of people asked to see my thesis, but that's rather large
and much is irrelevant to this discussion, so I've excerpted the pertinent
parts of it here.

Bruce Horn pointed out that Xerox had the pull-down menus that I
attributed to Atkinson (a case of independent reinvention of which I was
unaware) though click-and-drag for selection and moving were (as far as
he and I know) invented at Apple. Bruce and others have expressed the
idea that the Canon Cat was much as I would have wanted the Mac to be.
However, the Canon Cat as marketed by Canon was only a dim echo of
what my colleagues and I (at Information Appliance Inc.) worked on. For
example, most people only know it as a closed-architecture (which it
wasn't) secretarial workstation.

This is because Canon did not want to reveal that it was actually a
68000-based bit-mapped product with a nice set of graphics tools in
ROM--which tools they never used. This was partly because they decided
to bundle it with a daisy-wheel printer (!) that could not do graphics and
partly because it was brought out by their electric typewriter division and
not their computer division. Most observers at the time thought the
marketing had been botched, and I am not going to disagree. Designed to
allow easy integration of third-party software, Canon never pursued this
essential path, and the few third-party vendors we had begun to sign up
never had time to complete their work before Canon bailed out. The Cat
had a connector and software hooks for a pointing device, but Canon
never provided a mouse or mouse-equivalent for it.

In many ways it was, for 1987, far ahead. It automatically resumed from
where you had left off working, even if you turned off the machine in the
interim; it had a screen-saver; instant on with any keystroke (and you
didn't even lose the keystroke); it would have qualified for a "green"
sticker had they been invented then, it had true document-centric operation
with a level of integration beyond any of today's suites, OpenDoc, or
OLE; an ease of use that has yet to be duplicated; the boot time was about
seven seconds but seemed instantaneous due to a psychological trick; it
featured full-text retrieval of anything, no matter in what application; had
built-in communications including an auto-answer, auto-dial modem; the
communications were available and integrated with all the application
areas; and so on.

Even the normally sage Esther Dyson didn't understand the product's
openness, at one point she wrote that it was unacceptable since it didn't do
footnotes. But neither did the Mac, until appropriate third-party software
came along. But everybody liked the Cat interface.

With regard to my thesis, its formal title was, "A Hardware-Independent
Computer Drawing System Using List-Structured Modeling: The
Quick-Draw Graphics System" Pennsylvania State University, 1967. All
the material in quotes in the next few paragraphs is from the thesis.

Some things I probably should put into their chronological context,
otherwise they may seem strange after thirty years of bit-loss. First, I did
not have access to an interactive graphics terminal, which I thought would
"excite images of a new era in man-machine communications as the more
visionary proponents of the interactive console rightly put forth...".
Surprisingly, the utility of interactivity was not apparent to all computer
scientists at the time; A section of my thesis (6.23) on Interactive Graphics
was snuck in nonetheless. For example, when the system had to ask the
user something I proposed that small menus could appear right on the
display and that the user could "detect on" (now we would say, "click on"
as we don't use light pens) the appropriate element in the menu. Now we
would call this a dialog box.

I was "providing a common programming system for diverse output
media" based on their shared basic abilities, at that time the ability to create
a vector. Bit-mapped systems were not available at Penn State. At a time
when the existing graphics packages at Penn State drew only "charts,
graphs, and tables," I spoke in terms of creating a system for "Architects,
electronics engineers, musicians, computer scientists, artists,
meteorologists, linguists, chemists, and indeed the entire academic and
professional community."

The real need, I wrote, was to "have the ability to define arbitrary symbols
and manipulate them into complex pictures. Such symbols could be
representations of furniture and fixtures in floor plans, resistors,
transistors, and the like in schematic diagrams, notes and clef signs in
music, the individual shapes in flow charts, symbols for atoms and
molecular structures, sentence structure diagrams, and so on without
limit." I saw using images hierarchically, "The fixtures are arranged into
rooms, the rooms, now treated as units are arranged into buildings, and
the buildings, as units, become developments, urban centers, and cities."
The Quick Draw Graphics System (QDGS) provided all affine
transformations of objects, and a few others (such as perspective) as well.

For portability it was written in s higher-level language. In this case
FORTRAN was chosen, since it was the one scientific language almost
universally available at any computer center in the U.S. at the time. True
to what I would later do in the industry, I provided a plain-English
"QDGS Primer" to get people started, and included it as an appendix to
my thesis. Another section of the thesis discussed three-dimensional
representation, as the QDGS could do perspective drawings; we made a
few short films such as one showing a cube with writing on its faces
rotating in space. This is trivial now, of course, but few people were
doing 3-D computer-generated animation prior to 1970.

A lot of the thesis was standard computer science / math stuff, with matrix
calculations and formal grammars. E.g. "As a grammar... it is
context-free, and since it is self-embedding, it is clearly not regular." But I
will skip all that formal stuff.

More important from today's perspective is the observation that "with
character generators one has a limited choice of lettering sizes... There is
one orientation, horizontal, and no ability to introduce new characters."
Over the objection that I was sacrificing efficiency I decided that "Within
the QDGS no provision is made for the use of character generators,
although special programs could be devised. If such provision were made
what would be lost, aside from hardware independence, is the ability to
have annotations in various sizes, styles, and proportions, at any angle
and position. What also would be lost is the ability to treat characters with
a full range of transformations available to other geometric shapes, to
create arbitrary characters as the need arises, to make annotations part of a
figure and thus moved about with the picture it annotates, and the ability to
squeeze, justify, fit, and creatively use characters as picture elements in
every way."

I went through this same argument again a decade later with Woz when he
was designing the Apple II. I argued that he should eliminate the character
generator and do all character generation graphically, but Woz didn't think
that would work. Jobs didn't understand what was so important about
making computes graphics-based. I finally got the hardware architecture I
wanted by making it a fundamental principle behind the Mac.

Back to my thesis: to insure hardware independence I considered the effect
of different raster sizes and how to compensate in a graphics system for
different resolutions "such that if the entire picture is scaled its appearance
is the same no matter which device it is plotted or drawn on..." This all
seems pretty modern but other things were primitive, "A small
innovation... but a nicety, is that labels can be automatically centered."
Hard to believe that such an idea could ever have been an "innovation."
Many people were surprised to find the graphic arts and typographical
term "fonts" in a computer science thesis. Computer scientists were not
supposed to be interested in such things, except as an eccentric hobby.
Most of my collegues had never heard of a font. Now, everybody who
uses computers is aware of fonts; there's the very word on the menu
above this note I am writing.

The most heretical statement I made (my advisor thought it questionable)
was that my work was based on a "design and implementation philosophy
which demanded generality and human usability over execution speed and
efficiency." This at a time when the main aim of computer science courses
was to teach you to make programs run fast and use as little memory as
possible. Come to think of it, maybe we should bring back some of those
courses: nowadays major companies can't seem to write a word processor
in fewer than 8 megabytes.

When I put human usability as a major goal, I was off on my own, and
did not find like-minded computer scientists until I ran into them at PARC
about six years later. In fact, the very term "human usability" didn't enter
the computer science lexicon with any regularity until later. The IBM
Usability Lab opened in the early 1970's and at first was mainly
concerned with ergonomics. Only a handful of people, such as
Sutherland, Weinberg, Gilb, and Englebart, seemed interested in the topic
and I didn't learn of the work of the last three until after I had done my
thesis. (Weinberg's ground-breaking "The Psychology of Computer
Programming" was published in 1971." The quantitative work of Card,
Moran, and Newell became widely known with the publication of
important book, "The Psychology of Human-Computer Interaction". It
appeared in 1983.)

My thesis led up to a final illustration: the opening measures from
Beethoven's "Variations on God Save the King" for piano. I had created
the music font (including an elegant slur generator) and the software for
using it as a side project. I also designed and built a digitizer for putting
existing scores into the computer, when you pressed a button to indicate
that you were pointing where you wanted it noisily punched out Hollerith
cards on keypunch machines I had modified. I am not sure that anybody
had ever before used general purpose graphics devices to notate music.
The notation was not crude, it looked enough like published music so that
most people could not detect that it was computer-generated. The story of
this work and a photo of the digitizer I built has been published.

As I said in my history of the Mac Project (the one currently being
serialized in CHAC), the Mac was by no means the work of one person,
but the combined efforts of thousands in hundreds of companies large and
small. It was not, as many accounts anachronistically relate, stolen from
PARC by Steve Jobs after he saw the Alto running SmallTalk on a visit.
For one thing the usual account (as in Levy's book, "Insanely Great" and
others) denigrates the original and creative work done by all the Apple
employees that put their hearts into the Mac. Most of the histories of the
Mac were written without their authors interviewing the original team
(Brian Howard, who contributed so much, is always missed), and the
history of the Mac that Apple's own P/R department dispensed was based
on Jobs's version. Many didn't speak with me: without knowing that I
had worked out many of the key usability ideas when Jobs was still in
grade school and before there was a Xerox PARC to learn from, it is
perhaps understandable that people would find it necessary to invent a
history that derives the Mac's genesis from the nearest similar work. The
honest intellectual debt the Mac owes to the work at PARC was not a case
of highway robbery. 

------- 

I very much appreciate the many kind emails I've received, and since they
run into the many dozens, I cannot answer them all individually. David
Craig, a computer history buff, asks if I have the memo on the design of
the one-button mouse. I don't know, someday I may have time to go
through my papers and find out. Arild Eugen Johansen asks for a list of
names of the early Mac team. This, too, will have to wait for the same
reason; I don't want to do it from memory and leave out someone by
accident. Owen Linzmeyer, author of "The Mac Bathroom Reader", which
contains one of the best accounts of the history of the Mac yet published,
wrote to ask if I was still using the "Millionth Mac" I was given in 1987.
Yes, it's main job is to run our Lego toys through a neat computer
interface Lego sells. I wrote it up in WIRED.

Glen Cole asks if my attempt to avoid trademark conflict wiht McIntosh
audio equipment by spelling the computer "Macintosh" were successful.
He correctly remembers that it was not. Apple even had to pay the raincoat
manufacturer for use of the name, I have no idea why. Glen also asks
how I feel about the nickname "Mac" for the product. Fine, we used it
from the very beginning. Andrew Warren, referring to something I wrote
for Upside magazine asked, "Still believe that there'll be exactly 2,156
mainframes left by the end of the millenium?" Giving so specific a number
was my way of saying "that's a silly question." First, we'd have to define
"mainframe". And then how would I know anyway? Kent Borg asks
where the innovation is happening. It looks like Apple will take the route
of being compatible with all the nonsense in the universe. They'll survive,
and I'm glad I'll still be able to use Macs, but I am not excited. What I
want to create is software that is as easy to use as the CAT was but with
the power of today's broad range of applications. I know how to do it, I
just haven't found a company where I can build it. 


mdreier
Last modified: Tue Mar 4 20:36:07 MET 1997