from Hacker News

The Canon Cat

by maguay on 10/21/22, 8:02 AM with 160 comments

  • by Roark66 on 10/21/22, 11:08 AM

    The "don't use icons, use words" rule resonates with me. I really dislike android drawing(and other similar) apps that present a cluttered interface with a dozen of cryptic icons.I much prefer words. Also having each widget take a lot more space forces the designer to really think about the value of adding one vs hiding it.

    Multilevel menus are IMO definitely better. If the app is a real productivity app people will spend ages in adding an ability to add "shortcuts" to some most often items will provide power users with ability to reach them quickly.

  • by emptyparadise on 10/21/22, 9:35 AM

    Jef Raskin's The Humane Interface is a very good read. I didn't find myself agreeing with many of its proposed solutions to HCI problems, but it does an incredibly good job of identifying the issues that we still have to this day with user interfaces.
  • by Gordonjcp on 10/21/22, 2:57 PM

    Every time the Canon Cat comes up, I'm surprised no-one mentions the Amstrad PCW family and Locoscript. It was launched two years before the Canon Cat, and not only made it to market but had roughly the same market penetration in the UK as lightbulbs.

    Given that a lot of Amstrad stuff wasn't exactly known for being the highest quality the PCW8256 and 8512 were surprisingly good for the £400 they cost (about a grand in today's money, roughly half the price of the Canon Cat). You got a fairly chunky computer about the size of a 14" TV with a green screen monitor which had a not-ridiculous persistence so no flickering and a fairly "gentle" colour. The 3" disks were a bit weird. The keyboard was a nicely clicky "spring over membrane" design that felt nearly as good as a proper mechanical one (not a patch on a Model M but better than most!) and had a bunch of buttons for commonly-used functions. If you pressed any of the modifier keys the menu at the top of the screen would change to show you what you could do. It even came with a dot-matrix printer that could do graphics after a fashion.

    You could buy a posher version with a white screen and a daisywheel printer, too, but they were more expensive and the printer was extremely slow and noisy.

    It was only slightly harder to get started with than a biro and a notepad.

    I wish Locoscript had won instead of MS Word.

  • by hcarvalhoalves on 10/21/22, 1:54 PM

    It's funny that many of the principles like "Auto-save changes", "Restart where you left off", "Make commands accessible everywhere", "Use words instead of icons" are qualities I enjoy on Emacs, and qualities that derived from the Lisp machines also.

    The only conflicting point is "Never allow customization", which I guess is where typical user needs diverges from expert user. Everything else seems to be universal of good UIs.

  • by msla on 10/21/22, 3:35 PM

    I remember reading about Raskin's work on Ward's wiki, back before Ward and crew made it unusable. I wrote some pretty heated diatribes against it, which I'm not going to do now, but I will push back on one of his dogmas:

    > Never allow customization: Consistency, though, led Raskin’s perhaps most controversial idea, prompted by the trouble he saw customers have with documentation. “Customizations are software design changes that are not reflected in the documentation,” and as a documentarian, this could not stand. The designer knows best—something that comes through strongest in Apple’s products—and “allowing the user to change the interface design often results in choices that are not optimal, because the user will, usually, not be a knowledgeable interface designer,” said Raskin. “Time spent in learning and operating the personalization features is time mostly wasted from the task at hand.” Better a consistent, well-designed interface than one you could fiddle with forever.

    This is a meta-dogma, a dogma about being dogmatic about your design. Never allow the user to change your Holy Vision, because Your Beneficent Self, The Designer (Peace Be Upon You), has decreed it shall be such, such it shall always be, yea, unto the ends of the system's profitability, never allowing the user to grow in their knowledge of how to do their tasks, never allowing the user to bring their own domain knowledge to their tasks. There shalt always be an unbridgeable gulf between Designer (insert holy trump here) and user, and the user shall never trammel the Designer's Roarkian Vision. So mote it be, amen.

    It's High Modernism in software. It's the exaltation of One True Vision above the people who do the work and might, therefore, know something about how the work is done. It is, in other words, utterly shocking Jobs rejected the Canon Cat and its immense hubris. Probably because it wasn't Jobs' immense hubris.

  • by kkfx on 10/21/22, 9:31 AM

    IMVHO all "big of IT" have done their best to castrate users ability to use a computer instead of being used by it like a piece of a machine.

    Computing is power, if users get such power they improve, becoming less easy to milk and steer. That's why we see a war against the desktop concept from Xerox, initially by IBM and then by all others.

  • by omar_alt on 10/21/22, 9:50 AM

    A simple low energy computer that does one task like word processing or act as a terminal client with an ink display would be a useful for focusing on one task without distraction. I personally would love to have a low energy terminal device that I could work in for hours however my pessimism tells me there is not enough demand to scale this.
  • by jessegrosjean on 10/21/22, 11:00 AM

    If you are interested in Canon Cat these are two good sites:

    - Documents: http://www.canoncat.net

    - Web based emulator: https://archive.org/details/canoncat

  • by guenthert on 10/21/22, 5:27 PM

    An article about the Canon Cat which doesn't mention that it was programmed in FORTH? Perhaps not essential to the UX, but when claimed "A predictable, documentable system must be entirely under Apple’s control,” FORTH, renown for its extensibility, in fact, complete blurring the lines between language, OS and application, seems a non-obvious choice.
  • by dang on 10/21/22, 4:46 PM

    Related:

    Leap Technology (1987) [video] - https://news.ycombinator.com/item?id=33137433 - Oct 2022 (40 comments)

    The Canon Cat: The Writing Information Appliance (2004) - https://news.ycombinator.com/item?id=30836958 - March 2022 (42 comments)

    Demo of the Canon Cat computer released in 1987 with 'leap' feature [video] - https://news.ycombinator.com/item?id=29423545 - Dec 2021 (1 comment)

    Canon Cat - https://news.ycombinator.com/item?id=26213934 - Feb 2021 (31 comments)

    Leap Technology (keyboard vs. mouse on a Canon Cat machine, ca 1987) - https://news.ycombinator.com/item?id=22042900 - Jan 2020 (1 comment)

    Canon Cat Emulation - https://news.ycombinator.com/item?id=18032916 - Sept 2018 (2 comments)

    Canon Cat Resources – Jef Raskin's Forth-Powered Word Processing Appliance - https://news.ycombinator.com/item?id=14650365 - June 2017 (23 comments)

    The Canon Cat - https://news.ycombinator.com/item?id=6978587 - Dec 2013 (30 comments)

    Canon Cat Documents Archive - https://news.ycombinator.com/item?id=3394546 - Dec 2011 (8 comments)

    Canon Cat - https://news.ycombinator.com/item?id=595744 - May 2009 (15 comments)

  • by lisper on 10/21/22, 5:20 PM

    > design the computer to fit the human's needs

    The problem: different humans need different things. But Apple nowadays operates as if everyone needs the same things, at least in terms of UI/UX. If your actual needs (or desires) differ from Apple's preconceived notions, you are simply out of luck.

  • by forinti on 10/21/22, 12:15 PM

    It's interesting that the Cat came out in 1987, the same year as the Cambridge Z88.

    The Z88 cost 250 pounds (about US$400 at the time), which would make it a lot cheaper than the Cat.

  • by Razengan on 10/21/22, 10:22 AM

    Most of the ideas and rules there, like "auto-save changes, restart where you left off, and make commands accessible everywhere" have been a core of the Mac experience since over a decade and a big reason for why I've loved them since jumping ship from Windows.
  • by rob74 on 10/21/22, 1:13 PM

    I can totally understand why the Canon Cat was a flop. At the time it came onto the market, mouse-driven GUIs were largely seen as the thing of the future, and those didn't have to be explained: moving the mouse and clicking was like pointing your finger at the screen and tapping it directly. And then this weird thing with no GUI and no mouse comes along. Its UI may have been easy to use too, but it probably needed a lot of explaining and getting used to before you could internalize it, so only a few power users really made the effort to familiarize themselves with it.
  • by TacticalCoder on 10/21/22, 10:09 AM

    The keyboard is intriguing: the two "LEAP" keys before the spacebar, obviously to be used with the thumbs, are not dissimilar to what some ergonomic keyboards are now using. Apparently they're both labelled "LEAP" on the top of the key while on the side it's written "LEAP AGAIN" (that's what I see from googling a few images). On Wikipedia it says these keys are for "incremental string search".
  • by kaveh808 on 10/21/22, 10:07 PM

    I am currently working on a new 3D system, and have been thinking a lot about GUI design.

    My impression is that as applications have grown in complexity, there has not been a corresponding change in our approach to GUI design. I'm talking of desktop apps for doing 3D content creation (Maya, Houdini, Blender, etc).

    The menubar and hierarchical menus worked elegantly in the simpler days of the Xerox Star and Apple Macintosh, but the continued reliance on them makes me chuckle at times.

    The other day I decided to count the number of GUI elements visible in one such app. There were over 200. I can't prove this, but my feeling is that this visual/usage clutter can create confusion and anxiety in users. Or maybe users just learn to ignore the 90% of widgets they never need to use.

    I find it increasingly awkward to have to move the mouse to click inside a 16x16 pixel widget on a 4K screen. Most GUI actions are not inherently graphical (click on a button, menu, icon).

    One app has such a large contextual menu (with many submenus) that the menu includes its own search field, and users click to make the menu appear, then type in a few characters to locate the menu item they want, then click on the item. I can't help but shake my head and chuckle.

    My own attempts at coming up with something different have resulted in a series of (short) popup menus that can be invoked via keyboard. My hope is that users will develop muscle memory to go to the selection they want quickly. For example, hitting "C,C,C" (the "c" key three times) invokes, in order, the Create, Curves, Circle menus (each menu replaces the previous one) and creates a circle in the 3D scene.

    Been planning to make a video demo...

  • by predictsoft on 10/21/22, 11:07 AM

    The other day was a story about the CueCat. I always got confused between the CueCat and the Canon Cat!
  • by rubenv on 10/21/22, 9:25 AM

    I'm sure this may be interesting, but with such low contrast it's simply impossible to read.
  • by jFriedensreich on 10/21/22, 10:40 AM

    I think it boils down to icons + mouse vs text and keyboard. Imagine a mac without susan kares icons that played such a big role in making it feel truly human and developing an emotional attachment. Among so many great UI guidelines is that outlier with a dislike for icons with a minor and solvable (hover + tooltips) reason.
  • by al_be_back on 10/21/22, 11:13 AM

    More like Canon Emacs!
  • by dchest on 10/21/22, 9:15 PM

    Software written in Forth. Here's a video showing its interpreter mode and Forth words https://www.youtube.com/watch?v=XZomNp9TyPY
  • by retrocryptid on 10/21/22, 2:17 PM

    I take issue with the "forgotten" moniker. I use my cat every other day.
  • by lproven on 10/23/22, 11:22 AM

    Great article, great machine. Probably the single most important computer of the late 1980s.

    I wonder if it would be possible to write an implementation of the Cat UI on top of Emacs?

  • by EricE on 10/22/22, 1:18 AM

    There is no doubt Raskin had a lot of great ideas - I still wouldn't want to own or use a computer designed by him.
  • by elzbardico on 10/21/22, 2:46 PM

    It just shows the genius of Steve Jobs. Jeff's Macintosh would be an utterly boring, dumb, and limited machine.
  • by Ebree on 10/25/22, 11:29 AM

    "Parents" buy PC for children's better grades (misunderstood as education process), read: to throw their responsibility for teaching kids away; and SPhones/Tablets to make kids happy read: avoid them & avoid raising them completely. (instead of creating the best educated friends you never had)

    In era of Telly being religious altar & oracle, buying a boring tool without feature of entertainment and non-marketable as colorising life / uplifting status seems incomprehendable.

    If, only boring tools were available children having other options would still not use them without pressure / good faithed convincing / bribery, so only these without other alternatives would use it.

    Extreme majority is not teached to teach & convince/evangelise. What they know is forced through obligatory schooling or coerced by popular media reeducation disguised as entertainment & socialisation.

    It conditioned minds and culture to prefer a sweet poisoned fruit or fool's gold over juicy beef steak earned through hard & uncomfortable. And it's even marketed as free willed choice & freedom..

    Families teaching & training kids can fit in a statistical margin error. Not counting ideological training though.. which vary between better or even worse, usually just mirroring the present & future telling magic window.

    Digressing

    Being a loner born under communist occupation as kid dumped on with an old german-languaged windows-like "pc" & printer, with only some black-white architect-work apps, i can say from experience:

    God damn all foreign languages (Babel tower curse..); Bless universal language of self-explainatory images & idiotproof systems with one choice one action principle, learning by exploring & building own features from simple blocks on fly.

    Words, icons, images are descriptors & ideograms. If we count numbers as pictures then words are made of pictures too - just with predefined abstract meaning behind each composition.

    We can map, plot and precisely group common words and their meanings & relationships on machine learning multidimensional graphs. So you can also use these algorithms nowadays called AI to find each word's & meaning's image/icon approximation counterpart. It would be weird if nobody yet tried it and geberelise it for most common worldwide languages. It would create the only truly existing allhuman Lingua Franca. ∆∆∆

    Image wins by representing direct visual & intuitive least abstract meaning. Like icons with human gestures. Like emotions.

    It's direct representation of real distinct thing you keep in visual memory along with all other instances of this idea/class & it's common relations to other things.

    But if you don't give a fully separate choice you will never know which symbolic system is better in a fair fight.

    Just the border between both is blurred & has no codifier.

    In time universal self-explanatory meanings will mutate adding new popular abstractions & both will merge.

    Like me eventually learning the Mordor Lang words on the clicked magic boxes.