Exit


The Image Behind the Glass:
Touch Screens and Light Pens

by Dennis Báthory Kitsz

1986


This talk was presented at ID Expo in Boston, 1986
Copyright ©1986 by Dennis Báthory-Kitsz



The Importance of Touch

There’s always too much for these two hands. Sometimes it’s mere labor -- pushing a rototiller through soil or splitting firewood. Gross motions. But it’s refined when I drive a car: steering wheel, stick shift, turn signals, lights, wipers, radio. My feet work the gas pedal, brake, clutch. Further refinement now. As I compose, I push a pencil before me in tiny strokes, thousands per page, with fine, delicate motions in definitions hundredths of an inch apart.

But each of these tools -- tiller, maul, steering wheel, pencil -- has arisen through use, through slow, historical development.

Not so the computer. The computer is a beast that arose from a need of the mind, not the body, and it has been an uncomfortable fit from the start. We stare from the outside in, through a glass window; we’ve even become pretty good at it, which is why we deceive ourselves into believing such terms as "user friendly".

Nevertheless, we still have a deeply human need to reach out and touch. "Reach out and touch someone" is the AT&T slogan; why? (And when did touching stop being the norm?) Touching is part of our human condition, and it is perhaps a reflection of our computer age that we have lost some of that ability to touch. Consider: When was the last time you reached out to greet a friend, hands on shoulders, kissing both cheeks? An acquaintance? A business partner?

I learned my typing on a manual machine. It was a skill of measured strokes, carriage returns, paper insertion and removal, and many erasures. But that keyboard is alien now, merely what we call an interface, not a primary tool, no longer with a physical sense of being part of ourselves or even our work. We want to wave this useless appendage away and get behind that screen. Or at least our hands want real work to do.

This is true, also, for the lay person. The keyboard is an unhappy, intimidating reminder of computers past -- those alien monsters that spit out errant bank statements.

Top


A Crash Course in Solutions

There are many solutions to the problem of humanizing the computer interface, especially through touching. Eliminating the traditional keyboard can be done through optical scanning -- where typed or printed text is machine read -- or speech input -- direct computer recognition of a (presently limited) vocabulary. But hands-on touching predominates with special clusters of keys, joysticks, mice, trackballs, touch pads, touch screens, and light pens.

Consider first the keys. Think "A". Touch "A". An "A" appears on the screen. So? What did thinking "A" have to do with how it got on the screen? Now consider special key clusters, especially arrows. They’re good ... they give a sense of direction, up, down, left, right. You think up, feel up, you touch up, something on the screen goes up.

Now turn those arrows into high-quality joysticks, not just up-down-left-right video game joysticks, but ones sensitive to direction, position, extension, and velocity of motion. With proper software, these devices pull and push something on the screen just the way we would do it if it were a pencil or a pea.

But good joysticks are very expensive, and require a certain degree of "muscular learning". (Take note that the generation raised with video games will find joysticks a comfortable addition to any screen-based system -- more on that later.) A less expensive solution to high-quality joysticks is the trackball, which sits before you and can be fingered and spun and positioned quite accurately. As you roll the ball, a corresponding motion occurs on the screen; the disadvantage here is that (until you get very used to the device) you have to "home in" on what you want to mark.

But certainly the most popular answer right now seems to be the mouse, that ubiquitous slide-and-click addition to Apple-style computers. And for good reason. It has the motion sense of the trackball, but you move the whole mouse. You move it, something moves on the screen. And at the heart of it is that the mouse is something to hold -- a just-right palm-sized touchie-feelie with some finger buttons. Roll it, push it, and click its buttons. Nice.

It seems like the ideal answer to the need to touch. But not yet and, in fact, far from it. What you move is on the screen, and the mouse is not right on the screen. It’s still a kind of muscular analog to those physical needs. Like seeing your spouse on the other side of a wire jailhouse screen.

Another route is the touch pad. This gives us a familiar tool -- a stylus (think pencil), and a pad-shaped tablet. This is good. We sketch in front of us. But still, unless we are drawing and doodling, we still search the screen to reveal the results of our scratching.

So we get to the touch screens and light pens, different executions of the same idea: reach out for the screen. Recall that mouse or trackball or touch pad or joystick have the same problem as the original problem, the keyboard: action here happens there -- not action here happens here. It is disembodied.

Consider that the computer was heralded as offering a paperless society. As we drown in increasingly fathomless oceans of paper, we discover that computers have heightened our awareness of the print medium. They have caused more, not less, paper because they have provided paper with more versatility -- together with an understanding that paper is no longer the only tool of communication. So there is more versatility, if not (and hence?) more value. Sales of pens and pencils continue without predictions of doom and demise. Ideas continue to be scratched/born on paper napkins, phone calls memorialized in newspaper margins. There is a first lesson here already, and it is approachability.

Here’s more. Speech input systems are effective because they free our hands. Light pens and touch screens can make us slaves almost as much as a keyboard can, as much as the joystick, mouse and trackball can. All relate to the speed and convenience of accomplishing certain tasks -- foremost in the minds of many computer designers. Humans have for too long been considered the weak link in the computer system, the incompetent biological blob moving in virtual infinities of computer heartbeats, heartbeats that pulse as many times in a single second as a human heart beats more than four months. The second lesson is a question: what is efficiency?

Recently, computer design has begun to incorporate human needs: "ergonomics" is the catchword for hardware, "user friendliness" for software. Movable monitors comforted the stiffened necks, movable keyboards relieved carpal tunnel syndrome in the wrists, gently colored monitors screens abated eyestrain. Pull-down and pop-up menus of information appeared, familiar graphical symbols replaced complex typed sequences of computer commands, colors and highlights caught the eye. The third lesson is comfort.

Aside from my technical background, I am also a composer. When I speak of music, I often refer to its counterpoints as trying to listen to several conversations at once. Indeed, your own mind conducts several levels of conversation simultaneously. You discover yourself humming a tune, digging out change for a bus or subway or parking meter, mentally checking the time, preparing for a meeting, planning the evening’s schedule, considering dinner, etc. And all are informed and influenced by the residue of the morning’s news, a marital argument, a powerful film, the quality of your breakfast, and the previous night’s party. Typing along -- even speaking along -- doesn’t succeed as a thinking person requires. That is linear, demanding thinking tasks, sub-thinking tasks and physical tasks similarly directed. With computers, we are often enslaved to features that disallow that normal, human, complex, interwoven, parallel and slightly schizophrenic thinking activity. The last lesson is madness.

So here are conveniences, speed, ergonomics, user friendliness. And still it’s not right. All the colors and ease of use won’t address our humanity, our madness.

While those ideas settle, it’s time to turn to how these touch screen and light pen devices work, what they’re good for, and where they appear.

Top


Needs, Worries and Failings

Touch screens and light pens are both similar manifestations of the same goal -- to get to the image behind the glass. The idea is simple: touch a screen with a finger or a pen, and something happens. The rest is software.

Perhaps you’ve seen touch screens in use at shopping malls. They are currently replacing the old "you are here"-style wall directories, and expanding the application into a combination yellow pages and street map. Tap the red square for a list of stores, the yellow square for a list of products or services, the blue square for a map of exits.

A touch screen is conceptually quite simple. You indicate a choice or action of some kind by touching a spot on the screen. There are a number of methods used to achieve this, including fine crosswires, infrared sensors, ultrasonic sensors, and transparent membrane contacts. Reliability, trigger or threshhold points, accuracy and resolution are the important elements in all these systems.

The first problem to be solved was how to make these methods reliable; presently, reliability is not an issue beyond normal maintenance.

However, trigger threshholds, accuracy and resolution present difficulties. The finger is a fat thing, sometimes indecisive and often approximate. Especially in a public setting with untrained users, such as a public shopping or services directory, the combination of hardware and software must evaluate the user’s intent. Tactile methods (crosswires and membranes) provide definitive resolution, as long as the finger presses a particular crossing point or group of crossings. The threshhold here is how hard the user touches the screen -- and how often. The accuracy is how correctly the system can determine the user’s intent. The resolution is the highest number of on-screen positions possible if accuracy and threshhold are ideal. Beyond this, the system must react quickly to satisfy the user, but not so fast that a single touch flips through a sequence of video "pages". (Don’t fear the hardware, really. In the computer hardware world, a sotto voce motto is "fix it in software".)

I’m not going to identify sources and systems and companies and products. The rest of the day is for that; instead, I want you to think critically about what screen interface technology can and can’t do for you and why.

Scene: I approach a computer screen. Now what? A conflict! Do I Reach Out and Touch It? It depends on who I am. If I am me, then I am still the older generation, and for this 39-year-old, I am either curious or come with raised eyebrow. But if I am 16, I walk right up and start, as naturally as scraping a stick across a picket fence while walking. (I don’t exaggerate. Did you teach your child or did your child teach you how to run your VCR?)

Consider a better example. A group of teens bops up to your touch screen mall directory, looking for a place to eat and hang out. Remember that you want to be fair to all your eateries, and your system must give them a chance to be identified.

The screen reads:

Welcome to the Jovian MegaMall. 
Touch here <#> for stores by name. 
Touch here <#> for stores by products. 
Touch here <#> for a map of exits.

One of the group jabs once or twice at the "products" square. The screen dissolves and a new list appears. A gabble of conversation continues as one strikes out at "food". The eateries appear; someone taps "Sam’s Pizza". Sam’s logo popus up, and a colored map flashes to lead the way to Sam’s. The group disappears into the crowd.

What could have gone wrong? The screen could have flipped by too quickly as they jabbed more than once. Or the system could have missed the poorly aimed finger entirely. Or even worse, the software might have been ambiguous; that is, instead of "Touch here <#>" and showing a symbol to touch, it could have read "Touch your selection". Where to touch? Is there a button? Such simple errors are commonplace in the computer design world; a typical story is told over and over in different forms. You’ve seen the software which says, "Hit any key to continue." How many children, looking for the brightest color to press, strike the reset key, crashing the software into oblivion. If any problems came up, Sam’s would have lost out to the nearest eating place ... a worse situation than the old cardboard wall maps.

So by itself the touch screen is neither problem nor solution. On a factory or sales floor where the touch screen is accessible only to trained personnel, the hardware and software tradeoffs remain similar. In one restaurant application, for example, trained waiters and waitresses combine touch screens, identification cards and keyboards in their daily work. A magnetic ID card is dropped into a terminal, an order is built as menu items are tapped on the touch screen, and special orders or particular cooking requests are typed into the keyboard. The software forwards the order to kitchen and bar, tracks the bill, keeps a record of all orders, provides inventory information, totals for the day, and a breakdown of average supplies and raw materials needed over the long term. This type of system is available today.

Similarly, on the factory floor, all sorts of repetitive tasks presently using screen/keyboard combinations can be achieved using touch screens alone. Examples are quality control and approval areas, materials handling, inventory and orders, picking rooms, and so forth.

Remember, though, that these applications involve relatively few selections, simple responses, uncomplicated activities and unsubtle relationships. For more complex work, the light pen is the tool of choice.

The light pen is, of course, just another manifestation of the touch screen concept, but with a different set of advantages and disadvantages. It has significantly higher resolution, but it is less forgiving. It has greater accuracy, but it requires user training. It permits sketching and pen-like activities, but it is still attached by a cord. Convenience suffers.

The light pen does not put out light, it is a light sensor. It senses the light generated by a video screen. The illumination on any conventional television screen or video monitor is created by an electron beam sweeping the back surface of the glass, causing a phosphor coating to glow. The incredible speed of the beam’s movement creates an apparently solid picture to our eyes. The beam is actually streaking across the average 13-inch screen at over two and one-half miles per second.

Yet if you can identify what phosphor is glowing, you know where the electron beam is at that instant. Conversely, if you know where the electron beam is, you know where you are on the screen. A light pen triggers as the electron beam passes by its sensor; a signal is returned to the computer, which calculates exactly where the electron beam was at that instant. This is translated into screen position information for the software’s use.

Now why is this different from a touch screen, and why is it special? Of course it is more exact; it is just like a high-accuracy touch screen. But more significant is the tactile world it opens up.

Let me give you a personal example, which is my desire, even longing, for a light pen system for composing music. I, like many musicians, compose what we can hear but not what we can necessarily play. The notion of the composer at the keyboard is regressive to me, even in a high-tech studio with all sorts of digital multi-track keyboard equipment. I want to composer music that no one has yet played, perhaps what no one can yet play. Furthermore, even the ordinary musical "alphabet" consists of nearly a thousand symbols which are manipulated in parallel ways distinctly unlike the written word and its hundred or so numbers, letters and symbols.

The light pen system would allow me to fill my pen with an ink consisting of, for example, entirely quarter notes, and drop them down in a continuous stream. Tap a vocabularly at screen-side, and my pen fills with eighth notes, or rests, or whatever piece of vocabulary I need. Or, if a less common symbol is needed, I can draw it and the computer can retrieve it from its library. Or if I must invent a new symbol entirely -- quite a normal occurrence in musical composition -- I can design it, draw it with my light pen just once, and drop it into place any time I need it. The system would memorize my new symbol for the future. Yes, the technology for this application is all here right now -- only a lucrative market is missing.

In an actual graphics application at one university, the light pen is used for making selections, sketching and drawing, pulling straight lines across a screen, expanding and contracting circles, creating foci for ellipses, defining foregrounds and backgrounds, rotating on axes, filling in and shading, bordering and color selection, and so forth. In fact, with a combination of on-screen menus and other tools, virtually all graphics functions are accomplished with the light pen alone. Other than the need to lift an arm to the screen, it functions similarly to a mouse, but with greater speed, higher accuracy and certainly much more "touch" value. What we do is what takes place -- action here happens here.

Yet a light pen is a dangly thing. Wireless light pen technology has appeal, but accurate, high-speed, wireless light pens are not yet available. The problems include their size; they must contain batteries, electronics and some sort of transmitter. Look at the bulky wireless microphones used by rock musicians, for a comparable example.

Furthermore, the pens must transmit a serial stream of information, whether at radio frequencies or using infrared light, meaning the receiving system has to backtrack to figure out where the light pen was by the time the information is received. It isn’t impossible, but let me give you an idea that it isn’t easy. On an average high-resolution video screen (about 400 by 600 dots), a single electron beam "bip" occurs in about one ten millionth of a second. Even if the accuracy of the human hand limits this to a 50 by 50 matrix (very crude, indeed), this still requires a pen response in one 750,000th of a second. With 10 or more chances to do this, we can pare it down to under one 100,000th of a second to grab and transmit the location information to the computer. So we will wait for a little while longer for a highly accurate wireless light pen.

Yet even the present-day attached light pen has touch. So the light pen provides a link to the image behind the glass. Many of you have seen that on-screen sportscasting now uses pen-like drawings to show football plays, reminiscent of the coach’s blackboard. This has a familiarity, such scratching on the screen ... computerized or not.

Perhaps it is because the pen is so important to us. For centuries, the pen has provided our MARK -- our signature. No contract is complete without a real, witnessed signature in ink. It is an extension of: (1) our hands (2) our identity. It defines a physical manifestation of our humanity, does it not? Monkeys can be trained to use keyboard symbols and touch screens and levers and what-have-you, but the pen! Ah! Humanity!

Top


Asking the Right Questions

In practical terms, touch screens are the solution of choice in public situations. Voice input is not yet sophisticated enough to separate one voice from the plethora of sounds in a noisy environment, and most systems must be trained to hear a specific person’s voice if more than a half dozen words are to be recognized. Beyond that, the user cannot be too shy to speak. Joysticks, despite their familiarity to users under 25, are ambiguous; the abstraction of here-vs.-there arises. Durability and maintenance are also problematic with joysticks. Trackballs present the same problems. Mice and light pens require attachment cables, and even the phone company hasn’t made their receivers damage-proof. Ditto for keyboards or buttons. So that leaves us to consider touch screens for public use.

But before joining the touch screen generation, ask some questions about the use of touch screens; many of these questions you would ask of any piece of equipment, but the questions must be colored by the fact that this is indeed a computer system:

When you consider employing touch screens or light pens, take into your thoughts these two apparently unrelated stories. Do you recall the problem of the non-responsive keyboard? Those were the first computer keyboards which, compared with typewriters, were incredibly silent. Suddenly there seemed to be the need for both "tactile feel" and "keyclick" -- feedback through familiar systems. The redundant phrase "tactile feel" provoked a demand for so-called full-travel keyboards, where there was clearly a top and bottom to the keystroke. You knew when your finger started to press a key down, and when the key was done being pressed.

And then came the keyclick issue, a demand for a surrogate sound for the old mechanical typewriter key striking the platen. IBM added a keyclicker to many of their office machines -- it was nothing more than a relay that provided that satisfying "clack" for every keypress. Marketers did swift but brief business in add-on clickers and beepers and quackers of all sorts.

But how that changes! When was the last time you heard that complaint? The "keyclickers" have learned the new technology ... and now that portable computers have arrived in public places, the hunt for quieter keyboards is on. So the keyclick was really an issue of familiarity (conservatism) vs. some sort of biological need. But do not let this learning (familiarity) disguise the continuing physical need issue.

The questions you need to ask can be summed up simply: Will it work? Who will use it? When must I replace it?

Top


Considering the Future

I’m a fast typist -- better than 100 words per minute -- but I wrote the bulk of this talk with a pen and paper. I’ve written upwards of 300 articles and books, every one since 1978 using word processors. But this time I wanted to see whether the pen still felt "natural" or "vital" -- and to discover whether the ideas could still flow. They did. But when it came to putting it all together... Incredible! What a chore! Everything got typed as soon as possible into my computer with its disembodied amber screen.

But I do like paper. Flip, flip. I draw arrows to reconnect ideas. The computer is a good editor and finalizer and a great typewriter, but it just isn’t integrated with the disorganized and specially organized human mind -- especially mine. Remember that parallel thinking, that human madness I mentioned earlier.

But there is hope. As technology advances, screens will change, and this is an important -- if not a foremost -- opportunity. Now the curved, glowing cathode ray tube is primary. But what of flat, paper-like screens? Paper-thin batteries are here, as are flexible circuits and credit cards with massive computer memories. How far are we, then, from electronic notepads -- real notepads, a far cry from our shrunken computers a/k/a electric typewriters called "portables"? How about thin (think calculator or credit card), flexible, single-sheet paper "screens" with memory? A note pad with one page, but with indexing, referencing, and with a keyboard on a screen. Black-and-white or color.

The idea isn’t new. Many of you have read or heard that wonderfully funny story, The Hitchhiker’s Guide to the Galaxy. An everyday man finds that earth has been destroyed to make way for a hyperspace bypass, and his whole perspective of reality has been maddeningly altered. Weaving in and out of the tale was the Hitchhiker’s Guide itself, which was an electronic book.

Here is the description of the Guide from Douglas Adams’ 1979 novel:

He also had a device that looked rather like a largish electronic calculator. This had about a hundred tiny flat press buttons and a screen about four inches square on which any one of a million ’pages’ could be summoned at a moment’s notice. It looked insanely complicated, and this was one of the reasons why the snug plastic cover it fitted into had the words DON’T PANIC printed on it in large friendly letters. The other reason was that this device was in fact that most remarkable of books ever to come out of the great publishing corporations of Ursa Minor -- The Hitchhiker’s Guide to the Galaxy. The reason why it was published in the form of a micro sub meson electronic component is that if it were printed in normal book form, an interstellar hitchhiker would require several inconveniently large buildings to carry it around in.

The Guide spoke its entries aloud. (You might like to know that the Earth was described by the single-word entry, "harmless". The description in the Guide’s new edition was expanded to read, "mostly harmless".)

So the Hitchhiker’s Guide is both ahead of us and behind us in fantasy, what with its old-fashioned electronic calculator pushbutton style but its astounding connection with information millions of light years distant. Still, the notebook or scratchpad or portable electronic whatchamacallit can be real. All the technology is here, except the thin screen itself.

In my mind, the notebook will be like this: It looks like a pad of paper -- thick drawing paper, probably -- with a spiral binding. It comes in all sizes, just like stationery-store notebooks ... one for the pocket, one for taking notes, a jumbo binder size.

It probably has a cardboard cover to keep it clean. Open it up, and there is a nice sheet of matte-surface "paper" ready to write on. But surprise! It has a "flip-open" edge (side, top or bottom) which "flips open" by squeezing a corner. Nothing flips, really, just a margin of the pad darkens with a group of choices. Why, it is a pad, tablet, typewriter, dictionary, wireless fax machine and modem, and television! The TV comes with image capture such as many VCRs have right now.

Imagine, if you will, this notebook of the future. It comes standard with handwriting recognition (scanners, after all, are pretty far along already). You write, it turns it into print immediately. You can even abbreviate. Perhaps you can speak to correct, if you wish. It can have voice recognition, but you don’t need to think aloud; everything can be private. Either way. Or data. Or typing. Or. Or. Or.

Yes, typing. You like to type, do you? Then touch the typewriter symbol and the keyboard appears on the screen -- that is, on the pad in front of you. The tactile problem does resurface here, since a flat, clickless keyboard with no key travel is not quite tactile in the terms I’ve already described, I’ll admit, but there it is. This ultimate notepad docks with physically bigger or smaller pads (all are computers) for the organization and sharing of ideas. It is also a photocopier in digital form; it self-replicates.

You can doodle, stretch ideas. Circle and pluck a piece of doodle and put it elsewhere -- a reorganizable doodle pad! Then jog your pages of doodles and notes and sketches and ideas together. Erase with the tip of your finger. Enlarge. Reduce. Pull pages out and press them together at the edges to make a bigger sheet. It will have perhaps a dozen physical "pages", but each page will store a hundred pages of writing and drawings and ideas, recallable at the press of a corner. Flip, flip.

My notebook of the future indexes, files and retrieves; slips in the time and date. Beyond this, it has touch points for magnify/reduce; flip through; pen eraser/finger eraser; write/type; voice recorder on/off; colors; clean page; move/copy; dictionary; television on/off; telephone; and more.

Write with the light pen -- it’s not attached in this future -- or pull things together with your finger. Erase. Tear off. Crumple. Toss. Not really lose your ramblings if you don’t want to, because you can recover and unfold your crumpled papers. Unless you tear off and shred. (The political scandals of the future may require computer sleuthing of the first order.)

How about power? Paper-thin batteries are here, announced at the end of 1987. A round "target" on the back of every sheet is where you stick a new peel-off battery after a few months.

And, naturally, the whole works will be cheap enough to keep several notebooks and binding dockers around the home, office, school.

This notebook will be real, possibly as I’ve described, probably better. My measure of the success of a light pen or touch screen system is how close it comes to this notebook ideal. This is both a prediction and my challenge to today’s other speakers. Keep these notions in mind as you listen to them and ask them questions.

Top


Conclusions

The light pen and touch screens, both manifestations of the same need and the same process, are a truly useful interface between human beings and their servant computers. They are the only devices with the comfortable touch of hand tools. (Just as I cannot write and edit effectively without a word processor anymore, I long to think, sketch, and doodle with such new tools.) By acknowledging these human needs, examining all the solutions, acknowledging the advantages and failings of these tools and by asking the right questions, you can decide how touch screens and light pens can work for you.

Let me summarize the main points. Primarily, touch is good, the more touch the better. Touch is friendly. But it must be used in situations where it can be effective and efficient -- your goals -- as well as comfortable for your user. Except where it is employed as a gimmick, touch must also consider human reluctance, resistance and madness.

More specifically, several methods of touch screen technology are available, all invisible to the user. But these must have the accuracy, resolution, threshholds and long-term reliability you will require in your application. The software that supports the system must be ready to achieve your goals; do not expect technological innovation and glitz to do that by itself. Consider your own employee’s or customer’s reactions, foibles, failings, enthusiasms and levels of concentration and impatience.

If you consider light pens, do not allow their more specialized uses to obscure the requirement for comfort, efficiency and multi-leveled thinking. The hand-held writing tool -- pen or pencil -- is still used because it works. It meets needs. Your light pen system must meet these same needs.

Finally, consider the future. Put on your polarized glasses to cut down the glare and glitter of new technology. Will these systems still do the job for you when everyone has one? Don’t fear simply that the devices will be dated; no one turns down a gold Cross pen because it may not be the latest erasable ink roller micro point technology. Instead, ask yourself if you will have something that meets a basic need when the newness has dulled.

Touch, approachability, efficiency, comfort and madness. Light pens and touch screens presently offer a remarkable opportunity to address these human needs, to reach that image behind the glass.


Excerpt from The Hitchhiker’s Guide to the Galaxy: Copyright 1979 by Douglas Adams. Published by Harmony Books, New York.


Top

Contact the author


Exit