Thursday afternoon at OLA 2010 I attended a session called Touch Me! Multi-touch Screens and More!
My interest in human-computer interface technology spans my twenty-five years studying in university and working in libraries. When I started university my data input device was a manual typewriter, although I wrote out some of my assignments in ink. Seeing personal computers starting to appear on campus, I made the first big ticket purchase of my life (this was years before I bought a car or condo)– an Atari 8-bit computer, hooked up to a cheap black and white TV and the cheapest printer I could find.
This Atari 600XL got me through my undergraduate degree at the University of Western Ontario. The computer is underneath the keyboard. Many applications came on cartridges. I did have an external floppy disk drive to store data. Many essays I wrote had to be spanned over multiple floppy disks and spliced together with my word processing software.
As low-powered as this computer was, the ability to edit essays on the fly without retyping was a huge advantage over my typewriter. However, as I have characterized this in more recent years, as much as computers are time-savings devices, learning how to use them is an enormous “time-sink.” I did have to memorize many obscure command-line codes to operate the software on this computer.
When graphical user interfaces were popularized by the Apple Macintosh, I knew I had to move to this type of technology as soon as it was economical. I opted for the poor-man’s Mac—an Atari ST. The Atari ST had a mouse, and replaced many obscure codes with icons that could be manipulated by the mouse. This was a profound change, since it sharply reduced the learning curve. Representing abstract ideas of electronic files and directories with pictures of documents and folders was not just a gimmick. The graphical user interface meant people could much more easily infer what actions could be taken when working with computer data. Instead of blindly navigating directories of files and folders with command prompts, people could rapidly get a big-picture view of the data in front of them.
The graphical mouse-driven interface of the Atari ST.
The Atari ST was a poor-man’s Mac in the late 1980s, as it offered a graphical user interface at a lower price. One big reason I liked the Atari ST over the Mac was that the Atari ST floppy disk drive could read floppy disks produced on IBM PCs, unlike the Mac’s proprietary floppy disk drive. Applications on the Atari ST did crash a lot, and eventually software development dried up as new Windows-based PCs stormed the market in the mid-1990s leaving Ataris to dwindling niches such as the gaming and music markets.
An Atari 520ST.
I had a monochrome screen with my Atari ST, which had a fantastic resolution—ideally suited for word processing. Over the years, I souped up my Atari with more memory and operating system upgrades that required replacing ROM chips. Interestingly, as I learned in my OLA session on touch technology, the keyboard is the oldest standard in this picture. The QWERTY arrangement on the keyboard was invented in 1875.
When I saw Windows 3.1, with its TrueType scalable fonts, I saw the light, sold my Atari ST and bought a clone PC running Windows. It was with this PC that I first tested web browsers on behalf of just launched Internet Service Provider in early 1993—my ultimate early adopter experience, since there were very few web sites in existence at that time (and the web browsers were horrible in early 1993, crashing all the time). At the time, I was highly motivated to learn as much as possible, since I believed that extending the graphical user interface to the vast potential of the Internet represented a colossal change in how people would work with information in the future.
The graphical user interface works on the premise of icon metaphors. The idea behind touch technology is that the metaphors are gone. A natural user interface means that the input device is the human body, using gestures and natural motions. Touch technology can appear in small devices, such as the Apple iPhone, but there are devices like the Microsoft Surface which people can gather around.
The OLA session on touch technology started with a YouTube video (no longer available) called Perceptive Pixel by Jeff Han. It showed a user manipulating data on a large touch screen.
While the new gesture-based interface seems a little odd at first, several benefits were cited in this session. Multi-touch tabletops, like the Microsoft Surface, can have benefits in education, as they can encourage participation, and perhaps can work well for special needs children.
There might be benefits in the speed and effectiveness of search, sense-making and reference, especially if people can collaborate together on a large touch screen.
One benefit that made me reflect a bit as a librarian is that touch technology could make search a pleasurable experience. There might be such a thing as “information aesthetics,” and touch technology could very well lead to a new sensibility as to the best way to manipulate information. The multitouch interface of the Apple iPhone is based on this sensibility, with its requirement of gestures to manipulate data (the gesture standards have actually been around a while—they were invented in 1981).
As I learned after this OLA session, Microsoft seems to have recognized the game-changing nature of touch technology. Microsoft has sharply changed course on their smartphone technology. Within a year new Microsoft-based smartphones, heavily dependent on touch technology, will be released. The new Windows Phone 7 Series phone will not be backward-compatible with the previous Microsoft smartphone technology, Windows Mobile. This means that new software will need to be developed—software that will integrate touch technology at its core.
More on Microsoft Surface…
Recently I found this Library Hat blog posting, http://www.bohyunkim.net/blog/archives/254, where there are several good video clips demonstrating Microsoft Surface in educational and cultural institutions.
One of the last points in the OLA 2010 session on Touch Technology was that the field is wide open for application development in libraries. One can follow development in the field by monitoring the Natural User Interface Group at its web site: http://nuigroup.com/.