During yesterday’s iPad event, which largely played out just as the rumors foretold, Apple did do something unexpected: they unveiled a version of the word processing, spreadsheet and presentation suite iWork redesigned for the iPad’s 9.7-inch touchscreen. It’s easy to write off iWork’s inclusion as a minor perk for business types only, but don’t. The suite’s fully-redesigned touch interfaces actually reveal more about Apple’s vision of the future of computing than any other element of their new tablet. Here’s why.
I used each iWork app yesterday, and while I couldn’t spend enough time with them to come to a definitive conclusion, they definitely surprised me. Text-input issues aside (we’ll get to that in a minute), each appeared more than capable of offering a similar, if not much improved experience, over their desktop counterparts. And for that, all credit is due to multitouch.
In Pages, one of word processing’s most arduous tasks–formatting text cleanly and easily around graphical elements–has been made orders of magnitude easier with touch. Once tapped, pictures and charts can be moved, resized, rotated and masked with finger swipes, pinches and twists, as the text instantly and naturally wraps around them. Once a graphical element is touched, a contextual box can be summoned to the surface with another tap offering options unique to that element, such as its layering position, size, and the like. Again, my time with the app was brief, but the potential available once clicks and drags are replaced by our natural inclination to touch and interact with our fingers was immediately apparent.
Keynote provides a similar interface for composing presentation layouts, which are more graphically intensive and thus even better served by touch. Added to the mix is an intuitive way to rearrange sides individually or in batches with taps and swipes. And while spreadsheets may be the least exciting runt of the litter, one thing touch certainly improves is navigating to or selecting multiple cells in the document: tap, and you’re there.
The apps, especially Keynote and Pages, function almost as light versions of far more advanced software like Adobe’s InDesign. PopSci’s art director probably won’t be ditching InDesign for an iPad any time soon, but having a large tablet of the future flat on a desktop could merge the benefits of working digitally with an interface that feels more like working with a pencil and paper.
This is significant. It’s the underlying concept behind all touchscreen interfaces–removing the mouse and pointer’s layer of abstraction to get us back to working with our hands. Most previous attempts at a more natural and expansive touch interface have been hampered by too small a screen or inelegant design. The iPad has neither.
And in choosing productivity apps as the first test case for these new interfaces, Apple is providing a familiar stepping stone into the world of interacting with nothing-but-touch in software we’ve been using for decades. The first personal computers were largely about getting work done, and word processing and spreadsheets were for several years the only real software options. Apple’s not plunging us into some wild, augmented reality desktop interface navigated by touch. They’re weaning us off the keyboard and mouse in baby steps.
But there’s still a ways to go. After about 30 minutes, my impression of typing on the iPad is that it’s doable, but awkward. Apple is usually content to let users sort out such limitations for themselves, but with the iPad, they’ve uncharacteristically provided the option to attach a physical keyboard. Paired via Bluetooth or connected to the dock, a keyboard solves the problem of awkward text entry–and ties you to a desktop–creating a hybrid machine that’s 90 percent touch, 10 percent traditional desktop PC or laptop.
The iPad, then, is a transition to a future when, in Apple’s mind, multitouch is so good that we no longer need anything but a screen. Whether that’s an appealing place for you or something that sounds dreadful, Apple obviously has a vision of the future for which they’re smartly and methodically laying groundwork. And once the text input problem is solved (hyper-accurate handwriting or speech recognition, perhaps?), you can bet that’s the future we’ll have.