As a practitioner who has created multi-touch user experiences using stylus and touch, I realize that this moment Apple depicts is not to be taken for granted.
Unfortunately, I had to miss the live Apple event last week. So I took a little time this weekend to wade through it and came across this rather magical little multi-touch moment from the iPad Pro demonstration. To the untrained eye, simultaneously using a virtual ruler while drawing a line with a stylus may seem like standard fare. As a practitioner who has created multi-touch user experiences involving both stylus and touch inputs, I realize that this moment Apple depicts is not to be taken for granted. Here’s a breakdown of what’s likely happening to make this moment possible:
Virtual Ruler Behavior
The demo suggests that much like a real-world object, the user can freely manipulate the virtual ruler by applying any combination of fingers and movements. If accurate, this is quite liberating compared to most environments that require specific combinations of fingers or “gesture chords” as inputs to inform the software of the user’s intent. Further, to the human eye at least, the ruler edge boundaries and stylus inputs seem to be quite fluid. One could easily imagine a whole suite of virtual tools that become as ubiquitous as scaling vector-based lines or drawing tasks are today with stylus tablets, mouse and keyboard.
Simultaneous touch and stylus
While stylus technology has improved dramatically over the last several years, providing solid palm rejection can still be a challenging problem yielding unpredictable results for end-users. That is, the system should consistently distinguish between intended touch versus a resting hand or lingering finger when both stylus and touch are used together. What this has meant to UX designers of multi-touch systems is that performing “fat finger” touch at the same time as precision tasks (such as stylus inputs), is often a no-no. If this indeed works as demonstrated, the ability to simultaneously distinguish between touch and palm rejection ranges from a clever sequence of UI moves to perhaps something quite remarkable. Is Wacom scared? Unless they have an ace up their sleeve, they should be. Removing artificial input mechanisms users have had to cope with for decades, such as a mouse or separate stylus tablet, makes this environment a far more appealing and natural way to work.
The Stylus
There are a lot of good styluses out there. FiftyThree, Adonit and others have gone to great lengths agonizing over industrial design, pressure detection and various important attributes. The iPad Pro demo suggests the Apple “Pencil” has one thing the others don’t: zero latency. This would make stylus inputs more realistic and, well, precise. Apple has a legitimate leg-up with their ability to control the entire ecosystem around the stylus experience. They may have indeed created a better stylus experience, which is a bonus to the other aforementioned multi-touch aspects.Although this isn’t a revolutionary idea–IBM and others have had patents since the ‘80s claiming the ability to provide simultaneous touch and stylus inputs–with most things, ideas come easy. Execution is hard. So as a designer of systems employing multi-touch and stylus, this little moment got me excited. I’m sure in no time this more natural way of working will be the new normal, like my kids who can’t imagine not being able to watch their favorite television show on demand. Anyone else remember waiting all week for Saturday morning cartoons?
Tell us what you need and one of our experts will get back to you.