Aside from keeping our iPhones in our pocket more, I think the Apple Watch is compelling for another reason: communication. The ways in which Apple is allowing people to communicate via Apple Watch – taps, doodles, and, yes, even heartbeats – is a clever, discreet new paradigm that epitomizes the company’s mantra that the Watch is the most intimate and personal device they’ve ever created. I, for one, am very much looking forward to trying these features.
What’s even more compelling, though, in my view, is the engine that’s powering the delivery of said communication – namely, the Taptic Engine. Beyond its use for notifications and communication on the Watch, Apple has implemented its Taptic Engine in one other form: trackpads. Apple has put the tech into the new MacBook and the refreshed 13-inch Retina MacBook Pro. I had an opportunity to play with the Force Touch trackpad (about 30 minutes) at my favorite Apple Store here in San Francisco, and came away very, very impressed.
I find Apple’s embrace of haptic feedback fascinating and exciting, because the use of haptic technology has some very real benefits in terms of accessibility.
Haptic Feedback and Accessibility, in Broad Terms
Before delving into the specifics of Force Touch on accessibility, it’s worth discussing the influence haptic feedback in general has on accessibility.
In short, haptic feedback is great for people with disabilities because of augmentation.
What I mean by that is haptic feedback has the potential to augment a person’s senses, insofar that it can help “make up” for an impaired sense. For example: on my iPhone, I not only hear an audible ring when getting a phone call, but my phone also vibrates. I have it set that way on purpose, because I suffer from a congenital hearing loss due to my parents being deaf. Although the vibration feature isn’t technically haptic because the shaking is done via a motor, the concept is similar. Because I can’t hear as well and may miss the ringer’s audible signal, that my phone also vibrates compensates for my hearing loss by alerting me that my phone is ringing. The accessibility win here is that I have a secondary “sense” that boosts my awareness that my phone is doing something that requires my attention.1
By the same token, when I worked in special education preschool classrooms, we often used toys that offered a variety of sensory output (e.g., lights, sounds) in order to augment our students’ sensory development. The effects of these toys on our students is similar to what haptic feedback/vibration does for accessibility: it augments the experience. Instead of relying on one sense that may not work as well (e.g., my eyesight), haptic tech and/or vibration compensates for it in such a way that enables an enriched experience relative to a person’s needs and limitations.
Make no mistake, however, I’m in no way implying that haptic feedback alone can fully make up for disability; that’s impossible. My point is simply that this multi-sensory approach has the potential to greatly improve interaction between, say, a low-vision user and his or her Apple Watch (or other smartwatch).
The Accessibility Merit of Force Touch
As I wrote at the outset, I spend about a half-hour testing the Force Touch trackpad on the 13-inch Retina MacBook Pro at the Apple Store. A full review is beyond the scope of this piece, but I do have a few observations to note.
First, the feeling of “force pressing” is subtle but noticeable. I tried doing the things shown on stage at the event – long-pressing on a word to bring up OS X’s dictionary, for instance – and definitely noticed the haptic feedback working. It was somewhat unsettling because the Force Touch trackpad doesn’t physically move as the prior trackpad did; rather, it’s stationary and the only thing “moving” is the feeling under your finger. Again, a tad disconcerting at first, but I quickly adjusted to it.
The part that struck me about using Force Touch was how useful it was in alerting me that I clicked something. Clicking a word did two things: (1) it showed me the definition; but (2) more importantly, I felt the click at the same time. Feeling my action was key because it let me know that I’m clicking without me having to rely solely on my vision to know that I clicked. And that’s the accessible part – the Force Touch trackpad gives me yet another cue (beyond the popover animation and sound of the click) that something happened.
It seems small, but those extra sensory cues make a world of difference. As I noted earlier, Apple’s use of haptic feedback in this way is important because it gives me (and others under similar circumstances) another clue that something is going on. This is huge not only for the visually impaired, but for those with motor delays as well.
In my brief testing, I used the trackpad with its default settings. That said, it’s worth mentioning that the options Apple provides for adjusting Force Touch’s sensitivity go a long way in affecting the experience. This granularity is important, as users can adjust how much pressure is needed based on their own needs and preferences.
Sophisticated haptic feedback could add a new dimension to smartphone interactions, which so far have been trapped behind glass screens. Imagine an on-screen keyboard where you could orient yourself by feeling the grooves between the letters, or a version of Angry Birds where you could sense the tension in the slingshot as you drew it further back. Or just think about feeling a pleasant bit of texture under your fingertips as you flicked through your Twitter or Instagram feeds.
VanHemert references the latest iMovie update, where Apple leverages Force Touch to let video editors “feel” the end of a clip. That’s a very accessible (and thoughtful) feature, one that proves that Force Touch’s accessibility can be beneficial to all regardless of limitation. More to the point, it illustrates Force Touch’s as-yet-untapped potential as a accessibility tool.
Imagine, for example, iOS 10 or 11. Apple will almost assuredly bring Force Touch to the iPhone and iPad, and they could utilize the technology in a slew of ways. They could effectively solve the problem with buttons in iOS 7 and 8 by using haptic feedback to denote a “button press” everywhere in the system. Thus, visually impaired users like me wouldn’t have to struggle so much in figuring out what’s a button versus a text label. Likewise, Force Touch could save those with motor challenges from the work of extra taps by allowing force-pressing to bring up contextually-specific controls. There are lots of possibilities here.
Force Touch and the Future
It should be apparent by now that I’m very bullish on Apple’s Taptic Engine and its Force Touch technology.
I’ll be able to report on this more concretely once I’m able to spend some quality time with Apple Watch, but suffice it to say, this is an exciting addition. It’s great that Apple created Force Touch APIs for developers, and I wonder if and how Apple might use them to improve the discrete Accessibility features on the Mac and iOS.2
With the advent of Force Touch and the Apple Watch’s launch, these are exciting times for me and everyone else in the accessibility community. It’ll be interesting to see how it all plays out.
As an aside, the iPhone 6’s vibration motor is the strongest of any iPhone I’ve ever used. It makes not missing a phone call better than ever. ↩