This week brought two companies espousing two platforms with two very different approaches for bringing touch to traditional personal computers. Microsoft has long advocated that all input — mouse, pen, and touch — can all coexist in the same Windows UX. Apple on the other hand strongly believes that simply replacing the display of a Mac with a touchscreen would result in terrible ergonomics and UX. Instead Apple has opted to bring only touch input to Macs, leaving touchscreens to devices running iOS.
The recent announcements by both companies are extensions of these long held positions. Among other updates, Microsoft announced a new all-in-one desktop featuring a giant 28″ touchscreen that is mounted in a way that easily lowers it into a drafting-table position. Apple announced new laptops that also have touchscreens, but rather than replace the displays, they replaced the function keys to live just above the keyboard. While both tightly integrate hardware and software to provide pricey solutions geared toward professionals, the similarities stop there.
The Surface Studio
I have long been skeptical of touchscreen PCs for two reasons, hardware and software.
Adding a touchscreen to a laptop form factor is particularly challenging simply because of the many conflicting design priorities. Laptops need to be light, touchscreens need to be firm. Touchscreens are most ergonomic when horizontal, but the reason you get a laptop as opposed to a tablet is for the keyboard. Windows PC makers, including Microsoft, have tried to solved this by offering laptops that can in one way or another transform into a tablet. The problem I see with this approach is that each transformed state is effectively a different mode that is better for some tasks, but worse for others. Say you need to respond to an email while using the device as a tablet to edit a photo. Your best option is to transform the device back into a laptop to write and send the email, then transform it back into a tablet to resume your edits. Microsoft touts this transformation as a feature, but I see it as maximum disruption.
The Surface Studio already avoids much of this mess simply by being a desktop. Desktop computers are allowed to be heavy enough to offer a firm experience for touch input. They also have detached keyboards that can co-exist with a horizontal display, which brings me to what I see as the Surface Studio’s most appealing feature as a touchscreen device. The way the Studio easily lowers into a drafting-table position is exactly how I pictured a large touchscreen would work ergonomically. I am honestly a little surprised it took PC makers this long. That same person editing their photo on the touchscreen in drafting mode can easily respond because the keyboard is still accessible and even switching the Studio to being vertical seems much less disruptive, if not elegant.
While I am still skeptical of touchscreen laptops, including Microsoft’s Surface Book, I am not skeptical of the Surface Studio… at least hardware-wise.
Software-wise, I remain unconvinced. As good as the Surface Studio hardware looks, it’s still runs the same Windows as every other PC, and for as long as Microsoft has been promoting pen and touch enabled input, they’ve been promoting the mouse since Windows’ inception. Even in this week’s impressive demos, I couldn’t help but notice all of the tiny UX controls (buttons, scrollbars, etc…) that remain optimally sized for a cursor. This alone will make for a disjointed UX experience. Imagine again you are editing a photo with touch, but need to access some feature in a toolbar or menu. You will either have to carefully hit the tiny target or reach for a more precise input device. This leads me to believe that touch input in Windows is not just a second class citizen, but third behind mouse and pen. This doesn’t mean I think folks won’t use touch, they will for convenient gestures such as pinch-to-zoom and swiping, but that they will alway have a mouse or pen nearby.
I suspect this won’t be a showstopper for many professionals who already use custom inputs and are perfectly happy to get major new capabilities that enhance their specific apps and workflow, even if it comes at the cost of a disjointed experience across the rest of their OS. Microsoft just delivered what many creative professionals have been looking for and I see Apple losing customers as a result.
The 2016 MacBook Pro
While Microsoft probably delivered the best touchscreen on a traditional computer that will be great for certain creative professionals, Apple may have delivered the best touch input on a laptop for everyone else.
The already great multitouch trackpad is up to 2X larger than previous models, which looks to bring it closer to my favorite input device of all time, the Apple Magic Trackpad 2. I suspect a lot of people, professionals or otherwise, underestimate the usefulness of Apple’s trackpads. Basic gestures such as pinch-to-zoom, scrolling, and swiping are all incredibly intuitive across apps. Advanced gestures such as switching spaces, revealing the desktop, and exposing windows are equally rock solid. Apple’s multitouch trackpad already satisfies many of the benefits of a touchscreen.
Apple also introduced the Touch Bar, a touchscreen strip that exists above the keyboard where the function keys used to live. Like many power users, I first gasped at the idea of losing my precious functions keys. But when I thought about it, here is how I use actually those keys (in rough order of frequency):
- Screen brightness
- Key-mapped script
- Windows stuff in VMWare by holding the fn key.
Those first three items are already available in the controls strip section of the Touch Bar, I am guessing I can figure out how to add my my nerdy key-mapped script via customization, and VMWare will still be supported by holding the fn key. Keys that I do not or rarely use such as LaunchPad, Mission Control, and Keyboard Brightness no longer take up any space. In their place, applications will be able to insert their own custom functions and controls. While that may sound like a gimmick and though I am sure some developers will treat it that way, I think the usability potential of the Touch Bar is huge, especially given these controls can be multitouch and reportedly very responsive. Going back to example the a photo, one of my biggest annoyances when editing is losing both my focus and cursor position in order to access some tool. The touch bar will provide access to at least some of those tools without losing my cursor place.
This brings me the biggest difference in Apple’s solution, the implicit understanding that MacOS is fundamentally designed for the mouse, which makes the cursor perhaps the most important user interface element in the entire platform. Bolting on a touchscreen display inherently affects and sometimes contradicts the cursor. Tap the close button on a Window, and the cursor has to either disable or mirror that input. Anyone who has played with a pen or touch enabled Windows PC has seen this behavior. With the multitouch trackpad and now the Touch Bar, Apple doesn’t try to replace any of the cursor’s core functionality. Instead they’ve simply added complimentary touch input where ever possible. For example, if I two-finger swipe to scroll, the cursor is unaffected. Adding touch while leaving the cursor alone also means apps don’t have to be updated to get the benefit of touch gestures or redesigned to account for larger touch targets.
Apple’s solution is less obvious and certainly won’t lure back anyone getting a Surface just for the power of a traditional computer with a touchscreen, but I think it will ultimately be less disruptive while offering a more holistic and integrated experience in a way that feels natural.
Who is Using What
As I mentioned earlier and on Twitter, Microsoft’s solution seems to be a better fit for desktops while Apple continues to focus on laptops. I think Microsoft is definitely getting the attention of Apple’s pro desktop users who were once again left in the cold this week. That said, I think Microsoft might have a problem with portables. The Surface Books (and other touchscreen Windows laptops) still seem very awkward and Windows’ touch input is not good enough for tablets. Apple is dominating with touchscreen tablets and has a touch input solution that makes more sense to me for laptops. I think Microsoft is going to do well with the Surface Studio, but already more people use laptops and tablets, and those numbers are only going to go up.