Kenneth Friedman
March 27, 2015

Imagine using multiple devices to accomplish a single task. Instead of multitasking with your computer and phone, what if they seamlessly talked to each other to make individual tasks easier. There would be two benefits to any application that used this concept: new features, and simpler interfaces.

New features could be added because there would be more screen real estate, and there would be new capabilities available with each additional device.

Simpler user interfaces would be possible because you wouldn't be limited to a single screen or a single input method. Options and controls would not be buried behind menus or layers. Each device would be optimized for what that device is best at accomplishing.

I created a proof of concept last year called Open Canvas. It's a drawing app where the canvas is on one screen (an iPad), and the controls (color, brush size, eraser) are on another (an iPhone). It makes it way easier to quickly switch options, without covering up the image. In most other drawing apps, you have to find the controls buried in a menu or drawer. Once you find the controls, they appear over top of the canvas, interfering with the main task of drawing. Open Canvas fixes this problem. By "spreading out" the application on to two devices, it simplifies the user interface.

Another great example of this is Alfred, and Alfred Remote. Alfred is typing-based productivity application for OS X. Alfred Remote is an iPhone/iPad app that augments the desktop app. It can control Alfred on the desktop by pressing large icons on the other devices. It works really well, and it makes it much faster to quickly access actions frequently used in Alfred.

In two weeks, the Apple Watch will be available. Regardless of whether or not it is a success (don't bet against it), one great use case fits this general idea: the Watch as a remote control for apps. Procreate has already announced that you will be able to use the Apple Watch to access the controls of your drawing on the iPhone. I hope the Apple Watch will spark a new generation of apps that connect to large screen versions of the same app.

Open Canvas, Alfred Remote, and (soon-to-be) Procreate Pocket are previews of what's possible with current devices. The underlying technology is already here: Bluetooth LE and the high level APIs to communicate between devices. The problem isn't the technology, it's the concept. The key is real-time communication between devices to work towards a single task. Handoff, a new Apple technology, might seem like the answer at first. But it's not: it's wrong at the conceptual level. It only syncs data, but it doesn't work in real time. An easy to use framework would really open up this category of application.

This entire concept may sound trivial, but I don't think it is. I think it's a step towards the "Internet of Things" promise of connected devices. Before we worry about getting our refrigerator to talk to our toaster, shouldn't we make sure the already-internet-enabled devices can talk to each other?