Detecting which room I’m in? That’s cute. Detecting where I’m pointing? That’s critical.
Your phone and wearable are loaded with sensors and receivers to detect where you’re standing, how fast you’re moving, and which direction you’re facing. The average phone’s sensors have 16 degrees of freedom* – but none them can detect where you’re pointing. If we’re serious about ubiquitous computing, this is a critical flaw.
Let’s say you pimped out your rent-controlled bachelor pad with expensive Philips Hue lightbulbs and you want to dim one light. How can your phone figure out your intention? An iBeacon in the bulb can’t help – it might know you’re in the room, but that’s it. With all those 16 degrees of cutting-edge micromachined sensors, here’s the best we can do:
Are you serious?! Which one is “Hue Downlight 1” and why did you need to manually launch the goddamn app in the first place?!
Let’s compare that with a slightly older form of wireless transmission:
You don’t need to launch a remote-control app and select “FiOS Set-Top Box 4” and “Sony LCD TV 2”. You just point the remote, press the damn button, and that’s it. That’s the way it should work. You want to do this action on this object, right now. No pairing, no selecting, no launching, no bullshit.
I believe this is the primary reason that smart devices haven’t taken off. It’s nice that your scale tracks your weight and that you can recolor your lightbulbs. It just isn’t worth unlocking a phone, launching an app, selecting a dropdown, and pairing up to do it. When you can’t point and command, these devices are a usability nightmare and the ‘smart’ features are neglected. The Nest might have an awesome API and fantastic apps, but something like 99.95% of interactions involve physically handling the device.
Infrared is excellent for this, by the way. It’s unidirectional, uses trivial hardware, has a long range, and matches 1-to-1 with the user’s line of sight. If IR blipping could precisely trade your Charizard/vCard (depending on your age in the 90’s) then it’s perfect. A developer could easily encode an entire bidirectional interaction layer in IR pulses, but just transmitting a UUID would be more than enough.
Imagine this: You point your phone at that Hue bulb, unlock it, and you’re immediately in the color picker. It doesn’t need to be complex. We solved this problem 20 years ago – let’s get it back.
*Touchscreen X, touchscreen Y, pitch, roll, yaw, angular velocity X, angular velocity Y, angular velocity Z, acceleration X, acceleration Y, acceleration Z, GPS latitude, GPS longitude, barometer, temperature, proximity, ambient light
3 thoughts on “Screw Your iBeacons, We Need Point Beacons”
Very much yes! Why is CIR on smartphones not more widespread? I used to have it on my Pocket PC’s a decade ago. There are a couple smartphones now that have it, but maybe if someone made useful apps (like the Hue control you mentioned) there could be increased demand for this. Currently, I doubt it ever crosses most developer’s mind to brainstorm the possibilities of directional control.
P.S. I want your jacket from the AT&T hackathon in Vegas 😛
I think we still frame apps as programs that are run in isolation. In the Internet of Things, the world and the user should be the launcher.
we in Germany use to say: “you speak out off my soul” Do you know what I mean? Yes, you are so right!