Though interactive desks have been envisioned since at least the early ‘90s, they have never really taken off. One reason is that systems like this have not been designed to accommodate the variety of tasks and objects placed on this work surface.
In order to solve this issue, Carnegie Mellon researchers Rober Xiao, Scott Hudson, and Chris Harrison have developed a prototype system, dubbed “Desktopography,” that can launch and manipulate computer apps such as a calculator or a map with hand gestures, and react to physical objects on the desktop as needed.
Desktopography uses a depth camera for tracking, and generates augmented images in the same space as the physical objects via an overhead projector. The small projector, depth sensor, and Android computer are all self-contained and screw into a standard lightbulb socket, making installation as easy as replacing a bulb.
For more details on the project, check out the video below and read their paper available on Xiao’s website.[h/t: Digital Trends]
Desktopography Turns Your Desk Into a Responsive Touchscreen was originally published in Hackster’s Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.