- Use spatial perception and memory to keep track of dataflow code.
- Collaborate across office walls (= unlimited virtual resolution), together or remotely. Bluescape explores the concept with huge screens.
- Smart home programming: baby monitor flashes lamp in living room. MIT’s Fluid Interfaces group has been exploring AR controls for devices in the home.
- Stretch a canvas across the wall to make generative wallpaper. View and edit source in situ.
- Set up 100 virtual speakers around an exhibition space, then feed them with a/v synths. The speakers can skitter around the walls too, or flock (if they evolve wings 😉 ).
To do this we’ll port the graph editor to Unity, a game engine which will make it possible to port to various hardware. Our current graph editor is browser-based (Polymer custom elements + SVG) and makes use of zoomable UI concepts. Zoom out, see the big-picture shape of the graph. Zoom in, see details like port names. People are used to this interaction pattern with maps. It works with mouse (wheel/scroll gesture), touch (pinch), and AR/VR (get closer).
I’m a backer of Structure Sensor, and applying for Google’s Project Tango. These portable 3D cameras with screens will map the real space and provide a “magic window” into the augmented space. Combining these portable 3D cameras with VR/AR glasses would free the hands to explore interactions beyond poking glass or waving in the air.
Tegu blocks are sustainably made wooden blocks with magnets inside. They are lovely to hold and addictive to play with. I have been imagining using them as a tangible interface for dataflow programming. Snap together to connect, then pull apart and see the virtual wire. Arrange them on wall (or drafting table). Turn them like knobs to tweak values.
I hope to design augmented reality interactions that free programming (and computing) from the desktop metaphor.