Keyboards and mice will not continue to dominate computer user interfaces. Keyboard input will be replaced in large measure by systems that know what we want and require less explicit communication. Sensors are gaining fidelity and ubiquity to record presence and actions; sensors will notice when we enter a space, sit down, lie down, pump iron, etc.
This talk will present examples in which our intentions can be
understood and acted on by computers. The work reaches across domains
to demonstrate that human intentions can be recognized and respected
in many complex natural scenarios. Examples of context improving
scenarios in the office, the car, kitchen and even the bed will be
presented.