Project Soli from Google's ATAP is an exploration of how hands and fingers can be used to control ordinary devices without a touchscreen. The technology at work is a new high-tech micro radar sensor, which redefines what it means to own a piece of wearable technology.
This chip is so small, it can be embedded into objects in place of a button; imagine being able to pinch your fingers together to make menu selections or rub your fingers to activate scrolling. This is the same kind of technology that is used to track cars, planes, satellites and other big objects—but now it's being used in reverse to track tiny micro-movements.
Google's Project Soli asks a question that will lead the future of UI design: how can manual movements of the human hand be utilized by the virtual world?
Gesture Recognition Projects
Google's Project Soli Imagines the Future of UI for Objects
Trend Themes
1. Gesture Recognition - The exploration of using manual movements of the human hand to control devices without a touchscreen presents opportunities for disruptive innovation in user interface design.
2. High-tech Micro Radar Sensor - The development of a small micro radar sensor that can be embedded into objects in place of buttons opens up possibilities for disruptive innovation in wearable technology.
3. Virtual World Interaction - The use of manual movements of the human hand to interact with the virtual world creates potential for disruptive innovation in UI design for objects.
Industry Implications
1. Wearable Technology - The use of a high-tech micro radar sensor to control devices presents disruptive innovation opportunities in the wearable technology industry.
2. Consumer Electronics - Gesture recognition technology has the potential to disrupt the consumer electronics industry by revolutionizing the way people interact with devices.
3. UI/UX Design - The implementation of manual movements of the human hand as a user interface opens up disruptive innovation opportunities in the field of UI/UX design.