Sensors, Vision

Embedded vision is changing how people, machines interact

Embedded vision immerses the user in a more natural way by allowing the products to better augment our existing capabilities.
By AIA December 1, 2018
Courtesy: CFE Media

Embedded vision technology changes machines and the ways people interact with them. From smart phones to automobiles, embedded vision transforms the function and capabilities of the products people use every day. Embedded vision immerses the user in a more natural way by allowing the products to better augment our existing capabilities.

Embedded vision adds immersion and more natural interaction

Vision systems that recognize player movement in video games can add functionality to the game. They also allow for more seamless interaction between the player and the game. A controller isn’t needed to translate the player’s desired actions; the system can interpret them through the player’s movements.

In this scenario, the interaction with the video game and the console is far more natural than using a controller. There’s also an added level of immersion when a player uses their own movements to control the character they’re playing as.

Embedded vision augments existing capabilities

Courtesy: CFE Media

Courtesy: CFE Media

Embedded vision allows products to add to our existing capabilities. For example, augmented reality (AR) could allow a worker on a factory floor to get assembly directions on a pair of smart glasses. These directions could line up with the part being assembled in real-time.

In this way, embedded vision technology is adding to the worker’s capabilities. The worker already has the inherent ability to assemble a part; the AR system reminds them of the proper steps. Embedded vision systems, instead of automating tasks, often aid people and become a critical part of how we complete these tasks.

Embedded vision has incredible disruptive potential and completely transforms the products it’s integrated into. As this technology continues to advance, and as more products adopt embedded capabilities, the way we interact with our products will continue to change.

This article originally appeared on the AIA website. The AIA is a part of the Association for Advancing Automation (A3), a CFE Media content partner. Edited by Chris Vavra, production editor, CFE Media, cvavra@cfemedia.com.


AIA