Embedded vision is being added to imaging technology used in drones, collaborative robots and other devices on the manufacturing floor to improve efficiency.
The industry is moving toward using collaborative robots (cobots) with force-sensing technology that work alongside people
The Southwest Research Institute (SwRI) has developed a machine vision system that predicts pedestrian movements with a convolutional neural network (CNN).
Vision sensors and software have since become more sophisticated and their application more diverse, but this evolution has also increased demand for cables that can carry more data over longer distances, which presents new challenges for manufacturers.
Users choosing smart cameras for embedded vision applications should think about the camera's processor, vision software and automation system integration.
Georgia Tech researchers have demonstrated an all-optical technique for creating second-order nonlinear effects in materials that aren't normally supported, which could improve optical computers and high-speed data processors.
Researchers have developed a method to identify objects using microwaves that improves accuracy while reducing the associated computing time and power requirements.
Using sensors in age of smart devices and systems require more knowledge about sensor system design to ensure reliability and accuracy in critical automation applications. See three key trends in sensor engineering and design.
Researchers at the University of Illinois at Urbana-Champaign successfully applied kirigami architectures to graphene to create sensors suitable for wearable devices.
3-D machine vision can be used for a wide range of automotive manufacturing applications where fast and accurate object detection is needed.