Sensors, Vision
An inverted Universal Robots 10 kg payload robot with a two-camera configuration makes a test sweep over an automobile in progress, using machine vision to improve manufacturing quality. Walter LaPlante, advanced manufacturing systems engineer at Ford Motor Co., Redford, Mich., facility explained some Ford machine vision applications at Automate 2019. Courtesy: Mark T. Hoske, Control Engineering, CFE Media
Sensors, Vision June 14, 2019

Envisioning more at Ford

Plans are underway at Ford to use machine vision with robotics to increase quality and throughput, as explained at Automate 2019. See video.

By Mark T. Hoske
Image courtesy: Chris Vavra, CFE Media
Sensors, Vision June 11, 2019

Researchers develop scanning eye for robots

Fraunhofer Institute for Photonic Microsystems (IPMS) researchers have been developing and manufacturing micro-scanner mirrors, which provide robots an ability similar to human vision.

By Fraunhofer IPMS
Courtesy: CFE Media
Sensors, Vision June 9, 2019

Differences between machine vision and embedded vision

Embedded vision and machine vision applications are effective, but they have different priorities and interfaces that need to be considered.

By AIA
The first printer made by Inkbit, a startup out of MIT, uses machine vision and artificial intelligence to expand the range of materials it can print with, bringing all the benefits of 3-D printing to a slew of products that have never been printed before. Courtesy: Inkbit/Massachusetts Institute of Technology (MIT)
Sensors, Vision June 7, 2019

Company uses machine vision and artificial intelligence to enhance 3-D printing

Inkbit, a startup out of MIT, is working to bring 3-D printing benefits to a slew of products that have never been printed before by pairing its multimaterial inkjet 3-D printer with machine vision and machine learning systems.

By Zach Winn
Image courtesy: Bob Vavra, CFE Media
Sensors, Vision June 6, 2019

Overcoming predictive maintenance obstacles

There are several key barriers to the implementation of predictive maintenance technologies, but they can be alleviated.

By Jos Martin
Courtesy: CFE Media
Sensors, Vision June 5, 2019

System integrators use specialized skills for complex vision applications

System integrators across the industry are fielding more requests to develop complete and detailed systems, which require them to stay ahead of a fast-moving curve.

By Winn Hardin
Machine tool programmers and engineers that use Renishaw’s GoProbe macros could save significant time and labor with the use of Mitsubishi Electric Automation Inc.’s Interactive Cycle Insertion screens on its M8 Series of CNC controls. The new software screens can integrate Renishaw GoProbe macros with CNC controls. Courtesy: Mitsubishi Electric
Sensors, Vision May 31, 2019

Easier CNC programming, probe system integration

Cover Story: Computer numerical controls (CNC) trends for machine control include easier integration of systems, such as measuring probes, to save up to 90% setup time.

By CFE Media
MIT researchers have developed a low-cost, sensor-packed glove that captures pressure signals as humans interact with objects. The glove can be used to create high-resolution tactile datasets that robots can leverage to better identify, weigh, and manipulate objects. Courtesy: Massachusetts Institute of Technology (MIT)
Sensors, Vision May 30, 2019

Sensor glove developed to learn human grasp signatures for robotics, medical applications

MIT researchers have developed a scalable tactile glove (STAG) designed to help a neural network identify objects by touch, which could aid robotics and prosthetics design.

By Rob Matheson
Image courtesy: Chris Vavra, CFE Media
Sensors, Vision May 30, 2019

Four ways embedded vision systems are used to enhance robotics

Embedded vision systems can help robots achieve and accomplish tasks they weren't able to do before including automated assembly and robotic inspection.

By AIA
To bring more human-like reasoning to autonomous vehicle navigation, MIT researchers have created a system that enables driverless cars to check a simple map and use visual data to follow routes in new, complex environments. Courtesy: Chelsea Turner, MIT
Sensors, Vision May 27, 2019

Bringing human-like reasoning to autonomous vehicles

MIT researchers have developed an autonomous control system that “learns” the steering patterns of human drivers as they navigate roads in a small area, using only data from video camera feeds and a simple GPS-like map. See video.

By Rob Matheson