How sensors enable smart factories and more efficient manufacturing

Modern sensor technologies can help manufacturers achieve digital transformation by helping with safe human-robot interaction and autonomous vehicles.

By Yulin Wang June 4, 2023
Image courtesy: Brett Sayles

With the high inflation, increasing labor costs, labor shortage, energy crisis and hybrid working, the manufacturing industry experienced significant turmoil in 2022. Although some of these have started to get eased, 2023 is still expected to be a challenging year for the manufacturing industry with many transitions ahead.

One of the most interesting transitions in manufacturing is the trend toward digital transformation. Digital transformation, along with Industry 4.0, has been a buzzword for many years. However, despite the fancy name, digital transformation is often a vague concept for manufacturers. However, it can be deconstructed into two main themes: safe human-robot interaction (HRI) to achieve higher productivity and an increased level of autonomous mobility for material and goods transportation. These themes are ultimately enabled by modern sensor technologies.

Safe HRI used to achieve higher productivity

Safety has always come as the overarching priority when it comes to using robots/machines in the manufacturing industry. Robots can pose a variety of hazards to workers. For example, while industrial robots are designed to operate at a safe distance from people, these devices traditionally lack the sensory skills required to identify adjacent humans. Recently, with the fast adoption of collaborative robots (cobots), human operators are directly exposed to the workspace of robots, which can lead to further collisions, risks, and dangers. In order to mitigate safety concerns, multiple sensors such as force and torque sensors, lidar, and tactile sensors being installed on robots to equip them with better environmental perception and collision avoidance capabilities.

One of the critical applications of sensors in robots is proximity detection and collision detection. Proximity detection can be achieved using photoelectric sensors (photoelectric fences), lidar and capacitive proximity sensors. Photoelectric sensors/light curtains can be an ideal solution for industry robots. A safety light curtain is made up of a transmitter and a receiver. The transmitter transmits modulated infrared light, which is received by the receiver to create an array of light beams (also known as a light curtain). When a human operator enters or is blocked by the protection net, the light receiver circuit replies through the internal control circuit, which outputs a signal to the machine, causing the machine to slow down or stop its operation, thereby preventing the occurrence of a potential collision.

By contrast, force and torque sensors are commonly used for cobots when it comes to collision detection. Unlike industrial robots, cobots work in the same workspace as humans, meaning that a physical light curtain/fence would not suffice. The majority of commercialized cobots are often equipped with at least one force/torque (F/T) sensor at their joints. F/T sensors have two main functions, including force measurement and collision detection. F/T sensors are typically installed around the robot’s end-effectors to measure the force. Depending on the tasks, the range of forces needs to be preset, and when the collision happens, the force or torque detected by the sensor will exceed the pre-determined range, thereby informing the robot to stop its operation. With the increasing safety requirement of HRI, more F/T sensors are expected to be installed. Most cobots have one F/T sensor installed at this stage, typically around the end-effector. However, a few cobot original equipment manufacturers (OEMs) are starting to incorporate more torque sensors on all the joints to enable better force control and collision detection.

Increased level of autonomous mobility

Autonomous mobility is one of the most important parts of a robot’s autonomy. Autonomous mobility requires the robot to have the capabilities of navigating, localizing, and avoiding obstacles. In the context of the manufacturing industry, mobile robots, especially automated guided vehicles (AGVs) and autonomous mobile robots (AMRs), will be used for material transportation. The autonomous mobility function of mobile robots is enabled by sensors such as lidar, cameras, and ultrasonic sensors. Different sensors have benefits and drawbacks, and in reality, multiple sensors are usually used in combination with each other to achieve the best overall performance.

For instance, lidar is relatively easy to use, and they are immune to poor weather. However, lidar often comes with a high cost. By contrast, cameras or imaging sensors are the only ones that can be used for object classification/recognition, but they have poor performance when it comes to adverse weather or limited visibility. In terms of the manufacturing industry, cameras will be increasingly adopted because those robots tend to work in a well-controlled indoor environment with stable illumination. At this stage, many indoor automated guided vehicles (AGVs) in the manufacturing industry can perform on level 3 autonomy, meaning that the robotic onboard systems can achieve most of the autonomous driving tasks and multiple AGVs can be monitored simultaneously by one operator. With the trend toward level 4 and a higher level of autonomy, more robust sensors will likely be incorporated.

– Edited by Chris Vavra, web content manager, Control Engineering, CFE Media and Technology, cvavra@cfemedia.com.


Author Bio: Yulin Wang, technology analyst at IDTechEx.

Related Resources