Civilian drones are an emerging market in machine vision
While defense still drives cutting-edge innovation in the unmanned aircraft system (UAS) market for machine vision, civilian applications have grown in increasing numbers. With them, a growing number of applications and opportunities for vision components — from improving drone navigation and safety to capturing image data from heretofore challenging perspectives.
In late 2015, the Federal Aviation Administration (FAA) began to register all civilian drone owners who possessed at least one craft weighing between 0.55 and 55 pounds. By December of last year, that registry listed over 900,000 owners flying an estimated 1.25 million drones. Hobbyists dominated the ranks early on. But the number of commercial applications is now beginning to soar. The FAA forecasted in 2017 that 229,400 commercial drones would be registered by the end of 2019. But before 2018 had even closed, registrations had outstripped that forecast by 48,000. The FAA now projects commercial drones will number more than 835,000 by 2023.
During this period, media headlines have described a growing number of applications in which drone-mounted cameras assess a field of crops, map a construction site, or inspect a wind turbine. All these uses are worthy of a closer look. But what these stories often miss is just how fundamental machine vision has become to basic drone operation.
Consider DJI, the world’s largest manufacturer of drones for civilian use. The company has embedded ground-facing complementary metal-oxide-semiconductor (CMOS) sensors in all its products since inception. These chips are not the cameras mounted onboard to capture real estate listings or the local football game from 200 feet above. Instead, they provide a constant visual point of reference (i.e., the ground immediately below) that, when coupled with onboard memory and artificial intelligence, enables DJI drones to hover precisely in place if a gust of wind arises while the pilot is tying his shoes. Should a low battery signal automatically recall a DJI drone to its liftoff coordinates, GPS sensors will return it to within a few meters, but the embedded CMOS chip will navigate the final few centimeters for a precision touchdown.
“On our latest industrial drones, there are vision sensors on the front, back, and sides and infrared (IR) time-of-flight sensors on the top and bottom,” said Michael Oldenburg, senior communications manager, North America, for DJI. “So, you have omnidirectional sensors for obstacle detection, which feeds an advanced pilot assistance system. They’re literally flying robots.”
Like any vision-guided robot, technical sophistication can be measured by how much additional precision or functionality the robot control system wrings from the vision data. DJI provides limited details about the components underlying its drones’ flight control. But higher end models, such as its Mavic 2 or Matrice 200 quadcopters, incorporate a guidance sensor suite that collects data from five CMOS imagers and ultrasound sensors, processes it using FPGA-based hardware acceleration modules and two ARM Cortex-A9 cores, and feeds it into the flight-control system via a CANbus cable or to other intelligent systems via a USB or UART cable.
As a result, the drone can automatically adjust for adverse environmental conditions or pilot error, even when flying at high speeds. “If you fly it toward a telephone pole, the drone will alert you,” Oldenburg said. “Previously, if you continued to press straight forward on the sticks, the drone would just stop and hover in front of the pole. But now it uses vision to plot a path around that obstacle without your input and continue on its course.”
Drones at work
Whether used for a hobby or a commercial venture, virtually all drones on the market today mount additional cameras for use beyond navigation and collision avoidance. Most casual users and many commercial users can get by with cell-phone comparable resolutions 12 MP and 4K-quality video at 60 frames per second (FPS). This is suitable for capturing unique vacation pictures or, in Switzerland-based Flyability’s case, for inspecting infrastructure.
The company built its business around using “collision-tolerant” drones for visual inspection of normally inaccessible spaces such as mines, bridge trusses, and even the interiors of nuclear power tanks. While Flyability’s Elios 2 quadcopter carries a thermal camera, its workhorse is a low-light CMOS imager with an effective 12.3 MP pixel count that delivers a resolution of 0.18 mm/pixel at 30 cm — about what an inspector would see at arm’s length — which is enough to spot submillimeter cracks in infrastructure.
While camera resolutions in the 12 MP range find frequent use for commercial applications such as construction and agriculture, it can be as high as 20 MP.
“That sensor size is important for either 2-D mapping or 3-D mapping applications because it allows a drone to fly higher up and also capture a good amount of detail,” said Oldenburg. “By acquiring more information, that drone can capture that task faster.”
A more typical benchmark resolution for drone-mounted thermal cameras is QVGA (336 x 256 pixels) or VGA (640 x 512), according to Randall Warnas, global sUAS segment leader at Flir Systems.
“Thermographers recommend 10 by 10 pixels on a given target object to reliably detect and measure the temperature,” he said. “Given the typical standoff distance desired and a 640 by 512 resolution thermal camera, a 19-millimeter lens is very popular. When mapping, commercial drone operators are using approximately 80% overlap in the flight direction and 80% sidelap.”
Thermal drone use is roughly divided between public safety applications in firefighting, law enforcement, and search and rescue on one side and commercial inspection for utilities, solar fields, roofs, and buildings on the other, according to Warnas. But mounting visible and thermal cameras aboard drones is also reaping benefits in agriculture, where farmers can inspect expansive tracts of land and provide a wealth of data about crops and livestock. Add a thermal camera to a drone, for example, and it can help find your cattle herd under heavy cover or in rough terrain. Or, as Texas A&M researchers have done, mount a Zenmuse XT 19 mm uncooled microbolometer camera to a Matrice 100 drone — both made by DJI — to produce insights into how heat signatures from the herd might speed detection of stress or disease.
Pilots not for hire
Though drone delivery of your pizza or e-commerce order is still in the speculative phase, industry analysts agree that retail applications could one day outstrip all others. Coherent Market Insights, for one, forecasts that the global market for delivery drones will see compound annual growth of 22.3% through 2026, owing to increasing adoption of drones in the e-commerce transportation and logistics industry. Meanwhile, businesses such as Amazon, Google, Walmart, UPS, and even 7-Eleven are actively exploring the technology.
But before drones can begin making deliveries, the industry will need to negotiate several emerging regulatory hurdles. For example, the FAA is formulating a requirement for some means to remotely identify drones in flight, as well as the development of a universal traffic management system (UTMS). Currently, regulators in the U.S. and Europe also require all airborne platforms to be able to “see” other aircraft. In practice, this means unmanned drones must operate within the visual line of sight (VLOS) of a pilot or an observer in direct communication with the pilot.
Onboard machine vision cannot provide solutions for all these regulatory demands. Even the VLOS mandate may be more readily addressed by a UTMS. Plenty of obstacles, however — flying and otherwise — could interrupt a delivery drone’s path. So, it is likely that machine vision will be instrumental in avoiding collisions and improving flight precision.
“Beyond VLOS will be the next gateway the drone industry will pass through where we can expect a spike in demand and usefulness,” Warnas said. “We are within sight of fully autonomous missions, but it will likely be several years before it becomes widespread.” The key, he added, is to focus first on appropriate locations (e.g., remote versus urban), missions without viable alternatives to VLOS (e.g., first responder) and areas where the return on investment is very clear in terms of improved safety and financial outcomes.
In other words, the road — or flight path — to autonomous drones will advance incrementally. As human pilots and VLOS are slowly phased out of drone applications, the opportunities for machine vision will multiply.
Dan McCarthy is contributing editor for AIA. This article originally appeared in Vision Online. AIA is a part of the Association for Advancing Automation (A3), a CFE Media content partner. Edited by Chris Vavra, production editor, CFE Media, email@example.com.