Sensors, Vision

Illustration of controlled rotation of boron nitride (BN) layers above and below a graphene layer introduce coexisting moire superlattices, which change size, symmetry and complexity as a function of angle. In this system the Columbia researchers achieve unprecedented control over monolayer graphene's bandstructure within a single device, by mechanically rotating boron nitride atop graphene aligned to a bottom BN slab. Courtesy: Columbia University
Sensors, Vision December 15, 2019

Restoring graphene’s symmetry with a twistable electronics device

Columbia University researchers have developed a method to restore graphene's symmetry by adjusting the twist angle between them, which could enable the development of nanoelectromechanical sensors with applications in astronomy, medicine, search and rescue, and more.

By Holly Evarts
Courtesy: CFE Media
Sensors, Vision December 9, 2019

Machine vision sales drop in Q3 2019

The AIA reported a 4.6% decline in North American machine vision sales in the third quarter of 2019.

By Association for Advancing Automation (A3)
A machine vision project can be successful by defining expectations, basing a selection on needs and conducting a proof-of-concept. Courtesy: MartinCSI
Sensors, Vision November 29, 2019

3 steps to a successful machine vision project

Machine vision can add to a project’s quality and throughput. Heed these three ways to help a vision project succeed.

By Ian Visintine
Courtesy: CFE Media and Technology
Sensors, Vision November 24, 2019

Machine vision can improve random bin picking

Random 3-D bin picking is a developing robotic skill that requires robots to see and act more like humans, which is a complex task. Machine vision can help.

Courtesy: Chris Vavra, CFE Media
Sensors, Vision November 17, 2019

Machine vision and AI enhance 3-D printing

Adding machine vision and artificial intelligence (AI) to 3-D printing allows industrial printers to produce products that have never been printed before.

A new shoebox-sized laser produces terahertz waves (green squiggles) by using a special infrared laser (red) to rotate molecules of nitrous oxide, or laughing gas, packed in a pen-sized cavity (grey).Courtesy: Chad Scales, US Army Futures Command/MIT
Sensors, Vision November 15, 2019

Researchers generate terahertz laser with laughing gas

Researchers from MIT, Harvard University, and the U.S. Army have built a compact device to produce a terahertz laser whose frequency they can tune over a wide range using nitrous oxide for better wireless communication.

By Jennifer Chu
Through energy harvesting, IIoT can deliver maintenance-free wireless sensors through sources such as vibration, RF and light energy. Courtesy: Chris Vavra, CFE Media and Technology
Wireless November 13, 2019

Using energy harvesting, radio technologies to power wireless sensors

Gathering data from wireless sensors is critical in the Industrial Internet of Things (IIoT) era; using energy harvesting and radio frequency identification (RFID) can provide peace of mind for operators who don’t have to worry about batteries.

By Chris Vavra
Focused laser light generates an optical “tractor beam,” which can manipulate and orient semiconductor nanorods (red) with metal tips (blue) in an organic solvent solution. The energy from the laser superheats the metallic tip of the trapped nanorod, allowing the aligned nanorods to be welded together end-to-end in a solution-based “nanosoldering” process. Courtesy: University of Washington
Discrete Manufacturing November 12, 2019

Method developed to make nanoscale manufacturing possible

Researchers at the University of Washington have developed a method that could make reproducible manufacturing at the nanoscale possible, enabling new potential applications.

By James Urton
Courtesy: Chris Vavra, CFE Media
Sensors, Vision October 29, 2019

Improve the supply chain with drone-based image recognition

Warehouses are turning to drone-based image recognition to improve supply chain efficiencies.

A new type of lightweight, inexpensive hyperspectral camera could enable precision agriculture. This graphic shows how different pixels can be tuned to specific frequencies of light that indicate the various needs of a crop field. Courtesy: Maiken Mikkelsen and Jon Stewart, Duke University
Sensors, Vision October 27, 2019

Hyperspectral cameras designed to improve agricultural practices

A Duke University researcher is working on developing a small, inexpensive hyperspectral camera to enable worldwide precision agricultural practices thanks to a recently-awarded fellowship.

By Ken Kingery