Edge Computing, Embedded Systems
How unsupervised machine learning benefits industrial automation
Unsupervised ML is designed to act on unlabeled data using algorithms that excel at identifying patterns and pinpointing anomalies in data ranging from condition monitoring and performance testing to cybersecurity and asset management.
Improving motion capture with neural networks
A neural network approach captures the characteristics of a physical system’s dynamic motion from video, regardless of rendering configuration or image differences.
Creating energy-efficient electronics using machine learning
USC Viterbi researchers have developed a neural network that can model a high-performing new material using machine learning (ML) techniques.
Self-driving lab used to improve nanocrystal understanding
Researchers have developed and demonstrated a ‘self-driving lab’ that uses AI and fluidic systems to advance understanding of metal halide perovskite (MHP) nanocrystals, which could be used to improve many different materials.
Collaboration on research projects for edge computing, internet and more
Georgia Tech Technology is pursuing 30 quick-turn research projects that touch on key priority areas such as internet for the future, capabilities at the edge, and optimized application experiences.
AIoT improves transparency, quality for manufacturers
The speed of adoption and the potential benefits of combined artificial Intelligence (AI) and Internet of Things (IoT) technologies in the industrial sector could lead to many positive benefits.
Product advice: Building IIoT with edge I/O systems
Secure, granular data processing, connectivity, and communication in the field create a foundation for large-scale IIoT according to a company with a Control Engineering Engineers’ Choice Award product.
Zero-touch, secure and highly automated edge computing platform
When considering industrial edge computing, look for a zero-touch, secure and automated platform, designed and built for edge environments. Seek self-protecting and self-monitoring features and built-in application virtualization with fault tolerance, according to a company with a Control Engineering Engineers’ Choice Award product.
Technique makes federated learning easier for AI in wireless devices
A technique developed by North Carolina State University researchers uses compression to drastically reduce the size of data transmissions, creating additional opportunities for artificial intelligence (AI) training on wireless technologies.
Method developed to let AI learn, retain information
Purdue University researchers have developed a way that could let computer chips rewire themselves to take in new data like the brain does, helping artificial intelligence (AI) to keep learning over time.