Edge Computing, Embedded Systems
Unsupervised ML is designed to act on unlabeled data using algorithms that excel at identifying patterns and pinpointing anomalies in data ranging from condition monitoring and performance testing to cybersecurity and asset management.
A neural network approach captures the characteristics of a physical system’s dynamic motion from video, regardless of rendering configuration or image differences.
USC Viterbi researchers have developed a neural network that can model a high-performing new material using machine learning (ML) techniques.
Researchers have developed and demonstrated a ‘self-driving lab’ that uses AI and fluidic systems to advance understanding of metal halide perovskite (MHP) nanocrystals, which could be used to improve many different materials.
Georgia Tech Technology is pursuing 30 quick-turn research projects that touch on key priority areas such as internet for the future, capabilities at the edge, and optimized application experiences.
The speed of adoption and the potential benefits of combined artificial Intelligence (AI) and Internet of Things (IoT) technologies in the industrial sector could lead to many positive benefits.
Secure, granular data processing, connectivity, and communication in the field create a foundation for large-scale IIoT according to a company with a Control Engineering Engineers’ Choice Award product.
When considering industrial edge computing, look for a zero-touch, secure and automated platform, designed and built for edge environments. Seek self-protecting and self-monitoring features and built-in application virtualization with fault tolerance, according to a company with a Control Engineering Engineers’ Choice Award product.
A technique developed by North Carolina State University researchers uses compression to drastically reduce the size of data transmissions, creating additional opportunities for artificial intelligence (AI) training on wireless technologies.
Purdue University researchers have developed a way that could let computer chips rewire themselves to take in new data like the brain does, helping artificial intelligence (AI) to keep learning over time.