Local News
Researchers explore how brain-inspired analog hardware could improve drone energy efficiency and performance

Rochester, New York – In a quiet but significant shift in how machines might “see” the world, engineers at the University of Rochester are working on a new type of computing system—one that could help drones and other autonomous technologies become far more efficient than they are today.
The big idea? Instead of relying on power-hungry digital computers to drive artificial intelligence (AI) systems, these researchers are turning to analog hardware that mimics the human brain’s own visual and prediction processes. If successful, the technology could lead to lightweight, energy-saving onboard systems for drones, self-driving cars, and other intelligent machines.
The current state of AI is mostly powered by neural networks, which were inspired by how the brain works—but ironically, they run on digital computers that were never designed for tasks like real-time image recognition. “The artificial intelligence systems that guide drones and self-driving cars rely on neural networks—trainable computing systems inspired by the human brain,” the research team notes. “But the digital computers they run on were initially designed for general-purpose computing tasks ranging from word processing to scientific calculations and have ultra-high reliability at the expense of high-power consumption.”
Professor Michael Huang, who teaches electrical and computer engineering, computer science, and data science at the University of Rochester, says this digital-dominated approach has limits. It’s effective, but not sustainable—especially for mobile, autonomous systems that need to do a lot of thinking without access to large power sources.
So, Huang and his colleagues are experimenting with predictive coding networks, a model inspired by neuroscience. Instead of mimicking the back-propagation algorithms used in most neural networks today—which Huang calls “biologically implausible”—predictive coding mirrors how scientists think the human brain constantly refines its understanding of the world through feedback.
“Research by neuroscientists has shown that the workhorse of developing neural networks—this mechanism called back propagation—is biologically implausible and our brains’ perception systems don’t work that way,” Huang explains. “To solve the problem, we asked how our brains do it. The prevailing theory is predictive coding, which involves a hierarchical process of prediction and correction—think paraphrasing what you heard, telling it to the speaker, and using their feedback to refine your understanding.”
This kind of feedback loop is what the Rochester team hopes to replicate in hardware—not in digital code, but in analog circuits. These circuits would use far less power than traditional chips while still being able to process complex visual data. And rather than relying on experimental new parts, the system will be built using existing manufacturing processes such as CMOS, a widely used semiconductor technology.
The project is ambitious, but it’s not being done in isolation. Huang is joined by fellow professors Hui Wu and Tong Geng, along with students and research partners from Rice University and UCLA. Backing the research is DARPA, the Defense Advanced Research Projects Agency, which has committed up to $7.2 million over the next four and a half years to support the team’s work.
Read also: Supervisor Jeff Leenhouts shares plans for first-ever family fun night at Shadow Pines on August 22
The first major step will be developing a prototype that can classify static images using the new analog system. If the prototype proves that analog predictive coding networks can match—or come close to—digital neural network performance, the team sees a clear path toward scaling the system for far more demanding applications.
While today’s drones must often rely on remote data processing or heavy batteries to run their AI systems, this new approach could enable onboard intelligence that’s leaner, faster, and more sustainable. That means drones that stay in the air longer, navigate more accurately, and react to changes in the environment with something that looks a lot more like human intuition.
The University of Rochester has long been a hub for research in computer vision, and this latest initiative builds on that legacy. In fact, Huang credits former professor Dana Ballard—one of the original minds behind predictive coding theory—as a foundational influence on the project.
Whether it’s used for mapping, surveillance, disaster response, or even package delivery, tomorrow’s drones may owe their agility not to faster code, but to smarter, brain-like hardware that listens, learns, and adjusts—just like we do.

-
Local News8 months ago
New ALDI store close to Rochester to begin construction in late 2025 or early 2026
-
Local News8 months ago
Rochester Lilac Festival announces exciting 127th edition headliners
-
Local News6 months ago
County Executive Adam Bello and members of the county legislature celebrate exceptional young leaders and advocates at the 2025 Monroe County Youth Awards
-
Local News6 months ago
The 2025 Public Market Food Truck Rodeo series will begin this Wednesday with live music by the Royal Bromleys