Connect with us

Local News

AI-driven training system developed at URMC could reshape how surgeons learn critical procedures

Published

on

Rochester, New York – A team of researchers at the University of Rochester Medical Center (URMC) has developed a groundbreaking, instructor-free training system for surgeons that could radically change how surgical skills are taught. By merging 3D printed organs, artificial intelligence (AI), and augmented reality (AR), the new platform creates a fully autonomous and highly realistic learning environment — and early results suggest it may work even better than traditional methods.

The new system, called Educational System for Instructorless Surgical Training (ESIST), is designed to tackle a long-standing issue in surgical education: the need for constant access to skilled mentors. For decades, the typical model for learning surgery has followed the “see one, do one, teach one” philosophy — watch an expert, then slowly take over. But in reality, that model relies heavily on having experienced instructors available, which isn’t always the case in busy, high-pressure hospitals.

Now, thanks to ESIST, a trainee can learn and practice critical surgical maneuvers without an expert physically present in the room — and still receive precise, real-time feedback.

“This proof of principle demonstrates that deep learning, paired with extended reality, can autonomously teach and assess a critical surgical maneuver with near-perfect accuracy and high user satisfaction,” said Jonathan Stone, MD, director of Surgical Innovation at URMC.

Realistic Simulation Meets Cutting-Edge AI

ESIST uses a combination of technologies that simulate the feel and complexity of real surgery. The process begins with lifelike organs, created using medical imaging, advanced 3D printing, and tunable hydrogels that mimic the texture and properties of real human tissue. These synthetic organs give trainees something tangible to operate on.

Wearing an augmented reality headset, the trainee sees step-by-step instructions projected into their field of vision as they practice a specific surgical step — in this case, placing a clamp on a kidney artery, which is part of a partial nephrectomy procedure. The headset’s feed is connected to a convolutional neural network, a type of AI specialized in analyzing visual data. This AI watches the surgery unfold via a laparoscopic camera and instantly assesses the trainee’s work.

Read also: Community program in Penfield highlights urgent need to address climate change through film and discussion

If the clamp is misplaced — for example, on a vein or the ureter — the system stops the session and explains the error. If the clamp is placed correctly on the artery, it gives the green light to proceed.

All of this happens in real time, without human intervention.

“The goal is to move the very early portion of the learning curve outside of the operating room,” Stone explained. “This system doesn’t replace the mentor; it prepares the student better before they work with that mentor on patients.”

Results Show Strong Promise

To test the system, the team invited 17 participants to complete the simulation. The AI correctly identified the clamp placement 99.9 percent of the time, proving both its precision and potential.

Feedback from the users was equally impressive. Most trainees appreciated the instant feedback and realistic experience. According to survey results, 84 percent of responses rated the system positively for teaching that critical surgical task.

In addition to assessing accuracy, ESIST also logs every move made by the user, measuring how long each step takes and identifying performance trends over time. That kind of objective data is a major asset, especially in a field where performance is traditionally judged subjectively.

Future Potential: From Practice to Real Operations

While this initial version of ESIST focuses on training outside the operating room, the research team sees a future where AI systems like this could assist surgeons in real time during live procedures.

“This early step was about training, but in the future, it will be about augmenting performance,” said Stone. “The long-term vision is that AI models will be running in the background, processing surgical data, and providing real-time predictive analytics that improve efficiency and reduce complications.”

As both hardware and software continue to evolve, these autonomous systems may one day become a standard fixture in operating rooms — providing reminders about anatomy, warning about potential errors, or even suggesting more efficient techniques based on live data.

That level of integration could help standardize training and elevate performance across the board, especially in smaller hospitals or low-resource settings where experienced instructors may not always be available.

A Collaborative Effort Backed by Major Institutions

The research behind ESIST was supported by the National Institute of Biomedical Imaging and Bioengineering and the National Science Foundation, highlighting its significance in the field of medical innovation.

Read also: RIT announces new clinical doctorate program in occupational therapy launching in 2026

Contributors to the project include Nelson Stone of the Icahn School of Medicine at Mount Sinai, and researchers Steven Griffith, Kyle Zeller, and Michael Wilson from Viomerse, Inc. Jonathan Stone is also the founder and principal equity holder of Viomerse, while Griffith and Wilson hold equity in the company.

As for what comes next, the URMC team is focused on expanding the system’s capabilities — training it to cover entire procedures and exploring how the AI platform can enhance real-world surgeries, not just simulations.

In a time when healthcare systems are increasingly stretched, technologies like ESIST offer a hopeful glimpse into the future — one where precision, access, and education converge to create safer surgeries and more confident surgeons.

 

Continue Reading

Trending