Skip to content
1932

LAYOUT MENU

Insert PARAGRAPH
Insert IMAGE CAPTION
Insert SIDEBAR WITH IMAGE
Insert SIDEBAR NO IMAGE
Insert YMAL WITH IMAGES
Insert YMAL NO IMAGES
Insert NEWSLETTER PROMO
Insert IMAGE CAROUSEL
Insert PULLQUOTE
Insert VIDEO CAPTION

LAYOUT MENU

CREDIT: COURTESY OF AXEL KRIEGER AND JIN KANG

Surgical robots have long played a role in operating theaters, but they are now achieving greater autonomy. Here, an autonomous robot called STAR is able to suture together cut edges of the intestine of an anesthetized pig, a challenging problem because the tissue is so flimsy and mobile.

Handing the surgeon’s scalpel to a robot

After decades of merely assisting doctors, are sophisticated machines ready to take charge?

Support sound science and smart stories
Help us make scientific knowledge accessible to all
Donate today


Lea en español

In 2004, the United States’ Defense Advanced Research Projects Agency (DARPA) dangled a $1 million prize for any group that could design an autonomous car that could drive itself through 142 miles of rough terrain from Barstow, California, to Primm, Nevada. Thirteen years later, the Department of Defense announced another award — not for a robot car this time, but for autonomous, robotic doctors.

Robots have been found in the operating suite since the 1980s for things like holding a patient’s limbs in place, and later for laparoscopic surgery, in which surgeons can use remote-controlled robot arms to operate on the human body through tiny holes instead of huge cuts. But for the most part these robots have been, in essence, just very fancy versions of the scalpels and forceps surgeons have been using for centuries — incredibly sophisticated, granted, and capable of operating with incredible precision, but still tools in the surgeon’s hands.

Despite many challenges, that is changing. Today, five years after that award announcement, engineers are taking steps toward building independent machines that not only can cut or suture, but also plan those cuts, improvise and adapt. Researchers are improving the machines’ ability to navigate the complexities of the human body and coordinate with human doctors. But the truly autonomous robotic surgeon that the military may envision — just like truly driverless cars — may still be a long way off. And their biggest challenge may not be technological, but convincing people it’s OK to use them.

Navigating unpredictability

Like drivers, surgeons must learn to navigate their specific environments, something that sounds easy in principle but is endlessly complicated in the real world. Real-life roads have traffic, construction equipment, pedestrians — all things that don’t necessarily show up on Google Maps and which the car must learn to avoid.

Similarly, while one human body is generally like another, children’s movies are right: We’re all special on the inside. The precise size and shape of organs, the presence of scar tissue, and the placement of nerves or blood vessels often differ from person to person.

“There’s so much variation in the individual patients,” says Barbara Goff, a gynecologic oncologist and surgeon-in-chief at the University of Washington Medical Center in Seattle. “I think that that could be challenging.” She’s been using laparoscopic surgical robots — the kind that don’t move on their own but do translate the surgeon’s movements — for more than a decade.

The fact that bodies move poses a further complexity. A few robots already display some amount of autonomy, with one of the classic examples being a device with the (maybe-a-bit-on-the-nose) name ROBODOC, which can be used in hip surgery to shave down bone around the hip socket. But bone’s relatively easy to work with and, once locked into place, doesn’t move around much. “Bones don’t bend,” says Aleks Attanasio, a research specialist now at Konica Minolta who wrote about robots in surgery for the 2021 Annual Review of Control, Robotics, and Autonomous Systems. “And if they do, there’s a bigger problem.”

A surgical team watches as a robot manipulates surgical tools inserted in a patient’s abdomen during laparoscopic surgery.

The da Vinci surgical robot, shown here on a US Navy hospital ship, is one of the most widely used devices to assist doctors in laparoscopic surgery. The procedure — in which tools are inserted through tiny holes in the abdomen instead of cutting a long incision — allows patients to recover more quickly.

CREDIT: KELSEY L. ADAMS, US NAVY / FLICKR

Unfortunately, the rest of the body isn’t as easy to lock in place. Muscles contract, stomachs gurgle, brains jiggle, and lungs expand and contract, for instance — even before a surgeon gets in there and starts moving things around themselves. And while a human surgeon can obviously see and feel what they’re doing, how could a robot know if its scalpel is in the right place or if tissues have shifted?

One of the most promising options for such dynamic situations couples the use of cameras and sophisticated tracking software. In early 2022, for example, researchers at Johns Hopkins University used a device called the Smart Tissue Autonomous Robot (STAR for short) to sew two ends of severed intestine back together in an anesthetized pig — a potentially very jiggly task — thanks to this visual system.

A human operator tags the ends of the intestine with drops of fluorescent glue, creating markers the robot can track (a bit like an actor wearing a motion-capture suit in a Hollywood movie). At the same time, a camera system creates a 3-D model of the tissue using a grid of light points projected onto the area. Together, these technologies allow the robot to see what is in front of it.

“What’s really special about our vision system is that it allows us to not only reconstruct what that tissue looks like, but it also does so fast enough that you can do it in real time,” says STAR system codesigner Justin Opfermann, an engineering PhD student at Hopkins. “If something does move during the surgery, you can detect and follow it.”

The robot can then use this visual information to predict the best course of action, presenting the human operator with different plans to choose from or checking in with them in between sutures. In tests, STAR worked well on its own — though not perfectly. In total, 83 percent of the sutures could be done autonomously, but the human still had to step in the other 17 percent of the time to correct things.

“The 83 percent can definitely be overcome,” says Opfermann. Most of the problem was that the robot had a little trouble finding the right angle at certain corners and needed a human to nudge it into the right spot, he says. Newer, yet-to-be-published trials now have success rates in the high 90s. In the future, the human may only need to approve the plan, then watch it go, no intervention needed.

A timeline of progress in surgical robotics

Since the early days of NASA designs in the 1970s, surgical robots have gradually become more and more capable. Eventually, they may be able to make and carry out decisions on their own, without intervention or supervision by human surgeons.

Passing the safety test

For now, though, there still needs to be someone in the driver’s seat, so to speak. And it might be that way for a while for many different autonomous robots: While we could theoretically hand over complete decision-making to the robot, this does raise a question — one that has also plagued driverless cars.

“What happens if some of these activities go wrong?” says Attanasio. “What if the car has an accident?”

The general view, for now, is that keeping the humans ultimately in control is best — at least in a supervisory role, reviewing and signing off on procedures and standing by in case of emergency.

Even so, proving to hospitals and regulators that autonomous robots are both safe and effective may be the single biggest roadblock to truly human-free robots entering the surgical suite. Experts have a few takes on how to get around this.

For instance, designers will likely need to be able to explain to regulators exactly how the robots think and decide what to do next, says Attanasio, especially if they progress to the point where they’re not just assisting a human surgeon but arguably practicing medicine themselves. That explanation may be easier said than done, though, since current artificial intelligence systems may leave observers few hints of how they make decisions. As a result, engineers may want to design with “explainability” in mind from the beginning.

Pietro Valdastri, a biomedical engineer at the University of Leeds in England and one of Attanasio’s coauthors, thinks it’s possible that no manufacturer will be able to easily solve the regulatory question, though he does have a work-around. “The solution here is to make a system that even if it’s autonomous, it’s inherently safe.” This means the next generation of surgical robots may not resemble roadsters so much as bumper cars.

This soft robot, steerable by externally controlled magnets, is designed to snake deep into a patient’s lungs to view the tissue there. The robot navigates the narrow passages on its own, eliminating the need for X-rays to help guide a human operator.

CREDIT: UNIVERSITY OF LEEDS

Valdastri is working on what are known as soft robots, particularly for colonoscopies. Traditionally, a colonoscopy requires snaking a flexible tube with a camera — an endoscope — through the intestine to look for early signs of colon cancer. The procedure is recommended for anyone over the age of 45 — but it can take a long time and a lot of training for an operator to become proficient with the endoscope. With few properly trained operators to go around, waitlists have ballooned.

But using a smart robot that can steer itself would make the job much easier — like driving a car in a video game, Valdastri says. The doctor could then focus on the matter at hand: spotting early signs of cancer. And in this case, the robot, created from soft materials, would be inherently safer than more rigid devices. It may even reduce the need for anesthesia or sedation, says Valdastri, since it could more easily avoid pushing against the intestinal walls. And with no way for the robot to cut or zap anything on its own, it may be easier for regulators to accept.

As the technology develops, Opfermann suggests, autonomous robots may start out getting approval only for simpler tasks, such as holding a camera. As more and more of these basic jobs get approved, the tasks may build up into an autonomous system. In cars, we first got cruise control, he says, but now there’s brake assist, lane assist, even assisted parking — all of which build towards something driverless.

“I think this will be kind of similar,” says Opfermann, “where we see small, autonomous tasks that eventually get chained together into a full system.”

Support Knowable Magazine

Help us make scientific knowledge accessible to all

Donate

TAKE A DEEPER DIVE | Explore Related Scholarly Articles

More From
/content/article/technology/2022/handing-surgeon-scalpel-to-robot
dcterms_title,dcterms_subject,pub_author
+dcterms_language:language/en -dcterms_type:topics/newsletter -contentType:Journal -contentType:Contributor -contentType:Concept -contentType:Institution
3
3
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error