Skip to Main Content Menu Search Site

Novel 3D printing method embeds sensing capabilities within robotic actuators

Soft robots that can sense touch, pressure, movement and temperature

By Leah Burrows, SEAS Communications

(CAMBRIDGE, Mass.) — Researchers at Harvard University have built soft robots inspired by nature that can crawl, swim, grasp delicate objects and even assist a beating heart, but none of these devices has been able to sense and respond to the world around them.

That’s about to change.

Inspired by our bodies’ sensory capabilities, researchers at the Wyss Institute for Biologically Inspired Engineering and the Harvard John A. Paulson School of Engineering and Applied Sciences have developed a platform for creating soft robots with embedded sensors that can sense movement, pressure, touch, and even temperature.

The research is published in Advanced Materials.

Researchers from the Wyss Institute and Harvard SEAS have developed a platform for 3D printed, soft robots with embedded sensors that can feel touch, pressure, motion and temperature. This technology could be used for integrated sensing across a range of soft robotic applications. Credit: Harvard SEAS

“Our research represents a foundational advance in soft robotics,” said Ryan Truby, first author of the paper and recent Ph.D. graduate at SEAS. “Our manufacturing platform enables complex sensing motifs to be easily integrated into soft robotic systems.”

Integrating sensors within soft robots has been difficult in part because most sensors, such as those used in traditional electronics, are rigid.  To address this challenge, the researchers developed an organic ionic liquid-based conductive ink that can be 3D printed within the soft elastomer matrices that comprise most soft robots.

“To date, most integrated sensor/actuator systems used in soft robotics have been quite rudimentary,” said Michael Wehner, former Postdoctoral Fellow at SEAS and co-author of the paper. “By directly printing ionic liquid sensors within these soft systems, we open new avenues to device design and fabrication that will ultimately allow true closed loop control of soft robots.”

Wehner is now an Assistant Professor at the University of California, Santa Cruz.

To fabricate the device, the researchers relied on an established 3D printing technique developed in the lab of Jennifer Lewis, Sc.D., Core Faculty Member of the Wyss Institute and the Hansjörg Wyss Professor of Biologically Inspired Engineering at SEAS. The technique — known as embedded 3D printing — seamlessly and quickly integrates multiple features and materials within a single soft body.

Novel 3D printing method embeds sensing capabilities within robotic actuators
This soft robotic gripper is the result of a platform technology developed by Harvard researchers to create soft robots with embedded sensors that can sense inputs as diverse as movement, pressure, touch, and temperature. Credit: Ryan L. Truby/Harvard University

“This work represents the latest example of the enabling capabilities afforded by embedded 3D printing – a technique pioneered by our lab,” said Lewis.

“The function and design flexibility of this method is unparalleled,” said Truby. “This new ink combined with our embedded 3D printing process allows us to combine both soft sensing and actuation in one integrated soft robotic system.”

To test the sensors, the team printed a soft robotic gripper comprised of three soft fingers or actuators. The researchers tested the gripper’s ability to sense inflation pressure, curvature, contact, and temperature. They embedded multiple contact sensors, so the gripper could sense light and deep touches.

“Soft robotics are typically limited by conventional molding techniques that constrain geometry choices, or, in the case of commercial 3D printing, material selection that hampers design choices,” said Robert Wood, Ph.D., Core Faculty Member of the Wyss Institute and the Charles River Professor of Engineering and Applied Sciences at SEAS, and co-author of the paper.  “The techniques developed in the Lewis Lab have the opportunity to revolutionize how robots are created — moving away from sequential processes and creating complex and monolithic robots with embedded sensors and actuators.”

Next, the researchers hope to harness the power of machine learning to train these devices to grasp objects of varying size, shape, surface texture, and temperature.

The research was coauthored by Abigail Grosskopf, Daniel Vogt and Sebastien Uzel. It was supported it part by through Harvard MRSEC and Harvard’s Wyss Institute for Biologically Inspired Engineering.

Close menu