Shuran Song
She’s making robots more helpful, useful, and adaptive in everyday life.
Deep learning techniques have transformed how robots master new tasks. As head of the Robotics and Embodied AI Lab at Stanford University, Shuran Song is at the forefront of that shift, and finding creative ways to make robots more useful.
Just recently, Song, 33, and her team designed a low-cost way of giving robots a new sense: hearing. Most robots navigate primarily with sight through cameras—a problem in low-visibility environments. Song’s lab built a system to capture audio, which made robots better at tasks like erasing a whiteboard or emptying a cup.
The new system was built upon one of Song’s most significant contributions to the field—a handheld gripper equipped with microphones that anyone can use to do everything from wash dishes to repair a bike. While you complete a task with the gripper, the device constantly tracks your movements while recording audio and video. That data can then be used to train robots, similar to the way large language models are built.
Song is making all of this training data she collects open source. She’s working on a number of collaborative datasets, including DROID, which can be used by academic researchers who have far less access to training data than startups backed by venture capital firms.
Though safe and useful robots helping us with our daily at-home tasks are still some time away, we’re getting closer, thanks in part to Song’s work.
Deepak Pathak
He’s teaching robots to learn on the fly.
Generative AI models are good at producing text, images, and video in part because they have so many examples of human-generated content on which to train. But since navigating the physical world is more complicated, robots aren’t so lucky.
For years, roboticists have either programmed robots to perform specific actions as they encounter familiar obstacles, or trained them to tackle new tasks using hyperrealistic simulations. But most robots still struggle to adapt to new environments or changing conditions.
Deepak Pathak, 31, is helping robots learn on the fly. His work on adaptive robot learning has made it possible for robots to solve original challenges as they operate in the real world.
Pathak took an unconventional approach: Instead of training his robots on realistic simulations, he deliberately made his simulations unrealistic—full of cartoonish angles and bizarre terrain—and prone to random changes. Robots in his simulations learn to adapt above all else as the world constantly shifts around them.
Pathak has also shown that robots can also learn by watching (through a camera) YouTube videos of people performing specific tasks. The robot then practices the skill on its own until it gets it right, through what’s known as self-supervised learning.
With this method, Pathak has shown that robots can learn more than 20 tasks in just a few hours, including cleaning a whiteboard and removing a plug from a socket. However, tasks that require a particular amount of force or pressure are still a challenge. It’s hard for a robot to tell from videos alone, for example, how hard to grip a jar to open it.
Though Pathak has helped many robots learn lots of simple tasks, he has bigger plans. He wants to create a general-purpose robot that could perform helpful household tasks or even take on dangerous or tedious work, like harvesting crops or stocking warehouse shelves, that humans must perform today.
To that end, Pathak launched a company called Skild AI in July after raising $300 million. The company aims to build the first foundation model for robotics, which could someday be used to create that general-purpose robot.
Fangyu Zhang
He created tiny, drug-carrying biorobots to treat pneumonia and possibly cancer.
Fangyu Zhang, 34, engineered single-cell, self-propelled microalgae into a swarm of swimming biorobots that can be used to treat bacterial pneumonia and other infections before being dissolved naturally by the body. Now he’s working on similar treatments for cancer.
This new approach has advantages over intravenous antibiotics and chemotherapy. In the case of lung infections, which Zhang and his team studied in a mouse model, the antibiotic dose needed to treat pneumonia was reduced by over 99.9%. And targeting cancer cells with chemotherapy-laden biorobots could produce fewer side effects than intravenous treatments while boosting the odds of successful cures.
Zhang’s breakthrough depends on a few steps. First, he and his team created drug-infused nanocoatings that could survive being passed through the stomach or exposed to the lungs. Next, they attached the drug-delivery coatings to the surface of a microscopic green alga, Chlamydomonas reinhardtii, that moves around by flapping its two tail-like structures, or flagella. The rest is up to nature: The armor-wearing, drug-coated biorobots are applied in the trachea or esophagus, where they can swim toward a precise target.
“In the lung area,” says Zhang, “it then takes about 72 hours for the microalgae to totally dissolve.”
Zhang’s approach isn’t really a single treatment; he’s created a biorobot platform. “The algae surface has a lot of functional groups,” he says, “so we can target specific cells.” Some of those functional groups—think of modules that can be swapped out to change what the biorobot sticks to, or avoids—could help the algae bond with very narrow targets, like tumor cells, while avoiding a patient’s healthy cells.