Speaking last week at the Bay Area Robotics Symposium, held at the University of California, Berkeley, Ashutosh Saxena, who led the development of TellMeDave and RoboBrain, said that robots will increasingly share information in the future. “We are trying to make robots to learn and share knowledge,” he said. “Different robots can push and pull knowledge from the [RoboBrain] database.”

The key challenge in transferring learning between the robots at Cornell and Brown was that they are physically completely different, which means that low-level commands, such as those specifying the position each joint needs to assume in order to reach for a mug, will not match. Tellex’s group had to figure out a scheme that would allow commands to be transferred between the two platforms.

Ultimately, she says, it would be ideal for a robot to figure out how to translate information for itself, based on how its physical body compares with that of another robot. “This is what we’d all like to do, and this is really a baby step toward that vision,” Tellex says. “There are a lot of remaining technical challenges.”