Robots that learn

Industrial robots programmed for repetitive movements, to build cars for example, have been around for decades, but cannot learn. Newer warehouse robots, like those used by retail giant Amazon to pick up, store and deliver stock, can operate in environments with people and other obstacles.

Unlike their industrial predecessors, these are mobile and can deal with some uncertainty about what they’ll encounter while performing tasks, although all their actions are still programmed, not learned.

Scientists are now attempting to build robots that can learn by example. These robots often have sensors for vision and touch, for example, and manipulators such as simple hands or grippers. The idea is to teach the robot by guiding its manipulators to perform tasks you need, like picking up an item from a conveyor belt and placing it in a box, or collecting litter. 

Basic manipulation tasks, however, are challenging for a robot to copy. Something that seems obvious to us, like the difference between an empty chip packet, a large leaf or small purse, can be difficult for a computer to comprehend, due to infinite possibilities of shapes, colours and sizes (see image below). 

Having a body can help. We humans can pick up an object, turn it over, look inside and even smell it. We use our senses and bodies in flexible ways that allow us to accumulate evidence and make clear decisions when situations are ambiguous. 

How can you teach a robot this? Our intelligence (and common sense) is the culmination of years of experiences, each of which adds a little more to our knowledge and ability to interpret the world. This is why scientists believe that to build truly intelligent machines, we’ll need to equip them with bodies like ours and let them experience the world and mentally ‘grow up’ as children do.

Current best robots

    ​      

Help QBI research

Give now

QBI newsletters

Subscribe

The Brain: Intelligent Machines QBI