While scientists can't go back in time to study the environmental pressures that shaped the evolution of the diverse vision systems that exist in nature, a new computational framework developed by MIT researchers allows them to explore this evolution in artificial intelligence agents. The framework they developed, in which embodied AI agents evolve eyes and learn to see over many generations, is like a 'scientific sandbox' that allows researchers to recreate different evolutionary trees. The user does this by changing the structure of the world and the tasks AI agents complete, such as finding food or telling objects apart. The researchers' experiments with this framework showcase how tasks drove eye evolution in the agents. For instance, they found that navigation tasks often led to the evolution of compound eyes with many individual units, like the eyes of insects and crustaceans. This framework could enable scientists to probe 'what-if' questions about vision systems that are...
learn more