- Today
- Holidays
- Birthdays
- Reminders
- Cities
- Atlanta
- Austin
- Baltimore
- Berwyn
- Beverly Hills
- Birmingham
- Boston
- Brooklyn
- Buffalo
- Charlotte
- Chicago
- Cincinnati
- Cleveland
- Columbus
- Dallas
- Denver
- Detroit
- Fort Worth
- Houston
- Indianapolis
- Knoxville
- Las Vegas
- Los Angeles
- Louisville
- Madison
- Memphis
- Miami
- Milwaukee
- Minneapolis
- Nashville
- New Orleans
- New York
- Omaha
- Orlando
- Philadelphia
- Phoenix
- Pittsburgh
- Portland
- Raleigh
- Richmond
- Rutherford
- Sacramento
- Salt Lake City
- San Antonio
- San Diego
- San Francisco
- San Jose
- Seattle
- Tampa
- Tucson
- Washington
AI Simulates Vision Evolution in Artificial Animals
Researchers create digital creatures that develop functioning eyes without instructions
Published on Feb. 3, 2026
Got story updates? Submit your updates here. ›
A team of researchers has created artificial animals that over time develop functioning vision from simple light sensitivity to the ability to discern objects, despite not being given any specific instructions. The results demonstrate how AI can be used to understand the inner workings of evolutionary processes.
Why it matters
This research provides a new way to study the fundamental mechanisms of evolution, allowing scientists to explore potential evolutionary paths and solutions before they emerge in the natural world. The insights gained could also inform the development of more robust and adaptable technological systems.
The details
The researchers created virtual animals and released them into a synthetic world, giving them tasks on how to navigate, avoid obstacles and find food. Over successive generations, the artificial creatures developed functioning vision, with some developing dispersed photoreceptors, camera-type eyes and compound eyes - all without any direct programming. The researchers were surprised to see the digital eyes develop in the same way as those of real organisms, even in the simplified environment.
- The research was conducted by a team from Lund University in Sweden and the Massachusetts Institute of Technology (MIT).
- The findings were published on February 4, 2026.
The players
Dan-Eric Nilsson
Professor of sensory research and evolutionary biology at Lund University, who led the research team.
Lund University
A public research university in Lund, Sweden, where the research was conducted.
Massachusetts Institute of Technology (MIT)
A private research university in Cambridge, Massachusetts, which collaborated on the project.
What they’re saying
“We have succeeded in creating artificial evolution that produces the same results as in real life. It's the first time AI has been used to follow how a complete vision system can arise without instructing the computer how it should come to be.”
— Dan-Eric Nilsson, Professor of sensory research and evolutionary biology (Mirage News)
“The most surprising aspect was that the computer's eyes developed in the same way as those of real organisms, even though the environment we created was very simplified. In nature there are various solutions for achieving vision: dispersed photoreceptors, camera-type eyes and compound eyes. All three types were seen in the computer simulations. It was as if evolution found it familiar and followed its usual paths, even in our digital world.”
— Dan-Eric Nilsson, Professor of sensory research and evolutionary biology (Mirage News)
What’s next
The researchers plan to continue exploring the potential of using AI to simulate and understand evolutionary processes, with the goal of gaining insights that could inform the development of more robust and adaptable technological systems.
The takeaway
This research demonstrates the power of AI to unlock the inner workings of evolution, providing a new tool for scientists to study how complex biological systems emerge and evolve. The findings could have far-reaching implications, from advancing our understanding of the natural world to inspiring the design of more efficient and adaptable technologies.




