Monday, 1 July 2024

Babies use 'helpless' infant period to learn powerful foundation models, just like ChatGPT

 Babies' brains are not as immature as previously thought, rather they are using the period of postnatal 'helplessness' to learn powerful foundation models similar to those underpinning generative Artificial Intelligence, according to a new study.

The study, led by a Trinity College Dublin neuroscientist and just published in the journal Trends in Cognitive Sciences, finds for the first time that the classic explanation for infant helplessness is not supported by modern brain data.

Compared to many animals, humans are helpless for a long time after birth. Many animals, such as horses and chickens, can walk on the day they are born. This protracted period of helplessness puts human infants at risk and places a huge burden on the parents, but surprisingly has survived evolutionary pressure.

"Since the 1960s scientists have thought that the helplessness exhibited by human babies is due to the constraints of birth. The belief was that with big heads human babies have to be born early, resulting in immature brains and a helpless period that extends up to one year of age. We wanted to find out why human babies were helpless for such a long period," explains Professor Rhodri Cusack, Professor of Cognitive Neuroscience, and lead author of the paper.

The research team comprised Prof. Cusack, who measures development of the infant brain and mind using neuroimaging; Prof. Christine Charvet, Auburn University, USA, who compares brain development across species; and Dr. Marc'Aurelio Ranzato, a senior AI researcher at DeepMind.

"Our study compared brain development across animal species. It drew from a long-standing project, Translating Time, that equates corresponding ages across species to establish that human brains are more mature than many other species at birth," says Prof. Charvet.

The researchers used brain imaging and found that many systems in the human infant's brain are already functioning and processing the rich streams of information from the senses. This contradicts the long-held belief that many infant brain systems are too immature to function.

The team then compared learning in humans with the latest machine learning models, where deep neural networks benefit from a 'helpless' period of pre-training.

In the past, AI models were directly trained on tasks for which they were needed for example a self-driving car was trained to recognise what they see on a road. But now models are initially pre-trained to see patterns within vast quantities of data, without performing any task of importance. The resulting foundation model is subsequently used to learn specific tasks. It has been found this ultimately leads to quicker learning of new tasks and better performance.

Source: ScienceDaily

No comments:

Post a Comment