For most of us, creating mental images based on speech or memory is very easy.
If I say “cube”, you are probably already picturing one in your mind (although people with aphantasia have little or no mental imagery).
You may not realise it, but you’re probably also very good at translating physical sensations into mental images. Imagine being in total darkness and holding a cube-shaped object. There’s a good chance you could turn this tactile information into a mental image.
For centuries, scientists and philosophers have debated whether this is something we learn to do or if our nervous systems are hardwired for it. My team’s recent research involving newborn chicks offered new insights to this question.
This topic has intrigued philosophers and scientists for centuries, reaching back to at least 1688, when the Irish philosopher William Molyneux wrote a letter to fellow philosopher John Locke. Molyneux’s letter pondered whether a person born blind, who learns to differentiate between a cube and a sphere through touch, would recognise these objects upon immediately gaining sight. This question was also personal for Molyneux — his wife had lost her sight shortly after their marriage.
Being focused on the role of experience in people’s formation of ideas and beliefs, empiricists like Locke think that sensory experience is necessary to learn or understand the correspondence between tactile and visual information.
Yet, neuroscientists and philosophers alike have started to challenge this view. Some argue that sensory information can be processed in such a general and abstract way that, at some level, the details become irrelevant.
Others point to phenomena such as synaesthesia, where people experience stimulation of one sense (such hearing) and have a corresponding experience in a different sense (vision). This happens for example in “coloured hearing”, when the experience of hearing sounds also produces the experience of colours.
Scientists once thought only humans could associate sensory features across different modalities, but research is showing other animals do this too.
For instance, dogs match small images to high pitched sounds and big images to low pitched noises. My team’s 2023 study, found that tortoises do this too.
But scientists still aren’t sure whether animals learn with experience to match information between senses or whether this ability is a feature of the nervous system, that doesn’t require experience.
Scientists have been fascinated by the possibility that tactile sensations might spontaneously evoke visual representations, as a property of the nervous system. However, investigating these questions isn’t easy, particularly when it comes to human subjects.
In fact, while in some patients born blind, vision can be restored with the removal of a cataract. But it takes a few days for vision to fully restore. Patients tested within 48 hours of surgery could not solve the task of visually distinguishing shapes originally presented to them in tactile modality, but mastered the task in as little as five days, in a 2011 study.
Some researchers have studied infants to try and answer the question. In the 1979 study by University of Washington researchers, newborns were offered pacifiers with different shapes.
In another study from 2004, newborns were given different shaped objects, such as cylinders and prisms, to touch. When the babies were shown new objects and shapes mixed in with the ones they had touched, they tended to look for longer at the new ones. This suggests they were surprised by the new shapes.
Yet, when different teams tried to replicate the studies, results were mixed. It’s also not possible to say for certain what previous sensory experience babies have had as it would be highly unethical to raise them in sensory deprivation before the study.
Plus, in humans, vision develops over time.
What we found
My team opted for an alternative approach using domestic chicks. These birds have well developed motor and sensory systems upon hatching. In our experiment, we hatched chicks in complete darkness and maintained this environment during their exposure to tactile stimuli.
Each chick was housed in individual compartments containing either bumpy or smooth cubes. Over the course of 24 hours, the chicks familiarised themselves with their bumpy and smooth environments respectively. This setup mimics natural conditions, as chicks typically spend their early days nestled in the darkness under the warmth of a hen.
Because chicks tend to develop attachment responses (imprinting) on the first stimuli that they encounter, we thought that they could develop a kind of attachment to the tactile objects.
After the tactile exposure, we brought chicks to their first visual experience. We put them in an illuminated area with two objects: a smooth cube and a bumpy one.
Chicks exposed to the smooth stimuli approached the smooth cube much more compared to chicks exposed to the bumpy stimuli.
This result would surprise Molyneux and Locke (and possibly modern empiricists) because it shows the brain is wired to make sense of the complexities of the world before we have direct experience with it.
It also aligns with research showing newborn animal minds are born with expectations of stimuli. For instance, my team’s 2023 paper showed that chicks have expectations of upwards and downwards movement that suggests an understanding of gravity, in the absence of visual experience.
We are also interested in researching the neural mechanisms behind associations between senses during early development and other predispositions that help us cope with our environment in infancy.
This could help us understand the connection between sensory representations, imagination and the perception of reality.
Elisabetta Versace receives funding from The Royal Society, The Leverhulme Trust
This article was originally published on The Conversation. Read the original article.