Attention is fundementally different than learning.

There is some similarities between attention and learning, both increase a neurons ability to selectively respond to stimuli. However, despite these similarities, they are not caused by the same mechanism. One one level, this isn't surprising, as learning requires a permanent change in synaptic weight, while attention is a more transient. Nonetheless, one might expect that the same correlations between neurons is the same whether or not its a learning process or an attention process.

This is exactly the question that Dr. Jasper Poort, Adli Khan and team try to understand. They want to know if the exact same change in neuron firing rates before and after learning or attention are the same? It turns out that while broadly both attention and learning increase pyramidal neuron selectivity, the activity of interneurons show different correlation patterns.

To test this they create a task where rats learn to identify different visual patterns to get a food reward. Then they either have to attend to the visual task, or ignore it depending on the presence of a certain smell. They then measure the V1 neurons selectivity to the stimuli. They found that generally both Pyramidal cells and inhibitory PV interneurons increase their selectivity in both learning and attention. However the correlation between VIP neurons and SOM neurons are completely different between the two cases. In fact, the best way to reproduce the results is by adding multiplicative attention modulation to SOM cells and Pyramidal cells. Recall that multiplicative attention is where all synaptic weights (excitation and inhibitory) become stronger, rather than a baseline shift.
 
What they find, while not surprising, it settles the debate. It also provides a  location of mechanism for attention circuits. This means that most attention mechanisms require the other interneurons to work, and that many top-down control mechanisms feed into SOM interneurons as a way to control the selectivity of the pyramidal neurons. 


Author: Alex

留言