Thursday, 8 July 2010

Motor learning changes where you think you are

ResearchBlogging.orgI’ve covered both sensory and motor learning topics on this blog so far, and here’s one that very much mashes the two together. In earlier posts I have written about how we form a percept of the world around us, and about our sense of ownership of our limbs. In today’s paper the authors investigate the effect of learning a motor task on sensory perception itself.

They performed a couple of experiments, in slightly different ways, which essentially showed the same result – so I’ll just talk about the first one here. Participants had to make point-to-point reaches while holding a robotic device in three phases (null, force field and aftereffect) separated by perceptual tests designed to assess where they felt their arm to be. The figure below (Figure 1A in the paper) shows the protocol and the reaching error results:

Motor learning across trials

In the null phase, as usual, participants reached without being exposed to a perturbation. In the force field phase, the robot pushed their arm to the right or to the left (blue or red dots respectively), and you can see from the graph that they made highly curved movements to begin with and then learnt to correct them. In the aftereffect phase, the force was removed, but you can still see the motor aftereffects from the graph. So motor learning definitely took place.

But what about the perceptual tests? It turns out that participants’ estimation of where their arm was changed after learning the motor task. In the figure below (Figure 2B and 2C in the paper) you can see in the left graph that after the force field (FF) trials, hand perception shifted in the opposite direction to the force direction. [EDIT: actually it's in the same direction; see the comments section!] This effect persisted even after the aftereffects (AE) block.


Perceptual shifts as learning occurs

What I think is even more interesting is the graph on the right. It shows not only the right and left (blue and red) hand perceptions, but also the hand perception after 24 hours (yellow) – and, crucially, the hand perception when participants didn’t make the movements themselves but allowed the robot to move them (grey). As you can see, there’s no perceptual shift. It only appears to happen when participants make active movements through the force field, which means that the change in sensory perception is closely linked to learning a motor task.

In some ways this isn’t too surprising, to me at least. In some of my work with Adrian Haith (happily cited by the authors!), we developed and tested a model of motor learning that requires changes to both sensory and motor systems, and showed that force field learning causes perceptual shifts in locating both visual and proprioceptive targets; you can read it free online here. The work in this paper seems to shore up our thesis that the motor system takes into account both motor and sensory errors during learning.

Some of the work I’m dabbling with at the moment involves neuronal network models of motor learning and optimization. This kind of paper, showing the need for changes in sensory perception during motor learning, throws a bit of a cog into the wheels of some of that. As it stands the models tend to assume sensory input as static and merely change motor output as learning progresses. Perhaps we need to think a bit more carefully about that.

---

Ostry DJ, Darainy M, Mattar AA, Wong J, & Gribble PL (2010). Somatosensory plasticity and motor learning. The Journal of Neuroscience, 30 (15), 5384-93 PMID: 20392960

Images copyright © 2010 Ostry, Darainy, Mattar, Wong & Gribble

5 comments:

  1. Nice paper. It maps on nicely to work by Bingham, Mon-Williams and others on (re)calibration of prehension.

    It's certainly not a surprise, though; the only reason you tend not to see this sort of recalibration outside the lab is that in general the force and information fields are pretty stable. But jump on a moving walkway or into some water and you can get all these kinds of effects.

    ReplyDelete
  2. Yes, I particularly like stepping on to a stationary walkway (that is supposed to move), or a broken escalator.

    Actually one of my best experiences was when I went to Ripley's Believe-It-Or-Not in Florida. They had a walkway inside what looked like a rotating room. When you walked across the walkway you fell over, because the sense of rotation was so strong. Eventually I got used to it. Then I tried it backwards. Good times!

    ReplyDelete
  3. I think that would make me throw up :)

    ReplyDelete
  4. Hi Carl,
    Delighted to have found my way onto your blog...

    I should point out that Ostry et al's result is actually a shift in perceived hand position in the SAME direction as the force field. (A leftward shift in the perceptual boundary corresponds to a rightward shift in perceived hand position). So their result doesn't in fact agree with our prediction and results.
    Their effect size is tiny - just 2mm! So probably not such a huge cog in your computational modelling wheels.

    As for where that leaves our NIPS paper... it's something I've been working on... Watch this space!

    ReplyDelete
  5. Hi mate! Hope you're well.

    I must have misread the results. You're right of course that a leftward shift in the boundary is a rightward shift in perceived hand position; will edit the post. Now I look at it more carefully I note that they don't talk explicitly about which way the shift is. Discussing it purely in terms of the perceptual boundary is really confusing!

    I think there still is a case for my comments on modelling though - assuming that our results are correct, there will be some change in perception, which few current models really deal with.

    Oh well, my fault for not reading carefully enough. This, of course, is why the blog has a comments section. :)

    ReplyDelete