Replies: 1 comment
-
I think this is a very good question. I can't think of any other simple ways to do this. To be honest, I might argue that the very valuable assets in primate might be all of the other session types (3 out of 4). I think it would be reasonable to focus on purely visual tasks. Mouse sensory-motor coupling is deeply innate and engrained. The joystick might be a much more naive task so I am not sure the results would directly port over. It would certainly be a lot of work. I am curious what other people think? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
If we implement the sensory-motor paradigm in monkeys with a joystick coupled to an avatar, will this be a sufficient comparison to the mouse data?
Which specific oddball types should be included? Visual flow disconnected from joystick/running speed, continuing visual flow while stopping running/joystick and the inverse, stopping visual flow while continuing running/joystick movement.
Beta Was this translation helpful? Give feedback.
All reactions