I will try to include the eye movement in the mouse model.
Regarding the iCub model, this one is able to move both head and eyes. I'm performing the VOR experiment using this model. I move the robot head following a sinusoidal trajectory (using position commands) and the cerebellum must be able to generate the eye VELOCITY commands to compensate the head movement. The problem of this iCub model it is the fact that both head and eyes are too coupled. When the cerebellum generates an eye velocity trajectory, this movement induces a considerably deviation in the head sinusoidal trajectory. Thus, the cerebellar output (eye movement) modifies the cerebellar input (head movement), making the learning process almost impossible.
I have tried to decrement this coupling by decrementing the eye inertial coefficients (from 0.01 to 0.001) in the model.sdf file. In this case, when I move the eyes using POSITION commands, both head and eyes are almost uncoupled. Nevertheless, If I try to move the eyes using VELOCITY commands, the eyes become crazy, switching between the maximum positive and maximum negative velocities.
I would like to know how can I properly uncouple the iCub head and eye movements (or at least reduce this coupling effect)?