Using virtual reality, we can present visual information that is either consistent or inconsistent with motor signals and with other movement-related sensory cues and look at how this influences learning.
In our lab, we study how people use and integrate sensory information to control real-world behaviors.
Virtual reality technology gives us precise control over experimental variables and allows us to selectively manipulate stimuli, while maintaining a high degree of ecological validity in naturalistic settings. Additionally, we can simulate situations that would be difficult to create in the real-world.
Our interactive, immersive virtual reality environments are displayed with stereoscopic head-mounted display (HMD) systems (Vive Pro Eye, Oculus Rift S, HP Reverb G2). An attachable video see-through display is used to generate augmented reality scenarios.
An infrared marker/camera and inertial sensor system allows real-time position and orientation tracking of a 20x15 ft. lab space, attached to a control room which houses a tracking and rendering computer. It synchronizes participants’ movements in the lab with their movements in the virtual worlds.