Research at Brown University’s Virtual Environment Navigation Lab (VENLab), one of the largest virtual reality labs in the country, indicates that optic flow, the perceived motion of visual objects, is critical in judging distance and avoiding obstacles. The research focuses on perception in action–how we see affecting how we move.
The same researchers previously proved that optic flow was essential to steer toward a target as you walk. This time they asked themselves what happens when you are standing still, if you’re not walking and not getting the constant stream of visual information that optic flow provides, how do you begin moving toward a target?
The researchers used virtual reality goggles to test how optic flow helped subjects steer toward a specific target. When subjects had optic flow, they took a straight path toward the doorway and made it, on average, in just three tries. When optic flow was eliminated, and subjects had only a lone target to aim for, it took an average of 20 tries before they walked straight to the target.
The conclusion — “with a continuous flow of visual information, your brain allows you to rapidly and accurately adapt your direction of walking. So weâ€™re constantly recalibrating our movements and our actions based on information such as optic flow.â€
More profoundly, the results indicate that the world influences our brain, rather than simply our brain dictating how we perceive and navigate our world. These findings have implications for robotics and machine guidance.