Department

Mathematical Sciences

Author Type

Undergraduate Student

Submission Type

Event

Start Date

27-7-2017 1:15 PM

End Date

27-7-2017 2:15 PM

Description

This project studies human perception of size within a controlled virtual reality environment. When we open our eyes and glance at the surrounding, we see stable scenes that appear absolute. However, many argue that this is a pure illusion as nothing in our visual pathway is stable. Thus, our perceived size of an object may not reflect accurately the extent of the projection onto our retina. We do experience such false perception when we see celestial objects near the horizon. But are such misjudgments ubiquitous in our daily lives? With this in mind, the virtual reality scene has been created with a manipulated environment to provide a false sense of depth perception to the viewer. We want to test if this setting can cause the viewer to see familiar objects inaccurately in both size and distance. More specifically, we want to test if objects in one scene can look both closer and larger than the same objects in the other scene. If we can create such false perception, then we can conclude that misjudgment of angular size is ubiquitous. To bring this set up to life, the gaming software Unity is used to develop the scene, an HTC-Vive is used as the virtual reality goggles, and a computer running Windows 10 and equipped with an Intel Xeon CPU, 64GB RAM, and a GeForce GPU is used as the host. Subjects to participate in this experiment will be 18 or older and will be recruited around Susquehanna University. They will be asked to judge both size and distance of the objects in the virtual scene. The results will be presented at the conference.

Share

COinS
 
Jul 27th, 1:15 PM Jul 27th, 2:15 PM

Size Constancy in Virtual Reality

This project studies human perception of size within a controlled virtual reality environment. When we open our eyes and glance at the surrounding, we see stable scenes that appear absolute. However, many argue that this is a pure illusion as nothing in our visual pathway is stable. Thus, our perceived size of an object may not reflect accurately the extent of the projection onto our retina. We do experience such false perception when we see celestial objects near the horizon. But are such misjudgments ubiquitous in our daily lives? With this in mind, the virtual reality scene has been created with a manipulated environment to provide a false sense of depth perception to the viewer. We want to test if this setting can cause the viewer to see familiar objects inaccurately in both size and distance. More specifically, we want to test if objects in one scene can look both closer and larger than the same objects in the other scene. If we can create such false perception, then we can conclude that misjudgment of angular size is ubiquitous. To bring this set up to life, the gaming software Unity is used to develop the scene, an HTC-Vive is used as the virtual reality goggles, and a computer running Windows 10 and equipped with an Intel Xeon CPU, 64GB RAM, and a GeForce GPU is used as the host. Subjects to participate in this experiment will be 18 or older and will be recruited around Susquehanna University. They will be asked to judge both size and distance of the objects in the virtual scene. The results will be presented at the conference.