Skip to content Skip to sidebar Skip to footer

Getting a Grip on (Virtual) Reality: My Flights from (Conventional) Reality

A woman wearing a helmet with a screen in front of her eyes, and gloves with sensor cables.
Figure 1. Inside a Virtual Reality world, wearing a classic head mounted display (HMD) (Wikimedia Commons and NASA).

It all started one day in Manhattan. I started to experience all kinds of strange things.

  • I found myself in a dark cemetery surrounded by ghosts. Slowly, the tall shadow of Abraham Lincoln approached. A ghost behind me called out to alert us of his presence and I turned to see her. Lincoln entered a crypt opposite me to commune with his recently deceased son Willie.
  • I became a small Kapok tree growing in a tropical forest. I saw myself spring from seed to mature rapidly into a commanding presence. I moved my arms, and great tree limbs swayed as though they were buffeted by powerful winds. Finally, I smelled the forest fire that would fell me as a wrecking crew clear-cut the forest.
  • I dove deep beneath the ocean, almost weightless, exploring mighty rock formations and steering with just the slightest shift of my body. I chased small whales, emulating the grace of their movements.
  • I soared upward weightlessly, as celebrity actors gradually appeared and glided past in the cloudy sky. I first recognized Benicio Del Toro, then Charlize Theron, and then others as each disappeared only to be replaced by someone more notable.
  • Finally, I experienced a rocket launch at Cape Kennedy. I heard the deep-throated rumble of a Saturn 5-class rocket and felt its impact on my back and chest. I was enveloped in a tidal wave of flames that streamed from its engines through huge concrete channels that carried them away to safely dissipate.

No, I wasn’t doing mushrooms. I was doing Virtual Reality (VR). It is a journey that started this past January and hasn’t stopped. I wondered how Virtual Reality worked and whether it would grow into a daily occurrence.

Virtual Reality Fools Your Brain

First of all, “normal” experienced reality is really fragile and your brain fools you constantly into thinking it is stable. Virtual reality only fools you a little bit more.

Your brain and mind strive toward constancy and normalcy in how they interpret sense data. Your eyes, brain, and mind collude to make it seem that you are experiencing an unchanging world. However, your “normal” reality is largely illusory. For instance, the vision you experience shows you a mostly stable world, but that is not what your eyes really see. Take a short run with me and and you will understand a bit more.

Imagine that you are jogging on hard pavement, returning from a run you just made in the woods. Your eyes and head move violently with each footfall. However, your “eyes” perceive a seamless, solid view of the world, not the shakes or shudders that any movie camera would record.

Your jogging also takes you past many sources of light, all with their different color temperatures.

  • First, blue-ish light prevails as you move between a row of dense trees (north light – 10,000 degrees K)
  • Then white-ish light is dominant as you move into direct sun (5,000 degrees K).
  • Finally, yellow-ish, buttery light hits you in your living room (2,700 degrees K), unless you have green-ish compact fluorescent (4,100 degrees K).

Despite these changes in the color of light, your T-shirt continues to look white to you the whole time. It may look white to your eyes/brain/mind, but it doesn’t to a film camera, or a digital camera without Automatic White Balance. Like a good stage manager, your brain is sending commands from behind the scenes to “turn-up the lights up there” or “change the colors down there” in order to keep your perceived world stable.  Your eyes and brain work overtime to preserve the illusion of “constancy” and “normality” here.

Photograph showing two colors of white light on a styrofoam head
Figure 2. The same white styrofoam head under different colored light. Slightly bluish (open shade – 7,000 degrees K) on the left, and greenish tint (indoor compact fluorescent – 4,100 degrees K) on the right.

Sound also has deceptive properties. Home theater systems exploit them artfully. Imagine that you watch an action movie on DVD after that run in the woods. You hear the staccato report of machine gun fire behind you as the deep rumble of a warplane envelops the room. A bomb explodes and the sound seems to come from everywhere. However, you really know that all those low frequency sounds come from just one source, that sub-woofer box a few feet from your TV. In contrast, high frequency sounds are easy to locate, so the person who created the sound mix made the gunfire come from the left rear speaker.

Sometimes the senses must work together to preserve stable perception, and that union can be easily disrupted. Sensory collaboration is especially true of motion perception, which is one reason it can be so easily fooled. Normally your eyes, your vestibular system (inner ear canals), and your proprioception (feedback from muscles, tendons, and joints) listen to each other and agree about your motion and position in space. However, when something disrupts or misleads this triumvirate, you can be fooled quite badly.

One of the tricks that VR presentations use is inducing motion effects. VR motion chairs often have very high seat platforms so your legs dangle and cannot send proprioceptive signals to compare with other senses. Visual images can easily induce apparent motion, and without proprioceptive signals, your vestibular system doesn’t know what to think. You can be tricked very easily into thinking you are moving forward, backward, up, or down.

VR achieves its effects by artfully reproducing cues from nature and exploiting aspects of our sensory systems.

VR Effects

VR platforms play with many sensory and symbolic factors to produce their calculated effects. Table 1 revisits the uncanny experiences I had in Manhattan and identifies the sensory systems that helped produce them.

Head Mounted Displays (HMD) were part of all the platforms and are the cornerstone of current VR configurations. They provide auditory and visual input and can accommodate speech communication between fellow participants.

  • Two of the VR platforms include special pivoting platforms that strongly affect the vestibular system. The VR Motion Chair allowed a seated viewer to rotate sideways around its vertical axis (Yaw), and the Multi-Axis Prone Platform allowed a viewer to lie face downward and spindle like a drill bit around a longitudinal axis (Roll) and incline up and down on a horizontal axis (Pitch).
  • Table 1. A sample of VR Environments and the sensory systems they engage.
Sensory and Symbolic Systems In the Graveyard:

Head Mounted Display (HMD)

Kapok Tree in Forest:

Head Mounted Display (HMD)

Ocean Diving:

Multi-Axis Prone Platform

Floating in the Sky:

VR Motion Chair

Rocket Launch:

VR Backpack

Racing Simulator:

VR Driving Simulator

Auditory X




Visual X






  X X X X


  X X    
Olfactory   X


Interpersonal Communication           X
Shared Virtual Space           X

Table 1 also includes a VR experience that offered me multi-user interactions—an arcade game where competitors could climb in separate VR driving simulators and compete in a race down California’s Pacific Coast Highway. Unlike the other experiences, it created a shared virtual space and allowed communication among players. In some ways this platform appeared kind of retro—just a new wrinkle on a conventional amusement. The others provided personal experiences, and some were emotionally moving. However, none of them involved a communal experience shared in real-time.

This table is just one way to classify VR presentations and is only a sampling of what I experienced. However, there are other important ways of looking at these presentations. The table is organized by the type of hardware used to produce experiences and by the sensory or symbolic systems they engaged. Other useful approaches would be how much immersion or presence the settings induce, how much interactivity they allow, and whether they are fixed in location or permit limited or unconstrained movement (mobility and scale).

Photography Helps Create the VR Experience

My entrance into the world of VR came through planning and executing photography that is used in VR. Images in VR differ from conventional photographs or cinematography by presenting a panoramic field of view. VR image makers often allude to the basic rules of this new medium, though no canon of laws yet exists. However, here are some guidelines for creating content for VR headsets that is watchable and less likely to induce discomfort or nausea in your viewers:

Keep the horizon flat, level, and stable, and never make it appear to concave upward. Keeping the horizon flat and stable avoids apparent motion that can induce nausea in onlookers. A stable camera pitch also prevents the elastic flexing that occurs when the vertical pitch of the camera varies and the horizon line changes shape.

Our eyes, mind, and brain accept the notion that the horizon slopes just slightly downward because we walk on a huge giant sphere. On the other hand, a horizon that slopes upward is unnerving; it means we are walking on the inside of that sphere. We just don’t see things that way. Even the genius cinematography of three-time Oscar winner Emmanuel Lubezki was criticized for several seconds of “taffy pulling” as the horizon moved from a comforting convex shape to an eerie concave one in “The Knight of Cups”.

three photographs showing the effect of camera position on horizon curvature
Figure 3. The visual effect of different horizon points
(a) Flat Level Horizon – Lens axis pointed at the true Horizon
(b) Convex Horizon – Lens axis pointed above the true Horizon
(c) Concave Horizon – Lens axis pointed below the true Horizon

When photographing people, keep the horizon view of the lens at the level of subjects’ eyes.

In most instances, keeping the camera chest level is about right.

This guideline is an application of the illustrators’ maxim to “hang figures on the horizon.” Centering the horizon line on subjects’ eyes directs the viewer’s attention to this key area and keeps head size and distance in simple linear perspective. Image makers also know that people pay a great deal of attention to the eyes, so they spend an inordinate amount of time focusing, lighting, and sharpening eyes to create striking portraits.

Photograph of three people with diagram show how their eyes are aligned laterally on the horizon
Figure 4. Composition with subjects placed so their eyes align on the image horizon.

Do not photograph objects too close to the lens, especially when using dual lens or stereoscopic cameras.

Wide fields of view require wide angle optics, and sometimes images from separate lenses must be “stitched” together for panoramic scenes. If your camera is too close to your target object, one of the lenses may get only a part of your target object’s detail—perhaps not enough to create a satisfying stitched or stereoscopic image.

Being too close to your target can also lead to distortions we associate with wide angle lenses and may appear unpleasant. (Actually, the distortion is caused when the distance the object is from our lens—Working Distance—is different from the distance the reproduced image is to our eyes—Viewing Distance. Let’s discuss this later when I understand it better.  🙂

Avoid rapid movement in the visual field.

Make most motion run in a straight line toward the camera lens (aimed perpendicular to the plane of the flat image sensor) not cross-wise in front of the lens (in a line parallel to the plane of the image sensor). Motion appears slowest and most graceful when it moves forward or backward near the optical axis of the lens (perpendicular to the image sensor).

Think of your lens as a narrow spotlight that projects onto your subject. Have your subject run forward or backward within that tunnel of light.

Two diagrams showing motion of a person along and across the lens axis
Figure 5. Quick, unexpected actions can appear jarring in Head Mounted Displays and can also induce nausea. Motion that runs perpendicular to that tunnel of light appears fastest and potentially most jarring, (a) along the lens axis and (b) across the lens axis.

Avoid fast cuts or transitions in the final presentation.

Finally, the way you edit the final product can affect the tempo and apparent motion of your images. Use graceful transitions, and even interstitial titles, content, or effects to prevent abrupt changes that could be disorienting in a headset.

Finally, all rules are made to be broken.

Breaking rules has consequences, and occasionally the outcomes are good. Artists and craftsmen consciously break rules to test their validity and provoke responses that conventional treatments will not. The VR experiences we are speaking of are comparatively new, and there is ample room to discover the limits of the “basic rules” outlined above.

Collaborating to Create a New Promethean World

The arts, literature, and film have long anticipated the world of artificial realities. These visions were not always uplifting. Aldous Huxley introduced us to “feelies”—diverting VR movies in 1932’s Brave New World. More recently, we were captivated by the ominous, deceptive reality of the Wachowskis’ Matrix movies. Now we have the tools to build such experiences, understand them, and use them.

One way to master and help develop aesthetic conventions is to experiment with creating content in ultra-wide angle, stereoscopic, and 3-dimensional projected formats.

An inexpensive place to start is with “action cameras” with exceptionally wide fields of view. My first such camera accommodated 155 degrees lateral field of view and cost well under $100. If you wish to experiment with 360-degree rendering in rectilinear displays, reliable dual-lens cameras are available for $200-$300 dollars.

UXers possess the attitudes, skills, and knowledge to help advance these areas and construct new realities to serve worthy and practical goals.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.