At VirZOOM we make exercise games that move you around in VR from your pedaling and leaning on a bike. Solving VR sickness has been our top goal, so anyone in a gym can step up and feel exhilarated by VR rather than the opposite. It’s why our tagline is “Virtual Reality That Moves You!”

Virtual reality is experienced through a headset which draws two images of the game, from the precise location of each of your eyes to create a stereo effect, in the direction your head is facing. VR has been around for a long time, but the release of the Oculus DK1 marked the first time it was performant enough for a mass audience.

VR before Oculus was released could cause sickness because of insufficient framerate to draw two wide-field images fast and sharp enough, and the latency of measuring your head direction and position. Since Oculus was released these have been solved by having more powerful computers, VR-optimized graphics drivers and rendering techniques, low-persistence LED screens, and better and faster sensors.

This still takes up to 10x the horsepower of regular videogames, which are single image, usually 30 fps, and undistorted with a smaller field-of-view. This is why most VR games look a generation or two old. Fortunately GPU and engine makers have been hard at work optimizing drivers and techniques to leverage commonality between eyes and fact that you perceive the most resolution in the center of your field of view, to bring that multiplier down to 2-3x.

Even with the best hardware and rendering, VR can still cause sickness if games move your virtual head much differently from your real head. The difference between the acceleration your inner ears feel and the acceleration your eyes see is the cause of VR sickness. It turns out that pretty much every 3D game requires your virtual head to move around, which is why existing games have been astoundingly difficult to bring to VR.

From all our playtesting and feedback, we believe people have different levels of sensitivity which can trigger their simulation sickness, and they will only feel it 10 minutes after a game has crossed that threshold. Most people will incorrectly attribute their feeling to whatever they are doing at that moment rather than 10 minutes ago. This delay time is also why it’s difficult for someone to “discover their limit” and “auto-tune” a game for it. It’s true that repeated VR experience can acclimate users, but the amount and degree is again unpredictable, and a mass market product can’t rely solely on that.

So games have to be redesigned with VR motion in mind. The most successful but also most limiting way to do this is “room-scale”, whereby your virtual head moves the exact same way as your real head. In these games you are only allowed to play from a single location or in a little area, as far as the VR position tracking and your furniture allows.

One common way to allow you to move in virtual space is to put you in a “cockpit” where you can only see out windows. This approach evolved from the idea that you don’t generally get sick playing 3D games on your home TV, because your brain can perceive your whole room which is not moving, and accepts that the TV portion is just an image. But that is also what makes this approach less good and immersive for VR. Because VR images aren’t as wide as your real eye (100 vs 180 degrees), you have to draw the cockpit right in front of the user, and seeing the virtual world through a window can make it seem like an image rather than real.

A more evolved approach involves reducing “vection”, which is the amount of motion detail in your peripheral vision. Your brain relies on vection, not just what you are looking at, to determine how your head is moving. The ways games can reduce vection are by having low-detail backgrounds (i.e. space in Eve Valkrie), big blocking walls when you turn (i.e. Thumper), blindering you like a horse (i.e. Eagles Flight), or turning you in steps rather than continuously (i.e. Skyrim VR). These require obvious environment design or viewing limitations.

The VirZOOM approach is a third way, that works well with sitting on a bike for control as well as for gamepads. It is based on reducing the difference between acceleration felt by your inner ear and seen by your eyes, which is the heart of the VR motion problem. Through hundreds of early playtests we determined what accelerations people are most sensitive to, and from least to most sensitive they are “forward”, “up”, “sideways”, “down”, “backward”, and “rotation”. It’s interesting to think people evolved more sensitivity to “down” and “backward” accelerations because those are the two most dangerous directions to go if you can’t see.

As a first step, we smoothed motion changes to keep acceleration under those limits. But those limits were too constraining for people to rotate in our games, and we discovered that by turning your head at the same time as turning in the game, you could play for hours instead of minutes. This was also a solution found by Eagles Flight, and head-turning was introduced later in the Oculus Tuscany demo.

But that solution alone makes you turn uncontrollably while looking around and moving. Most people want to look around in VR without unintentionally crashing into things or falling off cliffs. So we devised a control scheme that blends your steering with your head direction, so you only turn in the virtual world if you both steer and turn your real head. If you just steer without turning your head, you offset your movement direction but don’t actually turn (think changing lanes vs going around a curve).

While a gamepad can provide steering, what feels great on a bike is to steer by leaning. People intuitively lean to turn like on a bicycle or motorcycle, even if you tell them to only steer through handlebars, which we saw from tons of playtests. Something in their mind knows that just turning the handlebars on a bike without leaning would in fact throw you to the ground. Not having moving handlebars also made users feel safer (with their eyes being covered) and more willing to lean. Once we switched from handlebar turning to leaning we saw a collective “whew!” in the body language of our players.

The other big problem was landing from a flight. Takeoff and flying in VR, especially under your own power, is amazing. But landing is a sharp change of the “down” acceleration that people are highly sensitive to, and can’t be smoothed or you will go under the ground. If the person themselves comes in for a smooth landing by reducing their speed the closer they get to the ground, landing feels fine. If they just ram into the ground, it feels to them like a punch in the gut, even though their real body hasn’t moved.

To solve that problem we limit your falling speed in the “normal direction” to ground slope, proportional to your height above the ground in that direction. This can feel like airplane autopilot if the user isn’t managing their own descent smoothly, but it’s much preferable to hard landings.

Something that a lot of VR games get wrong and could easily fix is to keep your virtual “neck” upright, not rolling or pitching as your vehicle or ground does. Roll and pitch should match your real head exactly, because only then do your inner ears feel gravity correctly. Another way to put this is the horizon of your game should always remain level and centered.

Besides the way you move, another way we’ve learned to make good VR games is to keep the action mostly in front of you. Even though you can turn your head 180 doesn’t mean the game should continuously require you to, and one of the benefits of VR is simply a wider field of view than you see on a TV. I’m excited to see games like FIFA in VR for the reason that I could see more of the field while also appreciating the character movements.

Also think about where to put your HUD (heads-up display). Game HUDs indicate your score and other things you may need to know to play that don’t make thematic sense on your “vehicle”. We put ours at 60 degrees up from the horizon, in the direction of your VR reference frame, not including your head rotation or the offset of vehicle motion due to steering. It’s not great to stick a VR HUD directly to your head that you can’t look away from.

We’ve learned that people can be fearful of moving in VR when the potential result of moving in their minds is falling or crashing. This is not simulation sickness because it doesn’t result from moving, it is vertigo. This was the reason we put up a fence along the edges of our horse-riding canyon road, and also the reason we moved walls around our racecar track back from the track, leaving a big patch of dirt that players could see they’d slide into before hitting the wall.

It’s crucial for players to feel in total control of their VR motion, so we have tutorials that teach you layer-by-layer how to pedal to move, lean to steer, look to turn, and finally how to fly. We also created immediate and accurate pedal speed sensors, because the cadence and speedometers available at retail have latencies around a second, and the instant you start and stop pedaling you need to start and stop moving.

With these VR comfort techniques we’ve found that 90% more people can play our games than without them. And not just for a longer period: if you stay below a person’s acceleration sensitivity threshold they can play until their legs give out. The payoff is an intensely immersive experience that really feels like you can race, fly, ride, drive, and paddle in beautiful virtual worlds.