Aritra Nath & Shayan Javed

Virtual Environments - Spring 2009 Project 3

Project: To create a Virtual Reality experience where you as a soldier practice shooting. A study involving 10 people was run to measure presence in this Virtual Environment (VE).
Download link to source code: Project 3 [Requires OgreSDK and OpenAL]
Download link to the report of the study: Study Report

Goal:

Training soldiers to shoot and/or go through different military scenarios can be expensive. Our experience aims to help people with shooting at distant objects from a high position (such as a mountain).

Scenario:

The scenario of the scene involves you as a soldier being recruited to serve at a new military base high up in the mountains. To train you multiple targets (some of them moving) have been setup around the area which you have to shoot down using your rifle. You have to shoot the closest targets to you first to show that you can take care of the nearest threats first.

Features:

1. Big scene composed of a military base and mountaineous ranges.
2. Head position and orientation tracking.
3. Tracking an object (A rifle). The Wii Zapper substitues for the rifle in the real world. Hands on the rifle are used to show a representation of the avatar.Ideally a replica of the rifle would be used.
4. Haptics: Wiimote rumbles as you press the trigger. (Ideally force feedback to simulate recoil). Passive Haptics: Barrier in front of the user and a wall behind.
5. Sound: Shooting the gun, exploding targets/balls, mountain ambience.
6. User is given orders and instructions from a general. Assumed user is wearing a headset.

How It Worked

-User starts off at the base in between the two barriers with the gun in hand. "General" says something to the user. Tells the user to get used to the environment and the handling of the gun. Asks the user to say when s/he's ready to begin practice.
-When user is ready, general gives some more orders and instructions. Six targets pop up which the user has to shoot in order of closest distance.
-Now general instructs the user to shoot moving targets (balls) in the same order.
-Experience ends after shooting the moving targets.

Video:


How it was done:

Tracking:

Head tracking and object tracking were both done using the two cameras situated in the lab.

Displays:

V8 HMD used for stereo effect. It has two 1.3" 640 X 480 screens. Using Ogre's StereoManager two screens were rendered.

Scene setup:

-Scene created using models found on Google warehouse.
-The Ogre game engine was used to render and manage the whole scene.
-OpenAL library used for audio. Sounds found on the internet and some voice-recording done.
-Barrier setup in exactly the same place as the real world. (well, almost).

Future Work

-Add shadows.
-Have dynamic sound.
-Have a full avatar.
-Find a bigger barrier.
-Add more scenarios.
-Have better instructions through the general.