The motion capture-based Wireless VR project's goal is to provide untethered access to virtual environments for use in design and animation prototyping. The wearable components of the system are inexpensive, have low power requirements, and are lightweight to maximize flexibility and comfort.
Mike Altman, Department of Design
Tim Daoust, Department of Computer Information Science
Hussain Frosh, Department of Electricaland Computer Engineering
Selim Gencoglu, Department of Design
Rob Gordon, Department of Electrical and Computer Engineering
Brent Haley, Department of Computer Information Science
Wenxiang Huang, Department of Electrical Engineering
Min Lee, Department of Art
Matthew Lewis, Project Coordinator, ACCAD
To use the system a participant dons a pair of video glasses, a wireless video receiver, and a battery pack. The wiring, receiver, and battery pack are integrated into wearable gear that is easy to put on and remove. In its simplest mode of operation a baseball cap with six optical motion capture markers is also worn to track the user's head motion.
The system's initial virtual environments include the motion capture lab containing the system, the Sistine chapel, the Vietnam memorial, Times Square, an office lobby, and a small apartment. All of the systems environments may be modified at run-time by using a wireless joystick to interactively place additional objects, lights, projectors, and animated characters. The user is free to walk around the environments and objects at anytime, limited only by the 25'x25'x8' confines of the motion capture volume. Internet data may also be sent to and from the system to provide for collaborative virtual environments.
Current development of the system focuses on replacing the need for external command input devices with the movement of the user’s body.
For more information please contact Matthew Lewis
Completed from 2002-2010.