Virtual Batting Cage




Robot Ping-Pong

Human Interface




About Me






Screen captures are available... (~120K)

In the virtual batting cage, the player used a real baseball bat to swing at virtual pitches seen through a head-mounted display. By tracking the bat using real-time stereo vision, the system could report if the swing was a hit or miss, providing details and an instant replay. The pitch data was `real,' potentially coming from a major league pitcher, but in this case, coming from me, an amateur pitcher. Most non-players couldn't hit even me anyway. The 3-D pitch data came from video of me on a real field, using a sophisticated 3-D monocular vision scheme.

The real baseball value of this system is that you can train a player's discrimination of balls and strikes, and ability to cope with game-quality velocity and location. This training is hard to get from batting practice, and normally only comes from game experience.

The whole thing --- stereo vision and real-time graphics --- ran on my TRIAX vision system (also see the TRIAX article). To make that work, the system would pop from graphics to vision just as the virtual ball crossed the plate. Consequently, the player saw a flash, but it was too late to affect the swing. The system ran at 60 Hz with carefully accounting for every latency source, in order to provide the temporal accuracy for such a dynamic task.

Screen captures (~120K) are available if you've got the time. The real-time graphics were line drawings, due to hardware speed limitations at that time, though some later replay-only systems do nice graphics. You can learn more from the paper.