The images are for example purposes only. The key to a system like this is that the pilot could not wander from a pre marked circle on the field, also the judges would also have to remain in a marked judges box. The software would then be Claibrated per say, to a starting radian of the circle, knowing the line lengths give the other calibration point. At that point it is all HW of course you need to use goggles or glasses with motion sensors so the pilots head movements could scroll the 3D mesh, and yes this part of the HW is still expensive. However new LCD projection technology is comming to market that allows the projection of a hi res accurate image onto a conventional glass optics rather than putting the actual display device in the wearers field of view. This was more of an examination of what could be a jumping off point for judge training thinking outside of a solution that requires accurately tracking the airplane in a 3D space. Perhaps it gives someone an Idea that is simpler to implement, yet is enough for pracical application. My Idea is along thiose lines. It eliminates the need of tracking the airplane at all, You could do a quick and dirty judging aid by creating the hemisphere like I did, except you use a video camera to tape the flight and using the 3D software composite the 3D hemisphere with the pre defined manuever shapes over the video footage. As each manuever is performed the judge or judges assistant brings up the next predefined shape on the screen, as there are two level laps between maneuvers, there is plenty of time for this. So long as the pilot does not wander too much or puts the maneuver too far away from the judges this tool would work. No need for special HW, or software. Only the 3D software I used to create the 3D sphere you saw, a vidoe camera, and a notebook computer. And if you wish the whole composited session can be burned to tape as the flight progresses. The cost, one video camera, one video tape recorder, one notebook computer with TV out, and the software ($600) The software allows you to composite live video into a 3D environment real time, no need to manually map the path of the plane afterwards. So long as the virtual camera within the software application's position matches the live video camera then everything in the 3D generated environment would register with the actual video footage. At that point the only thing moving is the airplane through the frame. Still works that way as well, only the pilot doe not get the benefit until the flight is over and there is still the disconnect from what the pilot sees and what the judge sees. Of course you could mount a camera to the pilots head and tape that session to be composited into the 3D environment to be viewed later. And for it to work in the pilots case he would have to always keep his head oriented at the airplane, Tough when doing overheads, as is judging them.
You can never really judge overhead manuevers as to do so the judge or camera would have to be in circle center.