However, because of the way the HVX was set up, the color information recorded differed quite a bit from, say, a Nikon picture I would have gotten by shooting it the proper way. So, I instead shot the globe with the HVX, and used a still frame to get the data (this also saved a ton of time). Obviously it was imported into my application quite dark as it wasn’t HDR, but by raising the levels on it, I managed to get a very good result that matched both lighting and reflections shot in the scene.
Now I realize I’m doing it the wrong way, but it seems to work. I would like to know why most people don’t just shoot the chrome balls in camera? Is this something only done on film, or is it just completely avoided altogether?