I promised myself that the Visualist will be a far broader forum than a traditional “corporate blog”, but this one I have to share.  We recently released a new version (2.1) of our flagship display auto-calibration software, Sol, and it contains a feature worth mentioning – wide field of view lens correction.

To automatically align a cluster of projectors into a single display palette, you need to use a sensor of some kind. In the case of our software, we use a camera. Point the camera at the display, start our software, and after a few minutes, any overlapping set of projected images are converted into a single seamless display. Turns out, one of the underlying scientific challenges in doing this is that the camera itself may have unmodelled distortions that, if not taken into account, end up distorting the images on the resulting beyond-HD display. This is only made worse if the camera lens is “fisheye”.

I’d been thinking about this problem for quite some time and over the last nine months developed a technique to remove the distortion (typically known as “radial” or “fisheye”) from very wide field of view lenses so the Mersive software can align displays when the camera is placed very close to the display or even align an entire planetarium using a single very-wide-field-of-view lens.

The approach is based on some interesting mathematics from the computer vision community called “Structure from Motion.” Structure from motion uses a camera to recover a three-dimensional surface description (structure) when that camera is in motion. Although our camera doesn’t move, it does take multiple shots of the same scene as the projectors illuminate the scene differently. Turns-out, a changing scene is nearly equivalent to motion and by borrowing mathematical techniques from recent work in the structure from motion community, we were able to estimate and remove the distortion from any lens automatically.

All in all, the work will support broader use of alignment technology and should be exciting to users who are building displays in confined spaces.

Radial Distortion Correction

Raw Image Shows Lens Distortion

Radial Distortion Correction
Radial Distortion Correction
Share
About Christopher Jaynes

Jaynes received his doctoral degree at the University of Massachusetts, Amherst where he worked on camera calibration and aerial image interpretation technologies now in use by the federal government. Jaynes received his BS degree with honors from the School of Computer Science at the University of Utah. In 2004, he founded Mersive and today serves as the company's Chief Technology Officer. Prior to Mersive, Jaynes founded the Metaverse Lab at the University of Kentucky, recognized as one of the leading laboratories for computer vision and interactive media and dedicated to research related to video surveillance, human-computer interaction, and display technologies.

Submit Comment