Stereo omniview camera system and Time-Of-Flight camera for Mars rover

H.-R.Graf, C.Gimkiewicz, C.Urban

Compact catadioptrical cameras and Time-Of-Flight (TOF) cameras show a high potential for autonomous vehicle navigation. For the European Project “PRoVisG”, CSEM has developed a miniature omniview camera with a horizontal field of view of 360° and a vertical field of view of 70°.  With two such cameras stacked vertically, the overlapping vertical field of view can be used to calculate distance maps from stereo vision. The stereo omniview camera system and an additional 3D-TOF camera were successfully integrated on a rover during first field trials on Earth.

Highly autonomous terrain rovers are currently been designed for future Mars exploration missions. Due to the limited communication possibilities between Mars and the Earth, such vehicles have to navigate and travel autonomously. Close-up collision avoidance stereo vision cameras are for example required to ensure safety. The stereo omniview camera system and the 3D-TOF camera are both close-up collision avoidance systems.

Usually, omniview cameras are built with conventional objectives and external mirrors. Such set-ups are bulky, heavy, and sensitive to vibrations. CSEM has already developed a miniaturized omniview system consisting of a mirror lens and an imaging lens in one compact lens holder [1].

For PRoVisG, CSEM has developed a new catadioptrical lens system specific to stereo vision for a future Mars rover:

In omniview cameras with a catadioptrical system, horizontal object structures in the panoramic scene appear on the sensor as a circle, vertical structures as radial stripes.

The field of view of the cameras is 360° in the horizontal plane and by design -60° to +10° in the vertical plane. The target sensors are two high dynamic range 1 Megapixel cameras, commercially available from PhotonFocus AG, a CSEM spin-off company.

For stereovision, two omniview cameras are stacked vertically and the overlapping images are used to calculate the distance to identified objects. The assembly foresees a camera distance of 400 mm. The theoretical resolution given by the number of pixels is 10 mm for objects at a distance of 1000 mm. The core component of a compact and robust catadioptrical system is the mirror lens. It consists of an input surface, which has the shape of a torus, a reflecting cone-like mirror and an output lens. To improve the image quality, additional imaging lenses are used around the field stop.

Schematic view of the compact catadioptrical system

Special care has been taken to decrease the distortion in the wide viewing angles. The distortion is at maximum 2% over the total image radius

Distortion of configuration “bright” and configuration “sharp” at a temperature of 20°C

The optical system is assembled in a metallic tube, each lens fixed and adjusted by spacer rings. The tube has a C-Mount thread to be screwed into the lens holder and thereby the focal length can be adjusted. The prototype lenses were fabricated in plastic to reduce costs.

Another innovative camera technology developed at CSEM is based on the principle of Time-Of-Flight (TOF) [2]. A dedicated image sensor with smart “lock-in” pixels performs a synchronous detection of the phase of a modulated infrared light field. These phase offsets correspond to different flight times and therefore to different distances – acquired for every pixel individually in parallel.

For PRoVisG, the commercially available SR4000 camera from MESA Imaging AG, a CSEM spin-off company, is used. The camera has 176 x 144 pixels (QCIF resolution) and uses 24 power LEDs at 850 nm wave length. The distance resolution is 4 mm (typical) in the calibrated range of 0.8 to 5 m.

3D-Time-Of-Flight (TOF) camera of MESA Imaging

To enable mobile operation, both camera systems connect to a PC laptop. The PhotonFocus cameras need a frame grabber (available from Imperx Inc.) with 2 independent CameraLink base interfaces, while the MESA Imaging 3D-TOF camera has an USB 2.0 interface. The generic data acquisition framework PLabDaq has been extended to support these devices. It enables camera control and data acquisition from within MATLAB or standalone applications.

Standalone graphical real-time applications are written in C++, using OpenCV, SSE2, OpenGL and IUP (for GUI). They allow 16-bit PNG image recording, processing and visualization. For the processing core of PRoVisG, additional PRoVIP functions are implemented as command-line-tools. They convert raw sensor data to cloud of 3D points in Cartesian space, which get exported in the PLY polygon file format. 

Raw sensor data recorded with omniview camera system
left side: image from upper camera; right side: image from lower camera

Unrolled panorama images created from raw data

During a PRoVisG field test in Aberystwyth, UK, both camera systems were mounted on the Bridget rover from EADS Astrium.

Image: ‘Bridget, Mars Rover’ on field trials in Tenerife, September 2011 © Astrium

The gathered data includes recording sessions at Clarach Bay and checkerboard targets for camera calibration purposes.

Visualization of 3D-TOF camera  recorded data

Future work includes improved camera calibration and depth-from-stereo-vision calculation for the omniview camera system.

This work has been co-funded by the European Commission under the 7th Framework Program Space, GA 218814.


[1]   C. Gimkiewicz, C. Urban, E. Innerhofer, P. Ferrat, S. Neukom, G. Vanstraelen, P. Seitz, ” Ultra-Miniature Catadioptrical System for  an Omnidirectional Camera”, SPIE Photonics Europe Conference, Paper no. 6992-18 (2008).

[2]   T. Oggier, F. Lustenberger, and N. Blanc, “Miniature 3D TOF Camera for Real-Time Imaging”, Volume 4021 Proceedings ISBN: 3-540-34743-7 pp. 212 – 21 (2006).