Abstract
We propose a real-time mapping procedure for data matching to deal with hybrid time-of-flight (ToF) multi-camera rig data fusion. Our approach takes advantage of the depth information provided by the ToF camera to calculate the distance-dependent disparity between the two cameras that constitute the system. As a consequence, the not co-centric binocular system behaves as a co-centric system with co-linear optical axes between their sensors. The association between mapped and non-mapped image coordinates can be described by a set of look-up tables. This, in turn, reduces the complexity of the whole process to a simple indexing step, and thus, performs in real-time. The experimental results show that in addition to being straightforward and easy to compute, our proposed data matching approach is highly accurate which facilitates further fusion operations.
Original language | English |
---|---|
Article number | 6231641 |
Pages (from-to) | 425-436 |
Number of pages | 12 |
Journal | IEEE Journal on Selected Topics in Signal Processing |
Volume | 6 |
Issue number | 5 |
DOIs | |
Publication status | Published - 2012 |
Externally published | Yes |
Keywords
- 3D data fusion
- Data matching
- mapping
- multi-sensor systems
- multimodal sensors
- sensor fusion
- time of flight (ToF)