Feature points evaluation on omnidirectional vision with a photorealistic fisheye sequence -- A report on experiments done in 2014

📅 2026-02-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of self-calibration in fisheye imagery, which often suffers from unreliable feature extraction due to the absence of an accurate projection model—a classic chicken-and-egg problem. To tackle this, we introduce PFSeq, the first realistic image sequence dataset specifically designed for zenith-pointing vehicular fisheye cameras in urban environments. Without relying on omnidirectional-specific algorithms, we systematically evaluate the performance of general-purpose feature detectors and descriptors, including SIFT and SURF, under uncalibrated conditions. Our experiments identify the most robust feature extraction strategy in the absence of precise calibration, offering practical guidance for initializing fisheye visual systems and enabling downstream applications such as visual odometry and stereo vision.

Technology Category

Application Category

📝 Abstract
What is this report: This is a scientific report, contributing with a detailed bibliography, a dataset which we will call now PFSeq for''Photorealistic Fisheye Sequence''and make available at https://doi.org/10. 57745/DYIVVU, and comprehensive experiments. This work should be considered as a draft, and has been done during my PhD thesis''Construction of 3D models from fisheye video data-Application to the localisation in urban area''in 2014 [Mor16]. These results have never been published. The aim was to find the best features detector and descriptor for fisheye images, in the context of selfcalibration, with cameras mounted on the top of a car and aiming at the zenith (to proceed then fisheye visual odometry and stereovision in urban scenes). We face a chicken and egg problem, because we can not take advantage of an accurate projection model for an optimal features detection and description, and we rightly need good features to perform the calibration (i.e. to compute the accurate projection model of the camera). What is not this report: It does not contribute with new features algorithm. It does not compare standard features algorithms to algorithms designed for omnidirectional images (unfortunately). It has not been peer-reviewed. Discussions have been translated and enhanced but the experiments have not been run again and the report has not been updated accordingly to the evolution of the state-of-the-art (read this as a 2014 report).
Problem

Research questions and friction points this paper is trying to address.

fisheye images
feature detection
camera calibration
omnidirectional vision
self-calibration
Innovation

Methods, ideas, or system contributions that make the work stand out.

fisheye vision
feature evaluation
photorealistic dataset
camera self-calibration
omnidirectional imaging
🔎 Similar Papers
No similar papers found.
Julien Moreau
Julien Moreau
PhD, Laboratoire Charles Fabry, Biophotonics group, Institut d'Optique Graduate School
Instrumentation optiqueplasmoniquebiophotonique
S
S. Ambellouis
Université Gustave Eiffel, COSYS/LEOST, Villeneuve d’Ascq, France
Y
Yassine Ruichek
Université de technologie de Belfort-Montbéliard, CIAD, Belfort, France