Swipe to navigate through the chapters of this book
This chapter describes the detection of keypoints and the definition of descriptors for those; a keypoint and a descriptor define a feature. The given examples are SIFT, SURF, and ORB, where we introduce BRIEF and FAST for providing ORB. We discuss the invariance of features in general, and of the provided examples in particular. The chapter also discusses three ways for tracking features: KLT, particle filter, and Kalman filter.
Please log in to get access to this content
To get access to this content you need the following product:
The described generation of 3D flow vectors has been published in [J.A. Sanchez, R. Klette, and E. Destefanis. Estimating 3D flow for driver assistance applications. Pacific-Rim Symposium Image Video Technology, LNCS 5414, pp. 237–248, 2009].
See [Z. Song and R. Klette. Robustness of point feature detection. In Proc. Computer Analysis Images Patterns, LNCS 8048, pp. 91–99, 2013].
See [Y. Zeng and R. Klette. Multi-run 3D streetside reconstruction from a vehicle. In Proc. Computer Analysis Images Patterns, LNCS 8047, pp. 580–588, 2013].
The presentation follows the Lucas–Kanade tracker introduction by T. Svoboda on cmp.felk.cvut.cz/cmp/courses/Y33ROV/Y33ROV_ZS20082009/Lectures/Motion/klt.pdf.
We use a (practically acceptable) approximation of the Hessian. Instead of mixed derivatives, we apply the product of the first-order derivatives.
A particle filter for lane detection was suggested in [S. Sehestedt, S. Kodagoda, A. Alempijevic, and G. Dissanayake. Efficient lane detection and tracking in urban environments. In Proc. European Conf. Mobile Robots, pp. 126–131, 2007].
This is the retinal point where lines parallel to translatory motion meet, also assuming a corresponding direction of gaze.
- Feature Detection and Tracking
- Springer London
- Sequence number
- Chapter number