Professional Documents
Culture Documents
WORK
| Envisioned goal: the original envisaged aim was to | All Visual SLAM systems employ features for use as
possibly assist the work of a future robotics visual landmarks in the map (after they have been
programmer/engineer.
programmer/engineer extracted and tracked in successive images).
g )
| The unified tool developed for performance comparison | Thus, the feature detectors of the SLAM system must
serves as a prototype and platform. It also automates extract points which are highly robust because the
every feasible task required to conduct a study in this accuracy of the map in turn depends on them.
direction
| Using inappropriate feature detector for the SLAM
| A prototype, as it implements a procedure for validating system under a given situation can not only yield bad
the matches obtained from the Mono SLAM framework maps but also cause poor localization within it. This is
b
because; b
both
th th
the maps and d localization
l li ti are
| Secondly, a platform for future work as there are interdependent and are being done simultaneously (the
plethora of ways in which the Monocular SLAM systems chicken or egg corollary).
have been formulated. So using the same blueprint similar
evaluation utilities can also be designed or for that matter
integrated into the present solution. | For a robot working in real world the results can be
1 catastrophic! 2
| Therefore, the requirement of the problem posed | I. Initial Phase: Directly after the feature
e ea
here and
daalso
so its
ts specification
spec cat o iss to: tracking g is done,, i.e. after features are
extracted and then they are matched in
successive frames by predicting the motion of
‘Devise a versatile procedure for performance the camera and using it to find corresponding
comparison of feature matches. (gives general solution)
detectors when typically used in conjunction with
visual SLAM (monocular)’ | II. Final Phase: After each complete SLAM
cycle
l fi
finishes,
i h ii.e. filter
fil update
d to the
h
prediction of the robot pose and location is
also made. (gives specific solution-used in my
investigation)
3 4
1
SOLUTION SOLUTION: MATCH VERIFICATION FOR
| RoboRealm- a powerful machine and robot vision software [3] for
capturing images and corners . Available for download at this
PERFORMANCE COMPARISON
link:
y http://www.roborealm.com/registration/index.php | The matches are verified by using a distance threshold
from the epipolar line.
| An offline monocular SLAM system based on Davison’s | The Epipolar Geometry here is that applied to the case of
framework and utilizing inverse depth formulation of [22]. The
Mono SLAM system is available for download at this link: Monocular Stereo, i.e. as depicted in the figure a single
y http://www.robots.ox.ac.uk/~SSS06/Website/Practicals/SSS06. camera moves from left to right (or from the initial
Prac2.MonocularSLAM.tar.gz
position (Oi) to the new one (On)), while observing the
| In addition to this a module for just obtaining Lowe’s SIFT same scene (pts, X,X1,X2,X3)
features was also unified in the present solution.
| From Epipolar Geometry we know that correspondences
| The Match
Th M t hV Verification
ifi ti componentt and d mechanism
h i which
hi h iis self
lf between non planar images must lie on the epipolar lines
designed (would be explained in the next subsection). (ideally).
| Thus, in the current implementation, a match is judged
| Lastly, all the modules which have been integrated by a single
application were coded in VC++. The same has been christened as correct, if the distance from the epipolar lines is less
as TestBed. than 2 pixels.
5 6
Epipolar line:
7 8
2
SOLUTION: CALCULATING DISTANCE OF REFERENCES AND OVERALL BACKGROUND
A MATCH FOR MATCH VERIFICATION RESEARCH MATERIAL
1. Andrew J. Davison, I.D.R., Nicholas D. Molton, and Olivier Stasse, Mono vSLAM:
Real-Time Single Camera SLAM. IEEE Transactions on Pattern Analysis and
Machine Intelligence, 2007. 26: p. 1052--1067.
2. Dirk Hahnel, W.B., Dieter Fox and Sebastian Thrun An Efficient FastSLAM
Algorithm for Generating Maps of Large-Scale Cyclic Environments from Raw
Laser Range Measurements. in IEEE/RSJ Int. Conf. on Intelligent Robots &
Systems, Las Vegas, NV, USA, 2003.
3. LEONARD J. J. , H.F., DURRANT-WHYTE Mobile robot localization by tracking
geometric beacons. In IEEE transactions on robotics and automation, 1991. 7.
4. Triebel, R., Burgard, W.:, Improving simultaneous mapping and localization in 3d
using global constraints. In: National Conference on Artificial Intelligence., (2005).
5 . Biber, P., Andreasson, H., Duckett, T., Schilling, A.:, 3d modelling of indoor
environments by a mobile robot with a laser scanner and panoramic camera. In:
IEEE/RSJ IntInt. Conf
Conf. on Intelligent Robots & Systems.,
Systems (2004).
(2004)
6. Little, J., Se, S., Lowe, D.:, Vision-based mobile robot localization and mapping
using scale-invariant features. . In: IEEE Int. Conf. on Robotics & Automation.,
(2001).
7. R. Sim, P.E., M. Griffin, J. J. Little. , Vision based SLAM using the Rao-
Blackwellised particle fillter. In IJCAI Workshop on Reasoning with Uncertainty in
Robotics, Edinburgh, Scotland., 2005.
9 10
8. Lowe, D.G., Distinctive image features from scale-invariant keypoints. Int. Journal
of computer Vision 2(60) (2004).