CSS
 
Date : 21-01-12 09:39
   PS-10-김성백.pdf (718.6K)
Vision Aided Inertial Navigation System using Omnidirecional Images for Space Exploration
Seong-Baek Kim, Jin-il Park, Se-Hwan Chung, Kye-Seok Lee, Jean-Charles Bazin Kyung-Ho Choi*


The increasing demand for advanced navigation aids in space exploration has led to a growing number of applications. However, satellite based navigation is not available especially in outer space exploration; for this reason, space navigation systems are often integrated with inertial sensors and other odometer data, thanks to their complementary features. Nowadays, additional aiding sensors such as vision sensor are developing and their usage as part of integrated navigation systems in difficult environments is gaining popularity which can also be useful in outer space exploration mission. Among vision sensors, omnidirectional vision has been adopted since it captures a large portion of viewing angle up to 360 degrees. On the other hand, a lot of problems still exist such as complex mathematical models or a low dynamic environment based application. Accurate and reliable space vehicle navigation is still considered a challenging problem and therefore significant improvements need to be achieved for vehicle navigation. This paper challenges space navigation performance accuracy where satellite navigation signals are not available. Our approach stems from the basic idea that vanishing points can calculate the attitude of a vehicle and translational motion can also be calculated from epipolar geometry in stereo-vision data. More specifically, translational motion data from vision odometer are fed back into the Kalman filter to reduce and compensate for IMU errors and improve the performance. Experimental results are presented to show the robustness of the proposed method, which can be used to reduce positioning or velocity errors caused by an IMU

Keywords: space exploration, inertial sensor, omnidirectional vision, vanishing points, Kalman filter