Contact Us
Space Navigation with an Omni-Directional Vision Sensor

Space Navigation with an Omni-Directional Vision Sensor

Wednesday 13/03/2019
  • Omri Kaufman
  • Work towards MSc degree under the supervision of Prof. Pini Gurfil
  • Classroom 165, ground floor, Library, Aerospace Eng.
  • TASP - Autonomous Systems and Robotics
  • Technion – Israel Institute of Technology
  • The talk will be given in English

With the onset of autonomous spacecraft formation flying missions, the ability of satellites to autonomously navigate relatively to other space objects has become essential. To implement relative navigation, relative measurements should be taken, and fused using relative state estimation. An efficient way to generate such information is by using vision-based measurements. Cameras are passive, low-energy, and information-rich sensors that do not actively interact with other space object. However, pointing cameras with a conventional field-of-view to other space objects requires much a-priori initialization data; in particular, dedicated attitude maneuvers are needed, which may interfere with the satellite’s main mission. One way to overcome these difficulties is to use an omnidirectional vision sensor, which has a 360-degree horizontal field of view.

In this presentation, we will discuss the development of an omnidirectional vision sensor for satellites, which can be used for relative navigation, formation flying, and space situational awareness. The study includes the development of the measurement equations, dynamical models, and state estimation algorithms, as well as an experimental investigation conducted at the Distributed Space Systems Lab.

Light refreshments will be served before the lecture
For more info
Please fill in the details

"*" indicates required fields

This field is for validation purposes and should be left unchanged.
*required fields are marked with an asterisk