With the onset of autonomous spacecraft formation flying missions, the ability of satellites to autonomously navigate relatively to other space objects has become essential. To implement relative navigation, relative measurements should be taken, and fused using relative state estimation. An efficient way to generate such information is by using vision-based measurements. Cameras are passive, low-energy, and information-rich sensors that do not actively interact with other space object. However, pointing cameras with a conventional field-of-view to other space objects requires much a-priori initialization data; in particular, dedicated attitude maneuvers are needed, which may interfere with the satellite’s main mission. One way to overcome these difficulties is to use an omnidirectional vision sensor, which has a 360-degree horizontal field of view.
In this presentation, we will discuss the development of an omnidirectional vision sensor for satellites, which can be used for relative navigation, formation flying, and space situational awareness. The study includes the development of the measurement equations, dynamical models, and state estimation algorithms, as well as an experimental investigation conducted at the Distributed Space Systems Lab.