Estimating the relative pose and motion of cooperative satellites using on-board sensors is a challenging problem. When the satellites are non-cooperative, the problem becomes far more complicated, as there might be poor or no a priori information about the motion or structure of the target satellite. In this work we develop robust algorithms for solving the said problem by assuming that only visual sensory information is available. Using two cameras mounted on a chaser satellite, the relative state of a target satellite, including the position, attitude, and rotational and translational velocities is estimated. Our approach employs a stereoscopic vision system for tracking a set of feature points on the target spacecraft. The perspective projection of these points on the two cameras constitutes the observation model of an EKF-based filtering scheme. In the final part of this work, the relative motion filtering algorithm is made robust to uncertainties in the inertia tensor. This is accomplished by endowing the plain EKF with a maximum a posteriori identification scheme for determining the most probable inertia tensor from several available hypotheses.