Simultaneous Identification-Tracking System (SITS)

The goal of this project is to investigate and adapt transformational CVIP Multi-modal imaging methods from bioscience to ballistic missile discrimination methods as feasible. To do this, an imaging sensor is used to capture the objects image (e.g., Optical or IR camera). After that the SIT system recognizes the target shape and material. If the system gives a positive identification according to this information, it decides one of the operational scenarios of tracking, one is called Intercept trajectory scenario and the other is Rendezvous Trajectories scenario. The project framework is shown in the following figure

Image-guided Robotics Platform for Object Identification and Tracking

In order to test the interplay between object characteristics, trajectory and the tracking mechanism we established a vision-guided robotic system formed of a robot arm on which the control is vision-guided. The system is shown in Fig. 1 Objects were mounted in the ceiling by a fine wire enabling free motion with controlled speed and pattern.  The system is quite flexible in terms of range of motion and targets. Speed is the major limitations due to the work space constraints and the on board robotics interface. Yet, the system provides a controlled mechanism to test and validate the Kalman and particle filtering approaches to object tracking based on system identification of the objects from LWIR images. We show below some off-line experiments that illustrate the overall process of identification and tracking using vision-guided robotics.

Fig. 1:Vision-guided robotics platform for simultaneous object identification and tracking. A vision-guided robot arm tracks objects mounted in the ceiling at different speeds. The objects are to have different shapes, appearance and temperatures, and to move on random trajectories.

Experiment 1: Visual tracking using robotic arm:

Linear motion is motion along a straight line, and can therefore be described mathematically using only one spatial dimension. It can be uniform, that is, with constant velocity (zero acceleration), or non-uniform, that is, with a variable velocity (non-zero acceleration). The motion of a particle (a point-like object) along the line can be described by its position x, which varies with t (time):

y(t) = ax(t)+b

Kalman filtering tracking based on the above deterministic model is quite starting forward. Fig. 2  shows real data and Kalman filtering results.

Fig. 2: results of Kalman filter tracking of linear motion.

Fig. 3: Snap shots used to construct the linear trajectory in Fig. 19. (where t is the snapshot number)

Fig. 3  shows snapshots of video tracking using Kalman filter approach. The original trajectory is constructed from these snap shots and an off-line Kalman filter is used to perform the tracking. At present, we are establishing the system that will allow the real-time implementation of the vision-based system as shown in Fig.4.

Description: C:UsersAhmed ShalabyDesktopNew Desktopmovies of tracking pendulum etcP1030683.JPG.

Fig. 4: Platform of the used system to construct the trajectory in Fig. 3

Research TeamAly A. FaragCVIP Directorfarag@cvip.louisville.eduAhmed ShalabyResearch Assistantamshal01@louisville.eduAli MahmoudResearch Assistantahmahm01@louisville.edu Methods1- Kalman filters2- Particle filters3- Mean shift approach Publications

  1.         Ahmed Shalaby, Asem Ali, Aly A. Farag, “Simultaneous Identification and Tracking of Moving Targets,” Proc. of 8th IEEE Workshop on Object Tracking and Classification Beyond the Visible Spectrum (OTCBVS), 2011, accepted to appear.

تصميم موقع كهذا باستخدام ووردبريس.كوم
ابدأ