Current Lab: Temple Robotics and Artificial Intelligence Lab (TRAIL)

  • Semantic Localization and Mapping Using PHD Filter

We are seeking to detect multiple classes of objects with semantic labels in an open area using Probability Hypothesis Density (PHD) filter. We take sensor’s false alarm into consideration as well as the mutative confusion of detecting different classes achieved by training data. The results can be immediately used in robot semantic localization, mapping, etc.


  • Distributed Multi-Target Search and Tracking Using a Coordinated Team of Ground and Aerial Robots

We enable a heterogeneous team of ground and aerial robots to perform search and tracking tasks in a coordinated manner in an open, convex area with unknown, time-varying number of targets, either static or dynamic. Both types of robots are equipped with sensors with a finite field of view and both false positive and false negative detection.

Simulated experiments indicate that coordinated teams perform better than ground-only teams and that the ratio of ground to aerial robots leads to different behavior in terms of the final tracking accuracy and the time necessary to achieve it. We are also working on verifying the method using physical aerial and ground robots.


  • Robot-Assisted Pedestrian Flow Regulation and Human-Robot Interaction

Having well-designed robotic systems to assist people in public crowd environments such as shopping malls, museums, and campus buildings benefits society economically. More important, in life-threatening emergency situations, robot-assisted evacuation could save lives by reducing congestion and preventing crowd stampede. We develop real time re-configurable crowd control by interacting robots, which replaces costly infrastructure modification for local crowd regulation. Machine learning based methods are used to learn human-robot interaction for effective robot navigation and pedestrian guidance in humans’ environments.