Hydra: A Framework and Algorithms for Mixed-Initiative UAV-Assisted Search and Rescue
Ephrat Bitton1, Siamak Faridani2 and Ken Goldberg
National Science Foundation and Air Force Office of Scientific Research
We are developing Hydra, a testbed and algorithms for collaborative human and automated decision making within the context of outdoor search and rescue. Hydra is a networked simulation tool that allows human and automated agents operating under different assumptions to share control over a set of UAVs with cameras, with the goal of locating a hidden subject as quickly as possible. The agents are modeled on a pre-defined hierarchy of authority, and the search space is characterized by varying degrees of obstructions. Search is based on iterating the following cycle of four steps: (1) all agents generate image requests based on their individual probability density functions; (2) Hydra collects requests and computes an optimal assignment of images to the UAVs; (3) Hydra processes the resulting image data and specifies whether or not the subject was detected; and (4) all agents update their pdfs. We have shown via simulation of a scenario with three agents and one UAV that our method performs 57.7% better than a theoretical upper bound for a single agent and UAV. Future work will include experiments using a robotic camera to find and photograph birds in a natural environment in conjunction with the CONE Welder project. Professors Claire Tomlin, Shankar Sastry, and Pravin Varaiya also provided feedback on this project.
Figure 1: Three UAVs with mounted cameras are controlled by a sequence of frame requests from distributed human and automated agents.
Figure 2: Summary of Hydra’s information architecture
- E. Bitton and K. Goldberg, "Hydra: A Framework and Algorithms for Mixed-Initiative UAV-Assisted Search and Rescue," Proceedings of the 4th IEEE Conference on Automation Science and Engineering, 2008.
1IEOR PhD Student
2IEOR PhD Student