Daniel S. Brown
I am a first-year computer science PhD Student at UT Austin.
I work in the Personal Autonomous Robotics Lab (PeARL), under the advisement of Professor
Scott Niekum . I am currently researching in learning from demonstration, and in particular how to obtain confidence bounds for policies learned through inverse reinforcement learning.
Prior to coming to UT I worked as a research scientist at the Air Force Research Lab's Information Directorate. I earned my master's degree in Computer Science from Brigham Young University under the advisement of Professor
Mike Goodrich. I also obtained my bachelor's degree in Mathematics at Brigham Young University and completed an honors thesis under the advisement of Professor Sean Warnick.
Conference and Workshop Papers
M. Berger, L. Seversky, D. S. Brown.
Classifying Swarm Behaviors via Compressive Subspace Learning. International Conference on Robotics and Automation, 2016.
D. S. Brown, S.-Y. Jung, and M. A. Goodrich,
Balancing human and inter-agent influences for shared control of bio-inspired collectives.
Proceedings of IEEE International Conference on Systems, Man, and Cybernetics.
October, 2014, San Diego. D. Brown, S. Kerman, and M. A. Goodrich.
Limited Bandwidth Recognition of Collective Behaviors in Bio-Inspired Swarms.
Proceedings of AAMAS,
May 2014, Paris France. D. Brown, S. Kerman, and M. A. Goodrich.
Human-Swarm Interactions Based on Managing Attractors.
In ACM/IEEE International Conference on Human-Robot Interactions.
March 2014, Bielefeld Germany. 2013
S.-Y. Jung, D. S. Brown, and M. A. Goodrich.
Shaping Couzin-like Torus Swarms through Coordinated Mediation.
Proceedings of the 2013 International Conference on Systems, Man, and Cybernetics,
October 2013, Manchester, United Kingdom. 2012
S. Kerman, D. S. Brown, and M. A. Goodrich.
Supporting Human Interaction with Robust Robot Swarms,
Proceedings of the International Symposium on Resilient Control Systems,
August 2012, Salt Lake City, Utah,USA.