I am a Ph.D. candidate working with Anca Dragan and Claire Tomlin. I am broadly interested in robotic motion planning, controls, and physical human-robot interaction. At Berkeley, I get to collaborate with some amazing people in the InterACT Lab and the Hybrid Systems Lab. I am graciously supported by the NSF Graduate Research Fellowship.
J.F. Fisac*, A. Bajcsy*, S. Herbert, D. Fridovich-Keil, S. Wang, C.J. Tomlin, A.D. Dragan. Probabilistically Safe Robot Planning with Confidence-Based Human Predictions. Robotics: Science and Systems (RSS), 2018.
A. Bajcsy , D.P. Losey, M.K. O'Malley, and A.D. Dragan. Learning from Physical Human Corrections, One Feature at a Time. International Confernece on Human-Robot Interaction (HRI), 2018.
A. Bajcsy* , D.P. Losey*, M.K. O'Malley, and A.D. Dragan. Learning Robot Objectives from Physical Human Robot Interaction. Conference on Robot Learning (CoRL), 2017.
A. Bateman, O. Zhao, A. Bajcsy, M. Jennings, B. Toth, A. Cohen, E. Horton, A. Khattar, R. Kuo, F. Lee, M.K. Lim, L. Migasiuk, R. Renganathan, A. Zhang, and M.A. Oliveira. A User-Centered Design and Analysis of an Electrostatic Haptic Touchscreen System for Students with Visual Impairments. International Journal of Human-Computer Studies, 2017.
E.L. Horton, R. Renganathan, B.N. Toth, A.J. Cohen, A.V. Bajcsy, A. Bateman, M.C. Jennings, A. Khattar, R.S. Kuo, F.A. Lee, M.K. Lim, L.W, Migasiuk, A. Zhang, O.K. Zhao, and M.A. Oliveira. A review of principles in design and usability testing of tactile technology for individuals with visual impairments. Assistive Technology, 2016.
A. Bajcsy, Y.S. Li-Baboud, and M. Brady. Systematic measurement of marginal mark types on voting ballots. NIST IR 8069, 2015.
A. Bajcsy, Y.S. Li-Baboud, and M. Brady. Depicting Web images for the blind and visually impaired. SPIE Newsroom, 2013.