Sadegh


I am a Ph.D. candidate in the Department of Computer Science at the University of Texas at Austin. I work under the supervision of Prof. Joydeep Biswas at the Autonomous Mobile Robotics Laboratory (AMRL).

I am interested in safe navigation for mobile robots via accurate motion models and perception algorithms that are competency-aware.

I received my M.S. in Computer Science from the University of Massachusetts Amherst in spring 2018 and my B.S. in Electrical Engineering with a major in Control Systems from the University of Tehran in spring 2015.

[CV] [LinkedIn]








IVOA: Introspective Vision for Obstacle Avoidance

Vision, as an inexpensive yet information rich sensor, is commonly used for perception on autonomous mobile robots. However, vision systems are prone to errors from various sources such as image saturation, blur, and texture-less scenes. In this project, we develop an approach for self-supervised learning of a model that can predict failures of stereo vision-based obstacle avoidance systems. The learned model predicts the probability of different types of failure (false positive and false negative) and pinpoints the location of the error on the input image.





Friction-Based Kinematic Model for Skid-Steer Wheeled Mobile Robots

Skid-steer drive systems are widely used in mobile robot platforms. Such systems are subject to significant slippage and skidding during normal operation due to their nature. The ability to predict and compensate for such slippages in the forward kinematics of these types of robots is of great importance and provides the means for accurate control and safe navigation. In this work, we propose a new kinematic model capable of slip prediction for skid-steer wheeled mobile robots (SSWMRs) leveraging the wheel-ground contact model.



Inverse Kinematics Based Human Mimicking System using Skeletal Tracking Technology

Mimicking is a fast and user-friendly way to teach humanoid robots human-like motions. This project presents a general and efficient, inverse kinematics based human mimicking system to map human upper limb motions to robot’s joints safely and smoothly. Microsoft Kinect sensor is used for natural perceiving of human motions.


IVOA: Introspective Vision for Obstacle Avoidance
S. Rabiee and J. Biswas
Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '19).

A Friction-Based Kinematic Model for Skid-Steer Wheeled Mobile Robots
S. Rabiee and J. Biswas
Proceedings of the 2019 IEEE International Conference on Robotics and Automation (ICRA '19).

Inverse Kinematics Based Human Mimicking System using Skeletal Tracking Technology
M. Alibeigi, S. Rabiee and M.N. Ahmadabadi
Springer Journal of Intelligent & Robotic Systems, 2017.

UMass MinuteBots 2018 Team Description Paper
K. Vedder, E. Schneeweiss, S. Rabiee, S. Nashed, S. Lane, J. Holtz, J. Biswas, D. Balaban
RoboCup 2018.

UMass MinuteBots 2017 Team Description Paper
K. Vedder, E. Schneeweiss, S. Rabiee, S. Nashed, S. Lane, J. Holtz, J. Biswas, D. Balaban
RoboCup 2017.




Campus Jackal

The Jackal is used for research on campus-scale long-term autonomy at UMass. It is equipped with a stereo vision system and inertial sensors, and an Intel NUC PC for onboard computation. I have been in charge of developing this robot in both software and hardware aspects and I use it extensively in my research projects. I have written various algorithms ranging from low level kinematics to obstacle avoidance and path planning to get this robot autonomously navigate on the campus.


F1/10 Autonomous Car

The F1/10 Race Car is a fully autonomous, low power, and portable wheeled mobile robot with Ackerman drive system. At Autonomous Mobile Robotics Laboratory (AMRL), we use this robot as a platform for multi-robot planning research as well as for teaching robotics. It is equipped with a Jetson TX2, a YDLidar, and an Intel realsense D435. I have been in charge of developing and maintaining both the software and hardware for this robot.



UMass MinuteBots

At AMRL, we have built a team of soccer robots for the RoboCup Small Size League(SSL). We use this platform to implement and stress test our research on actual robots and in a competitive environment. As a member of the UMass Minutebots Team, I have developed software for motion model learning and state estimation of the robots, and have been in charge of the hardware development.



Nao

Nao is a humanoid programmable robot that is vastly used in different areas of robotics research. I have used this robot as a platform for implementing my research on learning from demonstration (LfD). I developed algorithms for solving the inverse kinematics problem for a humanoid robot that enabled the robot to mimick a human's motion as similar to the instructor as possible while taking into account the constraints imposed by the robot's configuration.


University of Texas at Austin [expected graduation: May 2021]
Ph.D in Computer Science - Advisor: Prof. Joydeep Biswas

University of Massachusetts Amherst (2015 - 2018)
M.S. in Computer Science - Advisor: Prof. Joydeep Biswas

University of Tehran (2011 - 2015)
B.S. in Electrical Engineering - Advisor: Prof. Majid Nili Ahmadabadi


Email:
          srabiee[at]cs.utexas.edu