Visual Servoing Project
This was the final project for ECE4560 intro to Automation and Robotics. My group chose to use computer vision and the manipulator jacobian to get the robot arm to track an ARUCO tag. The robot uses OPENCV and a camera mounted on the end effector to detect and determine the pose of an ARUCO tag in the end effector frame. We then construct a twist determining the direction the arm should move to keep the tag centered in the camera's view as well as to maintain a constant height and distance to the tag. We don't fully track the pose of the ARUCO tag due to the kinematic deficiency of the arm (5-DOF). We were able to achieve successful tracking at a slow speed.
Possible improvements
- The arms movement would cause motion blur causing the camera to have difficulty detecting the ARUCO tag accurately. to solve this we could use a better camera, more robust vision algorithms, and splines to smooth the arm's trajectory.
- Higher DOF arms could be used for tracking the full pose of the tag. For example the lack of wrist yaw movement restricts us to only be able to track the azimuth of the tag position and not the yaw angle.
All Around View
This visualization shows tracking capabilities of the system in all forms of movement.
Up-Down Movement
Demonstration of vertical motion control.
Distance control
Demonstration the robot maintaining a constant distance from the tag.
Azimuth Control
Demonstration of rotational control around the vertical axis.
Video Demonstration
Watch the full video demonstration:
Project Presentation
Slides presented in class: