Skip to content
Snippets Groups Projects
Name Last commit Last update
pose_estimation
README.md

Manipulator Control with Computer Vision

Description

Dependencies

For running the code, ROS is required to be set up. This project uses ROS Noetic on Ubuntu 20.02. This page contains necessary installation: https://emanual.robotis.com/docs/en/platform/openmanipulator_x/quick_start_guide/#setup.

If ROS is installed and catkin_ws was setup, following git repositories are required to be placed in the src directory, in order to simulate OpenManipulator-X.

git clone -b noetic-devel https://github.com/ROBOTIS-GIT/open_manipulator.git
git clone -b noetic-devel https://github.com/ROBOTIS-GIT/open_manipulator_msgs.git
git clone -b noetic-devel https://github.com/ROBOTIS-GIT/open_manipulator_simulations.git
git clone https://github.com/ROBOTIS-GIT/open_manipulator_dependencies.git

Getting started

In order to run the project run the following commands (in the future, once the pipeline is complete, the roslaunch files will be all run from one file):

cd catkin_ws
catkin_make
source devel/setup.bash
roscore
roslaunch open_manipulator_gazebo open_manipulator_gazebo.launch
roslaunch open_manipulator_controller open_manipulator_controller.launch use_platform:=false

It is important to run the two roslaunch commands as the former runs Gazebo simulation with the robotic arm, whereas the latter allows for control of the arm. IMPORTANT: Before running the following scripts, it is important to press play at the bottom of Gazebo, otherwise the arm will not move.

At the moment, there are currently three files that can be run - in the future the functionality will be combined. The first one (1) runs the video capture of the keypoints in real time and moving the robotic arm in Gazebo [It is currently being debugged due to new functionality being added]. The second one (2) is a keypoint capture program which saves the keypoints into a pickle file. Finally, (3) loads the saved keypoints (hence 2 and 3 have to be run in order for 3 to work) and visualizes them in RVIZ and calculates the manipulator keypoints. Currently, 1 and 3 use different approaches. The accuracy is currently being tested.

  1. Arm Keypoint Capture + Pose Estimation Control
roslaunch pose_estimation pose_estimation.launch
  1. Capture Keypoints
roscd pose_estimation
python3 capture_keypoints.py 
  1. Visualize Keypoints
rviz
roscd pose_estimation
python3 visualize_keypoints.py