top of page

My Experience on Freelancing Project on a Pick-and-Place System with Mycobot 280 Jetson and ROS MoveIt

  • Writer: Aman Kumar Singh
    Aman Kumar Singh
  • Aug 28, 2024
  • 2 min read

Updated: Feb 27

I’d like to share some work I did last semester in the hope that it might motivate others for robotics in general. I worked on an online freelancing project for a client, where I developed a pick-and-place system using the Mycobot 280 Jetson serial manipulator, integrated with the ROS MoveIt framework.


MoveIt is an excellent tool for handling the intricate task of motion planning in robotic arms and manipulators and provides a loot of tools to ease up the development:

  • Motion Planning: Algorithms and tools for computing collision-free paths to achieve desired positions and orientations.

  • Collision Checking: To ensures the robot's motions avoid collisions with obstacles.

  • Kinematics: Provides solvers to compute joint configurations for achieving specific end-effector poses.

  • Robot Modeling: Allows for the detailed definition and modelling of the robot's geometry, kinematics, and dynamics, crucial for effective motion planning.


System Overview: The system integrates stereo vision, YOLOv8 object detection, K-means clustering for object localisation, and human activity imitation to enable collaborative assistance. It can autonomously pick up various objects and either place them into designated bins or hand them to a user.


Key Components:

  • Mycobot 280 Jetson Manipulator

  • Stereo Vision Setup with Two Cameras

  • ROS with MoveIt Framework

  • YOLOv8 for Real-time Object Detection

  • K-means Clustering for Object Localization

  • ROS Scripts for Manipulation, Tracking, and Human Activity Imitation

  • Video Processing Module for Analyzing Human Activities


Highlights:

  • Stereo Vision and Object Detection: Utilized two RGB cameras to capture the scene and generate point clouds using stereo vision approach. YOLOv8 handled real-time object detection within these point clouds.

  • Object Localization: Detected objects are localized in 3D space using DBSCAN clustering, providing coordinates (x, y, z) relative to the robot's workspace.

  • Pick and Place Control: The ROS MoveIt framework handles motion planning and control. Custom ROS scripts allow the robot to navigate, pick, and place objects based on user commands.

  • Object Tracking: Real-time feedback during manipulation enables the robot to adjust its movements dynamically.

  • Human Activity Imitation: A Python script processes video input of human activities. The system mimics human actions based on the video, either placing objects into bins or handing them to users.


This project was completed within a 1-month time-frame during the last semester. Although I had to stick to a classical approach due to time constraints, without integrating learning-based methods or intent recognition, the experience was incredibly valuable and taught me a lot about handling real-world challenges of serial manipulators in a remote, freelance setting.



I hope this post inspires you to take on robotics projects and explore some new possibilities!



Feel free to have a look at the presentation and the working video!



Picking Up Banana



Picking Up Orange

Comments


bottom of page