Software+IGUS+robotic+arm

The goal of this project is to deliver software for the Igus robot arm. The arm is in connection with a motion controller which controls the motors, amongst other things. Using the program structure ‘Robot Operating System’ (ROS), a system has been developed which can be used to move the robot arm to a designated position. To be able to operate as desired, a Kinect sensor is assembled above the robot arm.

The robot arm has three joints with five degrees of freedom. These connect to five motors which can be controlled separately.The new motion controller was taken into account and a Hardware Abstraction Layer was used. Because of this, the developed software code can be used in the new motion controller after substituting for the low level code. This part of the code communicates with the motion controller. Orientation:




 * ROS** (Robot Operating System) provides libraries and tools to help software developers create robot applications. It provides hardware abstraction, device drivers, libraries, visualizers, message-passing, package management, and more. ROS is licensed under an open source, BSD license. ( [])

The **motion controller** controls the motors which are connected to the joints. There is a serial connection for transferring data between pc and motion controller. The trajectory of the move_arm package is sent to the buffer of the motion controller via the serial connection. The motion controller ensures that all positions are performed.

ROS_**kinect** Ros_kinect has a driver inside which communicates with the kinect sensor. There are a lot of topics that are published by the driver. (RGB, depth etc.) This information is needed to determine the position of an object. If the program has detected an object, the coordinates are published on a topic so another application calculates the inverse kinematics.



The user can select a position where the gripper should go to pick up an object. This application uses the opencv_bridge which is written for ROS to communicate with opencv applications. ([|http://opencv.org]) The coordinates of the object will be calculated with the Kinect Sensor which is confirmed above the robot arm. Opencv uses the Kinect data to calculate the coordinates and will be published on an ROS topic.

The Inverse kinematic node will subscribe this topic and calculate the inverse kinematics for three positions (see pictures above). When the inverse kinematics is calculated, the motion plan for the joints will be calculated with the ros arm_navigation stack and will then be published on an ROS Topic ([]).

The motion controller has a serial connection with a ROS node which subscribes to this topic. The motion plan will be sent to the buffer of the motion controller. The motion controller reads all elements of the buffer and lets the engines run according to the motion plan.

A thread of the motion controller node reads three times per second the encoder values and publishes this values on an ROS topic. The 3D visualization tool RVIZ will subscribe this topic ([]). With this data the URDF robot arm model ([]) will be synchronic with the real Igus robot arm.

An expansion of the project would be to create a feedback loop for the opencv application. The gripper should have some reflection stickers so the position of the gripper can be compared with the position of the object. With the current code, the position of the gripper is not available.

media type="custom" key="23389862"