Task-Level Control of Soft- and Rigid-Body Robot Behaviors

Camera Integration

The ASUS Xtion Pro camera was used for tracking the soft robot and the target object to be retrieved. It provides both RGB and depth data. As pictured below, the camera was mounted on the front of the youBot. The camera could also be mounted on the wrist of the arm, but since the object and soft robot will always remain at ground level, the front mount was chosen.

The ROS OpenNI package was used to launch the camera driver and access camera data. The ROS package Cmvision was also used to aid with blob color detection in order to locate the red object and the yellow tag on the soft robot in the camera frame. In order to detect the blobs, the following procedure was used.

1. Launch the OpenNI camera driver: roslaunch openni_launch openni.launch

2. Run the Cmvision blob detector color tool: rosrun cmvision colorgui image:=<image topic>

3. Click on the blob you would like to detect until the RBG and YUV color values are given by the GUI.

4. Edit the colors.txt file to include the YUV color of the blob to detect.

5. Launch the Cmvision blob detector: roslaunch cmvision cmvision.launch. A GUI that looks like the image below should appear, outlining the blobs detected with a box. The image below depicts the red object being detected.

More information on the OpenNI and Cmvision packages can be found by viewing their ROS wiki pages:

http://wiki.ros.org/cmvision

http://wiki.ros.org/openni

Vision-Based Soft Robot Navigation

Once the youBot deployed the soft robot, the youBot communicated with the Arduino via a serial communication. A state machine on the Arduino processed incoming commands and executed corresponding actuation sequences for the soft robot. The camera data was used to generate a reference signal based on the position of the target and an error signal based on the position of the soft robot, which were then used to decide which commands to send to the soft robot next.  The simple control logic is given below. Here "position" and "reference" are, respectively, the 2D positions of the soft robot and target object in camera coordinates, and "threshold" is a threshold of 20 pixels in the image.

 if position < reference - threshold:

execute "move right" gait for 1 sec

else if position > reference - threshold:

execute "move left" gait for 1 sec

else: 

execute "move forward" gait for 1 sec

The code for controlling each of the gaits is discussed in the testing section of our wiki.

Task-Level State Machine

To execute the object retrieval task and coordinate the motion of the youBot and soft robot, a state machine was designed. This was done using the ROS Smach (state machine) package. A diagram summarizing the states as well as a brief description of all states is provided below.

Description of states:

Search: The Search state is the initial state of the machine. In this state, the youBot turns clockwise, looking for the object. Once the "/blobs" topic from the Cmvision package returns that it has found a red blob, the youBot rotates to face the blob. It then accesses the depth data of the blob and the state calculates the blob's coordinates in the global coordinate system. Once this has occurred, the Search state exits on "found" and transfers the coordinates of the blob to the next state.

Drive: The transition "found" goes to the Drive state. In this state, the youBot uses the coordinates of the blob from the Search state and drives to those coordinates. The youBot actually drives to 20 cm away from the coordinates in order to give the arm and soft robot room to reach the object. Once the youBot is at the proper coordinates, the state machine returns "here", which brings it to the next state.

Dropoff: The next state is Dropoff. In this state, the arm picks up the soft robot from the back of the youBot and drops it off slightly to the right in from of the youBot. Once the soft robot is dropped off, the state returns "deployed" and transitions to the next state.

Wait: The next state is Wait. This is where the youBot stays still, but sends commands to the Arduino for the soft robot. Additionally, the blob detection is run in this state, and the youBot tells the soft robot to move right or left depending on its relative position to the blob (comparing the yellow and red blobs). Once the soft robot gets to the object, the state returns "reached".

Return: "Reached" brings the state machine to to the final state: Return. This brings the arm back to its original position and once the arm is there outputs "done". "Done" brings the state machine to its exit state, also named "Done". This is the end of the state machine and the task is complete.