Configuration-Space Flipper Planning on 3D Terrain

09/17/2019 ∙ by Yijun Yuan, et al. ∙ 0

Autonomous run is always a goal in the field of rescue robot and the utilization of flipper will strongly improve the mobility and safety of robot. In this work, we simplify the rescue robot as a skeleton on inflated terrain. Its morphology can be represented by configuration of several parameters. Based on our previous paper, we further configure four flippers individually. The proposed flipper planning is of a mobile movement on 3D terrain with 2.5D maps. The experiment shows that our method can well tackle various terrain and have high efficiency on manipulating the flippers.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

In the future, rescue robots are expected play an important role in search and rescue after disasters happen[16] or in the military [1]. Some rescue robots are designed like animals, such as snakes [7, 5] and dogs [3], to increase mobility, which is mechanically complex. An easy alternative approach is to use tracked robots for propulsion. However, tracked robots usually meet the problem of big obstacles that are higher than the height of robot base. To increase the mobility of tracked robots, sub-tracks or flippers are added into the robot base, as shown in Figure. 3. Moreover, those flippers are even more important for small rescue robots compared to big ones, since they seriously rely on their flipper when traversing through rough terrain.

Even though the tracked robots with flippers have good mobility, they have high demand for operators, due to their many degree of freedoms, as mentioned in

[8, 10]. Therefore, autonomous flipper planning is really helpful for operators and autonomous path planning is important for rescue robots. This then also leads the way towards full autonomy in rescue robotics [11]. Some researchers designed flipper behaviors based on the experience of expert operators. The motion strategy used in [10] is based on the operation by skilled operators. Sheh et al. proposed behavioral cloning from human experts when the robots traversing on the rugged terrains [15].

In addition to experience from operators, some work proposed autonomous or semi-autonomous control of the rescue robots with flippers based on its kinematics and physical model. To design a close-loop control system, the rescue robots should ”know” the environment and their own states. To build such a system, Ohno et al. added encoder, current and gravity sensors to measure the state of the robot so that the robot can adjust its pose on the basis of data from these sensors [9]. In their follow-up work [14], they also add laser scanner to provide the environment sensing and automatically adjust the flippers according to the environment. Moreover, Pecka et al. even utilize robot arm to gather data which cannot been collected directly by the robot base [13].

No matter if the environmental data is gathered by the robots’ sensors in the real environment or if it is given by system settings in a simulation platform, the execution performance for rescue robots traversing on rugged terrain is always one of the biggest challenge, due to gravity and disturbance. To make the robot follow the given path, Martens et al. designed a feedback system to compensate the asymmetric wheels and gravity effect for a mobile robot to climb stairs [6]. Inspired by this work, Steplight et al. extended the remote-control strategy to autonomous stair-climbing with additional sensors to detect the environment features [18]. Besides, pre-defined morphology is performed during climbing. In [2], four different driving modes of the robot base are used to divide climbing into four periods and flippers are adjusted to meet the orientation requirement. Zimmermann et al. defined five flipper modes which are applied in different period of traversing complex terrain [22]

. In addition, to make sure that the robot is stable on rugged terrains, most control strategies take the center of mass and gravity into consideration. For example, Vu et al. analyzed the moment balance and stability of a customized robot when it is on the stairs with different pose

[19]. In [4], stability is interpreted as each flipper and robot base should touch the ground.

Besides, there are data-based methods that can train a mapping from state to action [22, 12, 17], which should be able to learn to use the flippers. But the weakness is the limited coverage of the training terrain, which in turn might cause a crash due to the overfitted parameters.

In the above studies, autonomous flipper planning is based on 2D planning, where the front two flippers are under the same motion and the rear two are the same, such as [14, 22, 13, 21]. Only when the flipper is adjusted slightly to touch the ground, the left and right flippers may execute to different angles [10, 4]. In this work, we propose an active control method for flipper planning, which calculates the safe morphology and feasible path for the robot with flippers in 3D space. Related to our previous work [21], that made the simplification of flippers in 2D space, we consider the four flippers individually in the 3D terrain, which largely increases the feasibility of the robot. Besides, we maintain the analysis in continuous configuration space when calculating the morphology of the robot.

Fig. 1: System Overview

The contributions of this work are summarized as follows:

  • Continuous space flipper planning in 3D terrain.

  • A real robot implementation to follow the path of configuration.

  • Analysis of the efficiency on various obstacle cases comparing with tele-operation.

The rest of the paper is structured as follows: The overview of the system is introduced in Section II and then the important modules of the approach are analyzed in Section III. In Section IV we introduce the experiment settings and discuss the experimental results. Finally, we draw conclusions in Section V.

Ii Overview

The overview of this system is shown in Fig. 1. Given a 2.5D map of the environment and the robot parameters, the workflow of this approach is as follows:

  • Simplify the robot as model as in Fig. 3,

  • Equivalently morph the robot model and 2.5D map to the robot skeleton and inflated map representation as in Section III-A,

  • Path search as in Section III-D, iteratively, given the current configuration, searching for its next best configuration, see Section III-C.

  • Path following for a real robot implementation, shown in Section III-E.

In Section III, we introduce these modules in detail.

Iii Methodology

Fig. 2: side-view and top-view sketch for Fig.3. The skeleton is drawn with black and blue lines connect with white joints. For robot’s coordinate system, its origin is and with back-front the , right-left the and bottom-up the axis.

To make it easier to analyze the robot morphology on the terrain, similar to what we did previously in [21], we transform the robot and map to skeleton and inflated map. We generate the skeleton-inflated ground representation in 3D space and then compute the configuration. After that, a customized path search is correspondingly utilized to find a sequence of morphologies for the robot.

Iii-a Robot Simplification

(a) Robot
(b) Model
Fig. 3: Rescue Robot and its simplified model.

However, in 3D case, it is not easily to utilize the equivalent inflation to get the skeleton representation as [21]. Thus a simplification of the real robot is necessary.

In Fig. 2, we can find the top and side view sketch of robot in Fig. (a)a. For this robot in Fig. (a)a, the values are , . The sketch describes the robot with points and lines connecting those points.

Around those blue and black lines in Fig. 2, we generate a surface that the closest distance from blue lines to each surface point is . The simplified model is as Fig. (b)b.

Iii-B Equivalent Inflation

To compute the morphology of the robot, simplification can bring about the convenience. In this paper, our simplification also consist of two parts: represent robot as a skeleton and inflate the ground.

We work with several simplifying assumptions:

  • the center of gravity is always in the center of the robot base;

  • no slip and floating;

  • driving forward with a steady speed.

Iii-B1 Skeleton Representation

Fig. 4: The flipper angles.

In Fig. 2, we can find the top view and side view sketch of robot in Fig. 3.

The model simplification is on top of the tracked rescue robot with four sub-tracks as in Fig. 2 the black line and blue line. We use to denote a joint on the skeleton. is on the left side and is on the right side.

Initially, as in Fig. 2, the robot facing to the positive axis, the axis is to the robot left, the axis is from bottom to top of robot. The roll, pitch, yaw angle are around , and , respectively, following right hand rule.

In our implementation the order of the three Euler angles is in yaw () , pitch (), roll (). In this paper, our robot is set to always move forward, thus is always set to .

Also following [21], we use and to denote the angle of the front and back flipper, as in Fig. 4, with a subscript or to specify the side.

Iii-B2 Inflated Map

Since the robot is now represented as a skeleton, the map should be inflated accordingly.

Since our rescue robot is moving on the ground, we consider it is adequate to use a 2.5D elevation map to represent the ground scene. We are thus able to do the inflation on that map with the proposed method.

Given a 2.5D map, each point , with , where is the number of ground points in grid map, with its value on the map that is the height .

Correspondingly, since we simplified the robot to skeleton, the inflated ground map should meet the closest distance from skeleton to map is no less than wheel radius.

Following [20] we build the distance map on the original ground with special kernel . Given the of some point to the kernel center with height , the value on this position is computed as:

(1)

We represent a ground pixel on position as a delta function . The function with input location to generate a distance map of each point as a convolution of and a kernel is:

(2)

Then the function to generate the inflated map can be represented as:

(3)

One example demonstration result is shown in Fig. 5, where one possible obstacle is found.

(a) Height Map
(b) Inflated Map
Fig. 5: Height map and its inflated map with .

Iii-C Configuration Generation

In this part, each configuration is designed from skeleton-inflated ground and can be transformed correspondingly to robot-ground scene.

The configuration consists of three parts, a. the euclidean position, b. the orientation from the pose and c. the flipper angles.

Note that, this subsection works as a function for the next subsection, the path search. Given location of pivot, this function will (1) generate a range of possible orientations for the robot base. For each pair of location and orientation, it can (2) uniquely determine its four flipper angles via collision checking. They are described in Algorithm 1 and 2, respectively.

The orientation can be described with the , and . It is not convenient to set the center of robot as the pivot when we want to compute the valid rotation angle on top of it. Thus, following [21], has been utilized as the pivot. Because we have and on the left and right side, when doing path search, both sides will be utilized to collect the candidates. The flipper angles can be deterministically determined given the position and orientation.

To ensure the safety of the robot, we constrain that for both sides of , at least one point is touching the ground. The implementation is done by searching over the pitch and roll to find the possible candidates.

1:  Input: Left Pivot xyz location ; inflated map D.
2:   pitch candidates for line with and
3:  for all   do
4:      get roll candidate(fix as axis, make line touch map surface D.)
5:     Add to pose parameter set .
6:  end for
7:  Output:
Algorithm 1 Get orientation parameters. (Assume left pivot.)

Assume we use the as pivot and is its 3D location. The is fixed in the very beginning as . Then we can rotate around the axis with as the rotation center.

If is not on the ground, we should find the smallest , that makes touch the ground.

If is on the ground, we can also correspondingly find the smallest , that make line touch the ground. And the chosen .

Then with each solved , we can compute the roll by finding the smallest angle that makes the rectangular plane touch the ground.

So now position and orientation have been solved, the only things left are the flipper angles to support such a pose. Now we collect a batch of pose candidates as in Algorithm 1.

1:  Input: ; inflated map D
2:  get the location of points .
3:   get angle by rotate around axis to make line colliding map surface D.
4:  Similarly compute the , , .
5:  Output:
Algorithm 2 Get flipper parameters. (Assume left pivot.)

Given the fixed joint (obtained from fixed location and orientation) on each flipper, we can uniquely get the flipper setting by finding the angle that make flipper touch ground surface (not puncture the surface). This is similar to computing the configuration if is chosen as the pivot.

Iii-D Path Search

Since we fixed the yaw as 0, the of the pivot will always increase. The goal of this part is to make robot move forward to cross a certain terrain.

1:  Input: initial ; inflated map D; ; ;
2:  
3:  while not reach target do
4:     
5:     
6:     for all  do
7:         
8:          {Algorithm. 1}
9:     end for
10:     for all pose in  do
11:         Add cost() into the costs set
12:     end for
13:     Similarly get with right side pivot and expand .
14:      with smallest cost in over and
15:      {Algorithm. 2}
16:     Add into
17:  end while
18:  Output:
Algorithm 3 Path Search.

Given morphology

at frame , we use both and to compute its pose in the next step and collect a batch of possible candidates.

For each pivot , we let the pivot move forward a step on the axis, sample a sequence of heights of the pivot and compute its orientation and flipper angle as in Algorithm 3. The cost function is the sum square of difference between the inflated map height and the point height with that is in the middle line of robot base.

We show one possible path in Fig. 6 to demonstrate the morphology at each configuration point.

Fig. 6: A possible path to get over the iramp with rotation in experiment. Each subfigure is a configuration point on the path.

Iii-E Path Following

The path following can be divided into a sequence of the small problem that robot moves from one point to the next.

Given and , we need to compute the track movement to make robot follow.

Different from [21], that can compute the distance to make the track move accurately, in this 3D scenario, it becomes much more complicated. Thus real time localization of the robot in map is required to let robot follow the planned path.

To realize the goal, while the robot is driving, we make sure that the robot keeps the yaw unchanged. We check if the robot reached the target by tracking point (middle point between and ).

(a) step
(b) ramp
(c) iramp
Fig. 7: Position bias for terrains. The red is the path to follow, the blue points are real time location.

Iv Experiment

Iv-a Setting

We apply our proposed method to our MARS Lab (Mobile Autonomous Robotic Systems Lab) rescue robot shown in Fig. 3 and test if it can get over various terrains. The robot has a wheel radius , track width and robot width . We use two DYNAMIXEL XM430-W210-T motors as wheel drive and four XM430-W350-T motors for the flippers. An Opti-track system is used to provide robot position. We don’t use a mapping algorithm but simply provide a user-generated elevation map.

The algorithm is implemented in python. ROS111https://www.ros.org is utilized for communication. Pose data from Opti-track is used with the VRPN server and the ROS client package222http://wiki.ros.org/vrpn_client_ros.

To standardize the experiment, we fix the robot initial pose and set the origin of the coordinate system as robot’s initial location, robot’s front direction as -axis and its bottom-up as -axis. Following right hand coordinate rule to uniquely determine the -axis.

The test consists of three groups of test cases: step, ramp, inverse ramp, all with various angles on the rotation axis (front center axis parallels to -axis) that the robot straightly confronts to, as shown in Fig. 8. We call them (a) step, (b) ramp and (c) iramp, respectively. The distance from the rotation axis to is fixed as , with a rotation of the obstacle from to , with a interval.

(a) step
(b) ramp
(c) iramp
Fig. 8: The obstacles in the experiments. The robot is orientated to the rotation axis of obstacle with same distance from the .

It should be noted that the robot is running autonomously in the experiments.

To get the robot location in the map, we pre-given the perfect 2.5D map of the obstacle and utilize tracking system (Opti-track) to provide the robot pose in real time. We are also using the the Opti-track pose to easily evaluate the offset in the experiments.

In the following we sequentially explore using our model on a real robot to cross real obstacles. Then we compute the error between the located pose and expected target pose.

Iv-B Real Robot Experiments with Various Obstacles

step 1 1 1 1 1 1 1 0
ramp 1 1 1 1 1 1 0 -
iramp 1 1 1 1 1 0 0 0
TABLE I: Table Success Run. 1 is success, 0 is fail, - is not test.

The attached video illustrates how the rescue robot gets over these three terrains with a range of rotation angles.

[21] also test the rotated step, but the planner was not aware of this. Here, in contrast, we planned with the elevation map for the complicated terrain. Table I shows that our robot can successfully move across the rotated cases.

For step cases, the robot finished the . However, when the rotation angle gets even larger, it fails the test. The video shows that the right front flipper gets stuck and Section IV-C will try to reveal the according details in the data.

For ramp cases, when there is a rotation, the robot will confront the lower side of ramp first. And similar to step fail case, the right front flipper may get stuck.

The iramp cases are much harder than ramp, because the robot confronts the higher side of the ramp first when there’s a rotation of ramp. A slip to the lower side commonly happens from the video record, though it climbs onto the ramp successfully, but we consider failed because its final configuration varies too much from the target. The stuck problem does not happen in such cases.

Iv-C Configuration Error between Real Robot and Target

Fig. 9: Orientation error for terrain step. Selected cases: , ,
Fig. 10: Orientation error for terrain ramp. Selected cases: , ,
Fig. 11: Orientation error for terrain iramp. Selected cases: , ,

In this section, we ignore the record of the flipper angles, because our robot can provide accurate position of the flippers. For demonstration, only selected orientation errors of obstacles (, and the first fail case) will be shown in Fig. 9, 10 and 11.

From Fig. 7, we can find that the of succeed cases are well following the desire planned path.

In step with rotate angle , Fig. (a)a and Fig. 9 demonstrate the well traced morphology (location, orientation and flipper angle). However, when the rotation is as large as , from video, we can find that the robot gets stuck with a flipper, blocking the movement.

In ramp with rotate angle , Fig. (b)b and Fig. 10 shows the performance of well tracing the configuration path. In such cases, when there is a rotation, the robot first confronts the lower side of the ramp. Similarly, when angle is large as , its flipper stuck the movement from the video.

In iramp, the robot faces the higher side first when there is a rotation of the object. From Fig. (c)c and Fig. 11, when rotation is large, the robot slips to the lower side a little. When it is in , robot still can well follow the configuration path. However, when it is larger, with the robot slips a lot, and the robot moves to a tremendously different setting, even though it climbs on the obstacle, as can be seen in the video.

Iv-D Discussion

From the record of position and orientation, we find that the robot can well follow the path, which demonstrates the applacability of our robot simplification and its flipper planning on the transformed representation.

However, there are some inaccurate issues from the simplification and skeleton representation.

First, the modeling of the collision surface with the skeleton will induce inaccuracy, because the collision between the skeleton and the inflated map is only equivalent to the touch of simplified model and map, that is sometimes not specifically a collision between the robot and map.

Actually, when there is a pitch, it does not simply rotate around axis as in the skeleton model. For example, without moving the track, if the robot rises its front side body up on the flat ground to change the pitch with , will move back . It won’t be a problem since we can get the pose from Opti-track in real time to ensure the track make up. However, similarly, when there is a roll, there will be a similar error on -axis, which is not possible to make up. Thus this model is not that accurate for tracking the -axis, comparing with -axis. We consider this by carefully setting the cost on roll, which we can find in Section IV-C, so the effect is very small.

V Conclusions

We presented an autonomous flipper planning method based on an elevation map and the robot model. We create a skeleton of the robot model and in turn inflate the 2.5D elevation map, to maintain correct collision checks. The planning algorithm manipulates the four flippers individually to traverse the 3D terrain. We implemented the algorithm on a real robot and performed experiments, showing that our rescue robot can get over various terrain with the proposed method.

For future work we will improve the method by solving problems such as getting stuck. To allow general applicability of the method, Simultaneous Localization and Mapping (SLAM) should be introduced into this work for providing the elevation map and localization while path following. Finally, we also plan to integrate the flipper planner into a global path planner.

References

  • [1] B. Choi, W. Lee, G. Park, Y. Lee, J. Min, and S. Hong (2019) Development and control of a military rescue robot for casualty extraction task. Journal of Field Robotics 36 (4), pp. 656–676. Cited by: §I.
  • [2] F. Colas, S. Mahesh, F. Pomerleau, M. Liu, and R. Siegwart (2013) 3D path planning and execution for search and rescue ground robots. In 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 722–727. Cited by: §I.
  • [3] A. Ferworn, C. Wright, J. Tran, C. Li, and H. Choset (2012) Dog and snake marsupial cooperation for urban search and rescue deployment. In 2012 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), pp. 1–5. Cited by: §I.
  • [4] M. Gianni, F. Ferri, M. Menna, and F. Pirri (2016) Adaptive robust three-dimensional trajectory tracking for actively articulated tracked vehicles. Journal of Field Robotics 33 (7), pp. 901–930. Cited by: §I, §I.
  • [5] M. Konyo, K. Isaki, K. Hatazaki, S. Tadokoro, and F. Takemura (2008) Ciliary vibration drive mechanism for active scope cameras. Journal of Robotics and Mechatronics 20 (3), pp. 490–499. Cited by: §I.
  • [6] J. D. Martens and W. S. Newman (1994) Stabilization of a mobile robot climbing stairs. In Proceedings of the 1994 IEEE International Conference on Robotics and Automation, pp. 2501–2507. Cited by: §I.
  • [7] G. Miller (2002) 13 snake robots for search and rescue. Neurotechnology for Biomimetic Robots, MIT Press, Cambridge, MA, pp. 271. Cited by: §I.
  • [8] K. Nagatani, A. Yamasaki, K. Yoshida, T. Yoshida, and E. Koyanagi (2008) Semi-autonomous traversal on uneven terrain for a tracked vehicle using autonomous control of active flippers. In 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2667–2672. Cited by: §I.
  • [9] K. Ohno, S. Morimura, S. Tadokoro, E. Koyanagi, and T. Yoshida (2007) Semi-autonomous control system of rescue crawler robot having flippers for getting over unknown-steps. In 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3012–3018. Cited by: §I.
  • [10] Y. Okada, K. Nagatani, K. Yoshida, S. Tadokoro, T. Yoshida, and E. Koyanagi (2011) Shared autonomy system for tracked vehicles on rough terrain based on continuous three-dimensional terrain scanning. Journal of Field Robotics 28 (6), pp. 875–893. Cited by: §I, §I.
  • [11] K. Pathak, A. Birk, S. Schwertfeger, I. Delchef, and S. Markov (2007) Fully autonomous operations of a jacobs rugbot in the robocup rescue robot league 2006. In 2007 IEEE International Workshop on Safety, Security and Rescue Robotics, pp. 1–6. Cited by: §I.
  • [12] M. Pecka, V. Šalanskỳ, K. Zimmermann, and T. Svoboda (2016) Autonomous flipper control with safety constraints. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2889–2894. Cited by: §I.
  • [13] M. Pecka, K. Zimmermann, M. Reinstein, and T. Svoboda (2016) Controlling robot morphology from incomplete measurements. IEEE Transactions on Industrial Electronics 64 (2), pp. 1773–1782. Cited by: §I, §I.
  • [14] E. Rohmer, K. Ohno, T. Yoshida, K. Nagatani, E. Konayagi, and S. Tadokoro (2010) Integration of a sub-crawlers’ autonomous control in quince highly mobile rescue robot. In 2010 IEEE/SICE International Symposium on System Integration, pp. 78–83. Cited by: §I, §I.
  • [15] R. Sheh, B. Hengst, and C. Sammut (2011) Behavioural cloning for driving robots over rough terrain. In 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 732–737. Cited by: §I.
  • [16] R. Sheh, S. Schwertfeger, and A. Visser (2016) 16 years of robocup rescue. KI-Künstliche Intelligenz 30 (3-4), pp. 267–277. Cited by: §I.
  • [17] M. Sokolov, I. Afanasyev, A. Klimchik, and N. Mavridis (2017) HyperNEAT-based flipper control for a crawler robot motion in 3d simulation environment. In 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 2652–2656. Cited by: §I.
  • [18] S. Steplight, G. Egnal, S. Jung, D. B. Walker, C. J. Taylor, and J. P. Ostrowski (2000) A mode-based sensor fusion approach to robotic stair-climbing. In Proceedings. 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000)(Cat. No. 00CH37113), Vol. 2, pp. 1113–1118. Cited by: §I.
  • [19] Q. Vu, B. Kim, and J. Song (2008) Autonomous stair climbing algorithm for a small four-tracked robot. In 2008 International Conference on Control, Automation and Systems, pp. 2356–2360. Cited by: §I.
  • [20] Y. Yuan and S. Schwertfeger (2019) Incrementally building topology graphs via distance maps. In 2019 IEEE International Conference on Real-time Computing and Robotics (RCAR), pp. . Cited by: §III-B2.
  • [21] Y. Yuan, L. Wang, and S. Schwertfeger (2019) Configuration-space flipper planning for rescue robots. In 2019 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), pp. . Cited by: §I, §III-A, §III-B1, §III-C, §III-E, §III, §IV-B.
  • [22] K. Zimmermann, P. Zuzanek, M. Reinstein, and V. Hlavac (2014) Adaptive traversability of unknown complex terrain with obstacles for mobile robots. In 2014 IEEE international conference on robotics and automation (ICRA), pp. 5177–5182. Cited by: §I, §I, §I.