SlingDrone: Mixed Reality System for Pointing and Interaction Using a Single Drone

11/12/2019 ∙ by Evgeny Tsykunov, et al. ∙ Brigham Young University Skoltech 0

We propose SlingDrone, a novel Mixed Reality interaction paradigm that utilizes a micro-quadrotor as both pointing controller and interactive robot with a slingshot motion type. The drone attempts to hover at a given position while the human pulls it in desired direction using a hand grip and a leash. Based on the displacement, a virtual trajectory is defined. To allow for intuitive and simple control, we use virtual reality (VR) technology to trace the path of the drone based on the displacement input. The user receives force feedback propagated through the leash. Force feedback from SlingDrone coupled with visualized trajectory in VR creates an intuitive and user friendly pointing device. When the drone is released, it follows the trajectory that was shown in VR. Onboard payload (e.g. magnetic gripper) can perform various scenarios for real interaction with the surroundings, e.g. manipulation or sensing. Unlike HTC Vive controller, SlingDrone does not require handheld devices, thus it can be used as a standalone pointing technology in VR.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1. Introduction

Three-dimensional human-computer interaction (HCI) is a rich field which is under active development in the robotics and VR/AR communities.

Pointing during HCI often happens via controllers, such as joysticks or different wearable or handheld devices (Billinghurst and Starner, 1999; Howard and Howard, ). For instance, authors in (Speicher et al., 2018) used HTC Vive controllers to select keys on a virtual keyboard. In (Yuan et al., 2019), a human-assisted quadcopter navigation system where the user guides the robot through eye-tracker glasses was proposed. Wearable devices are widely used to control remote (Tsykunov et al., 2019) or virtual vehicles (Labazanova et al., 2019). Although different devices proved rich functionality, the user has to hold or wear them during the whole interaction.

One alternative approach is gesture-based interaction which is widely used in VR/AR. (Valentini and Pezzuti, 2016) showed that Leap Motion is capable of demonstrating suitable accuracy for hand tracking in some operational zones. The drawback is that the finger pointing allows the user to define direction, but not magnitude. Nielsen et al. (Nielsen et al., 2004) stated that performing certain gestures or long lasting gesturing could be stressful. In addition, these approaches in most cases require a camera setup and provide no haptic or tactile feedback.

Mixed Reality (MR) plays an important role for creating new interaction methods, combining VR and robotics. Hoenig et al. (Hönig et al., 2015) used MR to simulate bigger quadrotors with small ones, which followed a virtual human. In this way authors tested human-robot interaction algorithms safely and preserved real flight dynamics. Authors in (Freund and Rossmann, 1999) used MR for industrial robot teleoperation. A human wore a tracked glove and motions from VR were translated to real manipulators.

In contrast with the discussed works, we introduce a novel MR concept of using a single drone for three-dimensional virtual pointing and interaction. The drone acts as a slingshot and a projectile at the same time. SlingDrone is a selection device which is first used for pointing to an object, then it performs the defined maneuvers to interact with the environment.

The drone has a hand grip hanging from the bottom. By default, the drone hovers at the same position and the human pulls it in some direction using a hand grip, see Fig.1(a). The physical displacement of the drone (which is three-dimensional) is connected to the virtual pointing and visualized in VR (an example of pointing is shown in Fig.1(b)). A force feedback system with a leash (Fig.1(a)) gives the user a better understanding of the interaction. After the pointing phase, the drone is able to perform the defined maneuvers to interact with the surroundings.

In contrast to the GridDrones project (Braley et al., 2018), where the operator grabs one drone to control the motion of the others agent, the interaction with the drone is smooth by its nature, (i) due to the elastic hardware element (leash) in the interface and (ii) due to the fact that the leash is connected to the center of the drone providing zero angular momentum.

The main novelty of the SlingDrone is that we propose to design and perform maneuvers using one drone, without any additional handheld or wearable control device. For the proof of concept, we tracked the user hand (to be shown in VR to facilitate holder grab) with hand mounted infrared reflective markers, but they can be replaced with Leap Motion attached to the HMD. On the other side, the user also could rely purely on haptic sensation from drone airflow for the hand grab. Since SlingDrone is a flying robot, one of its main advantages is that it can fly to a location that is convenient for a human to perform pointing.

2. SlingDrone Technology

A holder allows a human to grab and pull the SlingDrone in any direction in the plane and downwards (except upwards, which limits the operational zone). The drone tries to maintain its desired spatial position pdes while the operator is pulling the hand grip, which causes a slight change in position. The state of the SlingDrone, including the current position p

, is estimated with onboard sensors and a motion capture system. The real displacement vector

D is defined as and connected to the virtual pointing.

The relation between the real displacement vector and the virtual pointing could have multiple implementations, which incorporates different three-dimensional interaction techniques. For instance, SlingDrone allows to select an object that the user’s hand cannot reach. We propose to generate virtual trajectory that starts from the drone position and is defined using the displacement vector D. Vector D is tree-dimensional, therefore it is possible to map it to a point in tree-dimensional space.

Force with a vector F distributes through the leash from the drone to the human hand. Based on the slingshot analogy, we assume that force feedback potentially helps to improve interaction and make more accurate pointing.

After the pointing phase, the drone can be transformed to the projectile mode and is able to fly and perform different cases of 3D interaction with the real or virtual environment.

When the drone experiences extreme values in the state parameters (e.g. pitch, roll) during the pointing, the motors shut down. This functionality is introduced as an emergency stop to ensure safety of the operator.

2.1. Pointing During the Slingshot Mode

A pointing or the object selection could be defined in multiple ways. One of the most straightforward solutions is a scaling of displacement vector D with the coefficient (usually ). When the new vector defines the point in the mid-air (negative sign also can be omitted, but here it helps to replicate slingshot - we pull in one direction and the projectile fly in the opposite one). But taking into account that the SlingDrone technology somehow replicates the slingshot operation, we propose to design a ballistic trajectory instead of the straight line. The ballistic trajectory better replicates the projectile flight after the slingshot execution. An intersection of the ballistic trajectory with some surface, line or object could help to define a point in three-dimensional space.

The proposed ballistic trajectory is generated by an air-drag model defined by a system of second order, nonlinear differential equations:

(1)
(2)

where is the air density, is the coefficient of drag, is the frontal area of the drone in the respective direction, and is the mass. An air drag model was used to generate an intuitive trajectory of the free ballistic flight.

Equations (1-2) are solved using a numerical ODE solver with six initial conditions: the three dimensional vector pdes as the initial position and the three dimensional displacement vector D as a proxy for initial velocity. D can be scaled by a coefficient to achieve a desired distance scale for the trajectory. A value of was used. The three columns representing position are extracted to form a position vector which defines the trajectory. To command the drone to execute the trajectory, polynomial coefficients were generated as in (Richter et al., 2016).

The coefficients of equations (1) and (2) have been selected to model a smooth sphere on a ballistic trajectory. The air density is 1.23 kg/m. The coefficient of drag () is held constant at 0.4. The area is m and the mass of the virtual body is 10 kg. Like the trajectory itself, these values can be modified to suit specific use cases.

2.2. Transition between the Slingshot and the Projectile Modes

While hovering without interaction with a human, the drone stabilizes itself and experiences small displacements. All displacements under the certain threshold () are not considered as inputs. During the pointing, the velocity of the drone displacement is small due to the fact that small displacements cause big change in trajectory (which is defined with the coefficient, described above). In addition, aiming is almost always performed in a slow manner by the user. Therefore, the drone maintains small velocity which is mostly under the threshold . Based on that fact, we assume that while the following statements are true

(3)

the drone is in the slingshot mode. We update the trajectory every loop while (3) is true. When the user releases the holder, the drone starts to accelerate towards the default hover position and the velocity becomes bigger than (the trajectory is not updated any more). When the drone approaches the hover position , it becomes a projectile and starts to follow the last valid trajectory towards the defined point in space to interact.

2.3. Human-SlingDrone Interaction Strategy

First, the SlingDrone approaches the human and starts to hover, entering the slingshot mode (Fig.1(a)). The user estimates the position of the drone utilizing visual feedback from VR coupled with a haptic sensation form the airflow below the quadrotor, which helps to catch the hand grab. After the human grabs the holder, he or she starts to pull it. When the magnitude of the current displacement vector —D— exceeds the threshold, the trajectory is being visualized in VR. Now, the human is able to point the SlingDrone by pulling the holder. When the user is satisfied with the pointing or the selection, he can release the holder. The drone enters the projectile mode (Fig.1(c)) and performs the maneuvers and interaction with the surroundings.

2.4. Implementation

The SlingDrone itself, together with the user’s hand, and the object of interest are real objects and tracked by a motion capture system with visualization in virtual scene.

2.4.1. Aerial Platform

We used a Crazyflie 2.0 quadrotor to perform the verification of the flight tests. To get the high-quality tracking of the quadrotor during the experiments, we used Vicon motion capture system with 12 cameras (Vantage V5) covering a space. We used the Robot Operating System (ROS) Kinetic framework to run the developed software and ROS stack (Preiss et al., 2017; Hönig et al., 2015) for Crazyflie 2.0. The position and attitude update rate was 100 Hz for all drones. Before the verification of the proposed approach, we ensured that we were able to perform a stable and smooth flight, following the desired trajectory. In order to do so, all PID coefficients for position controller were set to default values for Crazyflie 2.0, according to (Preiss et al., 2017) (for x,y-axis =0.4, =0.2, =0.05; for z-axis =1.25, =0.4, =0.05). Finally, to be able to reach stable displacement of the SlingDrone during interaction, we set all positional integral terms of the PID controller to zero.

An elastic wire (100 mm in length) is connected to the bottom of the SlingDrone, as shown in Fig.1(a), with a holder at the end.

2.4.2. VR Development

A Virtual Reality application in Unity3D was created to provide the user with a pleasant immersion. Quadcopter, human hand, and remote object models were simulated in VR. The size of each virtual object changes like in real life depending on the distance between the operator and the objects. Vicon Motion Capture cameras were used to transfer positioning and rotation tracking data of each object to the Unity 3D engine. The operator wears HTC Vive VR headset to experience Virtual Reality application.

The virtual trajectory is updated 30 times per second and the human can get visual feedback in VR about the potential flight path (to avoid obstacles) and about the destination point.

Figure 2. Workflow of the user study experiment. (a) No interaction between the human and the drone, (b) User points towards the object of interest, (c) Pink ball represents the projectile.

2.5. Application: Grabbing a Remote Object

The goal of the proposed application is to grab a remote object using a magnetic gripper placed on the Crazyflie 2.0 drone. As a gripper for the user, we used a magnet which is also used to attract the remote object of interest.

During the pointing, when the end point of the generated trajectory hits the remote object that is intended to be picked, the user releases the holder. The drone enters the projectile mode (Fig.1(c)) and starts to follow the defined trajectory representing the slowed flight of passive object. When the quadrotor approaches the end of trajectory, to increase the chances of grabbing, it performs a search of object of interest by moving in a square with 0.15 meter side. After that, the drone flies back to the user to perform a delivery. After the delivery the user is able to experience a tangible sensation of the object of interest.

3. User Study

Seven right-handed users (22 to 28 years old, six male and one female) took part in the experiments. The protocol of the experiment was approved by a Skolkovo Institute of Science and Technology review board and all participants gave informed consent.

3.1. Experimental Methods

As described above, the user’s hand, the drone, and the object of interest are visualized in VR. The drone hovered near the human and maintained 1.5 meters in height as shown in Fig. 2(a). For the experiment we decided to focus on the pointing functionality of the SlingDrone purely in VR. Therefore, we kept the slingshot mode of the drone for the whole experiment. After the user performed aiming (with visualized trajectory as shown in Fig. 2(b)) and released the hand grip, the drone avoided the transition to the projectile mode. Instead, after the release, the drone just moved to the default hover position. Then, the virtual ball appeared and became a projectile; then, it flied along the desired trajectory in virtual scene (Fig. 2(c)). All subjects were asked to evaluate the pointing capabilities of SlingDrone technology for 5 minutes. Pointing is just a part of the idea behind SlingDrone (another part is the physical flight for interaction), which means that the experimental results cannot be extrapolated to the whole technology.

3.2. Experimental Results and Discussion

All participants positively responded to the device convenience. After the experiment, we also asked the subjects to answer a questionnaire of 8 questions using bipolar Likert-type seven-point scales. The results are presented in Table 1.

All participants reported that it is was easy to learn of how to use the SlingDrone (in fact it took 10-30 second to understand the workflow), which is supported by question 1 in Table 1. According to the users, it was slightly hard to grab the holder, as far as it is not visualised in VR. For the future work, the holder will be visualised in VR. For the trajectory representation, we used small white balls thrown from the position of the drone. Therefore, when the users change the direction, the trajectory did not change instantly - it takes some time for the balls to perform a flight. That fact was not positively supported by the users. For the future, we plan to replace the balls with a trajectory visualised with dashed line that can be changed instantly. Surprisingly, in spite of the small size of the used quadrotor, most of the users reported that they felt some force feedback from the drone, which provides additional information about the magnitude of the input.

In question 5 users were satisfied with the actual destination of the virtual projectile. Along with that, pulling a drone looks similar to the slingshot operation, which could have an acceptable accuracy in certain cases. Therefore, we propose a hypothesis that the SlingDrone potentially could have a high accuracy. During the experiment we did not measure the accuracy of the pointing, but it will be done in the future work.

Questions Mean SD
1 It was easy to learn how to use SlingDrone. 6.4 0.49
2
It was easy to grab and keep the holder of
the drone.
5.4 1.02
3
The visualized trajectory in VR clearly
represented the flight path.
5.6 0.8
4
I felt the force response from the drone
when pulling the rope.
5.8 1.47
5
I was satisfied with the actual destination
of the virtual projectile.
5.2 0.74
6 I was tired at the end of the experiment. 1.4 0.48
7 I felt comfortable when I used SlingDrone. 6.0 0.89
8 I felt safe when I used SlingDrone. 7.0 0
Table 1.

Evaluation of the participant’s experience. Volunteers evaluated these statements, presented in random order, using a 7-point Likert scale (1 = completely disagree, 7 = completely agree). Means and standard deviations are presented.

4. Conclusion and Future Work

In this paper, we developed SlingDrone - a novel interaction technology that combines the advantages of real and virtual environments. One of the core features is that SlingDrone provides a powerful and easy to use 3D pointing technology. Therefore, SlingDrone can also be used for pointing of other robots. The intersection between the virtual trajectory and a floor can be used as a desired position for a ground based robot such as car.

As a perspective development, we are planning to introduce the SlingDrone placement mode, when the user moves the SlingDrone in space to the most suitable location by pulling the rope. This can be done as a preliminary setup before the actual pointing process.

To provide deeper immersion into the pointing process, instead of the magnetic hand grab, the vibromotor can be used. The vibromotor is connected to the drone via coiled elastic wire and it is powered/controlled from onboard electronics. Different tactile patterns can deliver additional information about the potential interaction with the surroundings. But for the selection applications a press button instead of the vibromotor can be more suitable.

For the future work, we plan to conduct a user study to evaluate the participants’ experience of using SlingDrone and to estimate the accuracy and precision of pointing and flight. The SlingDrone platform can also be implemented in an Augmented Reality (AR). This would allow for more direct information about the trajectory to the goal and provide additional safety.

References

  • M. Billinghurst and T. Starner (1999) Wearable devices: new ways to manage information. Computer 32 (1), pp. 57–64. External Links: Document Cited by: §1.
  • S. Braley, C. Rubens, T. R. Merritt, and R. Vertegaal (2018) GridDrones. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems - CHI '18, External Links: Document Cited by: §1.
  • E. Freund and J. Rossmann (1999) Projective virtual reality: bridging the gap between virtual reality and robotics. IEEE Transactions on Robotics and Automation 15 (3), pp. 411–422. External Links: Document Cited by: §1.
  • W. Hönig, C. Milanes, L. Scaria, T. Phan, M. Bolas, and N. Ayanian (2015) Mixed reality for robotics. In IEEE/RSJ Intl Conf. Intelligent Robots and Systems, pp. 5382 – 5387. Cited by: §1, §2.4.1.
  • [5] B. Howard and S. Howard Lightglove: wrist-worn virtual typing and pointing. In Proceedings Fifth International Symposium on Wearable Computers, External Links: Document Cited by: §1.
  • L. Labazanova, A. Tleugazy, E. Tsykunov, and D. Tsetserukou (2019) SwarmGlove: a wearable tactile device for navigation of swarm of drones in VR environment. In Lecture Notes in Electrical Engineering, pp. 304–309. External Links: Document Cited by: §1.
  • M. Nielsen, M. Störring, T. B. Moeslund, and E. Granum (2004) A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In Gesture-Based Communication in Human-Computer Interaction, pp. 409–420. External Links: Document Cited by: §1.
  • J. A. Preiss, W. Honig, G. S. Sukhatme, and N. Ayanian (2017) Crazyswarm: a large nano-quadcopter swarm. In IEEE International Conference on Robotics and Automation (ICRA), External Links: Document Cited by: §2.4.1.
  • C. Richter, A. Bry, and N. Roy (2016) Polynomial trajectory planning for aggressive quadrotor flight in dense indoor environments. In Robotics Research, pp. 649–666. Cited by: §2.1.
  • M. Speicher, A. M. Feit, P. Ziegler, and A. Krüger (2018) Selection-based text entry in virtual reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI '18, External Links: Document Cited by: §1.
  • E. Tsykunov, R. Agishev, R. Ibrahimov, L. Labazanova, A. Tleugazy, and D. Tsetserukou (2019) SwarmTouch: guiding a swarm of micro-quadrotors with impedance control using a wearable tactile interface. IEEE Transactions on Haptics 12 (3), pp. 363–374. External Links: Document Cited by: §1.
  • P. P. Valentini and E. Pezzuti (2016) Accuracy in fingertip tracking using leap motion controller for interactive virtual applications. International Journal on Interactive Design and Manufacturing (IJIDeM) 11 (3), pp. 641–650. External Links: Document Cited by: §1.
  • L. Yuan, C. Reardon, G. Warnell, and G. Loianno (2019) Human gaze-driven spatial tasking of an autonomous mav. IEEE Robotics and Automation Letters 4 (2), pp. 1343–1350. Cited by: §1.