Companion Unmanned Aerial Vehicles: A Survey

01/14/2020
by   Chun Fui Liew, et al.
The University of Tokyo
0

Recent technological advancements in small-scale unmanned aerial vehicles (UAVs) have led to the development of companion UAVs. Similar to conventional companion robots, companion UAVs have the potential to assist us in our daily lives and to help alleviating social loneliness issue. In contrast to ground companion robots, companion UAVs have the capability to fly and possess unique interaction characteristics. Our goals in this work are to have a bird's-eye view of the companion UAV works and to identify lessons learned and guidelines for the design of companion UAVs. We tackle two major challenges towards these goals, where we first use a coordinated way to gather top-quality human-drone interaction (HDI) papers from three sources, and then propose to use a perceptual map of UAVs to summarize current research efforts in HDI. While being simple, the proposed perceptual map can cover the efforts have been made to realize companion UAVs in a comprehensive manner and lead our discussion coherently. We also discuss patterns we noticed in the literature and some lessons learned throughout the review. In addition, we recommend several areas that are worth exploring and suggest a few guidelines to enhance HDI researches with companion UAVs.

READ FULL TEXT VIEW PDF

page 1

page 2

page 3

page 4

04/23/2018

Unmanned Aerial Vehicle Forensic Investigation Process: Dji Phantom 3 Drone As A Case Study

Drones (also known as Unmanned Aerial Vehicles, UAVs) is a potential sou...
11/28/2017

Recent Developments in Aerial Robotics: A Survey and Prototypes Overview

In recent years, research and development in aerial robotics (i.e., unma...
07/02/2020

A New Direction in Firefighting Systems

Recent huge technological development of Unmanned Aerial Vehicles (UAVs)...
04/05/2017

The UMCD Dataset

In recent years, the technological improvements of low-cost small-scale ...
11/08/2019

Building an Aerial-Ground Robotics System for Precision Farming

The application of autonomous robots in agriculture is gaining more and ...
07/13/2022

Multi-Depot Multi-Trip Vehicle Routing with Total Completion Time Minimization

Unmanned aerial vehicles (UAVs) are aircraft whose flights can be fully ...
02/17/2021

Active Face Frontalization using Commodity Unmanned Aerial Vehicles

This paper describes a system by which Unmanned Aerial Vehicles (UAVs) c...

1 Introduction

Companion robots are robots designed to have social interaction and emotional connection with people. For examples, companion robots like Paro (a therapeutic robotic seal for elderly) Wada and Shibata (2007), EmotiRob (a companion robotic bear for children) Tallec et al. (2011), Aibo (a companion robotic dog) Kertész and Turunen (2019), and Jibo (a companion robot in home) Jibo (2017) could interact socially with people. One potential application of companion robots is that they could be our personal assistance. One could also argue that companion robots are similar to pets and they help to alleviate social isolation or loneliness issues.

Recently, technological advancements have led to a new class of companion robots—a companion unmanned aerial vehicle (UAV). Compared to conventional companion robots, companion UAVs have some distinctive characteristics—notably their capability to fly (hence their new interaction capability) and their strict design constraints such as safety concern, noise issue, flight time and payload limitation. While there are still many technical difficulties and social design questions, various concepts, design sketches, and prototypes have been proposed to demonstrate the idea of a companion UAV, including a flying jogging companion Graether and Mueller (2012), a flying assistant that could interact with us in daily tasks Vink et al. (2014), a human-centered designed drone aiming to fly in a human crowd environment Yeh et al. (2017), a flying smart agent that could assist users through active physical participation Agrawal et al. (2015), flying “fairies” Murphy et al. (2011); Duncan et al. (2010) and flying lampshades SPARKED (2016a, b) that could dance with human on the stage, a flying ball for augmented sports Nitta et al. (2014), a flying humanoid balloon to accompany children Cooney et al. (2012), a companion UAV that can react to human emotions Malliaraki (2017), and a moving projector platform for street games Kljun et al. (2015).

Our goals in this survey are to have a bird’s-eye view of the companion UAV works and to identify lessons learned, guidelines, and best practices for the design of companion UAVs. There are two major challenges towards these goals: (i) to find a coordinated way to identify top-quality HDI works from a huge amount of available literature, and (ii) to find a suitable framework or organizing principle to discuss the vast aspects of existing human-drone interaction (HDI) works.

To tackle the first challenge, i.e., to cover top-quality companion UAV works as comprehensive as possible in this review, we use a coordinated way to gather HDI papers from three major sources. First, we systematically identified 1,973 high-quality UAV papers from more than forty-eight thousand general papers that have appeared in the top robotic journals and conferences since 2001. In a nutshell, this identification process involves a few steps consist of automated and manual processes (more details in Section 3). Second, from the identified UAV papers, we tagged the papers with several key topics, studied those related to the topic of HRI and analyzed their references, and continued to track down HDI-related papers. Third, we included HDI papers recommended by reviewers during our past journal submission.

To tackle the second challenge, i.e., to find a suitable framework to discuss the vast aspects of existing HDI works, we propose to use a perceptual map of UAVs (more details in Section 4) as a high-level framework to organize current research efforts in HDI. In the proposed perceptual map, we categorize UAVs based on the degree of autonomy and the degree of sociability. This basic categorization leads to four distinct categories, namely remotely-controlled UAV, autonomous UAV, social UAV, and companion UAV. Looking at the research and development of companion UAVs with this perceptual map, we can find two main direction of on-going efforts. Moreover, we find this perceptual map easy to understand and lead our discussion coherently.

This work emphasizes on the proximate interaction between a human and a companion UAV in the HDI field. Note that proximate interaction is also called collocated interaction in some literature Wojciechowska et al. (2019); Hedayati et al. (2018). In the following sections, we first briefly explain the definition of a UAV and different type of UAVs (Section 2) in order to facilitate the discussion in this work. Next, we describe methodology we used to identify top-quality UAV papers from the literature (Section 3). Then, we discuss the perceptual map of UAVs (Section 4), followed by discussion on research efforts in realizing companion UAVs from the engineering (Section 5) and sociability (Section 6) perspectives. In Section 7 and Section 8, we discussion several observation and lessons learned, along with guidelines and recommendations for realizing a companion UAV. Section 9 draw conclusion about future research directions for companion UAVs.

2 UAV Background

We first explain the UAV definition and introduce some common UAV types to facilitate the following discussion. We recommend the handbook of UAVs Valavanis and Vachtsevanos (2014) if readers are interested in the more technical details of UAV.

2.1 UAV Definition

UAVs, commonly known as drones, are aircraft that can perform flight missions without a human pilot onboard FAA (2017). In general, UAVs can be viewed as flying robots. The UAV’s degree of autonomy varies but often modern UAVs are able to hover stably at a point in 3D space. UAVs with a higher degree of autonomy offer more functions like automatic take-off and landing, path planning, and obstacle avoidance. In the literature, UAVs have several other names such as micro aerial vehicle (MAV), unmanned aerial system (UAS), vertical take-off and landing aircraft (VTOL), multicopter, rotorcraft, and aerial robot. In this work, we will use the “UAV” and “drone” terms interchangeably.

2.2 UAV Types

Conventionally, UAVs can be classified into fixed-wing, multirotor, blimp, or balloon types based on their flying principle. In the end of this review, one could observe that most UAV prototypes we described in this work are multirotor UAVs. We speculate this is due to the availability of multirotor type UAVs in the market. In Section 

7, we will have a more rigorous discussion arguing that the blimp or balloon UAVs could be a better form for companion UAVs.

It is worth noting that Floreano & Wood Floreano and Wood (2015) have classified UAVs based on the flight time and UAV mass (a simplified plot is shown in Fig. 1). In general, flapping-wing UAVs are small and have a short flight time. Blimp/balloon UAVs are lightweight and have a longer flight time. Rotor-type and fixed-wing UAVs are usually heavier. In Section 7, we will have more discussion about the safety and noise issues of different types of UAVs.

Figure 1: UAV types based on flight time and UAV mass (inspired by Floreano & Wood Floreano and Wood (2015)).

3 UAV Paper Identification Process

The UAV papers identification process involves three major steps. We first used a script to automatically collect more than forty-eight thousand instances of title and abstract from fourteen top journal/conference web pages since 2001 (the seven journals include IEEE Transactions on Robotics (TRO), IEEE/ASME Transactions on Mechatronics (TME), The International Journal of Robotics Research (IJRR), IAS Robotics and Autonomous Systems (RAS), IEEE Robotics and Automation Letters (RA-L), ACM Journal on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), and ACM Transactions on Human-Robot Interaction (THRI); the seven conferences include IEEE International Conference on Intelligent Robots and Systems (IROS), IEEE International Conference on Robotics and Automation (ICRA), ACM/IEEE International Conference on Human-Robot Interaction (HRI), IEEE International Workshop on Robot and Human Communication (ROMAN), ACM Conference on Human Factors in Computing Systems (CHI), ACM International Conference on Ubiquitous Computing (UbiComp), and ACM International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI). We also manually reviewed the hard copies of the IROS and ICRA conferences’ table of contents from 2001 to 2004, as we find that not all UAV papers in those years are listed on the website (IEEE Xplore).

Then, we designed a list of keywords (Table 1) to search drone papers systematically from the titles and abstracts collected in the first step. Note that we searched for both the full name of each keyword (e.g., Unmanned Aerial Vehicle) and its abbreviation (i.e., UAV) with an automatic program script. The keywords include most of the words that describe a UAV. For example, the word “sUAV” (small UAV) could be detected by the keyword “UAV”. Similarly, the word “quadcopter” or “quadrotor” could be detected by the keyword “copter” or “rotor”. As long as one of the keywords is detected, the paper will pass this automated screening process.

acrobatic bat flight rotor
aerial bee fly rotorcraft
aero bird flying soar
aeroplane blimp glide soaring
air copter glider micro aerial vehicle
aircraft dragonfly gliding unmanned aerial vehicle
airplane drone hover unmanned aircraft system
airship flap hovering vertical takeoff and landing
balloon flapping kite MAV, UAV, UAS, VTOL
Table 1: 35 keywords used to search drone papers systematically from the collected titles and abstracts.

Finally, we performed a manual screening to reject some non-drone papers. We read the abstract, section titles, related works, and experiment results of all the papers from the second step. If a paper passes all the five criteria below, we consider it a drone paper for this survey. 111Details on the paper category analysis can be found in our previous survey paper Liew et al. (2017). Details about the online sharing and regular updates can be found in Appendix A.

  1. The paper must have more than two pages; we do not consider workshop and poster papers.

  2. The paper must have at least one page of flight-related results. These can be either simulation / experiment results, prototyping / fabrication results, or insights / discussion / lesson learned. One exception is a survey/review paper, which normally does not present experiment results. Papers with details or photos of the UAV hardware are a plus. Note that the experiment results do not necessarily need to be a successful flight, e.g., flapping wing UAVs normally have on-the-bench test results.

  3. In topics related to computer vision, the images must be collected from a UAV’s onboard camera rather than a manually moving camera.

  4. In topics related to computer vision, the images must be collected by the authors themselves. This is important, as authors who collected the dataset themselves often provide insights about their data collection and experiment results.

  5. The paper which proposes a general method, e.g., path planning, must have related works and experiment results on drones. This is important, as some authors mention that their method can be applied to a UAV, but provide no experiment result to verify their statement.

It is interesting to note that using the keyword “air” in the second step increases the number of false entries (since the keyword is used in many contexts) but helps to identify some rare drone-related papers that have only the keyword “air” in the title and abstract. By manually filtering the list in the third step, we successfully identified two of these drone papers Latscha et al. (2014); Butzke et al. (2015). Similarly, using the keyword “bee” can help to identify a rare drone paper Das et al. (2016). On the other hand, we chose not to use the keyword of “wing” because it causes many false entries like the case of “following”, “knowing”, etc.

4 Perceptual Map of UAVs

Having a framework that could cover all the related companion UAV works in both engineering and social interaction topics is challenging, as these papers have different motivation, methodology, and results. Companion UAV works focusing on engineering usually emphasize on devising new hardware designs or new autonomous functions, while companion UAV works focusing on social interaction studies usually emphasize on participatory design and social experiments with users. To this end, we propose to categorize related works in this survey based on a perceptual map with the degree of autonomy (corresponding to engineering works) and the degree of sociability (corresponding to social interaction works) (Fig. 1)

4.1 The Four UAV Categories

Figure 2: Perceptual map for UAVs based on the degree of autonomy and sociability, with the major topics found in the literature.

The perceptual map of UAVs has four categories: remote-controlled UAV, autonomous UAV, social UAV, and companion UAV. Traditionally, UAVs are controlled manually by human operators and have low degrees of autonomy and sociability. Gradually, along the vertical axis of degree of autonomy, researchers have been improving the autonomy aspects of UAVs, such as better reactive control with more sensors and better path planning algorithms. Essentially, autonomous UAVs are less dependent on human operators and are able to perform simple flight tasks autonomously.

At the same time, along the horizontal axis of degree of sociability, researchers have been improving the social aspects of UAVs, such as designing UAV movements that are more comfortable for humans or building intuitive interfaces for us to understand UAVs’ attention better. Most HRI researchers focus on the social aspects of UAVs and usually perform user studies using Wizard of Oz222A common experiment setting used by researchers, where participants interact with a robot that participants believe to be autonomous, but in fact it is being manually controlled by a human behind the scene. experiments. Different from autonomous UAVs, in which its main purpose is to achieve a task efficiently from an engineering point of view, social UAV aims to work with human harmonically, i.e., ease user acceptance and relieve user’s cognitive burden. For example, a “social” fire-fighting drone might need to have a design that make nearby human understanding its purpose of fire-fighting during emergency Khan and Neustaedter (2019a).

We first coined the phrase “companion UAV” and consider a companion UAV as one that possesses high degrees of both autonomy and sociability Liew (2016). In addition to the autonomy skills, such as motion planning and obstacle avoidance, companion UAVs must also feature sociability skills such as making users feel safe and understand their intention. It is worth noting that the term “companion UAV” used in a prior work Hrabia et al. (2017) has a different meaning, where the UAV was designed to support a ground robot but not to interact with a person.

To consolidate the idea of perceptual map, we can use a package-delivery drone as an example. In the autonomous UAV sense, the package-delivery drone focuses on accomplishing the delivery task from the engineering perspective. Note that this alone is a challenging task as one needs to figure out how to perform the flight efficiently, how to detect the landing location, how to avoid obstacles during the flight, etc. On the other hand, in the sociable UAV sense, the package-delivery drone should also acknowledge people for successful interactions with people, i.e., signaling to the person that it has seen them at a certain distance and a range of time. This design aspect has also been raised and investigated recently by Jensen et al. Jensen et al. (2018).

4.2 Link with Previous Study

It should be noted that “social UAV” has been mentioned in the past literature frequently. One representative example is an survey paper of social UAV Baytaş et al. (2019). In this study, Baytaş et al. define “social drone” as autonomous flyers operate near to human users. Literally, this definition is similar to the definition of “companion UAV” here, but upon careful investigation, their true meaning of “social drone” is closer to the meaning of “social UAV” mentioned in the perceptual map here. Most papers considered by Baytaş et al. were not truly autonomous, i.e., a drone with pre-programmed behavior and motions is considered as autonomous by them. In our opinion, their categorization is not precise enough, e.g., while the teleoperated drone Jones et al. (2016) is categorized as a “social drone” by them, we consider that teleoperated drone a remotely-controlled drone in our context here as the drone is neither autonomous nor social. They also included poster and short papers in their review, but it is unclear how they categorize some short and poster papers that lack of implementation details. In contrast, our work here cover more papers in a more coordinated and systematic way.

4.3 Research Efforts Towards Companion UAVs

Designing companion UAVs is challenging as it involves both technical/engineering issues and social/emotional design questions. We believe that because of this reason, most UAV works identified in this survey focus on a single issue, either an engineering or social issue, rather than having an ambitious goal to tackle both issues in a paper. As shown in Fig. 2, there are two main efforts for realizing companion UAVs in the literature, where the first one moves from the remote-controlled UAV to the autonomous UAV direction (blue arrow), and the second one moves from the remote-controlled UAV to the social UAV direction (red arrow).

The blue arrow in Fig. 2 signifies efforts of robotic developers in realizing companion UAVs. In the literature, these efforts have a distinctive feature where the authors include engineering details, be it about the UAV hardware, control algorithms, or visual tracking methods. From the identified companion UAV works in Section 3, the topics of human-following UAVs and user interface clearly emerge in this area. In Section 5, we will discuss these sub-topics in more details.

The red arrow in Fig. 2 signifies efforts of HDI researchers in realizing companion UAVs. In the literature, these efforts have a distinctive feature where the authors performed Wizard-of-Oz experiments or carried out online surveys by using HDI videos. From the identified companion UAV works in Section 3, the topics of social perception of UAVs, emotion and intention expression of UAVs, gesture and physical interaction with UAVs clearly emerge in this area. In Section 6, we will discuss these sub-topics in more details.

5 From Remote-Controlled to Autonomous UAVs

Developing companion UAVs that are autonomous and sociable is not a straightforward task. Most companion UAV works focus on one topic for realizing a companion UAV. In this section, we aim to summarize engineering efforts for realizing a companion UAV, with focus on the human-following and control interface topics.

5.1 Human Following UAVs

Pestana et al. used a UAV’s onboard camera and object tracking algorithm to realize a human following application Pestana et al. (2013). Higuchi et al. also performed human following with the UAV’s onboard camera by using a color-based particle filter Higuchi et al. (2011). On the other hand, Papachristos et al. demonstrated a human tracking application with the UAV’s onboard stereo camera Papachristos et al. (2015). All the proposed prototypes focused on the functional designs of the system and did not carry out social interaction experiments. Moreover, these systems have special requirements such as manual initialization of the user location Pestana et al. (2013), the necessity for the user to wear a shirt of a specific color Higuchi et al. (2011), or the necessity for the user to move (so that the image tracker starts working) Papachristos et al. (2015).

By integrating a visual SLAM technique and a vision-based human tracking algorithm, Lim & Sinha presented a UAV that can map the human walking path in real time Lim and Sinha (2015). On the other hand, Nasser et al. proposed a UAV that can perform human following and gesture recognition with an onboard Xtion depth camera Naseer et al. (2013). More recently, Yao et al. integrated a face detector and a feature tracker in order to achieve robust human tracking with a miniature robotic blimp Yao et al. (2017)

. Note that these systems are not able to track and follow the user robustly in every situation, e.g., when the user is occluded by other objects/people. In order to tackle the occlusion problem, Hepp et al. presented a human-following system based on ultra-wideband (UWB) radio and released their implementation as open-source software 

Hepp et al. (2016).

A few works on human-following UAV focus on filming. Huang et al. combined a stereo camera-based human following method with a dynamic planning strategy to film a human action in a more expressive manner Huang et al. (2018). Zhou et al. designed a flying drone that could keep tracking a human motion using a normal color camera Zhou et al. (2018). Bentz et al. presented an assistive aerial robot that could observe regions most interesting to the human and broadcast these views to the human’s augmented reality display Bentz et al. (2019). This resulted in reduced head motions of the human as well as improved reaction time.

5.2 User Interfaces for UAVs

In 2013, Monajjemi et al. presented a method to command a team of UAVs by using face and hand gestures Monajjemi et al. (2013). Later, Monajjemi et al. extended their work by commanding a team of two UAVs using not only face engagement and hand gestures but also voice and touch interfaces Monajjemi et al. (2014). Similar to Monajjemi’s works on multi-modal interaction above, MohaimenianPour & Vaughan MohaimenianPour and Vaughan (2018) and Nagi et al. Nagi et al. (2014) realized UAV control with hands and faces by relying on visual object detectors and simple preset rules

Unlike Monajjemi’s works on multi-modal interaction above, Sun et al. focused on piloting a drone with gesture recognition by combining a visual tracker with a skin pixel detector for robust performance Sun et al. (2017). Similarly, Lichtenstern et al. demonstrated a system where a user can control multiple UAVs using hand gestures Lichtenstern et al. (2012).

Constante et al. aimed to improve the hand gesture interface of UAVs by proposing a new algorithm transfer learning algorithm that can exploit both online generic and user-specific hand gestures data 

Costante et al. (2014). Burke & Lasenby presented a very fast and simple classification method to control a UAV with pantomimic gestures, in which the main idea to use a gesture that is similar to the desired action of UAV as a gesture command Burke and Lasenby (2015).

More recently, Bruce et al. proposed the use of facial expression for 3D trajectory control of UAVs Bruce et al. (2017). Previously, we have also demonstrated a drone that could react to the user’s facial expression Liew et al. (2012). In contrast to facial expression, Huang et al. directed a UAV in a known environment via natural language commands Huang et al. (2010).

6 From Remote-Controlled to Social UAVs

This section summarizes efforts in sociability studies for realizing companion UAVs. This section also offers different perspective from the recent survey work of social drones Baytaş et al. (2019). We first discuss the social perception of UAVs, followed by topics on emotion and intention expression of UAVs through motions, lights, or displays. Then, we briefly describe related works in gesture interaction and physical interaction with UAVs.

6.1 Social Perception of UAVs

Designing companion UAVs which invite social interaction is important. Wojciechowska et al. investigated the best way for a flying robot to approach a person Wojciechowska et al. (2019). Yeh et al. found that a drone with circular body shape, face, and voice could reduce the proximate distance between a social drone and the user Yeh et al. (2017). In addition, there are also studies on user perception on UAV, focusing on assistance during emergency situations Khan and Neustaedter (2019b), privacy and security issues Chang et al. (2017), and autonomous behaviors Nowacka et al. (2015).

Different from the social interaction works mentioned above, Abtahi et al. explored the touch interaction in HDI and participants preferred interacting with a safe-to-touch drone in the studies Abtahi et al. (2017). In particular, users feel safer and were less mentally demanding when interact with the safe-to-touch drone.

6.2 Emotion and Intention Expression of UAVs

Dancers use various kinds of motion to express their emotions. Sharma et al. used Laban motion analysis (a common method used by artists to express emotions) for UAVs to express their affective states Sharma et al. (2013). Aiming to deliver an opera performance, Eriksson et al. also described their method of designing expressive motions interactively with a choreographer for drones Eriksson et al. (2019).

Similarly, Cauchard et al. presented a model for UAVs to express emotions via movements Cauchard et al. (2016), believing that encoding these emotions into movements could help users to comprehend the UAV’s internal states.

In contrast to emotion expression, Szafir et al. used the UAV’s motion to express the robot’s intention Szafir et al. (2014). Walker et al. expand this work by visualizing robot motion intent using an augmented reality technique Walker et al. (2018). Colley et al. also investigated drone motion as direct guidance for pedestrians rather than equipping drones with a display or indicators Colley et al. (2017).

Duncan et al. have similar idea and presented an initial study for UAVs to communicate their internal states to bystanders via flying patterns Duncan et al. (2018). In their seminal work, Firestone et al. performed a participatory design with users for UAVs to communicate internal states effectively via flying patterns Firestone et al. (2019).

LED light has also been used for UAVs to express their emotion and intent. Arroyo et al. described a social UAV that performs four different expressions with head movement and two color LED eyes Arroyo et al. (2014). Szafir et al. also used a ring of sixty-four color LEDs as a reliable cue for the UAV to convey intention to the user Szafir et al. (2015).

Instead of using LED light, some works rely on displays or projectors to convey information to users, including a small drone with an OLED display for telepresence function HDI Gomes et al. (2016), a flying display system for crowd control during emergency situations Schneegass et al. (2014), a flying projector-screen system with two UAVs Nozaki (2014), and flying UAVs with onboard projectors for social group interactions Scheible et al. (2013), interactive map application Brock et al. (2018), navigation guidance Knierim et al. (2018), and gesture interaction Cauchard et al. (2019).

6.3 Gesture Interaction with UAVs

Inspired by human interaction with birds, Ng & Sharlin studied the effectiveness of a few hand gestures in commanding a UAV Ng and Sharlin (2011). Participants were very engaged when having gesture interaction with the UAV and spoke to the UAV like a pet. Cauchard et al. also performed similar Wizard-of-Oz experiments and most participants interacted with the UAV as if it were a pet Cauchard et al. (2015). E et al. later expand this experiment in different culture setting and found similar results E et al. (2017).

Aiming to increase the naturalness of HDI, Peshkova et al. surveyed gesture interaction techniques that have been applied for UAV control based on three mental models: the imitative class (controls the UAV motions with the user’s body movements), the instrumented class (controls the UAV motions with a physical controller or an imaginary object), and the intelligent class (interacts with a UAV as if the UAV is an intelligent agent) Peshkova et al. (2017).

On the other hand, Pfeil et al. studied the effectiveness of different interaction techniques of the upper body in UAV control Pfeil et al. (2013) (including all the three interaction classes mentioned by Peshkova et al. Peshkova et al. (2017)). They found that the proxy technique, in which the user moves the UAV as he/she is grasping the UAV in his/her hand, is the best out of the five developed interaction techniques.

6.4 Physical Interaction with UAVs

Physical interaction is rare in the UAV literature compared to gesture interaction. Knierim et al. used physical interaction with a flying robot as a novel input (touch/drag the flying robot) and output (the flying robot generates forces feedback) modalities for a user interface Knierim et al. (2018). Abtahi et al. proposed a haptic interaction333Physical HDI with a virtual reality display in their context. system, where an actual UAV is used to enhance user’s physical perception in virtual reality environment Abtahi et al. (2019). Soto et al. explored the idea of using a leashed UAV as a navigator to guide visually impaired people Soto et al. (2017).

7 Observation and Lessons Learned

Throughout the review process and personal experience, we noticed several patterns in the literature and learned a few lessons in designing companion UAVs. We discuss these observation (including ideas exploration) in this section, including: (i) UAV form, (ii) appearance design, (iii) integrated human-accompanying model, (iv) integrated human-sensing interface, (v) safety concerns, (vi) noise issue and sound design, and tactile interaction. Note that several aspects mentioned in this section could be potentially improved by drawing inspiration from the human-computer interaction or human-robot interaction literature. In next section, we will present a more concise guidelines and recommendations towards realizing companion UAVs.

7.1 UAV Form

Almost all papers considered in this work use multirotor UAV as a platform to demonstrate UAV flight or to carry out social experiments. From our long experience working with drones, we agree that multirotors are more convenient for experiments (in term of availability) and presentation (in term of flight quality) but their noise level is too annoying for companion UAVs.

We argue that a blimp or balloon type UAV is more suitable as a form for companion UAVs. We list up two technical blimp design papers that could be an alternative form of companion UAVs. First, Song et al. used a novel idea to hide the propulsion unit in the center of a blimp and designed a blimp that is safe to touch and interact with Song et al. (2018). Second, Yamada et al. designed a blimp with micro-blower with no rotating blade (hence safer and quieter) Yamada et al. (2019). It is also worth mentioned that Drew et al. designed a small flying robot using electro-hydrodynamic thrust with no moving part but it is tiny and cannot handle large payload. Drew et al. (2018)

7.2 Appearance Design

Appearance design of drones is important as the design affect users’ perception Wojciechowska et al. (2019). A few HDI studies mentioned in Section 6.1 investigated the user perception on drones. For example, it is found that a round shape flying robot has higher social acceptance Yeh et al. (2017) and emergency response drones with a prominent appearance that is easily distinguishable from recreational drones can gain user trust Khan and Neustaedter (2019b). HDI researchers have mentioned about the importance of color in drone design Chang et al. (2017). At the time of writing, no study has investigated the color design in companion UAVs and the most related study we can find is using color of balloons to visualize their surrounding air quality Kuznetsov et al. (2011).

7.3 Human Accompanying Model

A few papers demonstrated human-following capability of companion UAVs (Section 5.1) and Wojciechowska et al. investigated the best way a flying robot should approach a person Wojciechowska et al. (2019). We noticed that there is a lack of a general model to unify various human accompanying behaviors of companion UAVs, including approaching, following, side-by-side walking, leading or guiding, and flying above the user (to help observing things far away). This observation is also applicable to ground robots. With a unified human accompanying model, companion UAVs are expected to be able to truly accompany and interact with a human more naturally. For more details, we have summarized related works of various human accompanying modes of both flying robots and mobile robots in our previous work Liew (2016).

7.4 Human Sensing Interface

Human can achieve natural interaction with each other using face, gesture, touch, and voice modalities simultaneously. A few papers demonstrated HDI with multiple modalities but most papers focus on a single modality. It is not straightforward to realize a companion UAV with all modalities since researches/engineers often focus on methods with a single modality. Recently, great effort in the deep learning topic has led to a more integrated human sensing interface, such as the OpenPose library integrates visual tracking of human body, face, and hand simultaneously 

Cao et al. (2019). We should leverage these powerful tools more in order to extent companion UAV research. As a good example, Huang et al. utilized the OpenPose library for human-following and realize a high-level autonomous filming function with their UAV Huang et al. (2018). A standard human sensing interface (could be updated regularly) is crucial in facilitating a HDI study, not only it can accelerate HDI progress, but also make comparison study more effective.

7.5 Safety Concerns

HDI safety is an important design aspects of companion UAV. Most UAVs described in this work have sharp rotating propellers that could injure nearby human, especially human eyes—such accidents have been observed in a medical case report Moskowitz et al. (2018) and a formal news Toddler Eye Accident in BBC News (2015). Existing solutions444Including commercial UAV examples. include using ring-shape protectors around the propellers Parrot (2015b, 2016), having net cases that fully cover the propellers Abtahi et al. (2017); Salaan et al. (2019); Zero (2016, 2016), or designing a cage to cover the entire drone Abtahi et al. (2019); Briod et al. (2013, 2013); Kornatowski et al. (2017); Flyability (2014, 2016) but these modifications worsen the flight efficiency and shorten the flight time (due to the increased payloads).

Recently, Lee et al. claimed that a Coanda UAV has several advantages over standard UAVs, such as crash resistance and flight safety, thanks to its unique mechanical design Lee et al. (2017). A UAV with flexible structures could be less harmful when it unavoidably hit a user (as the UAV structure will absorb the crash impact). Based on this idea, UAVs with soft body frame S. Mintchev and Floreano (2017) and flexible propellers Jang et al. (2019) have been proposed.

In addition to the physical safety, making users feel safer (less cognitive burden) is also important. From our experience, most users are afraid that the UAV is going to crash when it starts to move (because unlike a common ground vehicle, a conventional UAV has to tilt or roll in order to move). We then designed a special type of drone—a holonomic UAV, where it can move horizontally without tilting, and the users expressed that the holonomic flight makes them feel safer. While several holonomic UAVs exists,555Papers can be found by searching the “holonomic” keyword in the UAV paper list mentioned in Appendix A there is no formal HDI study of holonomic UAV so far to the best of our knowledge.

7.6 Noise Issue and Sound Design

In addition to the safety concerns, noise issue is also important for HDI. Most UAVs produce unwanted noise with their high speed and high power rotating propellers. In our test, the noise of a commercial UAV Parrot (2015a) was measured to be as high as 82 dB one meter away, and is very close to the hazardous level of 85 dB as legislated by most countries Johnson et al. (2001). Studies also suggested that noise has a strong association with health issues Bodin et al. (2009) and increased risk of accidents Maue (2018). These findings suggest that noise issue should be seriously considered in HDI.

Sound design Lyon (2003) is also a related topic for HDI. For example, car manufacturers harmonically tune the engines’ noise so that their cars can sound more comfortable to the users Kim et al. (2017). Norman discussed about the emotional association of sounds and everyday products in his book Norman (2013). For example, an expensive melodious kettle (when the water is boiling) and a Segway self-balancing scooter (when the motor is rotating) sound exactly two musical octaves apart, making the unwanted noises sound like music. In the robotics context, Moore et al. utilized the motor noise to facilitate user interaction Moore et al. (2017) and Song & Yamada proposed to express emotions through sound, color, and vibrations Song and Yamada (2017).

The use of non-vocal, non-verbal, or non-linguistic utterance (NLU) for HDI is also a potential way to enhance a UAV’s characteristics and expressiveness. It should be noted that NLU might be more useful than speech during a HRI flight as the propeller noise make speech recognition difficult. In movies, robots such as R2-D2 also use non-verbal utterance to enhance its communication with characters. For more details on the study of NLU on HRI, readers are recommended to read the PhD thesis of Read Read (2014).

7.7 Tactile Interaction666We focus on thermal and touch interactions here, which have subtle difference with physical interaction (involves force feedback) mentioned in Section 6.4.

Peña & Tanaka proposed the use of a robot’s body temperature to express emotional state Pachamango and Tanaka (2018). Park & Lee also studied the effect of temperature with a companion dinosaur robot and found that skin temperature significantly affects users’ perception Park and Lee (2014). It would be interesting and useful to explore this new thermal application area. For example, companion UAVs with warm/hot body frame could signify a status of hardworking/exhaustion.

Equipping UAVs with touch sensing capability allows richer HDI. For instance, a UAV could perceive a person’s love if it could sense the person’s gentle stroke. Prior studies and results in the social robotics field Yohanan and MacLean (2012) could be integrated into a UAV to realize a more personalized companion UAV. In addition, from the engineering perspective, a UAV with touch sensors on its propeller guard could also sense a nearby human instantly and enhance HDI safety.

8 Guidelines and Recommendations

After mentioning several observation in the companion UAV literature (and our experience in designing companion UAVs), this section presents several design and research recommendations for companion UAVs. Note that almost all the topics discussed below are applicable to both the engineering development and social interaction experiments.

First, in the topic of UAV form and appearance design, we recommend two kind of platforms in the engineering or social experiments: (i) a palm-sized multirotor UAV with cage design (e.g., Abtahi et al. (2019)) for agile and accurate motion, safer interaction (less impact and less chance to get hurt by the propellers), affordance that invites touch interaction (e.g., Abtahi et al. (2017)), if noise and flight time are not a big issue; (ii) a hugging-sized blimp with hiding propellers (e.g., Song et al. (2018)) or novel propulsion unit with no rotating part (e.g., Yamada et al. (2019)) for quieter and calm interaction, safer interaction, and longer interaction time, if agile and accurate response are not a big issue.

Second, we suggest to pay more attention to integrated human-accompanying models and human-sensing interfaces in order to support a more realistic HDI. Human-accompanying model should integrate functions of human approaching, following, leading, side-by-side walking, bird-eye viewing on the top for a more natural HDI. Similarly, human-sensing interface should integrate at least four modalities of human tracking, hand tracking, face tracking, and voice interaction (e.g., Monajjemi et al. (2014)). In the engineering field, UAVs should also perform environment sensing at the same time so that they can accompany their users without hitting obstacle (e.g., Lim and Sinha (2015)).

Third, more related to the social interaction studies, we recommend to explore the ideas of sound design, tactile interaction, and holonomic flight motions of UAV. When individual interaction becomes more mature, we should try integrating the visual and audio expression, gesture and physical and tactile interactions, and investigate the long term HDI.

Fourth, we also encourage HDI researchers to share development code among companion UAV studies to facilitate comparison study under a shared metrics. In addition to our recommendation, one could also draw inspiration from the practices and know-how in the aerospace field Hutchins et al. (2015) and social robotics field Coeckelbergh (2009); Harper and Virk (2010).

Figure 3: Towards companion UAV from the autonomous UAV (left) and social UAV (right) categories.

Fifth, last but not least, we suggest engineering research to (i) incorporate findings in the HDI studies (such as accompanying a user with a proximate distance that is comfortable to the user) into their technical development, and (ii) perform HDI study after developing a new function in order to confirm its usefulness and social acceptance (efforts from autonomous UAV to companion UAV, corresponding to the Fig. 3 (left)). At the same time, we suggest HDI studies to perform experiments with a real drone integrated with autonomous capabilities (such as accompanying a person and avoid obstacles autonomously) in order to deal with HDI studies in a more realistic scenario (efforts from social UAV to companion UAV, corresponding to the Fig. 3 (right)).

9 Conclusion

Technological advancements in small-scale UAVs have led to a new class of companion robots—a companion UAV. After identifying and coordinating related works from the UAV’s autonomy and sociability perspectives, we found that recent research efforts in companion UAVs focus on a single issue, either an engineering or a social interaction issue, rather than having an ambitious goal to tackle both issues in a paper. While this might be the nature of research (i.e., specialize on a topic), we encourage future works to emphasis on both aspects of companion UAVs as the integration of these two interrelated aspects is essential for an effective HDI.

We also list up our observation throughout this review and propose guidelines to perform companion UAV designs and researches in the future. In addition to individual topics such as engineering functions and social interaction studies with new modality, we argue the importance of devising an integrated human-accompanying model and an integrated human-sensing interface to advance the development of companion UAVs. We also suggest researchers to share the programming codes used in their experiments to facilitate comparison study and consolidate findings in companion UAV works.

Most of the related papers focus on the development of the UAV itself. In contrast, it is also feasible to design an environment that could enable an easier navigation of companion UAVs. Public issues PytlikZillig et al. (2018) and policy making United Nations OCHA (2014) are also important for facilitating a social acceptance of UAVs. Lastly, while an affective model is an important aspect of companion UAVs, we argue that it is beyond the scope of this paper. We believe that affective models developed for general companion robots are applicable to companion UAVs when companion UAVs have more mature and integrated autonomous functions and socially interactive capabilities.

Compliance with Ethical Standards

Conflict of interest The authors declare that they have no conflict of interest.

Appendix A UAV Database Update and Online Sharing

The UAV database is shared online via Google Sheets (https://tinyurl.com/drone-paper-list). In the tables (one table per year), we list all the UAV-related papers from top journals/conferences since 2001 along with their related topics, abstracts, and details such as hardware summary. This list is particularly useful to: (i) search related works on a particular topic in UAV, e.g., HRI; (ii) search related works on a particular type in UAV, e.g., blimp; (iii) search related works on a particular UAV platform, etc. We believe that this list is not only beneficial for newcomers to the UAV field but also convenient for experienced researchers to cite and compare related works.

In addition to Google Sheets, we also use an open-source file tagging and organization software TagSpaces (2017). TagSpaces enables readers to search papers with multiple tags or/and keywords effectively. Moreover, since original papers (PDF files) cannot be shared with readers due to copyright issues, for each paper entry, we create an HTML file that contains public information (such as abstract, keywords, country, paper URL link, and video URL link) for easier reference. To setup TagSpaces and download all the HTML files, please refer to our website at https://sites.google.com/view/drone-survey.

References

  • P. Abtahi, B. Landry, J. (. Yang, M. Pavone, S. Follmer, and J. A. Landay (2019) Beyond the force: using quadcopters to appropriate objects and the environment for haptics in virtual reality. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), Vol. , pp. 1–13. Cited by: §6.4, §7.5, §8.
  • P. Abtahi, D. Y. Zhao, J. L. E., and J. A. Landay (2017) Drone near me: exploring touch-based human-drone interaction. Proc. ACM Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT) 1 (3), pp. 813–824. Cited by: §6.1, §7.5, §8.
  • H. Agrawal, S. Leigh, and P. Maes (2015) L’evolved: autonomous and ubiquitous utilities as smart agents. In Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), pp. 487–491. Cited by: §1.
  • Aibotix (2016) Aibot X6. External Links: Link Cited by: §7.5.
  • D. Arroyo, C. Lucho, J. Roncal, and F. Cuellar (2014) Daedalus: A sUAV for social environments. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), Vol. , pp. 99. Cited by: §6.2.
  • L. Vink, J. Cauchard, and J. A. Landay (2014) Autonomous Wandering Interface (AWI) - Concept Video. External Links: Link Cited by: §1.
  • M. A. Baytaş, D. Çay, Y. Zhang, M. Obaid, A. E. Yantaç, and M. Fjeld (2019) The design of social drones: a review of studies on autonomous flyers in inhabited environments. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), pp. 1–13. Cited by: §4.2, §6.
  • W. Bentz, S. Dhanjal, and D. Panagou (2019) Unsupervised learning of assistive camera views by an aerial co-robot in augmented reality multitasking environments. In Proc. IEEE International Conference on Robotics and Automation (ICRA), pp. 3003–3009. Cited by: §5.1.
  • T. Bodin, M. Albin, J. Ardö, E. Stroh, P. Östergren, and J. Björk (2009) Road traffic noise and hyper tension: Results from a cross-sectional public health survey in southern Sweden. Environmental Health 8. Cited by: §7.6.
  • A. Briod, P. Kornatowski, A. Klaptocz, A. Garnier, M. Pagnamenta, J. Zufferey, and D. Floreano (2013) Contact-based navigation for an autonomous flying robot. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vol. , pp. 3987–3992. Cited by: §7.5.
  • A. Briod, P. Kornatowski, J. Zufferey, and D. Floreano (2013) A collision‐resilient flying robot. Journal of Field Robotics 31 (4), pp. 496–509. Cited by: §7.5.
  • A. M. Brock, J. Chatain, M. Park, T. Fang, M. Hachet, J. A. Landay, and J. R. Cauchard (2018) FlyMap: interacting with maps projected from a drone. In Proc. ACM International Symposium on Pervasive Displays (PerDis), pp. 1–9. Cited by: §6.2.
  • J. Bruce, J. Perron, and R. Vaughani (2017) Ready—aim—fly! Hands-free face-based HRI for 3D trajectory control of UAVs. In Proc. Conference on Computer and Robot Vision (CRV), pp. 307–313. Cited by: §5.2.
  • M. Burke and J. Lasenby (2015) Pantomimic gestures for human-robot interaction. IEEE Transactions on Robotics (T-RO) 31 (5), pp. 1225–1237. Cited by: §5.2.
  • J. Butzke, A. Dornbush, and M. Likhachev (2015) 3-D exploration with an air-ground robotic system. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vol. , pp. 3241–3248. Cited by: §3.
  • Z. Cao, G. Hidalgo, T. Simon, S. Wei, and Y. Sheikh (2019)

    OpenPose: realtime multi-person 2D pose estimation using part affinity fields

    .
    In arXiv preprint arXiv:1812.08008, Vol. , pp. 1–8. Cited by: §7.4.
  • J. R. Cauchard, J. L. E, K. Y. Zhai, and J. A. Landay (2015) Drone & me: an exploration into natural human-drone interaction. In Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), Vol. , pp. 361–365. Cited by: §6.3.
  • J. R. Cauchard, A. Tamkin, C. Y. Wang, L. Vink, M. Park, T. Fang, and J. A. Landay (2019) Drone.io: a gestural and visual interface for human-drone interaction. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 153–162. Cited by: §6.2.
  • J. R. Cauchard, K. Y. Zhai, M. Spadafora, and J. A. Landay (2016) Emotion encoding in human-drone interaction. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), Vol. , pp. 263–270. Cited by: §6.2.
  • V. Chang, P. Chundury, and M. Chetty (2017) Spiders in the sky: user perceptions of drones, privacy, and security. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), Vol. , pp. 6765–6776. Cited by: §6.1, §7.2.
  • M. Coeckelbergh (2009) Personal robots, appearance, and human good: A methodological reflection on roboethics. International Journal of Social Robotics (IJSR) 1 (3), pp. 217–221. External Links: ISSN 1875-4805, Document, Link Cited by: §8.
  • A. Colley, L. Virtanen, P. Knierim, and J. Häkkilä (2017) Investigating drone motion as pedestrian guidance. In Proc. International Conference on Mobile and Ubiquitous Multimedia (MUM), pp. 143–150. Cited by: §6.2.
  • M. Cooney, F. Zanlungo, S. Nishio, and H. Ishiguro (2012) Designing a flying humanoid robot (FHR): Effects of flight on interactive communication. In Proc. IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Vol. , pp. 364–371. Cited by: §1.
  • G. Costante, E. Bellocchio, P. Valigi, and E. Ricci (2014) Personalizing vision-based gestural interfaces for HRI with UAVs: A transfer learning approach. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vol. , pp. 3319–3326. Cited by: §5.2.
  • B. Das, M. S. Couceiro, and P. A. Vargas (2016) MRoCS: A new multi-robot communication system based on passive action recognition. Robotics and Autonomous Systems (RAS) 82, pp. 46–60. Cited by: §3.
  • D. S. Drew, N. O. Lambert, C. B. Schindler, and K. S. J. Pister (2018) Toward controlled flight of the ionocraft: a flying microrobot using electrohydrodynamic thrust with onboard sensing and no moving parts. IEEE Robotics and Automation Letters (RA-L) 3 (4), pp. 2807–2813. Cited by: §7.1.
  • B. A. Duncan, E. Beachly, A. Bevins, S. Elbaum, and C. Detweiler (2018) Investigation of communicative flight paths for small unmanned aerial systems. In Proc. IEEE International Conference on Robotics and Automation (ICRA), pp. 602–609. Cited by: §6.2.
  • B. A. Duncan, R. R. Murphy, D. Shell, and A. G. Hopper (2010) A midsummer night’s dream: Social proof in HRI. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 91–92. Cited by: §1.
  • J. L. E, I. L.E, J. A. Landay, and J. R. Cauchard (2017) Drone & wo: cultural influences on human-drone interaction techniques. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), Vol. , pp. 6794–6799. Cited by: §6.3.
  • S. Eriksson, Å. Unander-Scharin, V. Trichon, C. Unander-Scharin, H. Kjellström, and K. Höök (2019) Dancing with drones: crafting novel artistic expressions through intercorporeality. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), Vol. , pp. 1–12. Cited by: §6.2.
  • FAA (2017) Unmanned Aircraft Systems. External Links: Link Cited by: §2.1.
  • J. W. Firestone, R. Quiñones, and B. A. Duncan (2019) Learning from users: an elicitation study and taxonomy for communicating small unmanned aerial system states through gestures. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 163–171. Cited by: §6.2.
  • Fleye (2016) Fleye flying robot. External Links: Link Cited by: §7.5.
  • D. Floreano and R. J. Wood (2015) Science, technology and the future of small autonomous drones. Nature 521 (), pp. 460–466. Cited by: Figure 1, §2.2.
  • Flyability (2014) Safe Drone for Inaccessible Places. External Links: Link Cited by: §7.5.
  • A. Gomes, C. Rubens, S. Braley, and R. Vertegaal (2016) BitDrones: Towards using 3D nanocopter displays as interactive self-levitating programmable matter. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), pp. 770–780. Cited by: §6.2.
  • E. Graether and F. Mueller (2012) Joggobot: A flying robot as jogging companion. In Proc. CHI Extended Abstracts on Human Factors in Computing Systems, Vol. , pp. 1063–1066. Cited by: §1.
  • C. Harper and G. Virk (2010) Towards the development of international safety standards forhuman robot interaction. International Journal of Social Robotics (IJSR) 2 (3), pp. 229–234. External Links: ISSN 1875-4805, Document, Link Cited by: §8.
  • H. Hedayati, M. Walker, and D. Szafir (2018) Improving collocated robot teleoperation with augmented reality. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 78–86. Cited by: §1.
  • B. Hepp, T. Nägeli, and O. Hilliges (2016) Omni-directional person tracking on a flying robot using occlusion-robust ultra-wideband signals. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 189–194. Cited by: §5.1.
  • K. Higuchi, Y. Ishiguro, and J. Rekimoto (2011) Flying eyes: Free-space content creation using autonomous aerial vehicles. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), Vol. , pp. 561–570. Cited by: §5.1.
  • C. Hrabia, M. Berger, A. Hessler, S. Wypler, J. Brehmer, S. Matern, and S. Albayrak (2017) An autonomous companion UAV for the SpaceBot Cup Competition 2015. In Robot Operating System (ROS): The Complete Reference (Volume 2), pp. 345–385. Cited by: §4.1.
  • A. S. Huang, S. Tellex, A. Bachrach, T. Kollar, D. Roy, and N. Roy (2010) Natural language command of an autonomous micro-air vehicle. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vol. , pp. 2663–2669. Cited by: §5.2.
  • C. Huang, F. Gao, J. Pan, Z. Yang, W. Qiu, P. Chen, X. Yang, S. Shen, and K. Cheng (2018) ACT: an autonomous drone cinematography system for action scenes. In Proc. IEEE International Conference on Robotics and Automation (ICRA), pp. 7039–7046. Cited by: §5.1, §7.4.
  • A. R. Hutchins, M. Cummings, M. C. Aubert, and S. C. Uzumcu (2015) Toward the development of a low-altitude air traffic control paradigm for networks of small, autonomous unmanned aerial vehicles. In Proc. AIAA Infotech @ Aerospace, pp. 1–8. Cited by: §8.
  • J. Jang, K. Cho, and G. Yang (2019) Design and experimental study of dragonfly-inspired flexible blade to improve safety of drones. IEEE Robotics and Automation Letters (RA-L) 4 (4), pp. 4200–4207. Cited by: §7.5.
  • W. Jensen, S. Hansen, and H. Knoche (2018) Knowing you, seeing me: investigating user preferences in drone-human acknowledgement. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), pp. 1–12. Cited by: §4.1.
  • Jibo (2017) Jibo Robot - The 25 Best Inventions of 2017. External Links: Link Cited by: §1.
  • D.L. Johnson, P. Papadopoulos, N. Watfa, and J. Takala (2001) Exposure criteria: Ooccupational exposure levels. In Occupational exposure to noise: Evaluation, prevention and control, B. Goelzer, C. H. Hansen, and G. A. Sehrndt (Eds.), pp. 79–102. Cited by: §7.6.
  • B. Jones, K. Dillman, R. Tang, A. Tang, E. Sharlin, L. Oehlberg, C. Neustaedter, and S. Bateman (2016) Elevating communication, collaboration, and shared experiences in mobile video through drones. In Proc. ACM Conference on Designing Interactive Systems (DIS), pp. 1123–1135. Cited by: §4.2.
  • C. Kertész and M. Turunen (2019) Exploratory analysis of sony AIBO users. Journal of AI & Society 34 (3), pp. 625–638. Cited by: §1.
  • M. N. H. Khan and C. Neustaedter (2019a) An exploratory study of the use of drones for assisting firefighters during emergency situations. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), pp. 1–14. Cited by: §4.1.
  • Md. N. H. Khan and C. Neustaedter (2019b) An exploratory study of the use of drones for assisting firefighters during emergency situations. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), Vol. , pp. 1–14. Cited by: §6.1, §7.2.
  • S. Kim, K. Chang, D. C. Park, S. M. Lee, and S. K. Lee (2017) A systematic approach to engine sound design for enhancing sound character by active sound design. SAE International Journal of Passenger Cars - Mechanical Systems 10 (3), pp. 691–702. Cited by: §7.6.
  • M. Kljun, K. Č. Pucihar, M. Lochrie, and P. Egglestone (2015) StreetGamez: a moving projector platform for projected street games. In Proc. Annual Symposium on Computer-Human Interaction in Play (CHI PLAY), pp. 589–594. Cited by: §1.
  • P. Knierim, T. Kosch, A. Achberger, and M. Funk (2018) Flyables: exploring 3d interaction spaces for levitating tangibles. In Proc. International Conference on Tangible, Embedded, and Embodied Interaction (TEI), Vol. , pp. 329–336. Cited by: §6.4.
  • P. Knierim, S. Maurer, K. Wolf, and M. Funky (2018) Quadcopter-projected in-situ navigation cues for improved location awareness. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), Vol. , pp. 1–6. Cited by: §6.2.
  • P. M. Kornatowski, S. Mintchev, and D. Floreano (2017) An origami-inspired cargo drone. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 6855–6862. Cited by: §7.5.
  • S. Kuznetsov, G. N. Davis, E. Paulos, M. D. Gross, and J. C. Cheung (2011) Red balloon, green balloon, sensors in the sky. In Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), Vol. , pp. 237–246. Cited by: §7.2.
  • S. Latscha, M. Kofron, A. Stroffolino, L. Davis, G. Merritt, M. Piccoli, and M. Yim (2014) Design of a hybrid exploration robot for air and land deployment (H.E.R.A.L.D) for urban search and rescue applications. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vol. , pp. 1868–1873. Cited by: §3.
  • J. Y. Lee, S. H. Song, H. W. Shon, H. R. Choi, and W. Yim (2017) Modeling and control of a saucer type Coandă effect UAV. In Proc. IEEE International Conference on Robotics and Automation (ICRA), Vol. , pp. 2717–2722. Cited by: §7.5.
  • M. Lichtenstern, M. Frassl, B. Perun, and M. Angermann (2012) A prototyping environment for interaction between a human and a robotic multi-agent system. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), Vol. , pp. 185–186. Cited by: §5.2.
  • C. F. Liew, D. DeLatte, N. Takeishi, and T. Yairi (2017) Recent developments in aerial robotics: A survey and prototypes overview. ArXiv e-prints. External Links: 1711.10085 Cited by: footnote 1.
  • C. F. Liew, N. Yokoya, and T. Yairi (2012) Control of unmanned aerial vehicle using facial expression. In Proc. Japanese Academy of Facial Studies, pp. 1–1. Cited by: §5.2.
  • C. F. Liew (2016) Towards human-robot interaction in flying robots: A user accompanying model and a sensing interface. Ph.D. Thesis, Department of Aeronautics and Astronautics, The University of Tokyo, Japan. External Links: Link Cited by: §4.1, §7.3.
  • H. Lim and S. N. Sinha (2015) Monocular localization of a moving person onboard a quadrotor MAV. In Proc. IEEE International Conference on Robotics and Automation (ICRA), Vol. , pp. 2182–2189. Cited by: §5.1, §8.
  • R. H. Lyon (2003) Product sound quality: from perception to design. Sound and Vibration 37 (3), pp. 18–23. Cited by: §7.6.
  • Malliaraki (2017) Natural human-drone interaction. External Links: Link Cited by: §1.
  • J. Maue (2018) Noise - European Agency for Safety and Health at Work. External Links: Link Cited by: §7.6.
  • S. MohaimenianPour and R. Vaughan (2018) Hands and faces, fast: mono-camera user detection robust enough to directly control a uav in fligh. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vol. , pp. 5224–5231. Cited by: §5.2.
  • V. M. Monajjemi, S. Pourmehr, S. A. Sadat, F. Zhan, J. Wawerla, G. Mori, and R. Vaughan (2014) Integrating multi-modal interfaces to command UAVs. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), Vol. , pp. 106–106. Cited by: §5.2, §8.
  • V. M. Monajjemi, J. Wawerla, R. Vaughan, and G. Mori (2013) HRI in the sky: Creating and commanding teams of UAVs with a vision-mediated gestural interface. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vol. , pp. 617–623. Cited by: §5.2.
  • D. Moore, H. Tennent, N. Martelaro, and W. Ju (2017) Making noise intentional: A study of servo sound perception. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 12–21. Cited by: §7.6.
  • E. E. Moskowitz, Y. M. Siegel-Richman, G. Hertner, and T. Schroeppel (2018) Aerial drone misadventure: a novel case of trauma resulting in ocular globe rupture. American Journal of Ophthalmology Case Reports 10 (), pp. 35–37. Cited by: §7.5.
  • R. Murphy, D. Shell, A. Guerin, B. Duncan, B. Fine, K. Pratt, and T. Zourntos (2011) A midsummer night’s dream (with flying robots). Autonomous Robots 30 (2), pp. 143–156. Cited by: §1.
  • J. Nagi, A. Giusti, G. A. D. Caro, and L. M. Gambardella (2014) Human control of uavs using face pose estimates and hand gestures. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 1–2. Cited by: §5.2.
  • T. Naseer, J. Sturm, and D. Cremers (2013) FollowMe: Person following and gesture recognition with a quadrocopter. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vol. , pp. 624–630. Cited by: §5.1.
  • W. S. Ng and E. Sharlin (2011) Collocated interaction with flying robots. In Proc. IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Vol. , pp. 143–149. Cited by: §6.3.
  • K. Nitta, K. Higuchi, and J. Rekimoto (2014) HoverBall: Augmented sports with a flying ball. In Proc. Augmented Human International Conference, Vol. , pp. 13:1–13:4. Cited by: §1.
  • D. A. Norman (2013) The design of everyday things: revised and expanded edition. Basic Books. Cited by: §7.6.
  • D. Nowacka, N. Y. Hammerla, C. Elsden, T. Plötz, and D. Kirk (2015) Diri - the actuated helium balloon: a study of autonomous behaviour in interfaces. In Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), Vol. , pp. 349–360. Cited by: §6.1.
  • H. Nozaki (2014) Flying display: A movable display pairing projector and screen in the air. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), pp. 909–914. Cited by: §6.2.
  • D. P. Pachamango and F. Tanaka (2018) Touch to feel me: Designing a robot for thermo-emotional communication. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 207–208. Cited by: §7.7.
  • C. Papachristos, D. Tzoumanikas, and A. Tzes (2015) Aerial robotic tracking of a generalized mobile target employing visual and spatio-temporal dynamic subject perception. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vol. , pp. 4319–4324. Cited by: §5.1.
  • E. Park and J. Lee (2014) I am a warm robot: The effects of temperature in physical human–robot interaction. Robotica 32 (1), pp. 133–142. Cited by: §7.7.
  • Parrot (2015a) Parrot AR.Drone 2.0. External Links: Link Cited by: §7.6.
  • Parrot (2015b) Parrot AR.Drone. External Links: Link Cited by: §7.5.
  • E. Peshkova, M. Hitz, and B. Kaufmann (2017) Natural interaction techniques for an unmanned aerial vehicle system. IEEE Pervasive Computing 16 (1), pp. 34–42. Cited by: §6.3, §6.3.
  • J. Pestana, J. L. Sanchez-Lopez, P. Campoy, and S. Saripalli (2013) Vision based GPS-denied object tracking and following for unmanned aerial vehicles. In Proc. IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Vol. , pp. 1–6. Cited by: §5.1.
  • K. P. Pfeil, S. L. Koh, and J. J. L. Jr. (2013) Exploring 3D gesture metaphors for interaction with unmanned aerial vehicles. In Proc. ACM International Conference on Intelligent User Interfaces (IUI), pp. 257–266. Cited by: §6.3.
  • L. M. PytlikZillig, B. Duncan, S. Elbaum, and C. Detweiler (2018) A drone by any other name. IEEE Technology and Society Magazine 37 (1), pp. 80–91. Cited by: §9.
  • R. Read (2014) A study of non-linguistic utterances for social human-robot interaction. Ph.D. Thesis, Faculty of Science and Environment, Plymouth University, UK. External Links: Link Cited by: §7.6.
  • S. d. R. S. Mintchev and D. Floreano (2017) Insect-inspired mechanical resilience for multicopters. IEEE Robotics and Automation Letters 2 (3), pp. 1248–1255. Cited by: §7.5.
  • C. J. Salaan, K. Tadakuma, Y. Okada, Y. Sakai, K. Ohno, and S. Tadokoro (2019) Development and experimental validation of aerial vehicle with passive rotating shell on each rotor. IEEE Robotics and Automation Letters (RA-L) 4 (3), pp. 2568–2575. Cited by: §7.5.
  • J. Scheible, A. Hoth, J. Saal, and H. Su (2013) Displaydrone: a flying robot based interactive display. In Proc. ACM International Symposium on Pervasive Displays (PerDis), pp. 49–54. Cited by: §6.2.
  • S. Schneegass, F. Alt, J. Scheible, A. Schmidt, and H. Su (2014) Midair displays: Exploring the concept of free-floating public displays. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), pp. 2035–2040. Cited by: §6.2.
  • M. Sharma, D. Hildebrandt, G. Newman, J. E. Young, and R. Eskicioglu (2013) Communicating affect via flight path: Exploring use of the laban effort system for designing affective locomotion paths. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), Vol. , pp. 293–300. Cited by: §6.2.
  • S. H. Song, H. W. Shon, G. Y. Yeon, and H. R. Choi (2018) Design and implementation of cloud-like soft drone s-cloud. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vol. , pp. 1–9. Cited by: §7.1, §8.
  • S. Song and S. Yamada (2017) Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), Vol. , pp. 2–11. Cited by: §7.6.
  • M. A. Soto, M. Funk, M. Hoppe, R. Boldt, K. Wolf, and N. Henze (2017) DroneNavigator: using leashed and free-floating quadcopters to navigate visually impaired travelers. In Proc. International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS), Vol. , pp. 300–304. Cited by: §6.4.
  • SPARKED (2016a) SPARKED: A Live Interaction Between Humans and Quadcopters. External Links: Link Cited by: §1.
  • SPARKED (2016b) SPARKED: Behind the Technology. External Links: Link Cited by: §1.
  • T. Sun, S. Nie, D. Yeung, and S. Shen (2017) Gesture-based piloting of an aerial robot using monocular vision. In Proc. IEEE International Conference on Robotics and Automation (ICRA), Vol. , pp. 5913–5920. Cited by: §5.2.
  • D. Szafir, B. Mutlu, and T. Fong (2014) Communication of intent in assistive free flyers. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), Vol. , pp. 358–365. Cited by: §6.2.
  • D. Szafir, B. Mutlu, and T. Fong (2015) Communicating directionality in flying robots. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 19–26. Cited by: §6.2.
  • TagSpaces (2017) TagSpaces - Your Hackable File Organizer. External Links: Link Cited by: Appendix A.
  • M. L. Tallec, S. Saint-Aimé, C. Jost, J. Villaneau, J. Antoine, S. Letellier-Zarshenas, B. Le-Pévédic, and D. Duhaut (2011) From speech to emotional interaction: EmotiRob project. Human-Robot Personal Relationships (), pp. 57–64. Cited by: §1.
  • Toddler Eye Accident in BBC News (2015) Toddler’s Eyeball Sliced in Half by Drone Propeller. External Links: Link Cited by: §7.5.
  • United Nations OCHA (2014) Unmanned aerial vehicles in humanitarian response. Technical report OCHA Policy and Studies Series. Cited by: §9.
  • K. P. Valavanis and G. J. Vachtsevanos (2014) Handbook of unmanned aerial vehicles. Springer Publishing Company, Incorporated. Cited by: §2.
  • Vantage (2016) Snap drone. External Links: Link Cited by: §7.5.
  • K. Wada and T. Shibata (2007) Living with seal robots—its sociopsychological and physiological influences on the elderly at a care house. IEEE Transactions on Robotics 23 (5), pp. 972–980. Cited by: §1.
  • M. Walker, H. Hedayati, J. Lee, and D. Szafir (2018) Communicating robot motion intent with augmented reality. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 316–324. Cited by: §6.2.
  • A. Wojciechowska, J. Frey, E. Mandelblum, Y. Amichai-Hamburger, and J. R. Cauchard (2019) Designing drones: factors and characteristics influencing the perception of flying robots. Proc. ACM Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT) 3 (3), pp. 1–19. Cited by: §7.2.
  • A. Wojciechowska, J. Frey, S. Sass, R. Shafir, and J. R. Cauchard (2019) Collocated human-drone interaction: methodology and approach strategy. In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 172–181. Cited by: §1, §6.1, §7.3.
  • W. Yamada, H. Manabe, and D. Ikeda (2019) ZeRONE: safety drone with blade-free propulsion. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), Vol. , pp. 1–8. Cited by: §7.1, §8.
  • N. Yao, E. Anaya, Q. T. andd Sungjin Cho, H. Zheng, and F. Zhang (2017) Monocular vision-based human following on miniature robotic blimp. In Proc. IEEE International Conference on Robotics and Automation (ICRA), Vol. , pp. 3244–3249. Cited by: §5.1.
  • A. Yeh, P. Ratsamee, K. Kiyokawa, Y. Uranishi, T. Mashita, H. Takemura, M. Fjeld, and M. Obaid (2017) Exploring proxemics for human-drone interaction. In Proc. International Conference on Human Agent Interaction (HAI), pp. 81–88. Cited by: §1, §6.1, §7.2.
  • S. Yohanan and K. E. MacLean (2012) The role of affective touch in human-robot interaction: human intent and expectations in touching the haptic creature. International Journal of Social Robotics (IJSR) 4 (2), pp. 163–180. External Links: ISSN 1875-4805, Document, Link Cited by: §7.7.
  • Zero (2016) Hover Camera. External Links: Link Cited by: §7.5.
  • X. Zhou, S. Liu, G. Pavlakos, V. Kumar, and K. Daniilidis (2018) Human motion capture using a drone. In Proc. IEEE International Conference on Robotics and Automation (ICRA), pp. 2027–2033. Cited by: §5.1.