Firefly: Supporting Drone Localization With Visible Light Communication

12/13/2021
by   Ricardo Ampudia Hernández, et al.
0

Drones are not fully trusted yet. Their reliance on radios and cameras for navigation raises safety and privacy concerns. These systems can fail, causing accidents, or be misused for unauthorized recordings. Considering recent regulations allowing commercial drones to operate only at night, we propose a radically new approach where drones obtain navigation information from artificial lighting. In our system, standard light bulbs modulate their intensity to send beacons and drones decode this information with a simple photodiode. This optical information is combined with the inertial and altitude sensors in the drones to provide localization without the need for radios, GPS or cameras. Our framework is the first to provide 3D drone localization with light and we evaluate it with a testbed consisting of four light beacons and a mini-drone. We show that, our approach allows to locate the drone within a few decimeters of the actual position and compared to state-of-the-art positioning methods, reduces the localization error by 42

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 6

11/17/2021

Technology Report:Multi-Mobile Robot Localization and Navigation based on Visible Light Positioning

We demonstrated multi-mobile robot navigation based on Visible Light Pos...
06/04/2019

Grid-based Localization Stack for Inspection Drones towards Automation of Large Scale Warehouse Systems

SLAM based techniques are often adopted for solving the navigation probl...
05/24/2019

Visual Model-predictive Localization for Computationally Efficient Autonomous Racing of a 72-gram Drone

Drone racing is becoming a popular e-sport all over the world, and beati...
04/20/2021

A simple vision-based navigation and control strategy for autonomous drone racing

In this paper, we present a control system that allows a drone to fly au...
02/27/2020

University-1652: A Multi-view Multi-source Benchmark for Drone-based Geo-localization

We consider the problem of cross-view geo-localization. The primary chal...
10/30/2020

Drone Positioning for Visible Light Communication with Drone-Mounted LED and Camera

The world is often stricken by catastrophic disasters. On-demand drone-m...
06/23/2020

Multi-view Drone-based Geo-localization via Style and Spatial Alignment

In this paper, we focus on the task of multi-view multi-source geo-local...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

Delivery with autonomous drones represents a fascinating future not only for commercial products but also for medical and food supply [27, 1]. However, as illustrated in a recent report by McKinsey [12], making this future a reality faces many challenges in range, safety and infrastructure support. People still feel nervous, scared or even angry under the presence of drones and only 11% of users think that drones should be allowed near homes [15]. Due to these reservations, most of the world still has restrictive commercial regulations, from outright bans to limited experimental licenses [17].

A broad commercial adoption of drones is hindered because people have concerns about safety and privacy. Drone operation depends on three key components: GPS [21], RF wireless links [8], and cameras [19]. But what if one of these components fails or its use is prohibited in certain areas? For example, GPS is known to face limitations indoors and in urban canyons [5]; RF signals face ever-increasing spectrum saturation and are prone to interference; and cameras raise privacy concerns (making them undesirable in various areas [23]) and consume orders of magnitude more power than simpler photosensors (impacting the lifetime of mini drones).

Opportunity. A pervasive and dependable infrastructure is crucial for reliable and safe drone operation, yet it has been largely overlooked [12]. Compared to autonomous vehicles, which build upon a comprehensive road network with sensors, cameras, traffic lights and road signs, unmanned drones do not have any such support. To minimize the risk to citizens, recent regulations are allowing drones to fly only at night [20]. This regulatory framework, effective in April 2021, opens the opportunity of transforming the vast presence of lighting in our cities into a pervasive infrastructure for drone navigation.

Vision. Similar to the way old lighthouses provided navigational aid to maritime pilots, Firefly aims at transforming standard light bulbs –such as those present in our roads, streets and buildings– into a modern version of those lighthouses. Light bulbs will play the role of “air traffic control towers”, modulating their intensities to provide navigation services to drones. To achieve this goal, Firefly exploits recent advances in visible light communication (VLC) [26], an emerging technology that transmits information using any type of LED. The core idea behind VLC is similar to communicating with somebody by turning a flashlight on and off. The modulation, however, is done at such high speeds that humans only see a normal light on (no flickering), while drones will be able to decode digital information using a simple photodiode.

Contributions. The use of visible light communication for drones is largely unexplored. A few theoretical studies have proposed the use of drones to provide temporary Internet coverage to people (i.e., mobile lights acting as access points [31]), and a more recent study assesses the link between a drone’s camera and a ground light [9]

. These studies focus solely on analyzing the communication link with visible light but do not provide any type of localization service to the drone. On the other hand, various methods have been proposed to use visible light for indoor positioning, but our study is the first to show accurate 3D positioning in scenarios with six degrees of freedom. Overall, Firefly provides two main contributions.

Contribution 1: Analytical Framework [Sections III & IV]. Considering that the state-of-the-art already provides notable contributions on visible light positioning [4, 11, 34, 2], we first perform a thorough analysis of the existing localization primitives and show that the approach decomposing the 3D problem into a 2D+H problem (where H stands for height) is the best alternative [22, 4]. After that, we identify the limitations of the 2D+H approach on drones (due to the high number of degrees of freedom) and propose a framework that combines visible light information, together with the inertial and altitude sensors in drones, to attain accurate 3D localization. A key property of our framework is the low complexity of the HW (transmitters and receiver) and SW (localization method).

Contribution 2: Platforms & Evaluation [Section V]. We evaluate our framework using empirical data. We build a testbed consisting of four light beacons and add a PCB with a single photodiode to a mini-drone. The testbed works in the dark as well as in scenarios exposed to ambient light. Our main result shows that the localization of drones can be achieved with an accuracy of a few decimeters and improved by 42% compared to available methods in the state-of-the-art.

Fig. 1: Basic localization with LEDs.
Fig. 2: LED propagation properties.
Fig. 3: Transitionalmovement.
Fig. 4: Rotationalmovement.

Ii Background

In this section, we present the background information on the Lambertian patterns of LED lights, as well as the basic localization principles behind this work. We introduce everything in the 2D space for clarity, however, these concepts are also applicable to the 3D space.

Two types of localization techniques are often used in visible light positioning (VLP): ones that make use of the received signal strength (RSS), and ones that make use of the angle of arrival (AOA). In each method, the received power or angle information acquired from different light sources is used to estimate the position of a receiver. In this work, we focus on RSS methods for reasons detailed in

Section III.

In VLP with RSS, LED lights are used as anchor points with known locations, as shown in Fig. 4. Each LED light (represented by TX1 to TX3) broadcasts a beacon and the receiver measures the RSS of each signal. The receiver uses the received power to estimate its distance to the different light sources and obtains its location through a trilateration method. Starting at the transmitter, the propagation of light follows a Lambertian pattern , which is determined by the equation below and also depicted in Fig. 4.

(1)

where represents the direction relative to the surface normal of the transmitter, and represents the Lambertian order. The Lambertian order determines the width of the beam from the transmitter. A small value of leads to a wider beam, while a larger leads to a narrow beam, similar to a spotlight.

The Lambertian pattern defines the optical wireless channel between the transmitter and the receiver, and it is described in the following equation:

(2)

where is the distance between the transmitter and the receiver; denotes the incidence angle at the receiver; is the effective sensing area of the receiver; and represents the field-of-view (FoV) of the receiver, beyond which the receiver is unable to detect the incoming signal. Combining the propagation pattern of the transmitter and the effect of the channel, the received power at the PD can be written as:

(3)

where is the transmitted power; is the optical gain of the PD [32]; represents the sum of the shot and thermal noise of the receiver, as well as the ambient noise.

In VLP for drones, the mobile receiver has 6 degrees of freedom (DoF), as shown in Fig. 4 and Fig. 4. Compared to the setups of state-of-the-art (SoA) studies, which typically consider static receivers, the movements of drones introduce two important dynamics. First, the distance to a transmitter changes, affecting the irradiance angle . Second, any transitional movement requires the drone to tilt around its axes, affecting the incidence angle . For example, in order to move forward and backward, the drone has to tilt in the pitch axis; and the tilting angle influences the speed of the movement.

The above two effects can have a significant impact on the RSS () and the location estimation. For example, considering only a deviation in the angles, between 20 and 15, the distance will be overestimated by more 10%, around 1.5% due to the misalignment at the receiver (effect of in Eq. 2) and 8.5% due to the misalignment at the transmitter (effect of in Eq. 1 assuming a Lambertian order ). Thus, compared to a static scenario where the receiver is not moving and the angles are known and fixed, the constant movement and tilting in drones make localization more challenging. Firefly provides a simple method based on the equations presented in this section and solely based on parameters available in the data sheets of the transmitters and receivers (i.e., , and ).

Iii Analysis of the State of the Art

In this section, we introduce the available VLP techniques in three categories: AOA-based 3D, RSS-based 3D and RSS-based 2D+H. We analyze the feasibility of each category for drone localization using the following criteria:

  • The complexity of the transmitter design, which is critical if we are to transform our lighting system with minimal changes.

  • The complexity of the receiver design considering the limited weight capacity and power budget of drones.

  • The complexity of the algorithm, which affects the processing, memory and power requirements in the constrained environment of the drone.

  • Whether tilting is considered, which is important for mobile scenarios as discussed in Section II.

Iii-a 3D VLP with AOA

In previous studies, AOA-based methods have shown to be more accurate than RSS-based methods for 3D VLP in simulation and experimental evaluations in static conditions, achieving up to sub-decimeter levels of accuracy in certain scenarios [25, 33, 30]. However, AOA-based methods have considerably higher complexities in terms of the required infrastructure support, the underlying mathematical framework and the computational complexity. For example, in the design by Yang et al. [30], and the design by Xie et al. [29], multiple PDs are arranged at different angles in polygonal structures to allow the use of both AOA and RSS information to uniquely determine the position of a receiver. The requirement of using multiple PDs placed at different angles increases the profile of the receiver significantly and may not be feasible on small drones. In the design by Zhu et al. [33], very accurate results are reported based on the angle difference of arrival (ADOA) information. However, in addition to using a PD array, a complex framework is also required, which results in a high computation time even on a dual-core processor. In the design by Sahin et al. [25], the complexity of design is transferred from the receiver to the transmitter by proposing the use of Visible Light Access Points (VAPs). VAPs consist of multiple tilted LEDs, which are used to provide AOA information to a single receiver sensor on the drone. However, this custom design makes the transmitter difficult to scale.

In summary, AOA-based methods can achieve a high accuracy but require either complex receiver designs and algorithms, which are infeasible for small drones; or elaborated and costly transmitters, which do not scale well to a large lighting infrastructure.

Iii-B 3D VLP with RSS

Compared to AOA-based methods, RSS-based methods require considerably less infrastructure but impose some stringent constraints on the evaluation setup. For example, in the study by Li et al. [18] and the study by Zhuang et al. [35], the receiver and the transmitter are assumed to be parallel to simplify the problem. Even though the design implementations have lower complexities, the parallel assumption is unlikely to remain valid when drones are flying. In the study by Cai et al. [6]

, a 3D RSS-based VLP method is demonstrated with an average error of a few centimeters. However, a computationally intensive particle swarm optimization (PSO) algorithm is adopted to solve for the position. In addition, the experimental scenarios consider that receivers remain static and parallel to the transmitters. In the study of Carreño et al. 

[7]

, the computationally-heavy Genetic Algorithm (GA) is also considered to solve the 3D VLP problem and reports comparable accuracy results.

In summary, RSS-based methods have potential for 3D VLP for drones, as they require less hardware complexities compared to AOA-based methods. However, the main shortcomings are their computationally intensive optimization algorithms and the fact that the methods are accurate only with static receivers maintaining a parallel orientation with the transmitters.

Iii-C 2D+H VLP with RSS (Indirect-H)

A promising approach in using RSS-based methods for 3D VLP is to decompose the problem space into 2D+H, where the height (H) and the 2D position of the receiver are independently estimated. For example, in the studies by Plets et al. [22] and by Alamadani et al. [3]

, a list of different height values are sequentially assumed, and for each height, the 2D position is solved via trilateration. After the trilateration process is completed for all different heights, the algorithm then determines which height is the most probable to be correct by minimizing the error of a cost function that identifies which candidate solution is the most likely. Both studies are able to achieve 3D VLP with modest infrastructure and algorithmic complexities, and report favorable results for

static receivers that are parallel to the transmitters. In the study of [3], however, when tilting is introduced, the positioning error of their method increases more than 15 times when the tilting angle is 3 and more than 30 times when it is 5. In addition, the experiments are performed in a controlled environment using a high-end PD, which do not reflect the normal working conditions of drones. A key problem of 2D+H methods is that they use an indirect approach to estimate height. Height is obtained through RSS measurements. This approach works well with parallel and static receivers, but mobility and tilting affect their performance. We use these studies as the baseline of comparison for Firefly, and for the remainder of the paper we will refer to 2D+H Indirect methods as Indirect-H.

To conclude this section, we show the performance of the discussed SoA methods in terms of accuracy (reported positioning error) and the LED density in Fig. 5. The density of anchor points is a major indicator of how well a VLP system will perform and provides a basis for comparison [2]. The methods are categorized by type (AOA or RSS) and testing conditions (if the receivers are parallel to the transmitters in their evaluations or if they consider tilted setups). All these studies, including the ones evaluating tilting, consider only static receivers in their experimental evaluations. In order to have a high localization accuracy for drones while keeping the hardware complexity low, we propose Firefly, which will be discussed in Section IV. In Section V-C we compare Firefly against two other SoA approaches in a significantly more challenging mobile scenario. In Fig. 5, we can observe that when we test those two SoA methods with 6 DoF drones (arrows), their performance decrease dramatically.

Fig. 5: Accuracy vs LED density for different SoA studies. The curve represents the average trend for the static scenarios (without including Firefly and the two end points of the arrows evaluated in our drone testbed).

Iv Proposed method

In Firefly, our aim is to keep the hardware complexity as low as possible. Therefore, we consider an infrastructure where each LED transmitter has a single off-the-shelf light with no tilted or custom-made structure. This maintains the simplicity of the transmitters, making them easily scalable. Similarly, on the receiver side, our design requires each drone to be equipped with only a single PD for VLP.

Based on the analysis of Section III, we consider the Indirect-H methods from [3] and [22] as the initial building block because, even though they assume that the transmitter and receiver have to be parallel to each other to obtain an accurate result, they have low hardware and algorithmic complexities. We will first describe the common framework, assumptions and limitations of RSS methods, in particular the Indirect-H approach. After that, we explain our method and the modifications to the common framework that enable accurate VLP in a mobile scenario.

Iv-a Common framework of RSS methods

As described in Sec. II, we assume that the parameters from the data sheets of the LED transmitters and PD receiver are known. Therefore, with the RSS information at the PD, the distances between the transmitters and the PD on a drone can be calculated directly from Eq. 3, if the incidence angle and irradiance angle are also known. A common assumption in the SoA studies, including Indirect-H methods, is

  • Assumption 1 (A1): The transmitters (TX) and receivers (RX) are parallel ().

This assumption greatly simplifies the problem and leads to the following equation:

(4)

where denotes the height of the receiver; represents the height of the i-th transmitter; and is the distance between the receiver and the i-th transmitter. In Fig. 6, we show the reference and parameters of the system for clarity. By substituting Eq. 4 into Eq. 3, the distance between the receiver and the i-th transmitter is given by:

(5)
Fig. 6: System reference and parameters. Without loss of generality, we place the LEDs (transmitters) on the ground.

With this method, a good accuracy can be obtained when A1 is true [22]. However, the study of [3] shows that when RSS is used to estimate height, the result is affected by tilt of the receiver. In turn, this has a considerable effect in the overall accuracy of the method. Thus, the Indirect-H approach is inadequate when assumption A1 is broken. We address the limitations of the Indirect-H method to achieve a more accurate location estimation, even when A1 does not hold true.

Iv-B Firefly

In Firefly, we adopt a multi-sensor fusion approach. We design a direct height estimation method that is robust against tilting using both inertial sensors (IMUs) and barometric sensors. As most off-the-shelf drones are already equipped with both kinds of sensors, no additional hardware is needed. We directly introduce our estimated height into the distance equations of the common framework and perform two iterations to tackle tilts in the irradiance and incidence angles. Our method consists of three steps:

Step 1: The height of the drone is measured using the onboard barometer and IMU with a complementary filter. The Indirect-H method proposed in SoA studies is applied to correct the long-term zero drift of the barometer.

Step 2: We perform a first iteration of the 2D trilateration method assuming that the transmitters and receiver are parallel. These (initial) location estimates tackle the tilt in the irradiance angle and serve as a basis for Step 3.

Step 3: Taking into account the tilting of the drone obtained from the IMU (incidence angle), we perform a second iteration of the 2D trilateration method, which gives us the final localization results.

Fig. 7: Firefly approach. The common framework of the SOA, which assumes a static parallel setup, is enhanced with two novel components: a direct measurement of height via a complementary filter and a 2D VLP method that considers tilt.

Iv-B1 Step 1: Direct height estimation with sensor fusion

To obtain a reliable measurement of the height of a drone, two types of sensors are often considered: IMUs and barometric sensors. IMUs consist of accelerometers and gyroscopes that are used to track the position and orientation of an object. Their low costs, form factors and power consumption make them an ideal option for drone navigation. While IMUs are good at measuring instantaneous changes, their outputs are prone to noise and drift errors, even over a short period of time [28]. On the other hand, the output of barometers, which use air pressure to measure altitude, is sensitive to rapid variations in the air pressure uncorrelated with altitude changes. However, over a longer time period, barometers can provide a stable estimate of height.

Considering that the strengths of both kinds of sensors are complementary, their respective advantages can be fused to obtain an improved measurement using the long term reference of the barometer and the short term agility of the acceleration measurements. In Firefly, we adopt the complementary filter from [14]

, to apply a high-pass filter to the vertical acceleration from the IMU and a low-pass filter to the output of the barometer. The complementary filter is especially suitable for a resource-limited UAV, compared to other sensor fusion methods such as the Kalman Filter, because of its simplicity.

When combined with inertial sensors, barometers can provide accurate height estimation for drones, for example, with errors less than after  [24]. Nonetheless, due to changing conditions in the environment, barometers are likely to eventually drift from the initial conditions set during calibration. This is also known as zero drift. One possible solution to correct the zero drift is to perform frequent calibrations at defined locations or checkpoints, but this approach is cumbersome. In Firefly, we propose a simple zero drift correction scheme using VLP, which is shown in Fig. 8.

As described in Sec. III-C, the Indirect-H method already provides a solution to indirectly estimate the height of a drone using VLP. This solutions is not accurate when tilting occurs, but provides a drift-free height measurement. Thus, we use the Indirect-H measurement to correct the drift problem, but only when it can be trusted, i.e. when the roll and pitch angles of the drone are below a threshold such as 3 using information from the IMU.

In our method, the height estimated by the Indirect-H VLP is denoted as , and the change of height measured by the barometer is denoted as . Overall, has a higher accuracy than to detect changes in altitude and we show this in Section V-C. The measurement results and are then scaled by a constant and , respectively, to obtain the final height estimation . In the following iterations, then acts as a correction for . By choosing a small value for the constant , the Indirect-H height estimation () has a minor influence in the immediate output. This ensures that the height estimation is still accurate. At the same time, the drift-free has an anchor effect on the long term, which corrects zero drift from the barometer. Together with the IMU data, is the input to the complementary filter.

Fig. 8: Block diagram for barometer drift correction scheme.

Iv-B2 Step 2: Parallel TX-RX

In the previous step, we directly obtain the height of the receiver (). Now, we utilize this accurate height to obtain a first estimation of the and coordinates of the receiver. This step tackles the issue caused by the incorrect characterization of the irradiance angle in Indirect-H. By directly estimating height, we obtain a better approximation of the irradiance angle than the Indirect-H method. For this first iteration, we consider the common framework of RSS methods where the receiver is parallel to the transmitter. We take into account the effect of the tilt of the drone (incidence angle) in the next step.

In a typical indoor environment the transmitters are fixed, and the irradiance angle is not affected by tilting of the receiver. Then, for the irradiance angle, Eq. 4 is valid and we can use Eq. 5 to obtain distance, provided that the height estimation is accurate. Note in Fig. 6 that given an estimation of (based on RSS measurements), a correct height is critical to obtain a proper characterization of , and that is the strength of Firefly compared to Indirect-H.

With the distance estimation we can obtain the position of the drone using a standard trilateration procedure. In our system, as we are using visible light for positioning, we can consider the main sources of noise to be the shot and thermal noises [16]. Therefore, we adopt the maximum likelihood estimation (MLE) to perform 2D trilateration, as it takes into account statistical considerations to minimize the error of noisy measurements [13].

Iv-B3 Step 3: VLP with tilting

In the previous step, we obtain the position of the receiver assuming that the incidence and irradiance angles are equal. As we explained before, the irradiance angle is not affected by tilting in our setup. On the other hand, the incidence angle is directly influenced by the 3D orientation of the receiver. We can obtain this information using the IMU which we already use to estimate the height of the drone. In this step, we perform an additional iteration of the 2D MLE method, using distance measurements that account for tilt (). From equation Eq. 3 we have:

(6)

where the irradiance angle is described by equation Eq. 4 and the incidence angle is described by [30]:

(7)

with

where , are the roll and pitch Euler angles respectively obtained from the IMU. We can see from Eq. 7 that the first iteration of the 2D trilateration method (Step 2) is necessary since the position of the receiver (, and ) is required in addition to its orientation for and .

V Experimental Validation

In this section, we evaluate Firefly and compare it with SoA methods in an indoor testbed shown in Fig. 9. We select two methods as reference: the 3D VLP with RSS method in [7] and the 2D+H VLP with RSS method in [22] which have reported favorable results, with errors below 10 cm, as previously shown in Fig. 5. Two metrics are used for comparison: 1) the location error, calculated as the Euclidean distance from ground truth to the estimation of each method, and 2) the algorithm complexity.

V-a Overview of the testbed

We build a testbed in a 2 m x 2 m x 2 m indoor environment as shown in Fig. (a)a. The testbed has three main components: 1) 4 transmitting LED sources located at fixed positions, 2) the drone, and 3) a ground truth system to determine and control the drone’s position during flight. The connection between these components and the schematic overview of their functionalities are shown in Fig. (b)b. The configuration of the system is listed in Table I.

(a)
(b)
Fig. 9: The overview of the system. (a) The physical position of the testbed with 1) 4 transmitters; 2) UAV; and 3) the ground truth system. (b) Schematic overview of the connection and functionalities of the components.
Parameter Label Value
Testbed size - 2 m x 2 m x 2 m
Position (x, y, z) and
frequency ID of each
transmitter
(, , )
(, , )
, , )
(, , )
Transmitter power
Lambertian order 14
Area of the photodiode
FOV of the receiver
TABLE I: Parameters of the system.

V-A1 Transmitter

CorePro LEDspot LV lamps are used as transmitters. Each LED lamp is controlled by a microcontroller to generate a unique frequency identifier (ID) which will be explained in Section V-B.

V-A2 Receiver

We select the Bitcraze Crazyflie 2.1 as the receiver. It is a lightweight commercially available drone with multiple peripherals, which makes it easy to attach dedicated or customized hardware to perform different functions. To enable VLP on the drone, an OPT101 PD is attached via a custom PCB as shown in Fig. (b)b. The PD is connected to a 12-bit ADC on the Crazyflie.

(a)
(b)
Fig. 10: Crazyflie 2.1 mini-drone (a) Lighthouse deck used for ground truth (top view) and (b) custom PD deck (bottom view)

V-A3 Ground truth system

We use the Lighthouse positioning system from Bitcraze to control and retrieve the ground truth position of the UAV. It consists of two parts: the Steam VR Stations positioned above the flying range of the receiver and the positioning deck attached to the top of the Crazyfile, as shown in Fig. (a)a. The ground truth system uses two infrared (IR) stations to locate the positions of the drone and provides a high accuracy of less than . However, these dedicated hardware stations 1) are more expensive (more than $200 USD per station compared to less than $5 USD for an LED lamp); 2) need careful setup and calibration; and 3) cannot utilize the existing indoor illumination infrastructure compared to Firefly. Thus, the chosen ground truth system is costly and impractical for drone positioning in a large scale application. However, it is an effective research tool to benchmark the performance of Firefly and SoA methods.

V-B Multiplexing

In Firefly, each LED transmits a unique frequency ID in the form of a square wave by turning itself on and off. In order to access the shared medium, the four LEDs use frequency division multiple access (FDMA) to transmit their unique IDs simultaneously. The combined signal is sampled by the PD in the receiver, and then decomposed by a Fast Fourier Transform (FFT) to obtain the RSS (

) corresponding to each frequency ID (i.e. the light source). In this way, Firefly is resilient to the presence of high frequency shot and thermal noise as well as constant DC component of ambient light. This allows Firefly to work in dark and illuminated environments with a simple implementation (i.e. no synchronization and no additional protocol is required by FDMA). To showcase its robustness, our experiments were carried under the presence (interference) of externals sources of artificial and natural light.

Note that, the frequency IDs of LEDs in Firefly are selected to avoid interference. As the Fourier transformation of a periodic square signal contains odd-harmonic components, using

multiples of a selected base frequency (i.e. ) can resolve the interference problem  [10]. The base frequency in Firefly is arbitrarily chosen to be for evaluation. However, any base frequency can be used, as long as the hardware limitations of the receiver and transmitters are taken into account.

V-C Evaluation results

We carry out 8 automated flight tests to cover a circular path of radius and a range of heights from to . In each flight test, the drone follows a pre-programmed route with curves and direction changes. As a result, we are constantly inducing tilting on the drone to test the VLP methods in a real flight scenario. Along the trajectory, the drone is sampling the visible light signal and executes the FFT in real time to retrieve the ID and the received power from each LED source. Although Firefly can run in real time in the drone, executing the three algorithms simultaneously is too demanding. Therefore, we send the sensor data log to a remote server as shown in Fig. (b)b and use Matlab to compare them. Table II lists the data transmitted from the drone.

Source Variable (s) Units Description
PD [] Power received
IMU [Gs] Linear acceleration
[ ]
Orientation
(roll, pitch, yaw)
Barometer [m] Altitude above sea level
Lighthouse
deck
[m] Ground truth position
TABLE II: Sensor variables transmitted to the remote client.

V-C1 Positioning error

Fig. 11 shows the mean-error of three methods across the 8 different tests. We label the method from [22] as Indirect-H and the method from [7] as 3D PSO. Firefly clearly outperforms the other two algorithms, in terms of the mean and maximum error achieved for all tests, by almost a factor of and with respect to Indirect-H and 3D PSO, respectively. Firefly and Indirect-H are closer in terms of accuracy compared to 3D PSO which evidently displays the lowest performance of all. Notice that in Fig. 5, the results reported by Indirect-H and 3D PSO are below , but that performance reduces significantly when tested in a mobile scenario with 6 DoF, obtaining errors around and , respectively. We will now discuss in detail the results obtained in Firefly with respect to each method.

Firefly Vs. Indirect-H: Let us take a close look of the Indirect-H results in one of the tests. In Fig. 12, we show the height estimation of Indirect-H and Firefly, and the receiver angles during the flight test. During the lift-off (timestep 0 to 60) and landing (after timestep 200), Indirect-H is able to detect the monotonic changes in altitude, but to a much smaller extend compared to Firefly. More significantly, during the route (timestep 60 to 200), Indirect-H fails to capture the variation in the height. In contrast, Firefly closely resembles the profile of the ground truth, and does not show any significant drift effect over the whole flight test.

Considering the height measurements of all tests, the mean absolute error for Firefly is less than compared to a mean error of of the Indirect-H method. In our tests, the drone is exposed to tilting angles up to 7 as shown in Fig. 12 (bottom). Although some of the errors of the Indirect-H method can be attributed to tilting, the height estimation does not accurately reflect the ground truth even when the tilting angle of the receiver is small. For example, in the segment between timestep 100 and 150, the roll and pitch angles are less than . Even though the receiver and transmitter are close to parallel during this time, we can see that the estimated height by the Indirect-H method does not match to the ground truth from Fig. 12 (top). Our approach considers the input of the indirect method but only with a small weight to compensate drifts, c.f. Fig. 8. As a result, Firefly produces an accurate height estimation using multi-sensor fusion.

In Fig. (b)b, we now analyze the 2D position by looking at the top view of the trajectory for the same test. The Indirect-H estimation roughly captures the circular motion of the actual drone, but not with the amplitude seen in the ground truth. Firefly achieves a wider amplitude in the x-y plane that is closer to ground truth. Since the 2D position of the receiver is computed with the same RSS information in both methods, this improvement of the 2D position can be mainly attributed to (1) an accurate height estimation using a sensor-fusion approach and (2) the consideration of tilting of the receiver into the equations.

Fig. (a)a depicts the 3D trajectory of both methods. It supports our previous analysis that Firefly provides a more accurate location than Indirect-H. From the statistical measurements of the position error of all 8 tests, Firefly improves the accuracy by around 42% as listed in Table III.

Fig. 11: Mean-error plot for the 8 test flights.
Fig. 12: Height estimation (top) and tilting angles of the receiver (bottom) in Test 5.
Indirect-H [22] Firefly Improvement (%)
Mean error (cm) 40.01 23.19 42.05%
Median error (cm) 41.02 23.62 42.41%
Max. error (cm) 68.98 42.52 38.35%
Std. dev. (cm) 15.68 9.64 38.53%
TABLE III: Improvement of positioning accuracy.

Firefly Vs. 3D PSO: In another test example, shown in Fig. (a)a, we look into the estimated trajectory of the 3D PSO method. 3D PSO displays the expected circular motion in the x-y plane, but the altitude estimation is far from the ground truth, and up to the imposed upper bound of the test area (i.e. ). In Fig. (b)b, the top view of the trajectory for the same test is shown and we see that the amplitude of the trajectory is very wide compared compared to the ground truth. 3D PSO does not perform as well as the other methods and has a mean positioning error of .

The inaccurate results of 3D PSO can be explained because its model considers parallel transmitters and receiver. In the method, the gain of the system (i.e. ratio of the received and transmitted power) is used to estimate the coordinates that most accurately describe the channel loss (see Eq. 2). However, the channel loss cannot be obtained from the power received directly because the distance is also affected by the height of the receiver. When the assumptions of the model are broken, small angle variations can have a significant effect in the distance calculation as the height is not known. Thus, the 3D position cannot be determined precisely.

In our test, we show that 3D RSS methods are not effective strategies to determine the position of UAVs. Although they have provided favorable results when tested under controlled conditions, when tilting and movement are introduced, the position accuracy is severely downgraded.

(a) 3D Trajectory
(b) 3D Trajectory (Top View)
Fig. 13: Firefly vs. Indirect-H [22] in Test 5.
(a) 3D Trajectory
(b) 3D Trajectory (Top View)
Fig. 14: Firefly vs. 3D PSO [7] in Test 8.

V-C2 Algorithmic complexity

The 3D PSO method considers multiple particles (candidate solutions) that are evaluated at each iteration. This results in multiple function evaluations and a quadratic complexity without taking into account additional modifications of the method. In [6] and [7], 20 iterations of the method are allowed and a particle group size of 200 is considered in the latter study which results in up to 4000 function evaluations.

In the Indirect-H method, the height is computed by iteratively evaluating a set of candidate heights in the range of interest. To evaluate each candidate solution, a trilateration step is performed for each height candidate. Considering the proposed resolution of and the height of our setup (), this would amount to 2000 evaluations. The time complexity of this iterative approach is if we consider that the computation time of each iteration (consisting of a trilateration step) is constant. A fast search optimization of the algorithm is also proposed in [22] to narrow the search interval. This reduces the computation time by 90% and the time complexity improves to .

In Firefly, we implement the complementary filter proposed in [14] to estimate the height, which consists of two discrete equations that can be computed in one step respectively. Then, we execute a 2D trilateration method twice. Considering the computation time of the trilateration steps constant (as we do for the Indirect-H method), the time complexity of the previous steps is . When we apply the long term drift correction for the barometer, we also use the Indirect-H method to remove zero drift. Therefore, our algorithm has a time complexity of considering an implementation of the fast search algorithm of Indirect-H. However, in our empirical evaluations, we find that the barometer correction can be executed less frequently without a considerable impact on the positioning accuracy. This results in a time complexity of , where indicates that the barometer correction is executed once for every iterations of our method. When , we find that the mean error of Firefly only increases by .

Vi Conclusion

This work illustrates, for the first time, that VLC can be used for accurate 3D positioning for drones with six degrees of freedom in a realistic experimental setup. It paves the way for using standard light bulbs to provide navigation services for drones. A novel localization method, Firefly, is proposed by building upon the line of research that decomposes a 3D positioning problem into 2D+H. Compared to SoA studies, it removes the need of imposing complex requirements on the transmitter and receiver designs to achieve an accurate positioning result. In addition, it overcomes the limitation of the SoA RSS methods, which are restricted to parallel receiver and transmitters without considering tilting. By utilising on-board sensors on the drone, we accurately estimate height and account for tilting of the receiver. The algorithm is also lightweight and can be implemented in a drone for real-time execution. As demonstrated by the experimental results, Firefly achieves mean position accuracy of using off-the-shelf LED lights and low-cost sensors. It reduces the localization error compared to other SoA methods by around 42% under the same experimental setup.

References

  • [1] Note: Matternet External Links: Link Cited by: §I.
  • [2] M. Afzalan and F. Jazizadeh (2019) Indoor positioning based on visible light communication: a performance-based survey of real-world prototypes. ACM Computing Surveys (CSUR) 52 (2), pp. 1–36. Cited by: §I, §III-C.
  • [3] Y. Almadani, M. Ijaz, W. Joseph, S. Bastiaens, S. Rajbhandari, B. Adebisi, and D. Plets (2019) A novel 3D visible light positioning method using received signal strength for industrial applications. Electronics 8 (11), pp. 1311. Cited by: §III-C, §IV-A, §IV.
  • [4] Y. Almadani, M. Ijaz, S. Rajbhandari, B. Adebisi, and U. Raza (2018) Application of visible light communication in an industrial environment. In 2018 11th International Symposium on Communication Systems, Networks Digital Signal Processing (CSNDSP), Vol. , pp. 1–6. External Links: Document Cited by: §I.
  • [5] (2020) Autonomous drone navigation system ends reliance on GPS. Note: Nasa Technology Transfer Program Cited by: §I.
  • [6] Y. Cai, W. Guan, Y. Wu, C. Xie, Y. Chen, and L. Fang (2017) Indoor high precision three-dimensional positioning system based on visible light communication using particle swarm optimization. IEEE Photonics Journal 9 (6), pp. 1–20. External Links: Document Cited by: §III-B, §V-C2.
  • [7] C. Carreño, F. Seguel, P. Adasme, I. Soto, N. Krommenacker, P. Charpentier, and V. Bombardier (2020) Comparison of metaheuristic optimization algorithms for RSS-based 3-D visible light positioning systems. In 2020 South American Colloquium on Visible Light Communications (SACVC), Vol. , pp. 1–6. External Links: Document Cited by: §III-B, Fig. 14, §V-C1, §V-C2, §V.
  • [8] H. Chao, Y. Cao, and Y. Chen (2010) Autopilots for small unmanned aerial vehicles: a survey. International Journal of Control, Automation and Systems 8 (1), pp. 36–44. External Links: Link Cited by: §I.
  • [9] B. Chhaglani, A. S. Anand, N. Garg, and A. Ashok (2020) Evaluating led-camera communication for drones. In Proceedings of the Workshop on Light Up the IoT, LIOT ’20, New York, NY, USA, pp. 18–23. External Links: ISBN 9781450380997, Link, Document Cited by: §I.
  • [10] S. De Lausnay, L. De Strycker, J. Goemaere, N. Stevens, and B. Nauwelaers (2015) A visible light positioning system using frequency division multiple access with square waves. In 2015 9th International Conference on Signal Processing and Communication Systems (ICSPCS), Vol. , pp. 1–7. External Links: Document Cited by: §V-B.
  • [11] T. Do and M. Yoo (2016) An in-depth survey of visible light communication based positioning systems. Sensors 16 (5), pp. 678. Cited by: §I.
  • [12] T. Duvall, A. Green, L. Meredith, and K. Miele (2019.-Feb.) Air-mobility solutions: what they’ll need to take off. Note: McKinsey Cited by: §I, §I.
  • [13] E. Goldoni, A. Savioli, M. Risi, and P. Gamba (2010-05) Experimental analysis of rssi-based indoor localization with ieee 802.15.4. pp. 71 – 77. External Links: Document Cited by: §IV-B2.
  • [14] W. T. Higgins (1975) A comparison of complementary and kalman filtering. IEEE Transactions on Aerospace and Electronic Systems AES-11 (3), pp. 321–325. External Links: Document Cited by: §IV-B1, §V-C2.
  • [15] P. Hitlin (2017-Dec.) How americans feel about drones and ways to use them. Note: Pew Research Center Cited by: §I.
  • [16] L. Hua, Y. Zhuang, L. Qi, J. Yang, and L. Shi (2018)

    Noise analysis and modeling in visible light communication using allan variance

    .
    IEEE Access 6 (), pp. 74320–74327. External Links: Document Cited by: §IV-B2.
  • [17] T. M. Jones (2017-Dec.) International commercial drone regulation and drone delivery services. External Links: Link Cited by: §I.
  • [18] L. Li, C. Hu,and Peng, G. Shen, and F. Zhao (2014) Epsilon: a visible light based positioning system. In 11th USENIX Symposium on Networked Systems Design and Implementation NSDI 14), pp. 331–343. Cited by: §III-B.
  • [19] Y. Lu, Z. Xue, G. Xia, and L. Zhang (2018) A survey on vision-based uav navigation. Geo-spatial Information Science 21 (1), pp. 21–32. External Links: Link Cited by: §I.
  • [20] U. S. D. of Transportation The operation of unmanned aircraft systems over people, ruling effective april 21, 2021.. Cited by: §I.
  • [21] A. Patrik, G. Utama, A. A. S. Gunawan, A. Chowanda, J. S. Suroso, R. Shofiyanti, and W. Budiharto (2019-Jun.) GNSS-based navigation systems of autonomous drone for delivering items. Springer International Publishing. Cited by: §I.
  • [22] D. Plets, Y. Almadani, S. Bastiaens, M. Ijaz, L. Martens, and W. Joseph (2019) Efficient 3D trilateration algorithm for visible light positioning. Journal of Optics 21 (5), pp. 05LT01. Cited by: §I, §III-C, §IV-A, §IV, Fig. 13, §V-C1, §V-C2, TABLE III, §V.
  • [23] S. Rice (2019-Feb.) Eyes in the sky: the public has privacy concerns about drones. Forbes Magazine. Cited by: §I.
  • [24] A. M. Sabatini and V. Genovese (2014) A sensor fusion method for tracking vertical velocity and height based on inertial and barometric altimeter measurements. Sensors 14 (8), pp. 13324–13347. Cited by: §IV-B1.
  • [25] A. Şahin, Y. S. Eroğlu, I. Güvenç, N. Pala, and M. Yüksel (2015) Hybrid 3-d localization for visible light communication systems. Journal of Lightwave Technology 33 (22), pp. 4589–4599. Cited by: §III-A.
  • [26] D. Tsonev, S. Videv, and H. Haas (2013) Light fidelity (li-fi): towards all-optical networking. Broadband Access Communication Technologies VIII. External Links: Document Cited by: §I.
  • [27] (2021.-Jun. 27,) Vital-on-demand delivery for the world. Note: Zipline External Links: Link Cited by: §I.
  • [28] O. J. Woodman (2007) An introduction to inertial navigation. Technical report University of Cambridge, Computer Laboratory. Cited by: §IV-B1.
  • [29] B. Xie, K. Chen, G. Tan, M. Lu, Y. Liu, J. Wu, and T. He (2016) LIPS: a light intensity–based positioning system for indoor environments. ACM Transactions on Sensor Networks (TOSN) 12 (4), pp. 1–27. Cited by: §III-A.
  • [30] S. Yang, H. Kim, Y. Son, and S. Han (2014) Three-dimensional visible light indoor localization using AOA and RSS with multiple optical receivers. Journal of Lightwave Technology 32 (14), pp. 2480–2485. Cited by: §III-A, §IV-B3.
  • [31] Y. Yang, M. Chen, C. Guo, C. Feng, and W. Saad (2019) Power efficient visible light communication with unmanned aerial vehicles. IEEE Communications Letters 23 (7), pp. 1272–1275. External Links: Document Cited by: §I.
  • [32] L. Yin, X. Wu, and H. Haas (2015) Indoor visible light positioning with angle diversity transmitter. In Proceedings of the IEEE Vehicular Technology Conference (VTC Fall), pp. 1–5. Cited by: §II.
  • [33] B. Zhu, J. Cheng, Y. Wang, J. Yan, and J. Wang (2018) Three-dimensional vlc positioning based on angle difference of arrival with arbitrary tilting angle of receiver. IEEE Journal on Selected Areas in Communications 36 (1), pp. 8–22. External Links: Document Cited by: §III-A.
  • [34] Y. Zhuang, L. Hua, L. Qi, J. Yang, P. Cao, Y. Cao, Y. Wu, J. Thompson, and H. Haas (2018) A survey of positioning systems using visible LED lights. IEEE Communications Surveys & Tutorials 20 (3), pp. 1963–1988. Cited by: §I.
  • [35] Y. Zhuang, L. Hua, Q. Wang, Y. Cao, Z. Gao, L. Qi, J. Yang, and J. Thompson (2019) Visible light positioning and navigation using noise measurement and mitigation. IEEE Transactions on Vehicular Technology 68 (11), pp. 11094–11106. External Links: Document Cited by: §III-B.