Detecting Invasive Insects with Unmanned Aerial Vehicles

03/03/2019 ∙ by Brian Stumph, et al. ∙ USDA Marquette University 0

A key aspect to controlling and reducing the effects invasive insect species have on agriculture is to obtain knowledge about the migration patterns of these species. Current state-of-the-art methods of studying these migration patterns involve a mark-release-recapture technique, in which insects are released after being marked and researchers attempt to recapture them later. However, this approach involves a human researcher manually searching for these insects in large fields and results in very low recapture rates. In this paper, we propose an automated system for detecting released insects using an unmanned aerial vehicle. This system utilizes ultraviolet lighting technology, digital cameras, and lightweight computer vision algorithms to more quickly and accurately detect insects compared to the current state of the art. The efficiency and accuracy that this system provides will allow for a more comprehensive understanding of invasive insect species migration patterns. Our experimental results demonstrate that our system can detect real target insects in field conditions with high precision and recall rates.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 3

page 4

page 5

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

Invasive species are often inadvertently transported from their native environments to new habitats where they tend to have severe negative impacts on food security, public health, economic interests, and native species biodiversity [13]

. As an example, in the United States, annual economic losses from invasive species are estimated at $120 billion

[26], and these severe impacts are predicted to continue [23].

Understanding invasive pest migration patterns is a key element to the mitigation of their damage to both natural environments and agricultural production. Insect species are highly variable in their dispersal capacity. For example, invasive brown marmorated stink bugs (BMSB) are extremely mobile and can fly 115km in 24 hours [12, 35], whereas invasive emerald ash borers typically fly less than 100m in natural habitats [16]. Therefore, the dispersal capacity of newly identified invasive insect species must be well-defined to determine what strategy (eradication or management) to initiate.

Fig. 1: Fluorescent-marked BMSB detected in a 30m tree using a hand-held laser equipped with focusing lens [28]. The inset shows a fluorescent-marked BMSB illuminated by UV light.

Research in this area has focused on the influence of wind and temperature in long-range migrations, [33, 5], neglecting the study of short dispersal patterns. Although there are studies about insect dispersal, they are limited to mark-release-recapture techniques – a laborious, time-consuming, and error-prone task. In addition, because of the labor-intensive nature of the process, very few samples of the marked insects can be recaptured, typically < 5%. This extremely low recapture rate negatively impacts the accuracy of the resulting dispersal models [29, 17, 20].

In this paper, we describe a novel aerial system that attempts to solve the limitations of the manual methods described above. The proposed approach uses an Unmanned Aerial Vehicle (UAV) to scan a region of interest and detect target insects using a recently developed system of ultraviolet (UV) lights. With the UAV equipped with a UV light source and video camera, we record an aerial video of the area. We later process the video offline to obtain an accurate count of the marked insects released in the field.

The video processing pipeline described in this paper employs multi-level color channel thresholding to segment the marked insects from the background. It consists of three main steps. First, we extract the region of interest (ROI) from the video frames, keeping only the region surrounding the area illuminated by the UV light. Then, we identify the illuminated insects using color channel thresholding in the RGB and HSV color spaces. Finally, we use watershed segmentation [32] to separate nearby insects that might have been identified as a single insect in the previous step. Our experiments demonstrate the detection of marked insects in field conditions.

The contributions and impact of this work are as follows: 1) We propose a novel data acquisition system based on a UAV that allows us to quickly search for insects in a large area of interest, 2) we design a low complexity computer vision algorithm to robustly detect and count the insects observed by our system, and 3) we evaluate the data acquisition system in open field situations using industry standard benchmarks. The impact of this work represents a considerably faster, economical, and accurate alternative to current manual mark-release-recapture methods for insect pest monitoring. Moreover, as there is no need to recapture the insects after they are detected, our non-intrusive method allows for researchers to dynamically localize insects over time.

Ii Related Work

For over a century, entomologists have used mark-release-recapture techniques to track insect movement. These initial efforts traditionally used paint and dyes as markers [10]. In recent years, protein markers were developed to quickly mark thousands of individual insects within minutes [8, 7], enabling more sophisticated studies. However, protein marking involves inherent challenges. First, the proteins degrade quickly under field conditions. In addition, a time-consuming process based on enzyme-linked immunosorbent assays (ELISA) is needed to determine whether a specific protein marker is present [4]. Finally, because of the ELISA test itself, the target insect is ultimately destroyed thereby eliminating the possibility of tracking the same insect more than once.

Ii-a Ultraviolet light tracking

The use of methods based on fluorescent pigment is a simple alternative that can effectively mark millions of insects in a short amount of time [31, 9]. Fluorescent markings (see Figure 1) glow under UV light by transforming radiant energy from the UV band to longer wavelengths that are detectable by the human eye. However, detection distances of fluorescent powders with UV light sources are significantly short – less than a few meters – even in dark conditions. Thus, many studies employing fluorescent pigment marking require recaptured organisms to be taken to the laboratory to verify marks under UV lamps [14, 19], rather than in the insect’s initial location.

Rice et al. described in [28] a novel method for detecting fluorescent-marked insects. They use hand-held UV lasers with focusable lenses which allow the beam to be widened significantly for a larger search area and increased detection distance up to 40m. The method also enables non-destructive scanning from the ground of previously inaccessible habitats, including tree canopies and aquatic habitats. These contributions have resulted in increased recovery rates and the amount of information gathered. Additionally, utilizing UAVs in conjunction with UV lights has been proposed in other research applications. One such example is using a UAV to inspect transmission lines, while implementing a UV light to highlight problems with high voltage coronas [30]. This demonstrates the advantage of using UV light to assist with UAV vision operations.

Ii-B Agricultural applications of unmanned aerial vehicles

Recently, there has been increasing interest in the use of UAVs in agricultural research, with predictions that it will revolutionize spatial ecology [2]. UAVs provide the ability to collect remote data at an unprecedented scale and sampling rates at a fraction of the cost of previous methods such as satellites or manned aircraft [36]. Current examples of research applications include methods of classification and monitoring, such as weed classification [15] and applying a multispectral imaging system for crop monitoring [18]. Similar to our proposed system of mapping insects in a field, other related research topics include aerial mapping of rice crops [22]. Similar agricultural mapping techniques involve using swarms of UAVs to map weeds in large fields [1] and terrain surveying of disjointed fields aided by UAVs and path planning [34]. Lastly, UAVs can provide a system for detection and estimation, such as using the vehicles to provide vital data on plant stress [3].

The UAV sensing modality is determined by the application needs as well as the payload capacity of the aircraft. Hyperspectral and multispectral cameras are among the most popular sensors, but regular and thermal cameras, lidar and radar imaging and even chemical sensing have been applied in specialized areas [24]. Recently, radio tags have been employed to track invasive fish species using UAVs [27]. However, no applications of UAVs to track insect movement and dispersal in field conditions have been proposed so far.

Iii Insect detection system

In this section, we describe the hardware and software components of the proposed detection system. We designed the system with the assumption that insects have been previously coated with a fluorescent pigment. It is also assumed the data acquisition process is carried out at night with limited artificial illumination sources other than the UV light system. The insect detection process consists of two primary components, the aerial data acquisition and the insect detection and segmentation algorithm. Data acquisition consists of flying the UAV over the area under study with the camera and UV light source facing down, illuminating and filming any coated insects on the ground. Once data acquisition is completed, the video files are downloaded onto a workstation and used as input to the software pipeline. This pipeline utilizes multi-channel thresholding techniques to detect all the insects in the video.

((a))
((b))
Fig. 2: a) Matrice 100, equipped with a Zenmuse X3 camera and gimbal unit and a UV light system. b) Close-up view of the bottom of the UAV. The blue circle denotes the camera and the red rectangle indicates the UV LED array.

Iii-a Hardware components

Our system consists of a UAV equipped with a set of ten ultraviolet lights and a high-resolution camera and gimbal unit. The UAV is a Matrice 100 model – a four-propeller UAV that is ready-to-fly off the shelf yet still fully customizable (see Figure 2). A benefit of the Matrice 100 is its extended flight time of up to 40 minutes. This is essential for the plausibility of our proposed system, as large fields will take considerable time for the UAV to cover.

The camera and gimbal unit is the Zenmuse X3. The Zenmuse X3 is an all-in-one gimbal and camera tool that can produce videos with pixel resolution and 3 axis movement stability. The gimbal is set to be pointed perpendicular to the ground at all times. The 3-axis stability allows for clear, non-blurry footage to be collected while the UAV is moving. The camera’s parameters can be adjusted through the DJI Ground Station Pro app for iOS devices. All conducted tests would be in a low-light environment, so we used an ISO rating of 1600 and a shutter speed of 1/25 second for all video footage.

The UV illumination system is attached to the bottom frame of the UAV and can be controlled by a remote transmitter. The UV lights consist of high-power violet LEDs with a wavelength of 395 nm. Each LED is encapsulated by a narrowing lens that focuses the light emitted by the LED. The lights are attached to a cm aluminum heatsink (see Figure 2) and a set of four 3.7V Li-ion batteries with 3400mAh capacity to supply power to the UV lights. The power source is controlled with an RF remote control relay switch fastened between the heatsink and battery pack. The light system is secured to the UAV using four fastening screws so that the UV lights are pointed downward.

Although the camera is mounted on a gimbal, the illumination system is not. This inevitably causes the UV projection to always be perpendicular to the UAV rather than the ground when the UAV is in motion. To minimize the movement of the UV-illuminated area, we installed the UV LED array in the center of the vehicle’s body frame, which corresponds to the pivot point of the pitch, roll, and yaw motions. This layout also improves flight dynamics and stability, as it reduces the moment of inertia caused by the weight of the system (1029g with batteries). Additionally, the UAV is flown at the relatively slow speed of 1 m/s to reduce the angular tilt caused by the forward movement of the UAV.

((a))
((b))
((c))
((d))
((e))
Fig. 3: Illustrative intermediate steps of the proposed software pipeline. a) Original image multiplied by thresholded image . b) regional intensity peaks of . c) , individual insect detections, obtained by applying as a mask to

. d) Binarized

with the blob counter added on the top left of the image. e) Original image frame with counter and overlaid bounding boxes around the detected insects.

Iii-B Software pipeline

The video recorded during the flight, , consists of a sequence of frames

(1)

where each frame , with and representing the height and width of the image, and its number of color channels. Thus, each image pixel is a

-dimensional vector, where

and are its coordinates in the image. The output video is then used as the input to Algorithm 1, which summarizes the processing steps carried out after each data collection flight.

1:UAV camera video file .
2:Video showing the detections and detection counts per frame.
3:for Each frame  do
4:     Extract an region of interest from .
5:     Create the detection image by thresholding in the HSV and RGB color spaces.
6:     Combine regional hue maxima of with to split groups of insects into individual detections, , creating .
7:     Count and locate by analyzing the blobs of the binarized .
Algorithm 1 Insect detection video processing algorithm.

1) Step 1: Region of interest selection: To reduce the execution time of our software pipeline, a region of interest is selected in relation to the middle of each frame. A supplemental advantage of selecting a region of interest is reducing the amount of noise on the edges of the frame since all the insects should be visible only within the ultraviolet light-illuminated portion of the frame. The field of view (FOV) of the light system () is significantly narrower than the FOV of the Zenmuse X3 (). Therefore, we determined the size of the image , , experimentally to ensure the UV light projection always remains within the region of interest. The determined values were x 720 pixels, which corresponds to 25% of the original image size. This allows for a relative angle of up to 16° between the camera and the light beam.

2) Step 2: Thresholding: To identify the insects in , we segment the pixels of as belonging to the foreground (insects) or background using simple color thresholding. This is plausible because of the distinctive color of the marked insects with respect to the background. The result of this step is , an image displaying the detected insects over a black background, as shown in Figure 3(a). To achieve this result, we used two different color space representations, RGB (red, green, blue) and HSV (hue, saturation, value).

The fluorescent powder used shows a pink color when illuminated with UV light. Pink exhibits very low green values compared with its red and blue components. We enforce this relation between the three color channels in our insect thresholding algorithm. That is, let be a pixel in , then is considered a foreground pixel as long as .

RGB thresholding alone, however, is not robust enough against reflective surfaces such as dewy grass. In our case, the reflection of the UV light (violet color) on some surfaces can result in dark pink pixels, which can be mistaken for insects. To improve the robustness of our method, we incorporate HSV color space thresholding. We use a set of nine calibration images of insects illuminated by the UV light on a non-reflective background to determine , the average values of the brightness component of the reflected light. We then use this value as a lower detection threshold. That is, let be a pixel in , the HSV representation of , then for to be considered a foreground pixel it has to satisfy . The determined value was 40. Additionally, the value channel threshold allows for further differentiation from the black background, as black pixels have very low brightness values. We have determined experimentally that our method is not sensitive to the number of images used for calibration. The method shows identical results if the number of calibration images is between five and eighteen.

The bright red and pink color of the marked insects correspond to both very high and very low values in the hue channel, leading us to choose an upper and lower threshold requirement for hue levels. The values are denoted by and for the upper and lower hue thresholds, respectively. Inversely, we found that marked insects typically exhibited saturation values in the middle of the saturation spectrum. Thus, we again use upper and lower thresholds for the saturation channel, denoted by and , respectively. That is, for a pixel to be considered a foreground pixel, it must satisfy , . The following optimal threshold values were found throughout our experiments: = 220, = 25, = 90, = 255.

3) Step 3: Segmentation: Thresholding alone is not sufficient to identify multiple insects in close proximity to each other. Groups of coated insects represent very brightly colored areas, and the gap between them is often detected as part of the foreground (Figure 3 (a)). To overcome this difficulty, we use watershed segmentation [25] to split groups of insects into individual detections. By applying watershed segmentation to , we generate an image of local intensity peaks, (Figure 3 (b)). Then we use as a mask for so that only the peaks that are inside the thresholded blobs of are maintained. The new image is given by

(2)

where represents elementwise multiplication. As Figure 3 (c) illustrates, all detected insects can be clearly identified in the resulting image .

4) Step 4: Counting: To count the detected insects in , we first binarize the image using the thresholds and segmentation steps from steps two and three. We subsequently perform blob analysis using MATLAB’s Computer Vision Toolbox111https://www.mathworks.com/products/computer-vision.html, thereby counting the number of blobs (groups of foreground pixels) in the image. This step additionally allows us to apply a detection size filter to discard blobs that contain fewer than a certain number of pixels .

To calculate the total number of insects in the video, we create a global counter that is incremented whenever the number of blobs in is higher than the number of blobs in . Figure 3 (d) shows the result of this step. Note that this method does not take into consideration potential double-counting caused by scanning an image area multiple times. Addressing this limitation is part of our future work.

Finally, for visualization purposes, we highlight each blob with a green bounding box, as show in Figure 3 (e).

Iv Experimental Results

In this section, we evaluate the performance of the proposed system in our testing location, a grassy outdoor field. We compare our method with a baseline approach that uses Otsu’s algorithm [21] to determine the optimal luminance threshold for the video.

Iv-a Baseline method

To the best of our knowledge, no automated methods to segment fluorescent insects currently exist. Therefore, we select Otsu’s algorithm as a baseline approach for comparison. Figures ((a))(a) and ((b))(b) illustrate that when no fluorescent target is illuminated, Otsu’s threshold generates significant amounts of clutter. To address this problem, we search the entire video for the maximum threshold value obtained applying Otsu’s algorithm to each frame and use that value for all the video frames on a subsequent pass, i.e.,

(3)

where is Otsu’s threshold for frame . Figure ((c))(c) shows the results of applying this method to the two images shown in Figure ((a))(a).



((a))


((b))


((c))
Fig. 4: Binarization of two video frames using Otsu’s algorithm. a) Original frames, one without a fluorescent object and the other with an illuminated coated board. b) Binarization using different thresholds for each frame ( without fluorescent object and with the object), c) Binarization of the frames using the maximum threshold value of the video according to Eq. 3 (in this case, ).

Iv-B Target species and marking procedure

All the experiments described in this section were carried out using real fluorescent-marked insects. In particular, we used 36 fluorescent-marked brown marmorated stink bugs (BMSB) with sizes between 13.5 x 7 mm to 16 x 8 mm (Figure ((a))(a)). To coat the insects, we individually placed them in a plastic cylinder with 2 g of red fluorescent powder (BioQuip, Rancho Dominguez, CA) and gently shook it for five seconds.

((a))
((b))
Fig. 5: a) Insects used during the design and evaluation of our system. b) Single insect with ruler for reference.

Iv-C Test site

Due to regulatory restrictions and local weather conflicts, opportunities for testing the method in an outdoor field at night, are limited. Therefore, we conducted our tests on a campus field located at Marquette University in Milwaukee, Wisconsin. To mitigate the impact of the illumination at the edges of the field, all the tests were performed on a meter area at the center of the field (Figures 6(a) and 6(b)). In the test area, the average grass height was 9 cm, the average luminous value was 0.1 lux, and the average wind speed during testing was 1.4 km/h.

((a))
((b))
Fig. 6: a) Insect positions (red dots) in our experimental setting. b) The DJI Ground Station Pro interface displaying the WayPoint routing mission. The white and gray arrow represents the takeoff location.

To make the conditions in our testing field as similar as practically possible to the natural fields insects would inhabit, we placed some insects on top of the grass gently and some closer to the base of the grass, causing them to be slightly occluded from above by the grass. The arrangement of the insects can be seen in Figure 6(a). A total of 36 insects were arranged in six rows of six insects each with gaps of 3.05 meters between each other. The data was collected with the UAV flying at a height of 10 meters. The resolution of the video was

. Under these conditions, the average width of the insects was 7.8 pixels and the average length was 14.6 pixels. The standard deviation of these values were 2.7 pixels in width and 5.0 pixels in length. The UAV flying speed was 1m/s and the video frame rate was 24 frames per second. Once the system has been set up, the entire data collection session can be performed in approximately three minutes.

Iv-D Mission planning

To ensure that all flight tests are performed in a similar pattern with minimal human error, we use the DJI Ground Station (GS) Pro app222https://www.dji.com/ground-station-pro to prepare a flight pattern ahead of the experiments. The tool provides planning flexibility, allowing many parameters to be adjusted, such as flight speed, flight altitude, and corner rounding radius. In this project, we used the WayPoint Route mission type (Figure 6(b)), which provides a simple tap-to-mark system to select the locations to which the UAV should fly. Based on the user-provided locations, the application creates a set of GPS waypoints that intercept these points while also satisfying the image acquisition requirements. As Figure 6(b) shows, in our experiments, we used 10 waypoints with a spacing of 15.24 meters in the vertical (i.e., south to north) direction and 3.05 meters in the horizontal (west to east) direction.

Iv-E Insect detection performance

In the evaluation of the detection performance, we chose not to rely on the GPS coordinates of the targets or the UAV as the ground truth. This approach eliminates the effects of localization and georeferencing error in our analysis. Placing insects in a formation with specific GPS coordinates is prone to error, especially with low precision methods to determine the GPS location of a specific spot in the field. Instead, we evaluate the insect detection method directly on the images by comparing the location and size of the bounding boxes around each detection in with the ground truth boxes (). is a binary image that contains manually labeled detection targets. To be labeled as a ground truth target, an insect must be clearly visible to the naked eye within the UV light beam in the images. A detection is considered a true positive if its intersection over union (IoU) metric with respect to a ground truth bounding box is higher than 50%. We compute the average recall and precision values of the baseline and proposed methods over the entire video sequence [6]. In our experiments, the testing footage consists of 2477 frames. Figure 7 shows the the precision-recall (PR) curves for our approach and the baseline method. The area under the PR curve for our method is whereas for the baseline approach it is . Our method shows a precision as high as for a recall rate of , while the baseline methods has a precision of for the same recall value. Most of the mistakes in our method are due to the fact that an IoU of 50% is challenging to achieve for very small objects. At lower IoU levels, the results would be improved.

Fig. 7: Precision and recall curves for the baseline and proposed insect detection algorithms generated using the Multiple Object Tracking Development kit [11].

Iv-F Computation time

The uncomplicated design of the detection algorithm allows us for quick processing of footage. The image cropping step can be executed in time, and the remaining steps run in time, where , i.e., the resolution of the region of interest. The overall method runs in time.

V Conclusions and Future Work

We have described a novel system that combines UAVs, ultraviolet lighting systems, and computer vision algorithms to detect fluorescent-coated insects in the field. To the best of our knowledge, this is the first vision-based system that detects insects in the field. It uses an illumination system based on a UV light source to visualize the insects and a color-based detection algorithm that requires minimal calibration. The system was evaluated relative to its detection precision and recall and compared with a baseline detection approach based on Otsu’s algorithm. The proposed system corresponds to a significant advancement of the state of the art as manual insect recapture rates are much lower, even with long range laser-based systems [28].

In the future, we intend to incorporate the GPS locations of the insects as the ground truth labels and use the UAV’s GPS location and inertial measurements to generate orthographic projections of the insect locations. Being able to use GPS locations rather than labeled video footage would allow us to directly map the locations of invasive insects and monitor their dispersal.

Acknowledgment

This work is supported by Agriculture and Food Research Initiative Agricultural Engineering grant no. 2018-67021-28318 from the USDA National Institute of Food and Agriculture. Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the view of the U.S. Department of Agriculture.

The authors would like to thank Weihua Liu and Scott Stewart for their assistance with data collection and annotation.

References

  • [1] D. Albani, D. Nardi, and V. Trianni. Field coverage and weed mapping by uav swarms. 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 4319–4325, 2017.
  • [2] K. Anderson and K. J. Gaston. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Frontiers in Ecology and the Environment, 11(3):138–146, 2013.
  • [3] S. Bhandari, A. Raheja, M. Chaichi, R. L. Green, D. Do, F. H. Pham, M. Ansari, J. G. Wolf, T. M. Sherman, and A. Espinas. Lessons learned from uav-based remote sensing for precision agriculture *. 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 458–467, 06 2018.
  • [4] H. Buss, T. P. Chan, K. B. Sluis, N. M. Domigan, and C. C. Winterbourn. Protein carbonyl measurement by a sensitive elisa method. Free Radical Biology and Medicine, 23(3):361–366, 1997.
  • [5] V. A. Drake and A. G. Gatehouse. Insect migration: tracking resources through space and time. Cambridge University Press, 1995.
  • [6] T. Fawcett. An introduction to ROC analysis. Pattern recognition letters, 27(8):861–874, 2006.
  • [7] J. R. Hagler, , and C. G. Jackson. Methods for marking insects: Current techniques and future prospects. Annual Review of Entomology, 46(1):511–543, 2001.
  • [8] J. R. Hagler and C. M. Durand. A new method for immunologically marking prey and its use in predation studies. Entomophaga, 39(3):257–265, 1994.
  • [9] S. W. J. and M. W. C. Marking tephritidae fruit fly adults in hawaii usa for release recovery studies. Proceedings of the Hawaiian Entomological Society, 23(3):437–440, 1981.
  • [10] G. JC, P. WC, and T. RE. Effective malaria control in ricefield district: With observations on experimental mosquito flights. Journal of the American Medical Association, 72(12):844–847, 1919.
  • [11] L. Leal-Taixé, A. Milan, I. Reid, S. Roth, and K. Schindler. MOTChallenge 2015: Towards a benchmark for multi-target tracking. arXiv:1504.01942 [cs], Apr. 2015. arXiv: 1504.01942.
  • [12] D.-H. Lee and T. Leskey. Flight behavior of foraging and overwintering brown marmorated stink bug, Halyomorpha halys (Hemiptera: Pentatomidae). Bulletin of Entomological Research, 105(05):566–573, 2015.
  • [13] J. L. Lockwood, M. F. Hoopes, and M. P. Marchetti. Invasion ecology. John Wiley & Sons, 2013.
  • [14] W. S. Longland and C. Clements. Use of fluorescent pigments in studies of seed caching by rodents. Journal of Mammalogy, 76(4):1260, 1995.
  • [15] P. Lottes, R. Khanna, J. Pfeifer, R. Siegwart, and C. Stachniss. Uav-based crop and weed classification for smart farming. 2017 IEEE International Conference on Robotics and Automation (ICRA), pages 3024–3031, 2017.
  • [16] R. J. Mercader, N. W. Siegert, A. M. Liebhold, and D. G. McCullough. Dispersal of the emerald ash borer, agrilus planipennis, in newly-colonized sites. Agricultural and Forest Entomology, 11(4):421–424, 2009.
  • [17] T. Merckx, R. E. Feber, R. L. Dulieu, M. C. Townsend, M. S. Parsons, N. A. Bourn, P. Riordan, and D. W. Macdonald. Effect of field margins on moths depends on species mobility: field-based evidence for landscape-scale conservation. Agriculture, ecosystems & environment, 129(1):302–309, 2009.
  • [18] A. Montes de Oca, L. Arreola, A. Flores, J. Sanchez, and G. Flores. Low-cost multispectral imaging system for crop monitoring. 2018 International Conference on Unmanned Aircraft Systems (ICUAS), 05 2018.
  • [19] Narisu, J. A. Lockwood, and S. P. Schell. A novel mark-recapture technique and its application to monitoring the direction and distance of local movements of rangeland grasshoppers (orthoptera: Acrididae) in the context of pest management. Journal of Applied Ecology, 36(4):604–617, 1999.
  • [20] W. Nazni, H. Luke, W. W. Rozita, A. Abdullah, I. Sa’diyah, A. Azahari, I. Zamree, S. Tan, H. Lee, and M. Sofian. Determination of the flight range and dispersal of the house fly, musca domestica (l.) using mark release recapture technique. Trop. Biomed, 22(1):53–61, 2005.
  • [21] N. Otsu. A threshold selection method from gray-level histograms. IEEE transactions on systems, man, and cybernetics, 9(1):62–66, 1979.
  • [22] J. P. Rojas B, C. A. Devia P, E. Petro, C. Martinez, I. F. Mondragon B, D. Patino, M. Rebolledo, and J. Colorado. Aerial mapping of rice crops using mosaicing techniques for vegetative index monitoring. 2018 International Conference on Unmanned Aircraft Systems (ICUAS), pages 846–855, 06 2018.
  • [23] D. R. Paini, A. W. Sheppard, D. C. Cook, P. J. De Barro, S. P. Worner, and M. B. Thomas. Global threat to agriculture from invasive species. Proceedings of the National Academy of Sciences of the United States of America, 113(27):7575–7579, 2016.
  • [24] G. Pajares. Overview and current status of remote sensing applications based on unmanned aerial vehicles (uavs). Photogrammetric Engineering & Remote Sensing, 81(4):281–329, 2015.
  • [25] K. Parvati, P. Rao, and M. Mariya Das. Image segmentation using gray-scale morphology and marker-controlled watershed transformation. Discrete Dynamics in Nature and Society, 2008:8, 2009.
  • [26] D. Pimentel, R. Zuniga, and D. Morrison. Update on the environmental and economic costs associated with alien-invasive species in the united states. Ecological Economics, 52(3):273 – 288, 2005.
  • [27] P. A. Plonski, J. V. Hook, C. Peng, N. Noori, and V. Isler. Environment exploration in sensing automation for habitat monitoring. IEEE Transactions on Automation Science and Engineering, 14:25–38, 2017.
  • [28] K. B. Rice, S. J. Fleischer, C. M. De Moraes, M. C. Mescher, J. F. Tooker, and M. Gish. Handheld lasers allow efficient detection of fluorescent marked organisms in the field. PloS one, 10(6):1–9, 2015.
  • [29] R. C. Russell, C. Webb, C. Williams, and S. Ritchie. Mark–release–recapture study to measure dispersal of the mosquito aedes aegypti in cairns, queensland, australia. Medical and veterinary entomology, 19(4):451–457, 2005.
  • [30] N. Rymer, A. J. Moore, and M. Schubert. Inexpensive, lightweight method of detecting coronas with uavs. 2018 International Conference on Unmanned Aircraft Systems (ICUAS), pages 452–457, 06 2018.
  • [31] V. M. Stern and A. Mueller. Techniques of marking insects with micronized fluorescent dust with especial emphasis on marking millions of lygus hesperus for dispersal studies 1. Journal of Economic Entomology, 61(5):1232, 1968.
  • [32] H. Sun. A fast watershed algorithm based on chain code and its application in image segmentation. Pattern Recognition Letters, 26(9):126–1274, 2005.
  • [33] L. Taylor. Insect migration, flight periodicity and the boundary layer. The Journal of Animal Ecology, pages 225–238, 1974.
  • [34] J. Vasquez-Gomez, J.-C. Herrera-Lozada, and M. Olguin-Carbajal. Coverage path planning for surveying disjoint areas. 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 06 2018.
  • [35] N. G. Wiman, V. M. Walton, P. W. Shearer, S. I. Rondon, and J. C. Lee. Factors affecting flight capacity of brown marmorated stink bug, halyomorpha halys (hemiptera: Pentatomidae). Journal of pest science, 88(1):37–47, 2015.
  • [36] C. Zhang and J. M. Kovacs. The application of small unmanned aerial systems for precision agriculture: a review. Precision agriculture, 13(6):693–712, 2012.