Show simple item record

dc.contributor.authorCase, Elizabeth Hillary
dc.date.accessioned2018-10-03T19:28:04Z
dc.date.available2018-10-03T19:28:04Z
dc.date.issued2017-12-30
dc.identifier.otherCase_cornell_0058O_10256
dc.identifier.otherhttp://dissertations.umi.com/cornell:10256
dc.identifier.otherbibid: 10474241
dc.identifier.urihttps://hdl.handle.net/1813/59138
dc.description.abstractIntegrated mosquito control is expensive and resource intensive, and changing climatic factors are predicted to expand the habitat ranges of mosquitos and other disease-carrying vectors into new regions within the United States. Currently, low-cost unmanned aerial vehicles (UAVs) can be used to photograph and map large areas at centimeter-scale resolution, and already are starting to be used by vector control personnel to efficiently locate mosquito habitat. However, post-processing of UAV images is still time intensive, often done manually, or with programs built for satellite imagery. Moreover, UAVs have never previously been used to assess habitat suitability in the more populated areas preferred by Aedes albopictus, a species that breeds primarily in standing water in artificial containers near human populations. This work explored the use of UAVs and convolutional neural for integrated mosquito management. Two neighborhoods comprising 125 houses in a densely-populated area of southern New York, were surveyed over nine days in 2017 with an unmanned aerial vehicle (UAV). The UAV survey coincided with an entomological survey, which was conducted on a subset of the houses to establish the presence and distribution of mosquito species. 64% the of 629 containers surveyed on all properties could be seen from the UAV, with almost 2,000 more features were identified the images (e.g. from houses that were not surveyed). In total, more than 2500 objects of interest (containers suitable mosquito habitat or related features) were identified in the aerial photographs. Two previously-published neural network architectures were trained on this novel set of UAV¬¬¬¬¬ aerial imagery. Single Shot Multibox Detection was used for image segmentation, achieving an average precision of 59%, a recall of 35%, and an overall accuracy of 31%. Separately, a fully convolutional neural net based on the VGG16 architecture, initiated with ImageNet weights and finetuned on images of surveyed properties assigned as positive or negative for Ae. albopictus larvae, achieved a binary classification of 80%. When combined with image segmentation neural networks, unmanned aerial vehicles show promise for identifying potential habitat for Ae. albopictus, increasing the ability of vector control personnel to manage mosquito populat¬¬ions. The neural networks’ abilities to predict larval presence could be further advanced by expanding training datasets, especially where containers of interest may vary by neighborhood.
dc.language.isoen_US
dc.subjectComputer science
dc.subjectaedes albopictus
dc.subjectMechanical engineering
dc.subjectneural network
dc.subjectunmanned aerial vehicle
dc.subjectmosquito source control
dc.subjectEntomology
dc.titleMosquitoNet: Investigating the use of unmanned aerial vehicles and neural networks in integrated mosquito management
dc.typedissertation or thesis
thesis.degree.disciplineMechanical Engineering
thesis.degree.grantorCornell University
thesis.degree.levelMaster of Science
thesis.degree.nameM.S., Mechanical Engineering
dc.contributor.chairErickson, David
dc.contributor.committeeMemberMorreale, Stephen J.
dc.contributor.committeeMemberHarrington, Laura C.
dcterms.licensehttps://hdl.handle.net/1813/59810
dc.identifier.doihttps://doi.org/10.7298/X43T9FDN


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Statistics