English abstract
ABSTRACT Coral community or coral reef surveys include a variety of methods based on "sampling" the substrates to assess species and substrates abundance and cover. A common characteristics of these methods is the time spent by experts analyzing the data, either underwater as diving experts or as analyst of photographs or video transects in the laboratory. The analysis of such underwater transects requires the identification of substrates (including corals) based on their appearance in the photographs, or video frames, which in many cases is time consuming. The aim of this study was to define underwater image features (color and texture) and classification algorithms susceptible to be used for a semi-automatic annotation of underwater survey videos and may prove to be a cost effective and time efficient tool for reef surveys. A series of video frames were extracted from numerous underwater survey videos of Musandam (Sutanate of Oman) coral communities. On these frames, small sample images (50 x 50) pixels were extracted from section of images identified by a coral taxonomist based on the morphological features of the colony. A total of 9 substrates consisting of 6 coral species and 3 non-coral substrates were chosen for this study representing > 95% of the cover in those communities. Both color features (8 features based on HSV transform of the images) and texture features (uniform, rotational invariant, local binary patterns at 3 different scales) were extracted from the sample images, and then used in a series of supervised classification algorithms. Three experiments of increased complexity were designed to evaluate the ability of a computer expert system to identify and quantify relevant coral communities' reference points. All ciassification algorithms performed well and distinguished correctly >90% correct identification) corals from non-living substrates. Because of its speed and simplicity, the 3 nearest neighbor classifications was then used to classify 7800 of images of 9 substrates (6 coral species and 3 non-coral substrates). The average correct identification for all substrates reached 85% with some variation between substrates ranging from 72% and 99%. Classification accuracy increased considerably using both color and texture feature vectors but did not vary significantly between the methods of classification tested. These results suggest the possibility of creating automatic substrate identification software for coral monitoring programs, which after some period of supervised training would reduce considerably both the cost of such programs and the variability in assessment by different experts. The proposed method, using video tranşects would also minimize the time spent sampling under water, thus reducing costs further. Future work will include an assessment of the need for underwater color correction and the development of a continuous learning algorithm based on Nearest Neighbor. Keywords: Coral; Substrate Classification; Underwater video; Coral reefs; Image analysis; texture; local binary pattern; Machine learning; automated annotation; survey.