Automated Analysis of Marine Video With Limited Data
Monitoring of the marine environment requires large amounts of data, simply due to its vast size. Therefore, underwater autonomous vehicles and drones are increas- ingly deployed to acquire numerous photographs. How- ever, ecological conclusions from them are lagging as the data requires expert annotation and thus realistically can- not be manually processed. This calls for developing au- tomatic classification algorithms dedicated for this type of data. Current out-of-the-box solutions struggle to provide optimal results in these scenarios as the marine data is very different from everyday data. Images taken under water dis- play low contrast levels and reduced visibility range thus making objects harder to localize and classify. Scale varies dramatically because of the complex 3 dimensionality of the scenes. In addition, the scarcity of labeled marine data pre- vents training these dedicated networks from scratch. In this work, we demonstrate how transfer learning can be uti- lized to achieve high quality results for both detection and classification in the marine environment. We also demon- strate tracking in videos that enables counting and measur- ing the organisms. We demonstrate the suggested method on two very different marine datasets, an aerial dataset and an underwater one.