DEEP OC Phytoplankton

By DEEP-Hybrid-DataCloud Consortium | Created:

services, library/tensorflow, library/lasagne, docker

License: Apache License 2.0

Build Status

The deep learning revolution has brought significant advances in a number of fields [1], primarily linked to image and speech recognition. The standardization of image classification tasks like the ImageNet Large Scale Visual Recognition Challenge [2] has resulted in a reliable way to compare top performing architectures.

This Docker container contains a trained Convolutional Neural network optimized for phytoplankton identification using images. The architecture used is an Xception [3] network using Keras on top of Tensorflow.

This service is based in the Image Classification with Tensorflow model.

References

[1]: Yann LeCun, Yoshua Bengio, and Geofrey Hinton. Deep learning. Nature, 521(7553):436–444, may 2015.

[2]: Olga Russakovsky et al. ImageNet Large Scale Visual Recognition Challenge. International Journal of Computer Vision (IJCV), 115(3):211–252, 2015.

[3]: Chollet, François. Xception: Deep learning with depthwise separable convolutions arXiv preprint (2017): 1610-02357.

Run locally on your computer

Using Docker

You can run this model directly on your computer, assuming that you have Docker installed, by following these steps:

$ docker pull deephdc/deep-oc-phytoplankton-classification-tf
$ docker run -ti -p 5000:5000 deephdc/deep-oc-phytoplankton-classification-tf

Using udocker

If you do not have Docker available or you do not want to install it, you can use udocker within a Python virtualenv:

$ virtualenv udocker
$ source udocker/bin/activate
$ git clone https://github.com/indigo-dc/udocker
$ cd udocker
$ pip install .
$ udocker pull deephdc/deep-oc-phytoplankton-classification-tf
$ udocker create deephdc/deep-oc-phytoplankton-classification-tf
$ udocker run -p 5000:5000  deephdc/deep-oc-phytoplankton-classification-tf

Once running, point your browser to http://127.0.0.1:5000/ and you will see the API documentation, where you can test the model functionality, as well as perform other actions (such as training).

For more information, refer to the user documentation.