DEEP OC Plant Classification (Theano)

By DEEP-Hybrid-DataCloud Consortium | Created: - Updated:

deprecated/services, library/theano, library/lasagne, docker

License: Apache License 2.0

Build Status

This service is deprecated. Please refer to the newer Tensorflow version

The deep learning revolution has brought significant advances in a number of fields [1], primarily linked to image and speech recognition. The standardization of image classification tasks like the ImageNet Large Scale Visual Recognition Challenge [2] has resulted in a reliable way to compare top performing architectures.

The use of deep learning for plant classification is not novel [3, 4] but has mainly focused in leaves and has been restricted to a limited amount of species, therefore making it of limited use for large-scale biodiversity monitoring purposes.

This Docker container contains a trained Convolutional Neural network optimized for plant identification using images. The architecture used is a Resnet50 [5] using Lasagne on top of Theano.

As training dataset it has been used the great collection of images which are available in PlantNet under a Creative-Common AttributionShareAlike 2.0 license. It consists of around 250K images belonging to more than 6K plant species of Western Europe. These species are distributed in 1500 genera and 200 families.

A detailed article about this network and the results obtained with it can be found in [6].

References

[1]: Yann LeCun, Yoshua Bengio, and Geofrey Hinton. Deep learning. Nature, 521(7553):436–444, may 2015.

[2]: Olga Russakovsky et al. ImageNet Large Scale Visual Recognition Challenge. International Journal of Computer Vision (IJCV), 115(3):211–252, 2015.

[3]: Sue Han Lee, Chee Seng Chan, Paul Wilkin, and Paolo Remagnino. Deep-plant: Plant identification with convolutional neural networks, 2015.

[4]: Mads Dyrmann, Henrik Karstoft, and Henrik Skov Midtiby. Plant species classification using deep convolutional neural network. Biosystems Engineering, 151:72–80, 2016.

[5]: He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 770-778)

[6]: Heredia, Ignacio. Large-scale plant classification with deep neural networks. Proceedings of the Computing Frontiers Conference. ACM, 2017.

Run locally on your computer

Using Docker

You can run this model directly on your computer, assuming that you have Docker installed, by following these steps:

$ docker pull deephdc/deep-oc-plant-classification-theano
$ docker run -ti -p 5000:5000 deephdc/deep-oc-plant-classification-theano

Using udocker

If you do not have Docker available or you do not want to install it, you can use udocker within a Python virtualenv:

$ virtualenv udocker
$ source udocker/bin/activate
$ git clone https://github.com/indigo-dc/udocker
$ cd udocker
$ pip install .
$ udocker pull deephdc/deep-oc-plant-classification-theano
$ udocker create deephdc/deep-oc-plant-classification-theano
$ udocker run -p 5000:5000  deephdc/deep-oc-plant-classification-theano

Once running, point your browser to http://127.0.0.1:5000/ and you will see the API documentation, where you can test the model functionality, as well as perform other actions (such as training).

For more information, refer to the user documentation.

Run on our pilot e-Infrastructure

In order to execute this model in our pilot e-Infrastructure you would need to be registered in the DEEP IAM.
The following instructions make use of the orchent CLI. You would need to install and configure orchent as shown in this tutorial.

Mesos (CPU)

$ curl -o deep-oc-plant-classification-theano.yml \
    https://raw.githubusercontent.com/indigo-dc/tosca-templates/master/deep-oc/deep-oc-plants-mesos-cpu.yml
$ orchent deep-oc-plant-classification-theano.yml  '{"rclone_conf"="...", "rclone_url"="...", "rclone_vendor"="...", "rclone_user"="...", "rclone_pass"="..."}'

Check the status of your job

$ orchent depshow <Deployment UUID>

Once its state is CREATE_COMPLETE, you will get the endpoint to access the service, e.g:

"endpoint": "mesos-lb.recas.ba.infn.it:10002"
Point your browser to the provided URL.

For more information, refer to the user documentation.