Citizen science has become a powerful force for scientific inquiry, providing researchers with access to a vast array of data points while connecting non scientists to the real process of science. This citizen-researcher relationship creates a very interesting synergy, allowing for the creation, execution, and analysis of research projects. With this in mind, a Convolutional Neural Network has been trained to identify seed images in collaboration with Spanish Royal Botanical Garden.
This Docker container contains a trained Convolutional Neural network optimized for seeds identification using images. The architecture used is an Xception  network using Keras on top of Tensorflow.
The PREDICT method expects an RGB image as input (or the url of an RGB image) and will return a JSON with the top 5 predictions.
As training dataset we have used a collection of images from the Spanish Royal Botanical Garden which consists of around 28K images from 743 species and 493 genera.
This service is based in the Image Classification with Tensorflow model.
: Chollet, Francois. Xception: Deep learning with depthwise separable convolutions arXiv preprint (2017): 1610-02357.
Run locally on your computer
You can run this module directly on your computer, assuming that you have Docker installed, by following these steps:
$ docker pull deephdc/deep-oc-seeds-classification-tf $ docker run -ti -p 5000:5000 deephdc/deep-oc-seeds-classification-tf
If you do not have Docker available or you do not want to install it, you can use udocker within a Python virtualenv:
$ virtualenv udocker $ source udocker/bin/activate $ git clone https://github.com/indigo-dc/udocker $ cd udocker $ pip install . $ udocker pull deephdc/deep-oc-seeds-classification-tf $ udocker create deephdc/deep-oc-seeds-classification-tf $ udocker run -p 5000:5000 deephdc/deep-oc-seeds-classification-tf
Once running, point your browser to
and you will see the API documentation, where you can test the module
functionality, as well as perform other actions (such as training).
For more information, refer to the user documentation.