2D semantic segmentation on the Vaihingen dataset
Know more »»Welcome to the DEEP Open Catalog!
DEEP-Hybrid-DataCloud is delivering a comprehensive platform to easily develop, build, share and deploy Artificial Intelligence, Machine Learning and Deep Learning modules on top of distributed e-Infrastructures.
In the DEEP Open Catalog you can find ready to use modules in a variety of domains. These modules can be executed on your local laptop, on a production server or on top of computing e-Infrastructures supporting the DEEP-Hybrid-DataCloud stack.
2D semantic segmentation on the Vaihingen dataset
Know more »»Train your own audio classifier with your custom dataset. It comes also pretrained on the 527 AudioSet classes.
Know more »»Detect body poses in images.
Know more »»Train a speech classifier to classify audio files between different keywords.
Know more »»This is a Docker image for developing new modules
Know more »»Massive Online Data Streams analysis
Know more »»Browse the list of available modules and pick the one of your choice.
The DEEP Generic Container is a good starting point to see how the API looks like (but it does not provide any functionality at all).
$ docker search deephdc
$ docker run -ti -p 5000:5000 deephdc/deep-oc-generic-container
No Docker possible? Try udocker instead!
$ udocker search deephdc
$ udocker run -p 5000:5000 deephdc/deep-oc-generic-container
Need GPU access?
Use nvidia-docker together with Docker:$ nvidia-docker run -ti -p 5000:5000 deephdc/deep-oc-generic-container
Or udocker way:
$ udocker pull deephdc/deep-oc-generic-container
$ udocker create --name=deep-oc-generic-container deephdc/deep-oc-generic-container
$ udocker setup --nvidia deep-oc-generic-container
$ udocker run -p 5000:5000 deep-oc-generic-container
http://127.0.0.1:5000/
.