Docker & Container Runtimes, News & Announcements

Use NVIDIA-Docker to deploy machine learning models

Deploying machine learning models has always been a struggle.

Most of the software industry has adopted the use of container engines like Docker for deploying code to production, but since accessing hardware resources like GPUs from Docker was difficult and required hacky, driver specific workarounds, the machine learning community has shied away from this option.

With the recent release of NVIDIA’s nvidia-docker tool, however, accessing GPUs from within Docker is a breeze, and we’re already reaping the benefits here at indico. In this tutorial we’ll walk you through setting up nvidia-docker so you too can deploy machine learning models with ease.

Before we get into the details however, let’s talk briefly about why using Docker for your next data science project may be a good choice. There is certainly a learning curve for the tools in the Docker ecosystem, but the benefits are worth the effort.

Read the complete article here.

Machine learning

Please share:
Tags:

We Recommend These Services

Register now for Big Data & AI Conference, international Big Data and AI conference in Dallas, TX (USA), June 27 - 29, 2019

Reasons to use control panel for your server

Register for the End-to-end Machine Learning with TensorFlow on Google Cloud Platform workshop. It will be conducted by the manager of Google's Cloud AI Advocacy team

Launch an SSD VPS in Europe, USA, Asia & Australia on Vultr's KVM-based Cloud platform starting at $5:00/month (15 GB SSD, 768 MB of RAM).


Leave a Comment

Your email address will not be published. Required fields are marked *

*