Skip to content

Latest commit

 

History

History
136 lines (97 loc) · 3.53 KB

CUDA.md

File metadata and controls

136 lines (97 loc) · 3.53 KB

CUDA-based Python docker stack

GPU accelerated, multi-arch (linux/amd64, linux/arm64/v8) docker images:

Images available for Python versions ≥ 3.11.1.

Build chain

The same as the Python docker stack.

Features

glcr.b-data.ch/cuda/python/ver:*-devel serves as parent image for glcr.b-data.ch/jupyterlab/cuda/python/base.

Otherwise the same as the Python docker stack plus

Table of Contents

Prerequisites

The same as the Python docker stack plus

  • NVIDIA GPU
  • NVIDIA Linux driver
  • NVIDIA Container Toolkit

ℹ️ The host running the GPU accelerated images only requires the NVIDIA driver, the CUDA toolkit does not have to be installed.

Install

To install the NVIDIA Container Toolkit, follow the instructions for your platform:

Usage

Build image (ver)

latest:

stage 1

docker build \
  --build-arg BASE_IMAGE=ubuntu \
  --build-arg BASE_IMAGE_TAG=22.04 \
  --build-arg CUDA_IMAGE=nvidia/cuda \
  --build-arg CUDA_VERSION=12.6.1 \
  --build-arg CUDA_IMAGE_SUBTAG=runtime-ubuntu22.04 \
  --build-arg PYTHON_VERSION=3.12.6 \
  -t cuda/python/ver \
  -f ver/latest.Dockerfile .

stage 2

docker build \
  --build-arg BUILD_ON_IMAGE=cuda/python/ver \
  --build-arg CUDNN_VERSION=8.9.7.29 \
  --build-arg CUDNN_CUDA_VERSION_MAJ_MIN=12.2 \
  --build-arg LIBNVINFER_VERSION=10.4.0.26 \
  --build-arg LIBNVINFER_CUDA_VERSION_MAJ_MIN=12.6 \
  --build-arg CUDA_IMAGE_FLAVOR=runtime \
  -t cuda/python/ver \
  -f cuda/latest.Dockerfile .

version:

stage 1

docker build \
  --build-arg BASE_IMAGE=ubuntu \
  --build-arg BASE_IMAGE_TAG=22.04 \
  --build-arg CUDA_IMAGE=nvidia/cuda \
  --build-arg CUDA_IMAGE_SUBTAG=[cudnn8-]runtime-ubuntu22.04 \
  -t cuda/python/ver:MAJOR.MINOR.PATCH \
  -f ver/MAJOR.MINOR.PATCH.Dockerfile .

stage 2

docker build \
  --build-arg BUILD_ON_IMAGE=cuda/python/ver:MAJOR.MINOR.PATCH \
  --build-arg CUDA_IMAGE_FLAVOR=runtime \
  -t cuda/python/ver:MAJOR.MINOR.PATCH \
  -f cuda/MAJOR.MINOR.PATCH.Dockerfile .

For MAJOR.MINOR.PATCH3.11.1.

Run container

self built:

docker run -it --rm \
  --gpus '"device=all"' \
  cuda/python/ver[:MAJOR.MINOR.PATCH]

from the project's GitLab Container Registries:

docker run -it --rm \
  --gpus '"device=all"' \
  IMAGE[:MAJOR[.MINOR[.PATCH]]]

IMAGE being one of

See Notes for tweaks.