LocalAI provides a variety of images to support different environments. These images are available on quay.io and Docker Hub.
For All-in-One image with a pre-configured set of models and backends, see the AIO Images.
For GPU Acceleration support for Nvidia video graphic cards, use the Nvidia/CUDA images, if you don’t have a GPU, use the CPU images. If you have AMD or Mac Silicon, see the build section.
š”
Available Images Types:
Images ending with -core are smaller images without predownload python dependencies. Use these images if you plan to use llama.cpp, stablediffusion-ncn, tinydream or rwkv backends - if you are not sure which one to use, do not use these images.
Images containing the aio tag are all-in-one images with all the features enabled, and come with an opinionated set of configuration.
FFMpeg is not included in the default images due to its licensing. If you need FFMpeg, use the images ending with -ffmpeg. Note that ffmpeg is needed in case of using audio-to-text LocalAI’s features.
If using old and outdated CPUs and no GPUs you might need to set REBUILD to true as environment variable along with options to disable the flags which your CPU does not support, however note that inference will perform poorly and slow. See also flagset compatibility.