Running other models

Do you have already a model file? Skip to Run models manually.

To load models into LocalAI, you can either use models manually or configure LocalAI to pull the models from external sources, like Huggingface and configure the model.

To do that, you can point LocalAI to an URL to a YAML configuration file - however - LocalAI does also have some popular model configuration embedded in the binary as well. Below you can find a list of the models configuration that LocalAI has pre-built, see Model customization on how to configure models from URLs.

There are different categories of models: LLMs, Multimodal LLM , Embeddings, Audio to Text, and Text to Audio depending on the backend being used and the model architecture.

Last updated 21 Nov 2024, 01:01 +0100 . history