Overview
LocalAI is an OpenAI-compatible API for local inference, providing a drop-in replacement for OpenAI’s API that runs on your hardware.
Key Features
- OpenAI Compatible: Drop-in replacement for OpenAI API
- Multiple Formats: Support for GGML, GGUF model formats
- Docker Support: Easy deployment via Docker
- REST API: Standard REST endpoints
- Cross-Platform: Runs on any system with Docker
Installation
Docker (Recommended)
docker run -p 8080:8080 localai/localai:latestFrom Source
git clone https://github.com/go-skynet/LocalAI
cd LocalAI
make buildUsage
- Start Server: Run LocalAI server
- Use OpenAI SDK: Use standard OpenAI Python/Node.js SDKs
- Point to Local: Set base_url to
http://localhost:8080
Supported Models
- Llama models
- GPT-2/J
- BERT
- Stable Diffusion (images)
- And more