LocalAI (via) “Self-hosted, community-driven, local OpenAI-compatible API”. Designed to let you run local models such as those enabled by llama.cpp without rewriting your existing code that calls the OpenAI REST APIs. Reminds me of the various S3-compatible storage APIs that exist today.
Recent articles
- What's new in the world of LLMs, for NICAR 2025 - 8th March 2025
- I built an automaton called Squadron - 4th March 2025
- Notes from my Accessibility and Gen AI podcast appearance - 2nd March 2025