Back to AI Tools
llama.cpp
Open Sourceby llama.cpp Community
4.5(0 ratings)
Best For
About llama.cpp
Local/offline LLM inference in C/C++.
Portable C/C++ LLM inference (GGUF) with many backends (CUDA, Metal, Vulkan, HIP, SYCL).
Tool Information
- License
- Open Source
- Type
- Cost
- Free (OSS)
- Released
- 2025
- Supported Languages
- C/C++, Python, Go, Node