Back to AI Tools

llama.cpp

Open Source

by llama.cpp Community

4.5(0 ratings)
Visit WebsiteGitHub

Best For

About llama.cpp

Local/offline LLM inference in C/C++.

Portable C/C++ LLM inference (GGUF) with many backends (CUDA, Metal, Vulkan, HIP, SYCL).

Tool Information

License
Open Source
Type
Cost
Free (OSS)
Released
2025
Supported Languages
C/C++, Python, Go, Node

Key Capabilities

Prompts for llama.cpp

Similar Tools

Works Well With

Curated combinations that pair nicely with llama.cpp for faster experimentation.

We're mapping complementary tools for this entry. Until then, explore similar tools above or check recommended stacks on challenge pages.