Packages for keyword “llama”
These packages are available as a package collection, usable in Xcode or SwiftPM.
NexaAI
Run the latest LLMs and VLMs across GPU, NPU, and CPU with PC (Python/C++) & mobile (Android & iOS) support, running quickly with OpenAI gpt-oss, Granite4, Qwen3VL, Gemma 3n and more.
LocalLLMClient
Swift package to run local LLMs on iOS, macOS, Linux
llmfarm_core
Swift library to work with llama and other large language models.
CameLLM
Run your favourite LLMs locally on macOS from Swift
4 packages.