Yes, Voxd can run on a wide range of Large Language Model (LLM) providers. This flexibility allows you to choose the model that best fits your business needs, whether you prefer leading commercial providers or want to use your own self-hosted models.
Some common examples of supported model providers include:
- OpenAI (such as
GPT-4,GPT-3.5) - Google (like
Gemini) - Anthropic (
Claude) - Microsoft Azure OpenAI Service
- Mistral and other open-source models
- Self-hosted models (using platforms like
Hugging Face,vLLM, or your own infrastructure)
Within each provider, you can select from multiple model versions and configurations based on your requirements for speed, accuracy, or privacy.
Voxd's flexible architecture means you’re not locked into a single AI provider. You can switch models or even run different agents on different models, helping you optimize for performance, cost, or compliance.