diff --git a/docs/en/concepts/llms.mdx b/docs/en/concepts/llms.mdx index 98bfbeb234..7222d3b62f 100644 --- a/docs/en/concepts/llms.mdx +++ b/docs/en/concepts/llms.mdx @@ -1189,6 +1189,42 @@ In this section, you'll find detailed examples that help you select, configure, uv add 'crewai[litellm]' ``` + + + [Doubleword](https://doubleword.ai) is an AI model gateway providing unified routing, management, and security for inference across multiple model providers. It exposes an OpenAI-compatible API. + + Set the following environment variables in your `.env` file: + ```toml Code + DOUBLEWORD_API_KEY= + ``` + + Example usage in your CrewAI project: + ```python Code + import os + from crewai import LLM + + llm = LLM( + model="openai/Qwen/Qwen3.5-397B-A17B-FP8", + base_url="https://api.doubleword.ai/v1", + api_key=os.environ["DOUBLEWORD_API_KEY"], + ) + ``` + + + Doubleword features: + - OpenAI-compatible API for any supported model + - Unified routing across multiple inference providers + - Per-model rate limiting and access control + - Request logging and analytics + + See the [Doubleword docs](https://docs.doubleword.ai) for available models and configuration. + + + **Note:** This provider uses LiteLLM with the `openai/` prefix. Add it as a dependency to your project: + ```bash + uv add 'crewai[litellm]' + ``` + ## Streaming Responses