Skip to content

docs: add Groq via LiteLLMModel example to using_different_models#2217

Open
VANDRANKI wants to merge 1 commit intohuggingface:mainfrom
VANDRANKI:docs/add-groq-litellm-example
Open

docs: add Groq via LiteLLMModel example to using_different_models#2217
VANDRANKI wants to merge 1 commit intohuggingface:mainfrom
VANDRANKI:docs/add-groq-litellm-example

Conversation

@VANDRANKI
Copy link
Copy Markdown

Adds a "Using Groq Models" section to docs/source/en/examples/using_different_models.md.

Closes #2163

What is added

  • Install step (smolagents[litellm])
  • API key setup
  • LiteLLMModel initialization with the groq/ prefix
  • Minimal CodeAgent usage example
  • Note on popular Groq models and link to the Groq model catalog

Why

Groq is a widely used inference provider known for very fast open-source model inference. The doc page already covers Gemini, OpenRouter, and xAI but not Groq. Issue #2163 specifically requested this.

The implementation follows the same pattern as the existing xAI Grok section: LiteLLMModel with the provider prefix and an API key.

Addresses huggingface#2163.

Adds a "Using Groq Models" section showing how to configure LiteLLMModel
with the groq/ prefix, which popular Groq models to use, and a minimal
CodeAgent usage example. Section is placed between OpenRouter and xAI
to maintain a logical grouping of third-party provider examples.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add documentation and example for using CodeAgent with Groq via LiteLLMModel

1 participant