Skip to content

Commit ee7dd36

Browse files
author
Meridian
committed
docs: README updates for v1.8.0 — DeepSeek provider + HF token env var
1 parent 9a71b93 commit ee7dd36

1 file changed

Lines changed: 29 additions & 4 deletions

File tree

README.md

Lines changed: 29 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ Open source SDK. Hosted optimization intelligence.
88

99
* **Outcome-aware routing** — Routes each goal to the model+tool path that is actually succeeding in production
1010
* **Continuous optimization** — Learns from real outcomes using Thompson Sampling. Adapts as models degrade, tools fail, or costs shift
11-
* **Auto-instrumentation** — Traces OpenAI, Anthropic, and Google AI calls with zero code changes
11+
* **Auto-instrumentation** — Traces OpenAI, Anthropic, Google AI, and DeepSeek calls with zero code changes. DeepSeek spans and costs are attributed correctly without a separate instrumentor.
1212
* **TraceCapsule** — Cross-agent context propagation for multi-agent systems
1313
* **Cost & token tracking** — Real-time cost calculation and token monitoring across all providers
1414
* **Any model, any modality** — Text LLMs, voice, image, embeddings, classification, translation, anything on HuggingFace
@@ -78,6 +78,22 @@ router = Router(goal="product_image", paths=["stabilityai/stable-diffusion-xl-ba
7878
result = router.execute(task="text_to_image", input_data="a product photo")
7979
```
8080

81+
## DeepSeek
82+
83+
DeepSeek models work out of the box — no separate SDK, no extra config beyond `DEEPSEEK_API_KEY`:
84+
85+
```python
86+
from kalibr import Router
87+
88+
router = Router(
89+
goal="classify_icp",
90+
paths=["deepseek-chat", "gpt-4o-mini", "claude-sonnet-4-20250514"],
91+
)
92+
response = router.completion(messages=[{"role": "user", "content": "Is this an ICP fit?"}])
93+
```
94+
95+
Supported models: `deepseek-chat` (V3), `deepseek-reasoner` (R1), `deepseek-coder`. Kalibr attributes costs and spans correctly for each.
96+
8197
`pip install kalibr`
8298

8399
[![PyPI version](https://img.shields.io/pypi/v/kalibr)](https://pypi.org/project/kalibr/)
@@ -102,7 +118,10 @@ Get your credentials from [dashboard.kalibr.systems/settings](https://dashboard.
102118
```bash
103119
export KALIBR_API_KEY=your-api-key
104120
export KALIBR_TENANT_ID=your-tenant-id
105-
export OPENAI_API_KEY=sk-... # or ANTHROPIC_API_KEY for Claude models
121+
export OPENAI_API_KEY=sk-... # OpenAI models
122+
export ANTHROPIC_API_KEY=sk-ant-... # Anthropic / Claude models
123+
export DEEPSEEK_API_KEY=sk-... # DeepSeek models (deepseek-chat, deepseek-reasoner)
124+
export HF_API_TOKEN=hf_... # HuggingFace private models / rate-limit bypass
106125
```
107126

108127
Or use autonomous provisioning:
@@ -282,12 +301,18 @@ pip install kalibr[langchain-all] # LangChain with all providers
282301
Kalibr auto-instruments OpenAI, Anthropic, Google, and HuggingFace SDKs on import (17 task types across every modality):
283302

284303
```python
285-
import kalibr # Must be first import
304+
import kalibr # Must be first import — patches OpenAI, Anthropic, Google, HuggingFace
286305
from openai import OpenAI
287306

288307
client = OpenAI()
289308
response = client.chat.completions.create(model="gpt-4o", messages=[...])
290-
# Traced automatically — cost, latency, tokens, success all captured
309+
# Traced automatically — cost, latency, tokens captured
310+
311+
# DeepSeek works automatically — same OpenAI SDK, detected by model prefix
312+
from openai import OpenAI
313+
deepseek = OpenAI(api_key=os.environ["DEEPSEEK_API_KEY"], base_url="https://api.deepseek.com")
314+
response = deepseek.chat.completions.create(model="deepseek-chat", messages=[...])
315+
# Span labeled deepseek.chat.completions.create, cost at DeepSeek rates
291316
```
292317

293318
Disable with `KALIBR_AUTO_INSTRUMENT=false`.

0 commit comments

Comments
 (0)