Skip to content

chore: bump version to 1.8.0#125

Merged
devonakelley merged 1 commit intomainfrom
chore/bump-version-1.8.0
Mar 24, 2026
Merged

chore: bump version to 1.8.0#125
devonakelley merged 1 commit intomainfrom
chore/bump-version-1.8.0

Conversation

@devonakelley
Copy link
Copy Markdown
Contributor

Version bump to 1.8.0 for PyPI release.

Changes since 1.7.0

#121 — kalibr init HuggingFace support

  • Scanner detects all 17 HF InferenceClient task methods + pipeline()
  • Rewriter generates router.execute() with task-appropriate model pairs
  • router.execute() task_method_map covers all 17 PATCHED_METHODS (was 10)
  • All scaffolded Routers now include 2 default paths (Thompson Sampling works from day one)
  • import kalibr enforced as first line in all generated code

#123 — HF token + DeepSeek provider

  • HF_API_TOKEN / HUGGING_FACE_HUB_TOKEN passed to InferenceClient
  • deepseek-* models routed correctly via _call_deepseek() instead of falling through to OpenAI

#124 — DeepSeek pricing + vendor attribution

  • DeepSeek added to pricing.py (V3, R1, Coder at correct rates)
  • OpenAI instrumentor detects DeepSeek by model prefix, sets correct llm.vendor and cost

Changes since 1.7.0:
- kalibr init HuggingFace support — detects all 17 HF task methods, generates
  router.execute() with task-appropriate model pairs (#121)
- router.execute() task_method_map covers all 17 HF PATCHED_METHODS (#121)
- Multi-path defaults in kalibr init — Thompson Sampling works from day one (#121)
- HF token passthrough to InferenceClient (#123)
- DeepSeek provider support in router._dispatch() (#123)
- DeepSeek pricing (V3, R1, Coder) + correct vendor attribution in spans (#124)
@devonakelley devonakelley merged commit 9a71b93 into main Mar 24, 2026
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant