Skip to content

Commit 4209f8b

Browse files
release: 0.5.0-alpha.3
1 parent 631ab2c commit 4209f8b

File tree

4 files changed

+44
-3
lines changed

4 files changed

+44
-3
lines changed

.release-please-manifest.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
{
2-
".": "0.5.0-alpha.2"
2+
".": "0.5.0-alpha.3"
33
}

CHANGELOG.md

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,46 @@
11
# Changelog
22

3+
## 0.5.0-alpha.3 (2026-02-25)
4+
5+
Full Changelog: [v0.5.0-alpha.2...v0.5.0-alpha.3](https://github.com/llamastack/llama-stack-client-python/compare/v0.5.0-alpha.2...v0.5.0-alpha.3)
6+
7+
### ⚠ BREAKING CHANGES
8+
9+
* improve consistency of post-training API endpoints
10+
11+
### Features
12+
13+
* Add prompt_cache_key parameter support ([6b45699](https://github.com/llamastack/llama-stack-client-python/commit/6b45699185d934a5f8395c5cc3046f6c5aceb770))
14+
* add support for 'frequency_penalty' param to Responses API ([56d39cc](https://github.com/llamastack/llama-stack-client-python/commit/56d39cc9ff9d6f54e303fc377d605ae17bac9584))
15+
* add support for 'presence_penalty' param to Responses API ([4f57d15](https://github.com/llamastack/llama-stack-client-python/commit/4f57d159caba431676dced864f8f0871c3692f7b))
16+
* add support for /responses background parameter ([4f8bf45](https://github.com/llamastack/llama-stack-client-python/commit/4f8bf4526e529a74b9c53cac6df8e4beb2808d60))
17+
* Add top_logprobs parameter support ([2196986](https://github.com/llamastack/llama-stack-client-python/commit/21969867a82596e8be0aeeddbb6d8ccedf3e0f8b))
18+
* add top_p parameter support to responses API ([23e3b9f](https://github.com/llamastack/llama-stack-client-python/commit/23e3b9fcf7a23378c200604d0f57dc5a9e6a8527))
19+
* Add truncation parameter support ([7501365](https://github.com/llamastack/llama-stack-client-python/commit/7501365fe89795e87accfb6b1f2329da25d0efeb))
20+
* improve consistency of post-training API endpoints ([99057fd](https://github.com/llamastack/llama-stack-client-python/commit/99057fdc74bafdf54479674ba75b447cd4681cb6))
21+
* **vector_io:** Implement Contextual Retrieval for improved RAG search quality ([89ec5a7](https://github.com/llamastack/llama-stack-client-python/commit/89ec5a7bf405e688bd404877e49ab1ee9b49bf7e))
22+
23+
24+
### Bug Fixes
25+
26+
* align chat completion usage schema with OpenAI spec ([3974d5d](https://github.com/llamastack/llama-stack-client-python/commit/3974d5db8270e2548d0cdd54204c1603ca7a84a8))
27+
* Enabled models list works ([#314](https://github.com/llamastack/llama-stack-client-python/issues/314)) ([acd5e64](https://github.com/llamastack/llama-stack-client-python/commit/acd5e64a9e82083192a31f85f9c810291cabcadb))
28+
* **inference:** use flat response message model for chat/completions ([e58e2e4](https://github.com/llamastack/llama-stack-client-python/commit/e58e2e4dee9c9bbb72e4903e30f169991d10e545))
29+
* **responses:** achieve full OpenResponses conformance — 6/6 tests passing ([631ab2c](https://github.com/llamastack/llama-stack-client-python/commit/631ab2c19c7cd33ac81598a795ae8be93bdd5a4b))
30+
* **vector_io:** align Protocol signatures with request models ([ea58fd8](https://github.com/llamastack/llama-stack-client-python/commit/ea58fd88201ef59e580443688100cafe45f305c0))
31+
32+
33+
### Chores
34+
35+
* **api:** minor updates ([17a2705](https://github.com/llamastack/llama-stack-client-python/commit/17a270528b503591de15f9e9fcbc378007b75eda))
36+
* format all `api.md` files ([0e3e262](https://github.com/llamastack/llama-stack-client-python/commit/0e3e2626081ca9268297742990368c7ed6493b40))
37+
* **internal:** add request options to SSE classes ([2ecc682](https://github.com/llamastack/llama-stack-client-python/commit/2ecc682c1fccc86c643ad3da40e5134352745525))
38+
* **internal:** bump dependencies ([612291e](https://github.com/llamastack/llama-stack-client-python/commit/612291e2142b710cdd643af16bbe83e514f7a44e))
39+
* **internal:** fix lint error on Python 3.14 ([a0f6975](https://github.com/llamastack/llama-stack-client-python/commit/a0f69750827b016bb27a52bdd77fcbbacd311020))
40+
* **internal:** make `test_proxy_environment_variables` more resilient ([6bc2bb4](https://github.com/llamastack/llama-stack-client-python/commit/6bc2bb4e81b16d23e20090f45dbd8a53a63c158d))
41+
* **internal:** make `test_proxy_environment_variables` more resilient to env ([44bbae1](https://github.com/llamastack/llama-stack-client-python/commit/44bbae12bb8b4f72d1fb50db29bedd69f30340b7))
42+
* update mock server docs ([92cb087](https://github.com/llamastack/llama-stack-client-python/commit/92cb087355ffa1fd50e3a35b8e888853784c9fe9))
43+
344
## 0.5.0-alpha.2 (2026-02-05)
445

546
Full Changelog: [v0.5.0-alpha.1...v0.5.0-alpha.2](https://github.com/llamastack/llama-stack-client-python/compare/v0.5.0-alpha.1...v0.5.0-alpha.2)

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "llama_stack_client"
3-
version = "0.5.0-alpha.2"
3+
version = "0.5.0-alpha.3"
44
description = "The official Python library for the llama-stack-client API"
55
dynamic = ["readme"]
66
license = "MIT"

src/llama_stack_client/_version.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,4 +7,4 @@
77
# File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.
88

99
__title__ = "llama_stack_client"
10-
__version__ = "0.5.0-alpha.2" # x-release-please-version
10+
__version__ = "0.5.0-alpha.3" # x-release-please-version

0 commit comments

Comments
 (0)