Skip to content

Commit 97f8335

Browse files
committed
Update dependencies to llama-stack version 0.5.2 and add llama-stack-api; adjust constants and tests accordingly
Updated `test.containerfile` to rhoai-3.4 Update base image in test.containerfile to use upstream Red Hat UBI fixed type error addressed comments - updated from 0.5.0 -> 0.5.2 fixed mypy
1 parent e73778a commit 97f8335

File tree

6 files changed

+137
-52
lines changed

6 files changed

+137
-52
lines changed

pyproject.toml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -28,9 +28,9 @@ dependencies = [
2828
# Used by authentication/k8s integration
2929
"kubernetes>=30.1.0",
3030
# Used to call Llama Stack APIs
31-
"llama-stack==0.4.3",
32-
"llama-stack-client==0.4.3",
33-
"llama-stack-api==0.4.4",
31+
"llama-stack==0.5.2",
32+
"llama-stack-client==0.5.2",
33+
"llama-stack-api==0.5.2",
3434
# Used by Logger
3535
"rich>=14.0.0",
3636
# Used by JWK token auth handler

src/constants.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
# Minimal and maximal supported Llama Stack version
44
MINIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.17"
5-
MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.4.3"
5+
MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.5.2"
66

77
UNABLE_TO_PROCESS_RESPONSE = "Unable to process this request"
88

src/utils/responses.py

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1139,15 +1139,16 @@ def _extract_text_from_content(
11391139

11401140
text_fragments: list[str] = []
11411141
for part in content:
1142-
if part.type == "input_text":
1142+
part_type = getattr(part, "type", None)
1143+
if part_type == "input_text":
11431144
input_text_part = cast(InputTextPart, part)
11441145
if input_text_part.text:
11451146
text_fragments.append(input_text_part.text.strip())
1146-
elif part.type == "output_text":
1147+
elif part_type == "output_text":
11471148
output_text_part = cast(OutputTextPart, part)
11481149
if output_text_part.text:
11491150
text_fragments.append(output_text_part.text.strip())
1150-
elif part.type == "refusal":
1151+
elif part_type == "refusal":
11511152
refusal_part = cast(ContentPartRefusal, part)
11521153
if refusal_part.refusal:
11531154
text_fragments.append(refusal_part.refusal.strip())

test.containerfile

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,3 @@
1-
# Upstream llama-stack built from Red Hat UBI
21
FROM registry.access.redhat.com/ubi9/ubi-minimal
32

43
USER root

tests/e2e/features/info.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ Feature: Info tests
1616
When I access REST API endpoint "info" using HTTP GET method
1717
Then The status code of the response is 200
1818
And The body of the response has proper name Lightspeed Core Service (LCS) and version 0.4.2
19-
And The body of the response has llama-stack version 0.4.3
19+
And The body of the response has llama-stack version 0.5.2
2020

2121
@skip-in-library-mode
2222
Scenario: Check if info endpoint reports error when llama-stack connection is not working

0 commit comments

Comments
 (0)