LCORE-633: bump-up Llama Stack version to 0.2.19#520
Conversation
|
Warning Rate limit exceeded@tisnik has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 8 minutes and 0 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (2)
WalkthroughBumps Llama Stack dependency from 0.2.18 to 0.2.19 in pyproject.toml and updates the repository constant MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION to "0.2.19". No other code, API, or behavior changes. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes Possibly related PRs
Suggested reviewers
Pre-merge checks (3 passed)✅ Passed checks (3 passed)
Poem
✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/constants.py(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: build-pr
| # Minimal and maximal supported Llama Stack version | ||
| MINIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.17" | ||
| MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.18" | ||
| MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.19" |
There was a problem hiding this comment.
💡 Verification agent
🧩 Analysis chain
LGTM: max supported version bumped to 0.2.19 and matches the PR intent.
Looks consistent with the dependency upgrade. Consider verifying no stale 0.2.18 references and that min/max ordering remains valid.
Run:
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "Check pyproject for llama-stack/client pins (expect 0.2.19):"
rg -nP 'llama-stack(-client)?\s*=?\s*["'\'']?0\.2\.\d+["'\'']?' pyproject.toml || true
echo "Find stale references to 0.2.18 anywhere:"
rg -n '0\.2\.18\b' -g '!**/dist/**' -g '!**/build/**' || true
echo "Validate MIN<=MAX in src/constants.py:"
python - <<'PY'
import re
min_v=max_v=None
for line in open("src/constants.py", encoding="utf-8"):
if "MINIMAL_SUPPORTED_LLAMA_STACK_VERSION" in line:
min_v=re.search(r'"([^"]+)"', line).group(1)
if "MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION" in line:
max_v=re.search(r'"([^"]+)"', line).group(1)
def parse(v): return tuple(map(int, v.split(".")))
assert parse(min_v)<=parse(max_v), f"Invalid range: {min_v}>{max_v}"
print(f"OK: {min_v}..{max_v}")
PYLength of output: 309
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "Dependencies in pyproject.toml:"
rg -n '^\s*llama-stack(-client)?' -C2 pyproject.toml || true
echo "Pin versions in pyproject.toml:"
rg -n '0\.2\.\d+' -C0 pyproject.toml || true
echo "Find stale refs to 0.2.18:"
rg -n '0\.2\.18\b' -C2 . || trueLength of output: 3615
Update stale 0.2.18 references in documentation and examples
Multiple example and doc files still pin “0.2.18” (e.g. examples/pyproject.llamastack.toml, src/models/responses.py examples, README.md, docs/openapi.md, docs/openapi.json, docs/output.md, docs/getting_started.md, docs/deployment_guide.md); bump all of these to “0.2.19”. Version range in src/constants.py (0.2.17..0.2.19) is valid.
|
@radofuchs Hi, where's the problem there? Can not see logs from llama-stack container, so can't tell what's wrong |
Description
LCORE-633: bump-up Llama Stack version to 0.2.19
Type of change
Related Tickets & Documents
Summary by CodeRabbit