Skip to content

chore: remove litellm type conversion #193

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 13, 2025
Merged

chore: remove litellm type conversion #193

merged 1 commit into from
Mar 13, 2025

Conversation

ehhuang
Copy link
Contributor

@ehhuang ehhuang commented Mar 11, 2025

Summary:
Moving this to litellm inference instead meta-llama/llama-stack#1565

Test Plan:
LLAMA_STACK_CONFIG=fireworks pytest -s -v tests/integration/agents/test_agents.py --safety-shield meta-llama/Llama-Guard-3-8B --text-model meta-llama/Llama-3.1-8B-Instruct

@ehhuang ehhuang marked this pull request as ready for review March 11, 2025 20:57
ehhuang added a commit to meta-llama/llama-stack that referenced this pull request Mar 12, 2025
Summary:
supports
meta-llama/llama-stack-client-python#193

Test Plan:
LLAMA_STACK_CONFIG=fireworks pytest -s -v
tests/integration/agents/test_agents.py --safety-shield
meta-llama/Llama-Guard-3-8B --text-model
meta-llama/Llama-3.1-8B-Instruct
@ehhuang ehhuang merged commit ab3f844 into main Mar 13, 2025
2 checks passed
@ehhuang ehhuang deleted the pr193 branch March 13, 2025 05:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants