A minimal example demonstrating how to build and coordinate a tool as a service using Orra's Plan Engine. It's Orra Hello World!
sequenceDiagram
participant CLI as Orra CLI
participant PE as Plan Engine
participant ES as Echo Tool as Service
participant WH as Webhook Server
CLI->>PE: Send action
PE->>ES: Orchestrate task
ES->>PE: Return echo
PE->>WH: Send result
Note over WH: See result in terminal
- 🔄 Basic service registration and coordination
- 📡 Real-time WebSocket communication
- ⚡ Reliable message delivery
- 🛡️ Built-in health monitoring
- 🚀 Simple but production-ready patterns
- Docker and Docker Compose
- Poetry
- OpenAI API key for Orra's Plan Engine
PLAN_CACHE_OPENAI_API_KEY
- OpenAI API key or Groq API key for Orra's Plan Engine reasoning models config
- OpenAI API key for the
writer_crew
andeditor
Agents
-
First, setup Orra and the CLI by following the installation instructions:
-
Setup your Orra project:
# Create project, add a webhook and API key
orra projects add my-echo-app
orra webhooks add http://host.docker.internal:8888/webhook
orra api-keys gen echo-key
- Configure the Echo tool as service
cd examples/echo-python
echo "ORRA_API_KEY=echo-key-from-step-2" > .env
- Start the webhook server (in a separate terminal):
# Start the webhook server using the verify subcommand
orra verify webhooks start http://localhost:8888/webhook
- Start and register the Echo service:
# With Docker
docker compose up
# Or locally with Poetry
poetry install
poetry run python src/main.py
- Try it out:
# Send a test message
orra verify run 'Echo this message' --data message:'Hello from Orra!'
# Check the result
orra ps
orra inspect <orchestration-id>
You should see the result both in the webhook server terminal and through the inspect command.
# This curl command is equivalent to orra verify run performs internally
## Send an echo orchestration request to the control plane
curl -X POST http://localhost:8005/orchestrations \
-H "Authorization: Bearer $ORRA_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"action": {
"type": "echo",
"content": "Echo this message"
},
"data": [
{
"field": "message",
"value": "Hello from curl!"
}
],
"webhook": "http://host.docker.internal:8888/api/webhook"
}'
The core Echo service implementation is straightforward:
from orra import OrraService, Task
from pydantic import BaseModel
class EchoInput(BaseModel):
message: str
class EchoOutput(BaseModel):
echo: str
service = OrraService(
name="echo",
description="Use echo to echo back messages",
url=os.getenv("ORRA_URL"),
api_key=os.getenv("ORRA_API_KEY")
)
@service.handler()
async def handle_echo(task: Task[EchoInput]) -> EchoOutput:
return EchoOutput(echo=task.input.message)
That's it! Orra provides:
- Service discovery
- Health monitoring
- Reliable task execution
- Error recovery