Router Delegate
Source: examples/patterns/router_delegate.py
Introduction
RouteLLM motivates specialized route selection, AutoGen demonstrates multi-agent delegation patterns, and Human-AI collaboration by design frames why explicit routing supports accountable coordination. This example shows intent-based routing across direct and multi-step agents using a shared runtime surface.
Technical Implementation
Configure
Tracerwith JSONL + console output so each run emits machine-readable traces and lifecycle logs.Build the runtime surface (public APIs only) and execute
RouterDelegatePattern.run(...)with a fixedrequest_id.Configure and invoke
Toolboxintegrations (core/script/MCP/callable) before assembling the final payload.Print a compact JSON payload including
trace_infofor deterministic tests and docs examples.
flowchart LR
A["Input prompt or scenario"] --> B["main(): runtime wiring"]
B --> C["RouterDelegatePattern.run(...)"]
C --> D["router delegates to specialized agent surfaces"]
C --> E["Tracer JSONL + console events"]
D --> F["ExecutionResult/payload"]
E --> F
F --> G["Printed JSON output"]
1from __future__ import annotations
2
3import json
4from pathlib import Path
5
6from design_research_agents import (
7 DirectLLMCall,
8 LlamaCppServerLLMClient,
9 MultiStepAgent,
10 Toolbox,
11 Tracer,
12)
13from design_research_agents.patterns import RouterDelegatePattern
14
15_EXAMPLE_LLAMA_CLIENT_KWARGS = {
16 "model": "Qwen_Qwen3-4B-Instruct-2507-Q4_K_M.gguf",
17 "hf_model_repo_id": "bartowski/Qwen_Qwen3-4B-Instruct-2507-GGUF",
18 "api_model": "qwen3-4b-instruct-2507-q4km",
19 "context_window": 8192,
20 "startup_timeout_seconds": 240.0,
21 "request_timeout_seconds": 240.0,
22}
23
24
25def main() -> None:
26 """Route one design prompt to the best delegate and print summary."""
27 # Fixed request id keeps traces and docs output deterministic across runs.
28 request_id = "example-workflow-router-delegate-design-001"
29 tracer = Tracer(
30 enabled=True,
31 trace_dir=Path("artifacts/examples/traces"),
32 enable_jsonl=True,
33 enable_console=True,
34 )
35 # Run the router/delegate pattern using public runtime surfaces. Using this with statement will automatically
36 # shut down the managed client and tool runtime when the example is done.
37 with Toolbox() as tool_runtime, LlamaCppServerLLMClient(**_EXAMPLE_LLAMA_CLIENT_KWARGS) as llm_client:
38 direct_llm_agent = DirectLLMCall(llm_client=llm_client, tracer=tracer)
39 json_tool_agent = MultiStepAgent(
40 mode="json",
41 llm_client=llm_client,
42 tool_runtime=tool_runtime,
43 max_steps=3,
44 allowed_tools=("text.word_count",),
45 tracer=tracer,
46 )
47
48 workflow = RouterDelegatePattern(
49 llm_client=llm_client,
50 tool_runtime=tool_runtime,
51 alternatives={
52 "direct_llm_agent": direct_llm_agent,
53 "json_tool_agent": json_tool_agent,
54 },
55 alternative_descriptions={
56 "direct_llm_agent": "Use for concise textual design summaries with no runtime tools.",
57 "json_tool_agent": ("Use for design requests needing runtime text analysis or tool calls."),
58 },
59 tracer=tracer,
60 )
61
62 result = workflow.run(
63 prompt=(
64 "Count the words in the phrase 'modular field service workflow' using the "
65 "appropriate delegate and return only the word_count."
66 ),
67 request_id=request_id,
68 )
69
70 # Print the results
71 summary = result.summary()
72 print(json.dumps(summary, ensure_ascii=True, indent=2, sort_keys=True))
73
74
75if __name__ == "__main__":
76 main()
Expected Results
Run Command
PYTHONPATH=src python3 examples/patterns/router_delegate.py
Example output shape (values vary by run):
{
"success": true,
"final_output": "<example-specific payload>",
"terminated_reason": "<string-or-null>",
"error": null,
"trace": {
"request_id": "<request-id>",
"trace_dir": "artifacts/examples/traces",
"trace_path": "artifacts/examples/traces/run_<timestamp>_<request_id>.jsonl"
}
}