Direct LLM Call#
Source: examples/agents/direct_llm_call.py
Introduction#
The default built-in path is the OpenAI-compatible HTTP client. This keeps the base install lightweight while still talking to a real endpoint, whether that endpoint is local (for example llama.cpp, vLLM, or SGLang) or remote behind an OpenAI-shaped gateway.
Technical Implementation#
Configure
Tracerwith JSONL + console output so each run emits machine-readable traces and lifecycle logs.Build the runtime surface (public APIs only) and execute
DirectLLMCall.run(...)with a fixedrequest_id.Capture structured outputs from runtime execution and preserve termination metadata for analysis.
Print a compact JSON payload including
trace_infofor deterministic tests and docs examples.
flowchart LR
A["Input prompt or scenario"] --> B["main(): runtime wiring"]
B --> C["DirectLLMCall.run(...)"]
C --> D["WorkflowRuntime executes one direct call"]
C --> E["Tracer JSONL + console events"]
D --> F["ExecutionResult/payload"]
E --> F
F --> G["Printed JSON output"]
1from __future__ import annotations
2
3import json
4from pathlib import Path
5
6import design_research_agents as drag
7
8
9def main() -> None:
10 """Execute one direct model call with explicit tracing."""
11 # Fixed request id keeps traces and docs output deterministic across runs.
12 request_id = "example-direct-llm-design-001"
13 tracer = drag.Tracer(
14 enabled=True,
15 trace_dir=Path("artifacts/examples/traces"),
16 enable_jsonl=True,
17 enable_console=True,
18 )
19
20 # Point the built-in HTTP client at any reachable OpenAI-compatible endpoint.
21 with drag.OpenAICompatibleHTTPLLMClient(
22 base_url="http://127.0.0.1:8001/v1",
23 default_model="qwen2-1.5b-q4",
24 ) as llm_client:
25 llm = drag.DirectLLMCall(llm_client=llm_client, tracer=tracer)
26 prompt = (
27 "Write one sentence describing the one primary engineering specification for a "
28 "field-repairable wearable sensor enclosure."
29 )
30 result = llm.run(
31 prompt=prompt,
32 request_id=request_id,
33 )
34
35 # Print the results
36 summary = result.summary()
37 print(json.dumps(summary, ensure_ascii=True, indent=2, sort_keys=True))
38
39
40if __name__ == "__main__":
41 main()
Expected Results#
Run Command
PYTHONPATH=src python3 examples/agents/direct_llm_call.py
Example output shape (values vary by run):
{
"success": true,
"final_output": "<example-specific payload>",
"terminated_reason": "<string-or-null>",
"error": null,
"trace": {
"request_id": "<request-id>",
"trace_dir": "artifacts/examples/traces",
"trace_path": "artifacts/examples/traces/run_<timestamp>_<request_id>.jsonl"
}
}