Direct LLM Compiled Execution#

Source: examples/agents/direct_llm_compiled_execution.py

Introduction#

Compiled delegate execution is useful when you want to inspect the bound workflow and tracing metadata before running it. This example makes the intermediate CompiledExecution object explicit so the compile step itself is visible and testable.

Technical Implementation#

  1. Configure Tracer with JSONL + console output so each run emits machine-readable traces and lifecycle logs.

  2. Build the runtime surface (public APIs only) and execute DirectLLMCall.compile(...) to obtain CompiledExecution.

  3. Validate the compiled wrapper shape, then call CompiledExecution.run() with the bound request metadata.

  4. Print a compact JSON payload that includes compile metadata alongside the normalized execution summary.

        flowchart LR
    A["Input prompt or scenario"] --> B["main(): runtime wiring"]
    B --> C["DirectLLMCall.compile(...)"]
    C --> D["CompiledExecution"]
    D --> E["CompiledExecution.run()"]
    E --> F["ExecutionResult/payload"]
    F --> G["Printed JSON output"]
    
 1from __future__ import annotations
 2
 3import json
 4from pathlib import Path
 5
 6import design_research_agents as drag
 7
 8
 9def main() -> None:
10    """Compile one direct model call, then run the compiled execution."""
11    # Fixed request id keeps traces and docs output deterministic across runs.
12    request_id = "example-direct-llm-compiled-design-001"
13    tracer = drag.Tracer(
14        enabled=True,
15        trace_dir=Path("artifacts/examples/traces"),
16        enable_jsonl=True,
17        enable_console=True,
18    )
19    # Run the compiled direct LLM example using public runtime APIs. Using this with statement will automatically
20    # shut down the managed client when the example is done.
21    with drag.LlamaCppServerLLMClient() as llm_client:
22        llm = drag.DirectLLMCall(llm_client=llm_client, tracer=tracer)
23        prompt = (
24            "Write one sentence describing the one primary engineering specification for a "
25            "field-repairable wearable sensor enclosure."
26        )
27        compiled: drag.CompiledExecution = llm.compile(
28            prompt=prompt,
29            request_id=request_id,
30        )
31        result = compiled.run()
32
33    # Print the results
34    payload = {
35        "compiled_delegate_name": compiled.delegate_name,
36        "compiled_request_id": compiled.request_id,
37        **result.summary(),
38    }
39    print(json.dumps(payload, ensure_ascii=True, indent=2, sort_keys=True))
40
41
42if __name__ == "__main__":
43    main()

Expected Results#

Run Command

PYTHONPATH=src python3 examples/agents/direct_llm_compiled_execution.py

Example output shape (values vary by run):

{
  "compiled_delegate_name": "DirectLLMCall",
  "compiled_request_id": "<request-id>",
  "success": true,
  "final_output": "<example-specific payload>",
  "terminated_reason": "<string-or-null>",
  "error": null,
  "trace": {
    "request_id": "<request-id>",
    "trace_dir": "artifacts/examples/traces",
    "trace_path": "artifacts/examples/traces/run_<timestamp>_<request_id>.jsonl"
  }
}

References#