Direct LLM With Pinned Skills#

Source: examples/agents/direct_llm_with_pinned_skills.py

Introduction#

This example shows how to preload a trusted project-local skill into a one-shot direct model call. Pinned skills are useful when you want deterministic, constructor-scoped behavior without exposing automatic activation.

Technical Implementation#

  1. Build a SkillsConfig that points at the current project root and pins one trusted local skill name.

  2. Construct DirectLLMCall through the public top-level API and pass the skills config at construction time.

  3. Execute one direct request so the pinned skill is injected as system-context before the user prompt.

  4. Print the normalized summary payload for inspection.

        1. Build a ``SkillsConfig`` that points at the current project root and pins one
   trusted local skill name.
2. Construct ``DirectLLMCall`` through the public top-level API and pass the
   skills config at construction time.
3. Execute one direct request so the pinned skill is injected as system-context
   before the user prompt.
4. Print the normalized summary payload for inspection.
    
 1from __future__ import annotations
 2
 3import json
 4from pathlib import Path
 5
 6import design_research_agents as drag
 7
 8
 9def _ensure_example_skill(project_root: Path) -> None:
10    skill_dir = project_root / ".agents" / "skills" / "design_brief"
11    skill_dir.mkdir(parents=True, exist_ok=True)
12    skill_dir.joinpath("SKILL.md").write_text(
13        "\n".join(
14            [
15                "---",
16                "name: design_brief",
17                "description: Summarize design requirements with concise language.",
18                "---",
19                "Focus on repairability, clarity, and actionable constraints.",
20                "",
21            ]
22        ),
23        encoding="utf-8",
24    )
25
26
27def main() -> None:
28    """Run one direct call with a pinned project-local skill."""
29    request_id = "example-direct-llm-pinned-skills-001"
30    project_root = Path("artifacts/examples/direct_llm_with_pinned_skills_project")
31    _ensure_example_skill(project_root)
32    skills = drag.SkillsConfig(
33        project_root=project_root,
34        pinned_skills=("design_brief",),
35    )
36
37    with drag.OpenAICompatibleHTTPLLMClient(
38        base_url="http://127.0.0.1:8001/v1",
39        default_model="qwen2-1.5b-q4",
40    ) as llm_client:
41        agent = drag.DirectLLMCall(
42            llm_client=llm_client,
43            system_prompt="You are a careful design research assistant.",
44            skills=skills,
45        )
46        result = agent.run(
47            prompt="Summarize the repairability requirements for a wearable device enclosure.",
48            request_id=request_id,
49        )
50
51    print(json.dumps(result.summary(), ensure_ascii=True, indent=2, sort_keys=True))
52
53
54if __name__ == "__main__":
55    main()

Expected Results#

Run Command

PYTHONPATH=src python3 examples/agents/direct_llm_with_pinned_skills.py

Example output shape (values vary by run):

{
  "success": true,
  "final_output": "<example-specific payload>",
  "terminated_reason": "<string-or-null>",
  "error": null
}

References#