Files
LLMUtils/README.md
2025-09-16 05:46:05 +02:00

3.8 KiB

LLMUtils

A Python utility library for managing LLM prompts with template variables and JSON schemas.

Installation

# Install from GitHub
uv add git+https://git.project-insanity.de/gmarth/LLMUtils.git

# Or with pip
pip install git+https://git.project-insanity.de/gmarth/LLMUtils.git

Features

  • Smart Prompt Management: Load and manage prompt templates with variable substitution
  • On-demand Loading: Prompts are loaded lazily at runtime for better performance
  • Caching Support: Optional caching to avoid repeated disk reads
  • JSON Schema Support: Associate structured output schemas with prompts
  • Variable Validation: Automatic validation of required template variables
  • Flexible API: Fill variables at retrieval or on-demand

Quick Start

Basic Usage

from llmutils.prompt_manager import PromptManager

# Get a prompt template
result = PromptManager.get_prompt('greeting')
print(result.variables)  # See required variables: {'name', 'age'}
print(result.template)   # View the template: "Hello {{name}}, you are {{age}} years old"

# Fill the template
filled = result.fill(name='Alice', age=30)
print(filled)  # "Hello Alice, you are 30 years old"

Pre-filling Variables

# Fill variables during retrieval
result = PromptManager.get_prompt('greeting', name='Alice', age=30)
print(result.prompt)  # Already filled: "Hello Alice, you are 30 years old"

Validation

result = PromptManager.get_prompt('greeting')

# Check if variables are valid
if not result.validate(name='Alice'):
    missing = result.get_missing_variables(name='Alice')
    print(f"Missing variables: {missing}")  # {'age'}

# Fill with all required variables
filled = result.fill(name='Alice', age=30)

JSON Schema Support

# Get prompt with associated schema
result = PromptManager.get_prompt('task_prompt')

if result.schema:
    print("This prompt has a structured output schema")
    print(result.schema)  # The JSON schema dictionary

Configuration

from pathlib import Path
from llmutils.prompt_manager import PromptManager

# Configure custom prompts directory (default: ./prompts)
PromptManager.configure(path=Path('/custom/prompts/location'))

# Disable caching for development
PromptManager.configure(caching=False)

# Clear cache to force reload
PromptManager.reload_prompts()

Prompt Files

Place your prompt templates in the prompts/ directory:

  • prompts/greeting.md - Markdown file with template
  • prompts/greeting.json - Optional JSON schema for structured output

Example prompt template (greeting.md):

Hello {{name}},

You are {{age}} years old.

Example schema (greeting.json):

{
  "type": "object",
  "properties": {
    "response": {
      "type": "string"
    }
  }
}

API Reference

PromptResult Class

The PromptResult dataclass returned by get_prompt():

  • template: str - The original template string
  • name: str - The prompt name
  • variables: Set[str] - Required template variables
  • schema: Optional[Dict] - Associated JSON schema
  • prompt: str - Property that returns filled prompt or template
  • fill(**kwargs) -> str - Fill template with variables
  • validate(**kwargs) -> bool - Check if all variables provided
  • get_missing_variables(**kwargs) -> Set[str] - Get missing variables

PromptManager Methods

  • get_prompt(prompt_name, **kwargs) -> PromptResult - Get a prompt template
  • get_schema(prompt_name) -> Optional[Dict] - Get just the schema
  • has_schema(prompt_name) -> bool - Check if prompt has schema
  • list_prompts() -> Dict - List all available prompts
  • get_prompt_info(prompt_name) -> Dict - Get detailed prompt information
  • configure(path=None, caching=None) - Configure settings
  • reload_prompts() - Clear the cache

License

MIT