3.8 KiB
3.8 KiB
LLMUtils
A Python utility library for managing LLM prompts with template variables and JSON schemas.
Installation
# Install from GitHub
uv add git+https://git.project-insanity.de/gmarth/LLMUtils.git
# Or with pip
pip install git+https://git.project-insanity.de/gmarth/LLMUtils.git
Features
- Smart Prompt Management: Load and manage prompt templates with variable substitution
- On-demand Loading: Prompts are loaded lazily at runtime for better performance
- Caching Support: Optional caching to avoid repeated disk reads
- JSON Schema Support: Associate structured output schemas with prompts
- Variable Validation: Automatic validation of required template variables
- Flexible API: Fill variables at retrieval or on-demand
Quick Start
Basic Usage
from llmutils.prompt_manager import PromptManager
# Get a prompt template
result = PromptManager.get_prompt('greeting')
print(result.variables) # See required variables: {'name', 'age'}
print(result.template) # View the template: "Hello {{name}}, you are {{age}} years old"
# Fill the template
filled = result.fill(name='Alice', age=30)
print(filled) # "Hello Alice, you are 30 years old"
Pre-filling Variables
# Fill variables during retrieval
result = PromptManager.get_prompt('greeting', name='Alice', age=30)
print(result.prompt) # Already filled: "Hello Alice, you are 30 years old"
Validation
result = PromptManager.get_prompt('greeting')
# Check if variables are valid
if not result.validate(name='Alice'):
missing = result.get_missing_variables(name='Alice')
print(f"Missing variables: {missing}") # {'age'}
# Fill with all required variables
filled = result.fill(name='Alice', age=30)
JSON Schema Support
# Get prompt with associated schema
result = PromptManager.get_prompt('task_prompt')
if result.schema:
print("This prompt has a structured output schema")
print(result.schema) # The JSON schema dictionary
Configuration
from pathlib import Path
from llmutils.prompt_manager import PromptManager
# Configure custom prompts directory (default: ./prompts)
PromptManager.configure(path=Path('/custom/prompts/location'))
# Disable caching for development
PromptManager.configure(caching=False)
# Clear cache to force reload
PromptManager.reload_prompts()
Prompt Files
Place your prompt templates in the prompts/ directory:
prompts/greeting.md- Markdown file with templateprompts/greeting.json- Optional JSON schema for structured output
Example prompt template (greeting.md):
Hello {{name}},
You are {{age}} years old.
Example schema (greeting.json):
{
"type": "object",
"properties": {
"response": {
"type": "string"
}
}
}
API Reference
PromptResult Class
The PromptResult dataclass returned by get_prompt():
template: str- The original template stringname: str- The prompt namevariables: Set[str]- Required template variablesschema: Optional[Dict]- Associated JSON schemaprompt: str- Property that returns filled prompt or templatefill(**kwargs) -> str- Fill template with variablesvalidate(**kwargs) -> bool- Check if all variables providedget_missing_variables(**kwargs) -> Set[str]- Get missing variables
PromptManager Methods
get_prompt(prompt_name, **kwargs) -> PromptResult- Get a prompt templateget_schema(prompt_name) -> Optional[Dict]- Get just the schemahas_schema(prompt_name) -> bool- Check if prompt has schemalist_prompts() -> Dict- List all available promptsget_prompt_info(prompt_name) -> Dict- Get detailed prompt informationconfigure(path=None, caching=None)- Configure settingsreload_prompts()- Clear the cache
License
MIT