136 lines
5.1 KiB
Markdown
136 lines
5.1 KiB
Markdown
# CLAUDE.md
|
|
|
|
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
|
|
|
## Project Overview
|
|
|
|
This is a multi-agent roleplay system implementing Stanford's "Generative Agents" memory architecture for believable AI characters with emergent behaviors. The project currently uses OpenAI's API in the agent system but is transitioning to use a custom LLM connector that supports any OpenAI-compatible API endpoint.
|
|
|
|
## Key Architecture Components
|
|
|
|
### Agent System (agents.py)
|
|
- **Memory Stream**: Stanford's memory architecture with observations, reflections, and plans
|
|
- **Smart Retrieval**: Combines recency (exponential decay), importance (1-10 scale), and relevance (cosine similarity)
|
|
- **Auto-Reflection**: Generates insights when importance threshold (150) is reached
|
|
- **Character Components**: Character, CharacterAgent, MemoryStream, SceneManager
|
|
- Currently uses OpenAI API directly but should be migrated to use llm_connector
|
|
|
|
### LLM Connector Package
|
|
- **Custom LLM abstraction** that supports any OpenAI-compatible API
|
|
- **Streaming support** with both reasoning and content chunks
|
|
- **Type definitions**: LLMBackend (base_url, api_token, model) and LLMMessage
|
|
- Environment variables: BACKEND_BASE_URL, BACKEND_API_TOKEN, BACKEND_MODEL
|
|
|
|
### UI Framework
|
|
- **NiceGUI** for web interface (async components)
|
|
- **AsyncElement base class**: Simplified async UI component pattern
|
|
- Constructor accepts element_type (default: ui.column) and element args/kwargs
|
|
- Implement build() method for async initialization logic
|
|
- Use create() factory method which returns the NiceGUI element directly
|
|
- Supports method chaining on the returned element
|
|
- Pages are created in pages/ directory, main page is MainPage
|
|
|
|
## Development Commands
|
|
|
|
```bash
|
|
# Install dependencies
|
|
uv sync
|
|
|
|
# Run the application
|
|
uv run python main.py
|
|
# Application runs on http://localhost:8080
|
|
|
|
# Add new dependencies
|
|
uv add <package-name>
|
|
|
|
# Python environment management
|
|
uv python pin 3.12 # Pin to Python 3.12
|
|
```
|
|
|
|
## Important Development Notes
|
|
|
|
### AsyncElement Usage
|
|
When creating UI components that extend AsyncElement:
|
|
```python
|
|
class MyComponent(AsyncElement):
|
|
async def build(self, param1: str, param2: int, *args, **kwargs) -> None:
|
|
# Build content directly in self.element
|
|
with self.element:
|
|
ui.label(f'{param1}: {param2}')
|
|
# Add more UI elements...
|
|
|
|
# Usage - create() returns the NiceGUI element directly, supports method chaining
|
|
(await MyComponent.create(element_type=ui.card, param1="test", param2=123)).classes('w-full')
|
|
|
|
# Can specify different element types
|
|
(await MyComponent.create(element_type=ui.row, param1="test", param2=456)).classes('gap-4')
|
|
|
|
# Pass element constructor args/kwargs via special keys
|
|
await MyComponent.create(
|
|
element_type=ui.column,
|
|
element_args=(), # Positional args for element constructor
|
|
element_kwargs={'classes': 'p-4'}, # Kwargs for element constructor
|
|
param1="test", # Build method parameters
|
|
param2=789
|
|
)
|
|
```
|
|
|
|
Key points:
|
|
- Constructor accepts element_type (default: ui.column) and element args/kwargs
|
|
- build() method receives component-specific parameters
|
|
- create() factory method returns the NiceGUI element directly (not the AsyncElement instance)
|
|
- Supports method chaining on the returned element
|
|
- Use `with self.element:` context manager to add content in build()
|
|
|
|
### LLM Integration
|
|
The project has two LLM integration approaches:
|
|
1. **Legacy** (in agents.py): Direct OpenAI client usage
|
|
2. **Current** (llm_connector): Flexible backend supporting any OpenAI-compatible API
|
|
|
|
When implementing new features, use the llm_connector package:
|
|
```python
|
|
from llm_connector import get_response, LLMBackend, LLMMessage
|
|
|
|
backend: LLMBackend = {
|
|
'base_url': os.environ['BACKEND_BASE_URL'],
|
|
'api_token': os.environ['BACKEND_API_TOKEN'],
|
|
'model': os.environ['BACKEND_MODEL']
|
|
}
|
|
|
|
messages: List[LLMMessage] = [
|
|
{'role': 'system', 'content': 'You are...'},
|
|
{'role': 'user', 'content': 'Hello'}
|
|
]
|
|
|
|
# Non-streaming
|
|
response = await get_response(backend, messages, stream=False)
|
|
|
|
# Streaming
|
|
async for chunk in await get_response(backend, messages, stream=True):
|
|
if 'content' in chunk:
|
|
# Handle content
|
|
if 'reasoning' in chunk:
|
|
# Handle reasoning (if supported)
|
|
```
|
|
|
|
### Project Structure
|
|
- `main.py`: Entry point, NiceGUI app configuration
|
|
- `agents.py`: Stanford memory architecture implementation (to be integrated)
|
|
- `llm_connector/`: Custom LLM integration package
|
|
- `components/`: Reusable UI components with AsyncElement base
|
|
- `pages/`: UI pages (currently only MainPage)
|
|
|
|
### Environment Variables
|
|
Required in `.env`:
|
|
- `BACKEND_BASE_URL`: LLM API endpoint
|
|
- `BACKEND_API_TOKEN`: API authentication token
|
|
- `BACKEND_MODEL`: Model identifier
|
|
- `OPENAI_API_KEY`: Currently needed for agents.py (to be removed)
|
|
|
|
## Next Steps for Integration
|
|
|
|
The agents.py system needs to be:
|
|
1. Modified to use llm_connector instead of direct OpenAI client
|
|
2. Integrated into the NiceGUI web interface
|
|
3. Create UI components for character interaction, memory viewing, scene management
|
|
4. Implement real-time streaming of agent responses in the UI |