---
wave: 2
depends_on: ["04-GC-01"]
files_modified:
- src/memory/storage/sqlite_manager.py
- tests/test_personality_learning.py
autonomous: false
---
# Gap Closure Plan 2: Implement Missing Methods for Personality Learning Pipeline
**Objective:** Implement the two missing methods (`get_conversations_by_date_range` and `get_conversation_messages`) in SQLiteManager that are required by PersonalityLearner.learn_from_conversations().
**Gap Description:** PersonalityLearner.learn_from_conversations() on lines 84-101 of src/memory/__init__.py calls two methods that don't exist in SQLiteManager:
1. `get_conversations_by_date_range(start_date, end_date)` - called on line 85
2. `get_conversation_messages(conversation_id)` - called on line 99
Without these methods, the personality learning pipeline completely fails, preventing the "Personality layers learn from conversation patterns" requirement from being verified.
**Root Cause:** These helper methods were not implemented in SQLiteManager, though the infrastructure (get_conversation, get_recent_conversations) exists for building them.
## Tasks
```xml
Implement get_conversations_by_date_range() method in SQLiteManager
PersonalityLearner.learn_from_conversations() needs to fetch all conversations within a date range to extract patterns from them. This method queries the conversations table filtered by created_at timestamp between start and end dates.
1. Open src/memory/storage/sqlite_manager.py
2. Locate the class definition and find a good insertion point (after get_recent_conversations method, ~line 350)
3. Copy the provided implementation from Implementation Details section
4. Add method to SQLiteManager class with proper indentation
5. Save file
python3 -c "from src.memory.storage.sqlite_manager import SQLiteManager; import inspect; assert 'get_conversations_by_date_range' in dir(SQLiteManager)"
- Method exists in SQLiteManager class
- Signature: get_conversations_by_date_range(start_date: datetime, end_date: datetime) -> List[Dict[str, Any]]
- Method queries conversations table with WHERE created_at BETWEEN start_date AND end_date
- Returns list of conversation dicts with id, title, created_at, metadata
- No syntax errors in the file
Implement get_conversation_messages() method in SQLiteManager
PersonalityLearner.learn_from_conversations() needs to get all messages for each conversation to extract patterns from message content and metadata. This is a simple method that retrieves all messages for a given conversation_id.
1. Open src/memory/storage/sqlite_manager.py
2. Locate the method you just added (get_conversations_by_date_range)
3. Add the get_conversation_messages method right after it
4. Copy implementation from Implementation Details section
5. Save file
python3 -c "from src.memory.storage.sqlite_manager import SQLiteManager; import inspect; assert 'get_conversation_messages' in dir(SQLiteManager)"
- Method exists in SQLiteManager class
- Signature: get_conversation_messages(conversation_id: str) -> List[Dict[str, Any]]
- Method queries messages table with WHERE conversation_id = ?
- Returns list of message dicts with id, role, content, timestamp, metadata
- Messages are ordered by timestamp ascending
Verify methods work with PersonalityLearner pipeline
Ensure the new methods integrate properly with PersonalityLearner.learn_from_conversations() and don't cause errors in the pattern extraction flow.
1. Create simple Python test script that:
- Imports MemoryManager and PersonalityLearner
- Creates a test memory manager instance
- Calls get_conversations_by_date_range with test dates
- For each conversation, calls get_conversation_messages
- Verifies methods return proper data structures
2. Run test script to verify no AttributeError occurs
python3 -c "from src.memory import MemoryManager, PersonalityLearner; from datetime import datetime, timedelta; mm = MemoryManager(); convs = mm.sqlite_manager.get_conversations_by_date_range(datetime.now() - timedelta(days=30), datetime.now()); print(f'Found {len(convs)} conversations')"
- Both methods can be called without AttributeError
- get_conversations_by_date_range returns list (empty or with conversations)
- get_conversation_messages returns list (empty or with messages)
- Data structures are properly formatted with expected fields
Create integration test for complete personality learning pipeline
Write a comprehensive test that verifies the entire personality learning flow works from conversation retrieval through pattern extraction to layer creation. This is the main verification test for closing this gap.
1. Create or update tests/test_personality_learning.py
2. Add test function that:
- Initializes MemoryManager with test database
- Creates sample conversations with multiple messages
- Calls PersonalityLearner.learn_from_conversations()
- Verifies patterns are extracted and layers are created
3. Run test to verify end-to-end pipeline works
4. Verify all assertions pass
python3 -m pytest tests/test_personality_learning.py -v
- Integration test file exists (tests/test_personality_learning.py)
- Test creates sample data and calls personality learning pipeline
- Test verifies patterns are extracted from conversation messages
- Test verifies personality layers are created
- All assertions pass without errors
- End-to-end personality learning pipeline is functional
```
## Implementation Details
### Method 1: get_conversations_by_date_range
```python
def get_conversations_by_date_range(
self, start_date: datetime, end_date: datetime
) -> List[Dict[str, Any]]:
"""
Get all conversations created within a date range.
Args:
start_date: Start of date range
end_date: End of date range
Returns:
List of conversation dictionaries with metadata
"""
try:
conn = self._get_connection()
cursor = conn.cursor()
query = """
SELECT id, title, created_at, updated_at, metadata, session_id,
total_messages, total_tokens
FROM conversations
WHERE created_at BETWEEN ? AND ?
ORDER BY created_at DESC
"""
cursor.execute(query, (start_date.isoformat(), end_date.isoformat()))
rows = cursor.fetchall()
conversations = []
for row in rows:
conv_dict = {
"id": row[0],
"title": row[1],
"created_at": row[2],
"updated_at": row[3],
"metadata": json.loads(row[4]) if row[4] else {},
"session_id": row[5],
"total_messages": row[6],
"total_tokens": row[7],
}
conversations.append(conv_dict)
return conversations
except Exception as e:
self.logger.error(f"Failed to get conversations by date range: {e}")
return []
```
### Method 2: get_conversation_messages
```python
def get_conversation_messages(self, conversation_id: str) -> List[Dict[str, Any]]:
"""
Get all messages for a conversation.
Args:
conversation_id: ID of the conversation
Returns:
List of message dictionaries with content and metadata
"""
try:
conn = self._get_connection()
cursor = conn.cursor()
query = """
SELECT id, conversation_id, role, content, timestamp,
token_count, importance_score, metadata, embedding_id
FROM messages
WHERE conversation_id = ?
ORDER BY timestamp ASC
"""
cursor.execute(query, (conversation_id,))
rows = cursor.fetchall()
messages = []
for row in rows:
msg_dict = {
"id": row[0],
"conversation_id": row[1],
"role": row[2],
"content": row[3],
"timestamp": row[4],
"token_count": row[5],
"importance_score": row[6],
"metadata": json.loads(row[7]) if row[7] else {},
"embedding_id": row[8],
}
messages.append(msg_dict)
return messages
except Exception as e:
self.logger.error(f"Failed to get conversation messages: {e}")
return []
```
## Must-Haves for Verification
- [ ] get_conversations_by_date_range method exists in SQLiteManager
- [ ] Method accepts start_date and end_date as datetime parameters
- [ ] Method returns list of conversation dicts with required fields (id, title, created_at, metadata)
- [ ] get_conversation_messages method exists in SQLiteManager
- [ ] Method accepts conversation_id as string parameter
- [ ] Method returns list of message dicts with required fields (role, content, timestamp, metadata)
- [ ] PersonalityLearner.learn_from_conversations() can execute without AttributeError
- [ ] Pattern extraction pipeline completes successfully with sample data
- [ ] Integration test for complete personality learning pipeline exists and passes
- [ ] Personality layers are created from conversation patterns