Skip to content

Commit dde8e06

Browse files
authored
refactor(ph-ai): move chat agent specifics to dedicated classes (#41697)
## Problem This PR extracts the Chat Agent -related logics and classes, so that the base architecture can be reused for other agents, e.g. the Research agent. Additionally, the mode manager is now responsible for tools and system prompts, while the mode definitions only define positive/negative examples and custom tools. ## Changes - The old `Assistant` class is now `AgentManager`. You don't need to use the `Assistant.create` factory anymore, instead, create an `AgentManager` class for your specific agent. The only legacy assistant class still in the repo is the `InsightsAssistant` one, which is connected to the MCP - New `agent`, `chat_agent` and `research_agent` folders: the `agent` folder contains all base implementations, `chat_agent` contains the agent specific implementation, `research_agent` is mostly empty and will be filled up with the new agent in the next PRs - `agent` contains: - `agent_modes/` - Agent mode system - `executables.py` - Agent and tools executable implementations (formerly `nodes.py`) - `mode_manager.py` - Base mode manager class - `factory.py` - Mode definition factory - `toolkit.py` - Base toolkit class - `utils.py` - Utility functions - `feature_flags.py` - Feature flag checks - `prompts.py` - Agent-specific prompts - `presets/` - Mode preset configurations (product_analytics, sql, session_replay) - `compaction_manager.py` - State compaction management - `manager.py` - Base agent manager (formerly `assistant/base.py`) - `stream_processor.py` - Agent stream processor protocol - `agent_executor.py`: Agent executor to run Temporal workflow and stream from Redis - `redis_stream.py`: class to stream from Redis - `chat_agent` contains: - `manager.py` - ChatAgentManager implementation (formerly `assistant/main_assistant.py`) - `mode_manager.py` - Chat-specific mode management, this adds contextual tools to the default tools, and manages the mode registry and default tooling - `graph.py` - Chat agent graph definition (formerly `graph/graph.py`) - `loop_graph/` - Agent executor loop graph (formerly `graph/agent_executor/`) - `graph.py` - Loop graph implementation - `nodes.py` - Loop graph node implementations - `stream_processor.py` - Chat-specific stream processor (formerly `utils/stream_processor.py`) - `prompts.py` - Chat-specific prompts (cleaned up from `graph/agent_modes/prompts.py`) - `conversation_summarizer/nodes.py` is now `conversation_summarizer/summarizer.py` as it's not a node, and is moved to the `utils` folder - **MAIN LOGIC CHANGE**: the mode manager is now the source of truth for tools and system prompts, it injects two functions to get system prompts and tools into the executables, and is responsible for defining the default tools and mode registry. Modes stay lean, only implementing positive/negative examples and custom tools. The `SwitchModeTool`​ is also initialized in the mode manager so that we can inject the mode registry and default tools. ## How did you test this code? - Refactored tests + locally ## Changelog: No
1 parent 4676bc8 commit dde8e06

File tree

87 files changed

+1472
-1151
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

87 files changed

+1472
-1151
lines changed

ee/api/conversation.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,8 +30,8 @@
3030
)
3131
from posthog.utils import get_instance_region
3232

33-
from ee.hogai.agent.executor import AgentExecutor
3433
from ee.hogai.api.serializers import ConversationSerializer
34+
from ee.hogai.core.executor import AgentExecutor
3535
from ee.hogai.utils.aio import async_to_sync
3636
from ee.hogai.utils.sse import AssistantSSESerializer
3737
from ee.hogai.utils.types.base import AssistantMode

ee/api/max_tools.py

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
from posthog.rate_limit import AIBurstRateThrottle, AISustainedRateThrottle
1616
from posthog.renderers import SafeJSONRenderer
1717

18-
from ee.hogai.utils.types import AssistantMode, AssistantState
18+
from ee.hogai.utils.types import AssistantState
1919
from ee.models.assistant import Conversation
2020

2121

@@ -53,17 +53,16 @@ class MaxToolsViewSet(TeamAndOrgViewSetMixin, GenericViewSet):
5353
required_scopes=["insight:read", "query:read"],
5454
)
5555
def create_and_query_insight(self, request: Request, *args, **kwargs):
56-
from ee.hogai.assistant import Assistant
56+
from ee.hogai.insights_assistant import InsightsAssistant
5757

5858
serializer = InsightsToolCallSerializer(data=request.data)
5959
serializer.is_valid(raise_exception=True)
6060
conversation = self.get_queryset().create(user=request.user, team=self.team, type=Conversation.Type.TOOL_CALL)
61-
assistant = Assistant.create(
61+
assistant = InsightsAssistant(
6262
self.team,
6363
conversation,
6464
user=cast(User, request.user),
6565
is_new_conversation=False, # we don't care about the conversation id being sent back to the client
66-
mode=AssistantMode.INSIGHTS_TOOL,
6766
initial_state=serializer.validated_data["state"],
6867
)
6968

ee/api/test/test_conversation.py

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -102,7 +102,7 @@ def test_create_conversation(self):
102102
conversation_id = str(uuid.uuid4())
103103

104104
with patch(
105-
"ee.hogai.agent.executor.AgentExecutor.astream",
105+
"ee.hogai.core.executor.AgentExecutor.astream",
106106
return_value=_async_generator(),
107107
) as mock_start_workflow_and_stream:
108108
with patch("ee.api.conversation.StreamingHttpResponse", side_effect=self._create_mock_streaming_response):
@@ -129,7 +129,7 @@ def test_create_conversation(self):
129129

130130
def test_add_message_to_existing_conversation(self):
131131
with patch(
132-
"ee.hogai.agent.executor.AgentExecutor.astream",
132+
"ee.hogai.core.executor.AgentExecutor.astream",
133133
return_value=_async_generator(),
134134
) as mock_start_workflow_and_stream:
135135
with patch("ee.api.conversation.StreamingHttpResponse", side_effect=self._create_mock_streaming_response):
@@ -183,7 +183,7 @@ def test_invalid_message_format(self):
183183
def test_rate_limit_burst(self):
184184
# Create multiple requests to trigger burst rate limit
185185
with patch(
186-
"ee.hogai.agent.executor.AgentExecutor.astream",
186+
"ee.hogai.core.executor.AgentExecutor.astream",
187187
return_value=_async_generator(),
188188
):
189189
with patch("ee.api.conversation.StreamingHttpResponse", side_effect=self._create_mock_streaming_response):
@@ -215,7 +215,7 @@ def test_none_content_with_existing_conversation(self):
215215
user=self.user, team=self.team, status=Conversation.Status.IN_PROGRESS
216216
)
217217
with patch(
218-
"ee.hogai.agent.executor.AgentExecutor.astream",
218+
"ee.hogai.core.executor.AgentExecutor.astream",
219219
return_value=_async_generator(),
220220
) as mock_stream_conversation:
221221
with patch("ee.api.conversation.StreamingHttpResponse", side_effect=self._create_mock_streaming_response):
@@ -251,7 +251,7 @@ def test_missing_trace_id(self):
251251

252252
def test_nonexistent_conversation(self):
253253
with patch(
254-
"ee.hogai.agent.executor.AgentExecutor.astream",
254+
"ee.hogai.core.executor.AgentExecutor.astream",
255255
return_value=_async_generator(),
256256
):
257257
with patch("ee.api.conversation.StreamingHttpResponse", side_effect=self._create_mock_streaming_response):
@@ -284,7 +284,7 @@ def test_unauthenticated_request(self):
284284
)
285285
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
286286

287-
@patch("ee.hogai.agent.executor.AgentExecutor.cancel_workflow")
287+
@patch("ee.hogai.core.executor.AgentExecutor.cancel_workflow")
288288
def test_cancel_conversation(self, mock_cancel):
289289
conversation = Conversation.objects.create(
290290
user=self.user,
@@ -327,7 +327,7 @@ def test_cancel_other_teams_conversation(self):
327327
)
328328
self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
329329

330-
@patch("ee.hogai.agent.executor.AgentExecutor.cancel_workflow")
330+
@patch("ee.hogai.core.executor.AgentExecutor.cancel_workflow")
331331
def test_cancel_conversation_with_async_cleanup(self, mock_cancel):
332332
"""Test that cancel endpoint properly handles async cleanup."""
333333
conversation = Conversation.objects.create(
@@ -347,7 +347,7 @@ def test_cancel_conversation_with_async_cleanup(self, mock_cancel):
347347

348348
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
349349

350-
@patch("ee.hogai.agent.executor.AgentExecutor.cancel_workflow")
350+
@patch("ee.hogai.core.executor.AgentExecutor.cancel_workflow")
351351
def test_cancel_conversation_async_cleanup_failure(self, mock_cancel):
352352
"""Test cancel endpoint behavior when async cleanup fails."""
353353
conversation = Conversation.objects.create(
@@ -415,7 +415,7 @@ def test_stream_from_in_progress_conversation(self):
415415
user=self.user, team=self.team, status=Conversation.Status.IN_PROGRESS
416416
)
417417
with patch(
418-
"ee.hogai.agent.executor.AgentExecutor.astream",
418+
"ee.hogai.core.executor.AgentExecutor.astream",
419419
return_value=_async_generator(),
420420
) as mock_stream_conversation:
421421
with patch("ee.api.conversation.StreamingHttpResponse", side_effect=self._create_mock_streaming_response):
@@ -581,7 +581,7 @@ def test_billing_context_validation_valid_data(self):
581581
conversation = Conversation.objects.create(user=self.user, team=self.team)
582582

583583
with patch(
584-
"ee.hogai.agent.executor.AgentExecutor.astream",
584+
"ee.hogai.core.executor.AgentExecutor.astream",
585585
return_value=_async_generator(),
586586
) as mock_start_workflow_and_stream:
587587
with patch("ee.api.conversation.StreamingHttpResponse", side_effect=self._create_mock_streaming_response):
@@ -605,7 +605,7 @@ def test_billing_context_validation_invalid_data(self):
605605
conversation = Conversation.objects.create(user=self.user, team=self.team)
606606

607607
with patch(
608-
"ee.hogai.agent.executor.AgentExecutor.astream",
608+
"ee.hogai.core.executor.AgentExecutor.astream",
609609
return_value=_async_generator(),
610610
) as mock_start_workflow_and_stream:
611611
with patch("ee.api.conversation.StreamingHttpResponse", side_effect=self._create_mock_streaming_response):

ee/api/test/test_max_tools.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33

44
from mistralai_azure import AssistantMessage
55

6-
from ee.hogai.assistant.insights_assistant import InsightsAssistant
6+
from ee.hogai.insights_assistant import InsightsAssistant
77

88

99
class TestMaxToolsAPI(APIBaseTest):

ee/hogai/api/serializers.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@
77

88
from posthog.exceptions_capture import capture_exception
99

10+
from ee.hogai.chat_agent.graph import AssistantGraph
1011
from ee.hogai.graph.deep_research.graph import DeepResearchAssistantGraph
1112
from ee.hogai.graph.deep_research.types import DeepResearchState
12-
from ee.hogai.graph.graph import AssistantGraph
1313
from ee.hogai.utils.helpers import should_output_assistant_message
1414
from ee.hogai.utils.types import AssistantState
1515
from ee.hogai.utils.types.composed import AssistantMaxGraphState

ee/hogai/api/test/test_serializers.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
from posthog.schema import AssistantMessage, AssistantToolCallMessage, ContextMessage
55

66
from ee.hogai.api.serializers import ConversationSerializer
7-
from ee.hogai.graph.graph import AssistantGraph
7+
from ee.hogai.chat_agent.graph import AssistantGraph
88
from ee.hogai.utils.types import AssistantState
99
from ee.models.assistant import Conversation
1010

ee/hogai/assistant/__init__.py

Lines changed: 0 additions & 3 deletions
This file was deleted.

ee/hogai/assistant/assistant.py

Lines changed: 0 additions & 52 deletions
This file was deleted.
File renamed without changes.

ee/hogai/graph/graph.py renamed to ee/hogai/chat_agent/graph.py

Lines changed: 6 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,8 @@
11
from collections.abc import Callable
22

3+
from ee.hogai.chat_agent.loop_graph.graph import ChatAgentLoopGraph
34
from ee.hogai.django_checkpoint.checkpointer import DjangoCheckpointer
4-
from ee.hogai.graph.agent_executor import AgentExecutorGraph
5-
from ee.hogai.graph.title_generator.nodes import TitleGeneratorNode
6-
from ee.hogai.graph.usage import UsageNode
7-
from ee.hogai.utils.types.base import AssistantGraphName, AssistantNodeName, AssistantState
8-
9-
from .memory.nodes import (
5+
from ee.hogai.graph.memory.nodes import (
106
MemoryCollectorNode,
117
MemoryCollectorToolsNode,
128
MemoryInitializerInterruptNode,
@@ -16,9 +12,12 @@
1612
MemoryOnboardingFinalizeNode,
1713
MemoryOnboardingNode,
1814
)
15+
from ee.hogai.graph.title_generator.nodes import TitleGeneratorNode
16+
from ee.hogai.graph.usage.nodes import UsageNode
17+
from ee.hogai.utils.types.base import AssistantGraphName, AssistantNodeName, AssistantState
1918

2019

21-
class AssistantGraph(AgentExecutorGraph):
20+
class AssistantGraph(ChatAgentLoopGraph):
2221
@property
2322
def graph_name(self) -> AssistantGraphName:
2423
return AssistantGraphName.ASSISTANT

0 commit comments

Comments
 (0)