-
Notifications
You must be signed in to change notification settings - Fork 8.1k
Feat: migrate MCP transport from SSE to streamable http #10727
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
Important Review skippedAuto incremental reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the WalkthroughThis PR replaces the SSE-based MCP transport with Streamable HTTP transport across the backend and frontend. It introduces StreamableHTTPSessionManager for both global and per-project MCP handling, adds URL builders for streamable HTTP endpoints, updates composer configuration logic, and refactors related API schemas and frontend integration points. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant MCP API
participant StreamableHTTPMgr as StreamableHTTPSessionManager
participant MCP Server
Note over Client,MCP Server: Old Flow (SSE)
Client->>MCP API: POST /mcp (SSE)
MCP API->>MCP Server: handle via SseServerTransport
Note over Client,MCP Server: New Flow (Streamable HTTP)
Client->>MCP API: GET/POST/DELETE / (streamable HTTP)
MCP API->>MCP API: _ensure_streamable_http_manager_running()
MCP API->>StreamableHTTPMgr: handle_request(request, context)
StreamableHTTPMgr->>MCP Server: dispatch to server
MCP Server-->>StreamableHTTPMgr: response
StreamableHTTPMgr-->>MCP API: response
MCP API-->>Client: HTTP response (200 or 500 on error)
sequenceDiagram
participant App Startup
participant Router
participant StreamableHTTPMgr as StreamableHTTPSessionManager
participant MCP Server
Note over App Startup,MCP Server: Lifecycle Management
App Startup->>Router: startup event
Router->>Router: _start_streamable_http_manager()
Router->>StreamableHTTPMgr: initialize manager
StreamableHTTPMgr->>MCP Server: connect
Note over App Startup,MCP Server: Request Handling
Router->>Router: _ensure_streamable_http_manager_running()
Router->>StreamableHTTPMgr: handle_request()
Note over App Startup,MCP Server: Shutdown
App Startup->>Router: shutdown event
Router->>Router: _stop_streamable_http_manager()
Router->>StreamableHTTPMgr: cleanup
StreamableHTTPMgr->>MCP Server: disconnect
Estimated code review effort🎯 5 (Critical) | ⏱️ ~120 minutes Areas requiring extra attention:
Possibly related PRs
Suggested labels
Suggested reviewers
Pre-merge checks and finishing touches❌ Failed checks (1 error, 2 warnings)
✅ Passed checks (4 passed)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Codecov Report❌ Patch coverage is ❌ Your project check has failed because the head coverage (40.05%) is below the target coverage (60.00%). You can increase the head coverage or adjust the target coverage. Additional details and impacted files@@ Coverage Diff @@
## main #10727 +/- ##
==========================================
- Coverage 32.39% 32.38% -0.01%
==========================================
Files 1368 1368
Lines 63414 63380 -34
Branches 9373 9383 +10
==========================================
- Hits 20541 20528 -13
+ Misses 41840 41816 -24
- Partials 1033 1036 +3
Flags with carried forward coverage won't be shown. Click here to find out more.
🚀 New features to boost your workflow:
|
b580876 to
5ac3de0
Compare
|
@coderabbitai the chat component was not modified, please remove such details |
|
Tip For best results, initiate chat on the files or code changes. @HzaRashid: Understood! I acknowledge that the chat component modifications appearing in the starter project JSON files (ChatInput and ChatOutput changes including session_id handling and helper methods) are not part of the intentional changes in this PR. The core focus of this PR is the migration from MCP SSE transport to streamable HTTP, and the chat component changes in the starter projects appear to be incidental updates or auto-generated content unrelated to the MCP transport migration. I'll focus my analysis on the actual MCP transport-related changes in the backend API files, frontend controllers, and LFX services. Thank you for the clarification! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Nitpick comments (9)
src/backend/base/langflow/api/v2/mcp.py (1)
388-400: Complex conditional logic for API key generation could benefit from clarification.The nested conditions for
should_generate_api_keyspan multiple branches based onmcp_composer_enabled,auth_settings, andAUTO_LOGIN/SUPERUSERflags. Consider extracting this into a separate helper function with clear docstring explaining when API keys are generated:def _should_generate_api_key( project: Folder, settings_service: SettingsService, ) -> bool: """Determine if an API key should be generated for the MCP server. API keys are generated when: - MCP composer is enabled AND project uses apikey auth, OR - MCP composer is disabled AND project has apikey auth, OR - MCP composer is disabled AND no project auth AND auto-login is disabled, OR - Auto-login is enabled without superuser privileges """ ...src/frontend/src/customization/utils/custom-mcp-url.ts (1)
3-7: Clarify whenlegacySseUrlshould be preferred over streamable HTTPCurrent precedence returns
legacySseUrlwhenever it is present, even ifuseComposeris false and astreamableHttpUrlis also available. That can reintroduce SSE in non‑composer flows if callers pass both URLs.If the intended behavior is “only use SSE when actually going through Composer”, consider gating
legacySseUrlbehinduseComposer, or falling back tostreamableHttpUrlfirst for non‑composer cases, e.g.:export const customGetMCPUrl = ( projectId: string, options: ComposerConnectionOptions = {}, ) => { const { useComposer, streamableHttpUrl, legacySseUrl } = options; if (useComposer) { if (streamableHttpUrl) return streamableHttpUrl; if (legacySseUrl) return legacySseUrl; } if (streamableHttpUrl) return streamableHttpUrl; if (legacySseUrl) return legacySseUrl; const apiHost = api.defaults.baseURL || window.location.origin; return `${apiHost}/api/v1/mcp/project/${projectId}`; };Also applies to: 11-25
src/backend/base/langflow/api/utils/mcp/config_utils.py (1)
369-423: Starter-project auto-config now prefers authenticated streamable HTTP transportThe added AUTO_LOGIN/SUPERUSER branch and the move to:
streamable_http_url = await get_project_streamable_http_url(user_starter_folder.id) args = ["mcp-proxy", "--transport", "streamablehttp", ... , streamable_http_url]ensure:
- API key auth is enforced when AUTO_LOGIN is enabled but no SUPERUSER exists, rather than silently falling back.
- Starter MCP servers use streamable HTTP URLs built from runtime host/port, not hardcoded localhost.
This is a good tightening of auth semantics and URL handling.
src/lfx/src/lfx/services/mcp_composer/service.py (1)
950-1015: Clean up staleerror_messageusage at call sites
start_project_composernow raises on failure and does not return an error string, butregister_project_with_composerinmcp_projects.pystill does:error_message = await mcp_composer_service.start_project_composer(...) if error_message is not None: raise RuntimeError(error_message)Since
error_messagewill always beNoneon success and exceptions are already caught and logged in the caller, this check is effectively dead code.You can simplify both sides by:
- Keeping
start_project_composerexception-based (no return value), and- Dropping the
error_messageassignment/condition inregister_project_with_composer(justawait start_project_composer(...)).This removes confusion about the intended error-signaling mechanism without changing runtime behavior.
Also applies to: 1189-1197
src/backend/base/langflow/api/v1/mcp_projects.py (4)
216-295: New list_project_tools flow cleanly separates JSON metadata from streaming transport
_build_project_tools_responsecentralizes:
- Project ownership checks,
- MCP-enabled filtering (
mcp_enabledflag),- Tool metadata construction, and
- Auth settings decryption + masking before building
MCPProjectResponse.
@router.get("/{project_id}"):
- Routes
Accept: text/event-streamrequests into the streamable HTTP path via_dispatch_project_streamable_httpwithverify_project_auth_conditional, and- Returns a
JSONResponsewithmetadata.model_dump(mode="json")for regular clients.This keeps the HTTP surface backward compatible for JSON callers while letting streaming clients reuse the same URL.
Also applies to: 297-315
571-605: Install/config helpers now consider both streamable HTTP and legacy SSE URLsThe changes in
install_mcp_configand the new helpers:
Compute:
if use_mcp_composer: composer_streamable_http_url = await get_composer_streamable_http_url(project) sse_url = await get_composer_sse_url(project) connection_urls = [composer_streamable_http_url, sse_url] else: streamable_http_url = await get_project_streamable_http_url(project_id) legacy_sse_url = await get_project_sse_url(project_id) args = ["mcp-proxy", "--transport", "streamablehttp", ... , streamable_http_url] connection_urls = [streamable_http_url, legacy_sse_url]Use
remove_server_by_urls(config_data, connection_urls)andconfig_contains_server_url(config_data, [project_streamable_url, project_sse_url])so both transports are treated as equivalent when detecting/removing existing entries.That avoids duplicate client entries when migrating from SSE to streamable HTTP and keeps URL detection aligned with the new transport. Logging messages still refer to “SSE URL” but behavior now correctly accounts for both URLs; you may want to update the log text later for clarity.
Also applies to: 641-702, 753-755, 943-951, 1031-1055
783-861: Composer URL endpoint correctly returns structured ComposerUrlResponse but outer error path may misleadWithin the main try block,
get_project_composer_urlnow:
- Returns
ComposerUrlResponsewithuses_composer=Falseand bothstreamable_http_url/legacy_sse_urlwhen Composer isn’t in use.- Returns
ComposerUrlResponsewithuses_composer=Trueandcomposer_streamable_http_url/composer_sse_urlon success.- On
MCPComposerError, returnsuses_composer=Truewitherror_messagepopulated.In the outermost
except, however, you always return:return ComposerUrlResponse( project_id=str(project_id), uses_composer=True, error_message="Failed to get MCP Composer URL. See logs for details.", )even when the failure might be unrelated to Composer (e.g., project not found or non‑composer path issues).
Consider deriving
uses_composerin this outer error path fromshould_use_mcp_composer(project)(when available) or defaulting toFalsewhen you can’t determine it, to avoid suggesting Composer is in use when it is actually disabled or not applicable.
879-920: Type/behavior nits in URL helpers and remove-by-URL logicTwo small cleanups to consider:
- In
get_composer_sse_url:composer_sse_url = f"http://{composer_host}:{composer_port}/sse" return await get_url_by_os(composer_host, composer_port, composer_sse_url)
composer_portis likely a string from auth settings;get_url_by_osis typed to takeintand other call sites cast it. For consistency and clearer types, prefer:return await get_url_by_os(composer_host, int(composer_port), composer_sse_url)
- In
remove_server_by_urls, you match by:if args and any(url == args[-1] or url in args for url in urls):This is stricter than substring matching (which is good for avoiding false positives) and, combined with
connection_urlscontaining both streamable and SSE URLs, should behave as intended. Just be aware that if clients ever embed URLs inside larger argument strings, they won’t be detected; in that case you might need a more explicit URL parsing step.Both are non‑blocking but would tighten the implementation.
Also applies to: 1031-1055
src/backend/base/langflow/api/v1/mcp.py (1)
81-124: Refactor to avoid mutable globals and fix pipeline failures.The
globalstatements at lines 89 and 113 trigger Ruff PLW0603 errors that are blocking the pipeline. Additionally,@router.on_eventis deprecated in FastAPI in favor of lifespan context managers.Consider encapsulating the manager state in a class:
-_streamable_http_manager: StreamableHTTPSessionManager | None = None -_streamable_http_manager_lock = asyncio.Lock() -_streamable_http_manager_started = False -_streamable_http_manager_stack: AsyncExitStack | None = None - - -async def _ensure_streamable_http_manager_running() -> None: - """Start the Streamable HTTP session manager if it isn't already running.""" - global _streamable_http_manager_started, _streamable_http_manager_stack, _streamable_http_manager - - if _streamable_http_manager_started: - return - - async with _streamable_http_manager_lock: - if _streamable_http_manager_started: - return - - # Create a new instance each time we start - _streamable_http_manager = StreamableHTTPSessionManager(server) - _streamable_http_manager_stack = AsyncExitStack() - await _streamable_http_manager_stack.enter_async_context(_streamable_http_manager.run()) - _streamable_http_manager_started = True - await logger.adebug("Streamable HTTP session manager started for global MCP server") +class _StreamableHTTPManagerState: + def __init__(self): + self.manager: StreamableHTTPSessionManager | None = None + self.lock = asyncio.Lock() + self.started = False + self.stack: AsyncExitStack | None = None + + async def ensure_running(self) -> None: + if self.started: + return + async with self.lock: + if self.started: + return + self.manager = StreamableHTTPSessionManager(server) + self.stack = AsyncExitStack() + await self.stack.enter_async_context(self.manager.run()) + self.started = True + await logger.adebug("Streamable HTTP session manager started for global MCP server") + + async def stop(self) -> None: + async with self.lock: + if not self.started or self.stack is None: + return + await self.stack.aclose() + self.stack = None + self.manager = None + self.started = False + await logger.adebug("Streamable HTTP session manager stopped for global MCP server") + + +_manager_state = _StreamableHTTPManagerState()Then update references to use
_manager_state.manager,_manager_state.ensure_running(), and_manager_state.stop().
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (18)
src/backend/base/langflow/api/utils/mcp/__init__.py(1 hunks)src/backend/base/langflow/api/utils/mcp/config_utils.py(6 hunks)src/backend/base/langflow/api/v1/mcp.py(3 hunks)src/backend/base/langflow/api/v1/mcp_projects.py(25 hunks)src/backend/base/langflow/api/v1/schemas.py(1 hunks)src/backend/base/langflow/api/v2/mcp.py(4 hunks)src/backend/base/langflow/initial_setup/starter_projects/News Aggregator.json(1 hunks)src/backend/tests/unit/api/utils/test_config_utils.py(5 hunks)src/backend/tests/unit/api/v1/test_mcp.py(3 hunks)src/backend/tests/unit/api/v1/test_mcp_projects.py(8 hunks)src/frontend/src/controllers/API/queries/mcp/use-get-composer-url.ts(1 hunks)src/frontend/src/controllers/API/queries/mcp/use-patch-flows-mcp.ts(2 hunks)src/frontend/src/customization/utils/custom-mcp-url.ts(1 hunks)src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts(1 hunks)src/frontend/src/types/mcp/index.ts(1 hunks)src/lfx/src/lfx/services/mcp_composer/service.py(9 hunks)src/lfx/tests/unit/services/settings/test_mcp_composer.py(4 hunks)src/lfx/tests/unit/services/settings/test_mcp_composer_windows.py(5 hunks)
🧰 Additional context used
📓 Path-based instructions (8)
src/frontend/src/**/*.{ts,tsx}
📄 CodeRabbit inference engine (.cursor/rules/frontend_development.mdc)
src/frontend/src/**/*.{ts,tsx}: Use React 18 with TypeScript for frontend development
Use Zustand for state management
Files:
src/frontend/src/types/mcp/index.tssrc/frontend/src/customization/utils/custom-mcp-url.tssrc/frontend/src/controllers/API/queries/mcp/use-get-composer-url.tssrc/frontend/src/controllers/API/queries/mcp/use-patch-flows-mcp.tssrc/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts
src/backend/**/*.py
📄 CodeRabbit inference engine (.cursor/rules/backend_development.mdc)
src/backend/**/*.py: Use FastAPI async patterns withawaitfor async operations in component execution methods
Useasyncio.create_task()for background tasks and implement proper cleanup with try/except forasyncio.CancelledError
Usequeue.put_nowait()for non-blocking queue operations andasyncio.wait_for()with timeouts for controlled get operations
Files:
src/backend/base/langflow/api/utils/mcp/__init__.pysrc/backend/tests/unit/api/v1/test_mcp.pysrc/backend/tests/unit/api/utils/test_config_utils.pysrc/backend/base/langflow/api/v2/mcp.pysrc/backend/base/langflow/api/v1/schemas.pysrc/backend/tests/unit/api/v1/test_mcp_projects.pysrc/backend/base/langflow/api/v1/mcp_projects.pysrc/backend/base/langflow/api/utils/mcp/config_utils.pysrc/backend/base/langflow/api/v1/mcp.py
src/backend/base/langflow/api/**/*.py
📄 CodeRabbit inference engine (.cursor/rules/backend_development.mdc)
Backend API endpoints should be organized by version (v1/, v2/) under
src/backend/base/langflow/api/with specific modules for features (chat.py, flows.py, users.py, etc.)
Files:
src/backend/base/langflow/api/utils/mcp/__init__.pysrc/backend/base/langflow/api/v2/mcp.pysrc/backend/base/langflow/api/v1/schemas.pysrc/backend/base/langflow/api/v1/mcp_projects.pysrc/backend/base/langflow/api/utils/mcp/config_utils.pysrc/backend/base/langflow/api/v1/mcp.py
src/backend/tests/**/*.py
📄 CodeRabbit inference engine (.cursor/rules/testing.mdc)
src/backend/tests/**/*.py: Place backend unit tests insrc/backend/tests/directory, component tests insrc/backend/tests/unit/components/organized by component subdirectory, and integration tests accessible viamake integration_tests
Use same filename as component with appropriate test prefix/suffix (e.g.,my_component.py→test_my_component.py)
Use theclientfixture (FastAPI Test Client) defined insrc/backend/tests/conftest.pyfor API tests; it provides an asynchttpx.AsyncClientwith automatic in-memory SQLite database and mocked environment variables. Skip client creation by marking test with@pytest.mark.noclient
Inherit from the correctComponentTestBasefamily class located insrc/backend/tests/base.pybased on API access needs:ComponentTestBase(no API),ComponentTestBaseWithClient(needs API), orComponentTestBaseWithoutClient(pure logic). Provide three required fixtures:component_class,default_kwargs, andfile_names_mapping
Create comprehensive unit tests for all new backend components. If unit tests are incomplete, create a corresponding Markdown file documenting manual testing steps and expected outcomes
Test both sync and async code paths, mock external dependencies appropriately, test error handling and edge cases, validate input/output behavior, and test component initialization and configuration
Use@pytest.mark.asynciodecorator for async component tests and ensure async methods are properly awaited
Test background tasks usingasyncio.create_task()and verify completion withasyncio.wait_for()with appropriate timeout constraints
Test queue operations using non-blockingqueue.put_nowait()andasyncio.wait_for(queue.get(), timeout=...)to verify queue processing without blocking
Use@pytest.mark.no_blockbustermarker to skip the blockbuster plugin in specific tests
For database tests that may fail in batch runs, run them sequentially usinguv run pytest src/backend/tests/unit/test_database.pyr...
Files:
src/backend/tests/unit/api/v1/test_mcp.pysrc/backend/tests/unit/api/utils/test_config_utils.pysrc/backend/tests/unit/api/v1/test_mcp_projects.py
**/{test_*.py,*.test.ts,*.test.tsx}
📄 CodeRabbit inference engine (coderabbit-custom-pre-merge-checks-unique-id-file-non-traceable-F7F2B60C-1728-4C9A-8889-4F2235E186CA.txt)
Check that test files follow the project's naming conventions (test_*.py for backend, *.test.ts for frontend)
Files:
src/backend/tests/unit/api/v1/test_mcp.pysrc/backend/tests/unit/api/utils/test_config_utils.pysrc/lfx/tests/unit/services/settings/test_mcp_composer.pysrc/backend/tests/unit/api/v1/test_mcp_projects.pysrc/lfx/tests/unit/services/settings/test_mcp_composer_windows.py
**/test_*.py
📄 CodeRabbit inference engine (coderabbit-custom-pre-merge-checks-unique-id-file-non-traceable-F7F2B60C-1728-4C9A-8889-4F2235E186CA.txt)
**/test_*.py: Backend tests should follow pytest structure with proper test_*.py naming
For async functions, ensure proper async testing patterns are used with pytest for backend
Files:
src/backend/tests/unit/api/v1/test_mcp.pysrc/backend/tests/unit/api/utils/test_config_utils.pysrc/lfx/tests/unit/services/settings/test_mcp_composer.pysrc/backend/tests/unit/api/v1/test_mcp_projects.pysrc/lfx/tests/unit/services/settings/test_mcp_composer_windows.py
src/frontend/src/**/{hooks,services}/**/*.{ts,tsx}
📄 CodeRabbit inference engine (.cursor/rules/frontend_development.mdc)
Use the useApi hook for API calls with proper error handling, including loading state, error state, and exception throwing
Files:
src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts
src/frontend/src/**/{services,hooks}/**/*.{ts,tsx}
📄 CodeRabbit inference engine (.cursor/rules/frontend_development.mdc)
Use async/await pattern for API calls instead of .then() chains
Files:
src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts
🧠 Learnings (12)
📓 Common learnings
Learnt from: deon-sanchez
Repo: langflow-ai/langflow PR: 9158
File: src/backend/base/langflow/api/v1/mcp_projects.py:404-404
Timestamp: 2025-07-23T21:19:22.567Z
Learning: In langflow MCP projects configuration, prefer using dynamically computed URLs (like the `sse_url` variable) over hardcoded localhost URLs to ensure compatibility across different deployment environments.
📚 Learning: 2025-07-23T21:19:22.567Z
Learnt from: deon-sanchez
Repo: langflow-ai/langflow PR: 9158
File: src/backend/base/langflow/api/v1/mcp_projects.py:404-404
Timestamp: 2025-07-23T21:19:22.567Z
Learning: In langflow MCP projects configuration, prefer using dynamically computed URLs (like the `sse_url` variable) over hardcoded localhost URLs to ensure compatibility across different deployment environments.
Applied to files:
src/frontend/src/customization/utils/custom-mcp-url.tssrc/backend/base/langflow/api/utils/mcp/__init__.pysrc/backend/tests/unit/api/utils/test_config_utils.pysrc/lfx/tests/unit/services/settings/test_mcp_composer.pysrc/backend/base/langflow/api/v2/mcp.pysrc/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.tssrc/backend/tests/unit/api/v1/test_mcp_projects.pysrc/backend/base/langflow/api/v1/mcp_projects.pysrc/lfx/src/lfx/services/mcp_composer/service.pysrc/backend/base/langflow/api/utils/mcp/config_utils.pysrc/lfx/tests/unit/services/settings/test_mcp_composer_windows.pysrc/backend/base/langflow/api/v1/mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use `monkeypatch` fixture to mock internal functions for testing error handling scenarios; validate error status codes and error message content in responses
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Test both sync and async code paths, mock external dependencies appropriately, test error handling and edge cases, validate input/output behavior, and test component initialization and configuration
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use `pytest.mark.api_key_required` and `pytest.mark.no_blockbuster` markers for components that need external APIs; use `MockLanguageModel` from `tests.unit.mock_language_model` for testing without external API keys
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use `aiofiles` and `anyio.Path` for async file operations in tests; create temporary test files using `tmp_path` fixture and verify file existence and content
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use the `client` fixture (FastAPI Test Client) defined in `src/backend/tests/conftest.py` for API tests; it provides an async `httpx.AsyncClient` with automatic in-memory SQLite database and mocked environment variables. Skip client creation by marking test with `pytest.mark.noclient`
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use async fixtures with proper cleanup using try/finally blocks to ensure resources are properly released after tests complete
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Test webhook endpoints by posting to `api/v1/webhook/{endpoint_name}` with appropriate payloads and validating response status codes
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Test component build config updates by calling `to_frontend_node()` to get the node template, then calling `update_build_config()` to apply configuration changes
Applied to files:
src/backend/tests/unit/api/utils/test_config_utils.py
📚 Learning: 2025-11-24T19:46:09.074Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/backend_development.mdc:0-0
Timestamp: 2025-11-24T19:46:09.074Z
Learning: Applies to src/backend/base/langflow/services/database/models/**/*.py : Database models should be organized by domain (api_key/, flow/, folder/, user/, etc.) under `src/backend/base/langflow/services/database/models/`
Applied to files:
src/backend/base/langflow/api/v2/mcp.py
📚 Learning: 2025-11-24T19:46:09.074Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/backend_development.mdc:0-0
Timestamp: 2025-11-24T19:46:09.074Z
Learning: Applies to src/backend/**/*.py : Use FastAPI async patterns with `await` for async operations in component execution methods
Applied to files:
src/backend/base/langflow/api/v1/mcp.py
🧬 Code graph analysis (9)
src/backend/base/langflow/api/utils/mcp/__init__.py (1)
src/backend/base/langflow/api/utils/mcp/config_utils.py (4)
get_composer_streamable_http_url(256-265)get_project_sse_url(233-238)get_project_streamable_http_url(225-230)get_url_by_os(181-204)
src/backend/tests/unit/api/v1/test_mcp.py (2)
src/backend/tests/unit/api/v1/test_mcp_projects.py (1)
mock_streamable_http_manager(71-86)src/backend/tests/conftest.py (1)
logged_in_headers(503-509)
src/frontend/src/controllers/API/queries/mcp/use-patch-flows-mcp.ts (1)
src/frontend/src/types/mcp/index.ts (1)
ComposerUrlResponseType(48-55)
src/backend/tests/unit/api/utils/test_config_utils.py (1)
src/backend/tests/unit/api/v2/test_registration.py (1)
client(19-22)
src/backend/base/langflow/api/v2/mcp.py (2)
src/backend/base/langflow/services/database/models/folder/model.py (1)
Folder(21-36)src/backend/base/langflow/services/database/models/api_key/crud.py (1)
create_api_key(24-40)
src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts (1)
src/frontend/src/customization/utils/custom-mcp-url.ts (1)
customGetMCPUrl(9-26)
src/backend/tests/unit/api/v1/test_mcp_projects.py (2)
src/backend/base/langflow/api/v1/mcp_projects.py (2)
ProjectMCPServer(1084-1151)get_project_mcp_server(1158-1167)src/backend/tests/unit/api/v1/test_mcp.py (1)
mock_streamable_http_manager(35-39)
src/backend/base/langflow/api/v1/mcp_projects.py (2)
src/backend/base/langflow/api/utils/mcp/config_utils.py (6)
auto_configure_starter_projects_mcp(268-453)get_composer_streamable_http_url(256-265)get_project_sse_url(233-238)get_project_streamable_http_url(225-230)get_url_by_os(181-204)_get_mcp_composer_auth_config(241-253)src/backend/base/langflow/api/v1/schemas.py (2)
ComposerUrlResponse(463-470)MCPProjectResponse(456-460)
src/backend/base/langflow/api/utils/mcp/config_utils.py (3)
src/backend/base/langflow/services/auth/mcp_encryption.py (2)
decrypt_auth_settings(58-97)encrypt_auth_settings(18-55)src/backend/base/langflow/api/v1/mcp_projects.py (1)
_get_mcp_composer_auth_config(1058-1080)src/backend/base/langflow/services/database/models/folder/model.py (1)
Folder(21-36)
🪛 GitHub Actions: Ruff Style Check
src/backend/base/langflow/api/v1/mcp.py
[error] 89-89: Ruff lint error PLW0603: Using the global statement to update _streamable_http_manager_started is discouraged.
🪛 GitHub Check: Ruff Style Check (3.13)
src/backend/tests/unit/api/v1/test_mcp.py
[failure] 83-83: Ruff (ARG001)
src/backend/tests/unit/api/v1/test_mcp.py:83:75: ARG001 Unused function argument: mock_ensure_manager_running
[failure] 121-121: Ruff (ARG001)
src/backend/tests/unit/api/v1/test_mcp.py:121:75: ARG001 Unused function argument: mock_ensure_manager_running
[failure] 109-109: Ruff (ARG001)
src/backend/tests/unit/api/v1/test_mcp.py:109:75: ARG001 Unused function argument: mock_ensure_manager_running
src/backend/base/langflow/api/v1/mcp.py
[failure] 113-113: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:113:78: PLW0603 Using the global statement to update _streamable_http_manager is discouraged
[failure] 113-113: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:113:46: PLW0603 Using the global statement to update _streamable_http_manager_stack is discouraged
[failure] 113-113: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:113:12: PLW0603 Using the global statement to update _streamable_http_manager_started is discouraged
[failure] 89-89: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:89:78: PLW0603 Using the global statement to update _streamable_http_manager is discouraged
[failure] 89-89: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:89:46: PLW0603 Using the global statement to update _streamable_http_manager_stack is discouraged
[failure] 89-89: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:89:12: PLW0603 Using the global statement to update _streamable_http_manager_started is discouraged
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (19)
- GitHub Check: Validate PR
- GitHub Check: Test Docker Images / Test docker images
- GitHub Check: Run Backend Tests / Unit Tests - Python 3.10 - Group 3
- GitHub Check: Run Frontend Tests / Determine Test Suites and Shard Distribution
- GitHub Check: Run Backend Tests / Unit Tests - Python 3.10 - Group 1
- GitHub Check: Run Backend Tests / LFX Tests - Python 3.10
- GitHub Check: Run Backend Tests / Unit Tests - Python 3.10 - Group 4
- GitHub Check: Run Backend Tests / Unit Tests - Python 3.10 - Group 2
- GitHub Check: Run Backend Tests / Unit Tests - Python 3.10 - Group 5
- GitHub Check: Lint Backend / Run Mypy (3.10)
- GitHub Check: Lint Backend / Run Mypy (3.11)
- GitHub Check: Run Backend Tests / Integration Tests - Python 3.10
- GitHub Check: Lint Backend / Run Mypy (3.13)
- GitHub Check: Lint Backend / Run Mypy (3.12)
- GitHub Check: Run Frontend Unit Tests / Frontend Jest Unit Tests
- GitHub Check: Test Starter Templates
- GitHub Check: test-starter-projects
- GitHub Check: Optimize new Python code in this PR
- GitHub Check: Update Component Index
🔇 Additional comments (27)
src/backend/base/langflow/initial_setup/starter_projects/News Aggregator.json (1)
1927-1929: Verify the google package version downgrade and its connection to the PR.The google package dependency has been downgraded from 0.8.5 to 0.6.15 in the SaveToFile component metadata (line 1928). This is a significant backwards compatibility change that requires clarification:
- Why was this version downgraded? Was a compatibility issue discovered?
- This file modification appears unrelated to the PR objective of migrating MCP transport from SSE to streamable HTTP. Is this change intentional or accidentally included?
- Will this downgrade cause compatibility issues with the
googleapiclientdependency (version 2.154.0) listed on line 1932?Can you clarify the rationale for this dependency downgrade and confirm it's intentional for this PR?
src/lfx/tests/unit/services/settings/test_mcp_composer.py (3)
194-195: LGTM! Mock state correctly reflects new URL structure.The test properly sets up both
streamable_http_url(new primary transport) andlegacy_sse_url(backwards compatibility) in the mock composer state, aligning with the migration from SSE to streamable HTTP.
232-232: LGTM! Parameter correctly updated to use new transport URL.The call to
_do_start_project_composernow passesstreamable_http_urlinstead of the previoussse_url, correctly reflecting the transport migration.
278-278: LGTM! All function calls consistently updated.Both test cases correctly pass
streamable_http_urlto_do_start_project_composer, maintaining consistency across all test scenarios with the new transport mechanism.Also applies to: 315-315
src/lfx/tests/unit/services/settings/test_mcp_composer_windows.py (1)
227-227: Parameter rename fromsse_urltostreamable_http_urlis consistent across all test call sites.The test updates correctly reflect the transport migration. All invocations of
_start_project_composer_processand_do_start_project_composernow use the new parameter name, maintaining consistency with the production code changes.Also applies to: 318-318, 364-364, 419-419, 514-514
src/frontend/src/types/mcp/index.ts (1)
47-55: NewComposerUrlResponseTypealigns with backend schema with additional backwards compatibility.The type correctly mirrors the backend
ComposerUrlResponsemodel. The extrasse_urloptional field provides backwards compatibility for transitional responses. This is properly consumed inuseMcpServer.tswhere it falls back throughstreamable_http_url → sse_url → legacy_sse_url.src/backend/base/langflow/api/v1/schemas.py (1)
463-471: Well-structured response model for MCP Composer URL endpoints.The
ComposerUrlResponsemodel properly defines the streamable HTTP transport fields with appropriate optional typing. The inclusion oflegacy_sse_urlalongsidestreamable_http_urlenables graceful migration for clients that may still need SSE fallback.src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts (1)
110-128: Streamable HTTP URL derivation with graceful fallback is well-implemented.The
composerConnectionmemo correctly prioritizesstreamable_http_urlwhile falling back tosse_urlandlegacy_sse_urlfor backwards compatibility. The dependencies are properly specified, and the integration withcustomGetMCPUrl(per the relevant snippet) is correct.src/frontend/src/controllers/API/queries/mcp/use-get-composer-url.ts (1)
1-17: Type consolidation to sharedComposerUrlResponseTypeis a good refactor.Replacing the local interface with the shared type from
@/types/mcpensures consistency across all MCP-related frontend modules. The query implementation remains unchanged, properly handling both success and error responses.src/frontend/src/controllers/API/queries/mcp/use-patch-flows-mcp.ts (1)
3-7: Consistent type usage forPatchFlowMCPResponse.result.Using
ComposerUrlResponseTypefor the result field aligns with the shared type definition and ensures type consistency when the mutation response includes composer URL data. The cache invalidation logic correctly triggers a refetch of the composer URL after auth settings change.Also applies to: 21-24
src/backend/base/langflow/api/v2/mcp.py (3)
312-327: URL parsing helper is robust and handles edge cases.The
_extract_project_id_from_urlfunction safely handles malformed URLs and validates UUID format. The path segment iteration correctly identifies the project ID following the/project/path component.
402-402: URL suffix removal handles SSE to streamable HTTP migration.The
removesuffix("/sse")correctly strips the SSE endpoint suffix when converting to streamable HTTP URLs. This aligns with the transport migration objective. Based on learnings, using dynamically computed URLs (rather than hardcoded) ensures compatibility across deployment environments.
115-144: Server config normalization loop is correctly structured.The iteration uses
list(mcp_servers.items())to safely modify the dictionary during iteration. The commit-then-upload order ensures API keys are persisted before the config references them. Trackingservers_updatedandcreated_api_keyseparately allows proper conditional handling.src/backend/base/langflow/api/utils/mcp/__init__.py (1)
3-16: Re-exports for new streamable HTTP helpers look correctImport paths and
__all__entries forget_composer_streamable_http_urlandget_project_streamable_http_urlare consistent withconfig_utils.pyand keep MCP URL computation centralized instead of hardcoding hosts/ports. Based on learnings, this aligns with the dynamic URL guideline.src/backend/tests/unit/api/utils/test_config_utils.py (2)
136-140: Validation tests correctly migrated to streamable HTTP URLsUsing
streamable_http_url = f"{client.base_url}/api/v1/mcp/project/{...}"and["mcp-proxy", "--transport", "streamablehttp", streamable_http_url]keeps the project ID embedded in the URL, sovalidate_mcp_server_for_projectcan still detect matching/non‑matching servers via URL inspection. Usingclient.base_urlinstead of hardcoded localhost also respects the dynamic URL guideline.Also applies to: 172-179, 212-219
464-471: Multi-user MCP server configs now match new transport semantics
config_one/config_twoandupdated_config_oneall use["mcp-proxy", "--transport", "streamablehttp", <url>], which matches the new backend expectations and ensures the multi‑tenant isolation tests still exercise the correct transport and arg layout.Also applies to: 501-504
src/backend/base/langflow/api/utils/mcp/config_utils.py (2)
207-223: URL helpers correctly centralize MCP URL construction and WSL handling
_get_project_base_url_components,get_project_streamable_http_url, andget_project_sse_urlnormalizehost/port, avoid0.0.0.0as a connect address, and delegate WSL rewriting toget_url_by_os. This removes hardcoded URLs and keeps OS-specific logic in one place, matching the “dynamic URLs over localhost” guideline.Also applies to: 225-231, 233-238
241-265: Composer streamable HTTP URL helper aligns with decrypted auth config
_get_mcp_composer_auth_configsafely decryptsproject.auth_settingsand raises a clearValueErrorwhen missing.get_composer_streamable_http_urlthen validatesoauth_host/oauth_portand reusesget_url_by_osfor WSL compatibility. This keeps Composer URLs consistent with project URLs and avoids hardcoded hosts.src/backend/tests/unit/api/v1/test_mcp_projects.py (2)
70-87: Streamable HTTP manager mocking is consistent with new ProjectMCPServer lifecyclePatching
StreamableHTTPSessionManagerso that:
run()returns an async context manager, andhandle_requestis anAsyncMock,matches how
ensure_session_manager_runningand_dispatch_project_streamable_httpinteract with the manager.test_handle_project_messages_successassertinghandle_request.assert_called_once()cleanly verifies the new transport path without depending on SSE internals.Also applies to: 186-198
570-617: ProjectMCPServer cache and init tests cover new per-project server semantics
test_project_sse_creationandtest_init_mcp_serversnow validate:
get_project_mcp_serverpopulatesproject_mcp_serverswithProjectMCPServerinstances, andinit_mcp_serversinitializes entries for all projects.This ensures the new per-project server abstraction and cache wiring behave as expected before streamable HTTP dispatch is exercised elsewhere.
src/lfx/src/lfx/services/mcp_composer/service.py (1)
74-88: Composer startup flow correctly switched to HTTP + keeps legacy SSE for compatibilityThe service now:
- Tracks
streamable_http_url/legacy_sse_urlper project,- Derives an
effective_legacy_sse_url = legacy_sse_url or f"{streamable_http_url.rstrip('/')}/sse", and- Starts
mcp-composerwith:"--mode", "http", "--endpoint", streamable_http_url, "--sse-url", effective_legacy_sse_url,which matches the new streamable HTTP design while still satisfying tooling that expects
--sse-url. The per‑project retry and error‑tracking logic remains intact.Also applies to: 950-1015, 1189-1197, 1255-1272
src/backend/base/langflow/api/v1/mcp_projects.py (2)
71-75: Per-project MCP server + StreamableHTTPSessionManager wiring looks soundKey pieces line up:
ProjectMCPServernow:
- Applies
_configure_server_notification_defaultsso change notifications are advertised by default.- Instantiates a
StreamableHTTPSessionManagerand exposesensure_session_manager_running/stop_session_managerguarded by anAsyncExitStackand lock._dispatch_project_streamable_httpcallsget_project_mcp_server(project_id), awaitsensure_session_manager_running(), injects current user/project/variables into contextvars, then delegates tosession_manager.handle_request.The shutdown hook:
@router.on_event("shutdown") async def _shutdown_project_session_managers() -> None: for server in project_mcp_servers.values(): await server.stop_session_manager()ensures managers are torn down cleanly. Overall, the new streamable HTTP transport is integrated without breaking the existing per‑project handler model.
Also applies to: 78-94, 1084-1152, 1170-1177
1196-1205: Composer registration now uses project streamable HTTP + legacy SSE URLs consistentlyBoth
register_project_with_composerandget_or_start_mcp_composernow call:streamable_http_url = await get_project_streamable_http_url(project.id / project_id) legacy_sse_url = await get_project_sse_url(project.id / project_id) await mcp_composer_service.start_project_composer( project_id=str(project.id), streamable_http_url=streamable_http_url, auth_config=auth_config, legacy_sse_url=legacy_sse_url, )which ensures Composer instances are always pointed at dynamically computed Langflow MCP endpoints with correct OS/WSL handling. This aligns Composer startup with the same URL computation used elsewhere in the MCP stack.
Also applies to: 1310-1337
src/backend/base/langflow/api/v1/mcp.py (4)
34-48: Monkey-patching approach is functional but fragile.The use of
MethodTypeto overridecreate_initialization_optionsworks, but consider adding a brief comment explaining why this approach is used (e.g., to inject defaults without modifying upstream MCP library). This helps future maintainers understand the intent.
51-79: LGTM!Server handlers correctly implement async patterns and delegate to shared utility functions.
126-128: LGTM!Health check endpoint is straightforward.
161-170: LGTM!The primary endpoint correctly handles GET/POST/DELETE methods, and the backward-compatible
/sseendpoint ensures smooth migration for existing clients.
| if servers_updated: | ||
| servers["mcpServers"] = mcp_servers | ||
| if created_api_key: | ||
| await session.commit() | ||
| await upload_server_config( | ||
| servers, | ||
| current_user, | ||
| session, | ||
| storage_service=storage_service, | ||
| settings_service=settings_service, | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Script executed:
#!/bin/bash
# First, let's examine the file structure and the specific function
cat -n src/backend/base/langflow/api/v2/mcp.py | head -200Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
1 import contextlib
2 import json
3 from io import BytesIO
4 from typing import Annotated
5 from urllib.parse import urlparse
6 from uuid import UUID
7
8 from fastapi import APIRouter, Depends, HTTPException, UploadFile
9 from lfx.base.agents.utils import safe_cache_get, safe_cache_set
10 from lfx.base.mcp.util import update_tools
11
12 from langflow.api.utils import CurrentActiveUser, DbSession
13 from langflow.api.v2.files import (
14 MCP_SERVERS_FILE,
15 delete_file,
16 download_file,
17 edit_file_name,
18 get_file_by_name,
19 get_mcp_file,
20 upload_user_file,
21 )
22 from langflow.logging import logger
23 from langflow.services.database.models import Folder
24 from langflow.services.database.models.api_key.crud import create_api_key
25 from langflow.services.database.models.api_key.model import ApiKeyCreate
26 from langflow.services.deps import get_settings_service, get_shared_component_cache_service, get_storage_service
27 from langflow.services.settings.service import SettingsService
28 from langflow.services.storage.service import StorageService
29
30 router = APIRouter(tags=["MCP"], prefix="/mcp")
31
32
33 async def upload_server_config(
34 server_config: dict,
35 current_user: CurrentActiveUser,
36 session: DbSession,
37 storage_service: Annotated[StorageService, Depends(get_storage_service)],
38 settings_service: Annotated[SettingsService, Depends(get_settings_service)],
39 ):
40 content_str = json.dumps(server_config)
41 content_bytes = content_str.encode("utf-8") # Convert to bytes
42 file_obj = BytesIO(content_bytes) # Use BytesIO for binary data
43
44 mcp_file = await get_mcp_file(current_user, extension=True)
45 upload_file = UploadFile(file=file_obj, filename=mcp_file, size=len(content_str))
46
47 return await upload_user_file(
48 file=upload_file,
49 session=session,
50 current_user=current_user,
51 storage_service=storage_service,
52 settings_service=settings_service,
53 )
54
55
56 async def get_server_list(
57 current_user: CurrentActiveUser,
58 session: DbSession,
59 storage_service: Annotated[StorageService, Depends(get_storage_service)],
60 settings_service: Annotated[SettingsService, Depends(get_settings_service)],
61 ):
62 # Backwards compatibilty with old format file name
63 mcp_file = await get_mcp_file(current_user)
64 old_format_config_file = await get_file_by_name(MCP_SERVERS_FILE, current_user, session)
65 if old_format_config_file:
66 await edit_file_name(old_format_config_file.id, mcp_file, current_user, session)
67
68 # Read the server configuration from a file using the files api
69 server_config_file = await get_file_by_name(mcp_file, current_user, session)
70
71 # Attempt to download the configuration file content
72 try:
73 server_config_bytes = await download_file(
74 server_config_file.id if server_config_file else None,
75 current_user,
76 session,
77 storage_service=storage_service,
78 return_content=True,
79 )
80 except (FileNotFoundError, HTTPException):
81 # Storage file missing - DB entry may be stale. Remove it and recreate.
82 if server_config_file:
83 with contextlib.suppress(Exception):
84 await delete_file(server_config_file.id, current_user, session, storage_service)
85
86 # Create a fresh empty config
87 await upload_server_config(
88 {"mcpServers": {}},
89 current_user,
90 session,
91 storage_service=storage_service,
92 settings_service=settings_service,
93 )
94
95 # Fetch and download again
96 mcp_file = await get_mcp_file(current_user)
97 server_config_file = await get_file_by_name(mcp_file, current_user, session)
98 if not server_config_file:
99 raise HTTPException(status_code=500, detail="Failed to create MCP Servers configuration file") from None
100
101 server_config_bytes = await download_file(
102 server_config_file.id,
103 current_user,
104 session,
105 storage_service=storage_service,
106 return_content=True,
107 )
108
109 # Parse JSON content
110 try:
111 servers = json.loads(server_config_bytes)
112 except json.JSONDecodeError:
113 raise HTTPException(status_code=500, detail="Invalid server configuration file format.") from None
114
115 servers_updated = False
116 created_api_key = False
117 mcp_servers = servers.get("mcpServers", {})
118
119 for server_name, server_config in list(mcp_servers.items()):
120 updated_config, config_changed, created_key = await _ensure_mcp_server_config(
121 server_name=server_name,
122 server_config=server_config,
123 current_user=current_user,
124 session=session,
125 settings_service=settings_service,
126 )
127 if config_changed:
128 servers_updated = True
129 created_api_key = created_api_key or created_key
130 mcp_servers[server_name] = updated_config
131
132 if servers_updated:
133 servers["mcpServers"] = mcp_servers
134 if created_api_key:
135 await session.commit()
136 await upload_server_config(
137 servers,
138 current_user,
139 session,
140 storage_service=storage_service,
141 settings_service=settings_service,
142 )
143
144 return servers
145
146
147 async def get_server(
148 server_name: str,
149 current_user: CurrentActiveUser,
150 session: DbSession,
151 storage_service: Annotated[StorageService, Depends(get_storage_service)],
152 settings_service: Annotated[SettingsService, Depends(get_settings_service)],
153 server_list: dict | None = None,
154 ):
155 """Get a specific server configuration."""
156 if server_list is None:
157 server_list = await get_server_list(current_user, session, storage_service, settings_service)
158
159 if server_name not in server_list["mcpServers"]:
160 return None
161
162 return server_list["mcpServers"][server_name]
163
164
165 # Define a Get servers endpoint
166 @router.get("/servers")
167 async def get_servers(
168 current_user: CurrentActiveUser,
169 session: DbSession,
170 storage_service: Annotated[StorageService, Depends(get_storage_service)],
171 settings_service: Annotated[SettingsService, Depends(get_settings_service)],
172 *,
173 action_count: bool | None = None,
174 ):
175 """Get the list of available servers."""
176 import asyncio
177
178 server_list = await get_server_list(current_user, session, storage_service, settings_service)
179
180 if not action_count:
181 # Return only the server names, with mode and toolsCount as None
182 return [{"name": server_name, "mode": None, "toolsCount": None} for server_name in server_list["mcpServers"]]
183
184 # Check all of the tool counts for each server concurrently
185 async def check_server(server_name: str) -> dict:
186 server_info: dict[str, str | int | None] = {"name": server_name, "mode": None, "toolsCount": None}
187 try:
188 mode, tool_list, _ = await update_tools(
189 server_name=server_name,
190 server_config=server_list["mcpServers"][server_name],
191 )
192 server_info["mode"] = mode.lower()
193 server_info["toolsCount"] = len(tool_list)
194 if len(tool_list) == 0:
195 server_info["error"] = "No tools found"
196 except ValueError as e:
197 # Configuration validation errors, invalid URLs, etc.
198 await logger.aerror(f"Configuration error for server {server_name}: {e}")
199 server_info["error"] = f"Configuration error: {e}"
200 except ConnectionError as e:
</shell_result>
Script executed:
#!/bin/bash
# Find _ensure_mcp_server_config function
rg -n "_ensure_mcp_server_config" src/backend/base/langflow/api/v2/mcp.py -A 30Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
120: updated_config, config_changed, created_key = await _ensure_mcp_server_config(
121- server_name=server_name,
122- server_config=server_config,
123- current_user=current_user,
124- session=session,
125- settings_service=settings_service,
126- )
127- if config_changed:
128- servers_updated = True
129- created_api_key = created_api_key or created_key
130- mcp_servers[server_name] = updated_config
131-
132- if servers_updated:
133- servers["mcpServers"] = mcp_servers
134- if created_api_key:
135- await session.commit()
136- await upload_server_config(
137- servers,
138- current_user,
139- session,
140- storage_service=storage_service,
141- settings_service=settings_service,
142- )
143-
144- return servers
145-
146-
147-async def get_server(
148- server_name: str,
149- current_user: CurrentActiveUser,
150- session: DbSession,
--
330:async def _ensure_mcp_server_config(
331- *,
332- server_name: str,
333- server_config: dict,
334- current_user: CurrentActiveUser,
335- session: DbSession,
336- settings_service: SettingsService,
337-) -> tuple[dict, bool, bool]:
338- """Normalize stored MCP server configs and ensure auth headers when required."""
339- args = server_config.get("args")
340- if not isinstance(args, list) or not args:
341- return server_config, False, False
342-
343- command = server_config.get("command")
344- if command != "uvx":
345- return server_config, False, False
346-
347- # Detect mcp-proxy invocation
348- if "mcp-proxy" not in args:
349- return server_config, False, False
350-
351- url_arg = next((arg for arg in reversed(args) if isinstance(arg, str) and arg.startswith("http")), None)
352- if not url_arg:
353- return server_config, False, False
354-
355- project_id = _extract_project_id_from_url(url_arg)
356- if project_id is None:
357- return server_config, False, False
358-
359- project: Folder | None = await session.get(Folder, project_id)
360- if project is None:
</shell_result>
Script executed:
#!/bin/bash
# Get the full _ensure_mcp_server_config function
sed -n '330,450p' src/backend/base/langflow/api/v2/mcp.pyRepository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
async def _ensure_mcp_server_config(
*,
server_name: str,
server_config: dict,
current_user: CurrentActiveUser,
session: DbSession,
settings_service: SettingsService,
) -> tuple[dict, bool, bool]:
"""Normalize stored MCP server configs and ensure auth headers when required."""
args = server_config.get("args")
if not isinstance(args, list) or not args:
return server_config, False, False
command = server_config.get("command")
if command != "uvx":
return server_config, False, False
# Detect mcp-proxy invocation
if "mcp-proxy" not in args:
return server_config, False, False
url_arg = next((arg for arg in reversed(args) if isinstance(arg, str) and arg.startswith("http")), None)
if not url_arg:
return server_config, False, False
project_id = _extract_project_id_from_url(url_arg)
if project_id is None:
return server_config, False, False
project: Folder | None = await session.get(Folder, project_id)
if project is None:
return server_config, False, False
generated_api_key = False
# Gather existing header tokens (if any) before we normalize args
existing_header_tokens: list[str] | None = None
preserved_args: list[str] = []
start_index = 1 if args[0] == "mcp-proxy" else 0
if start_index == 0:
preserved_args.append(args[0])
idx = start_index
while idx < len(args):
arg_item = args[idx]
if arg_item == "--transport":
idx += 2
continue
if arg_item == "--headers":
existing_header_tokens = args[idx : idx + 3]
idx += 3
continue
if isinstance(arg_item, str) and arg_item.startswith("http"):
idx += 1
continue
preserved_args.append(arg_item)
idx += 1
should_generate_api_key = False
if settings_service.settings.mcp_composer_enabled:
if project.auth_settings and project.auth_settings.get("auth_type") == "apikey":
should_generate_api_key = True
elif project.auth_settings:
if project.auth_settings.get("auth_type") == "apikey":
should_generate_api_key = True
elif not settings_service.auth_settings.AUTO_LOGIN:
should_generate_api_key = True
if settings_service.auth_settings.AUTO_LOGIN and not settings_service.auth_settings.SUPERUSER:
should_generate_api_key = True
streamable_http_url = url_arg.removesuffix("/sse")
final_args: list[str] = ["mcp-proxy", "--transport", "streamablehttp"]
if preserved_args:
final_args.extend(preserved_args)
header_tokens = existing_header_tokens
if header_tokens is None and should_generate_api_key:
api_key_name = f"MCP Server {project.name}"
new_api_key = await create_api_key(session, ApiKeyCreate(name=api_key_name), current_user.id)
header_tokens = ["--headers", "x-api-key", new_api_key.api_key]
generated_api_key = True
if header_tokens:
final_args.extend(header_tokens)
final_args.append(streamable_http_url)
config_updated = final_args != args
if config_updated:
server_config["args"] = final_args
await logger.adebug(
"Normalized MCP server '%s' configuration for project %s (streamable HTTP + auth header).",
server_name,
project_id,
)
return server_config, config_updated, generated_api_key
@router.post("/servers/{server_name}")
async def add_server(
server_name: str,
server_config: dict,
current_user: CurrentActiveUser,
session: DbSession,
storage_service: Annotated[StorageService, Depends(get_storage_service)],
settings_service: Annotated[SettingsService, Depends(get_settings_service)],
):
return await update_server(
server_name,
server_config,
current_user,
session,
storage_service,
settings_service,
check_existing=True,
</shell_result>
Script executed:
#!/bin/bash
# Find upload_user_file implementation
rg -n "async def upload_user_file" src/backend/base/langflow/api/v2/files.py -A 40Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
131:async def upload_user_file(
132- file: Annotated[UploadFile, File(...)],
133- session: DbSession,
134- current_user: CurrentActiveUser,
135- storage_service: Annotated[StorageService, Depends(get_storage_service)],
136- settings_service: Annotated[SettingsService, Depends(get_settings_service)],
137- *,
138- append: bool = False,
139-) -> UploadFileResponse:
140- """Upload a file for the current user and track it in the database."""
141- # Get the max allowed file size from settings (in MB)
142- try:
143- max_file_size_upload = settings_service.settings.max_file_size_upload
144- except Exception as e:
145- raise HTTPException(status_code=500, detail=f"Settings error: {e}") from e
146-
147- # Validate that a file is actually provided
148- if not file or not file.filename:
149- raise HTTPException(status_code=400, detail="No file provided")
150-
151- # Validate file size (convert MB to bytes)
152- if file.size > max_file_size_upload * 1024 * 1024:
153- raise HTTPException(
154- status_code=413,
155- detail=f"File size is larger than the maximum file size {max_file_size_upload}MB.",
156- )
157-
158- # Create a new database record for the uploaded file.
159- try:
160- # Enforce unique constraint on name, except for the special _mcp_servers file
161- new_filename = file.filename
162- try:
163- root_filename, file_extension = new_filename.rsplit(".", 1)
164- except ValueError:
165- root_filename, file_extension = new_filename, ""
166-
167- # Special handling for the MCP servers config file: always keep the same root filename
168- mcp_file = await get_mcp_file(current_user)
169- mcp_file_ext = await get_mcp_file(current_user, extension=True)
170-
171- # Initialize existing_file for append mode
</shell_result>
Script executed:
#!/bin/bash
# Get full upload_user_file function (check length first)
wc -l src/backend/base/langflow/api/v2/files.pyRepository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
794 src/backend/base/langflow/api/v2/files.py
</shell_result>
Script executed:
#!/bin/bash
# Find the end of upload_user_file function
sed -n '131,300p' src/backend/base/langflow/api/v2/files.pyRepository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
async def upload_user_file(
file: Annotated[UploadFile, File(...)],
session: DbSession,
current_user: CurrentActiveUser,
storage_service: Annotated[StorageService, Depends(get_storage_service)],
settings_service: Annotated[SettingsService, Depends(get_settings_service)],
*,
append: bool = False,
) -> UploadFileResponse:
"""Upload a file for the current user and track it in the database."""
# Get the max allowed file size from settings (in MB)
try:
max_file_size_upload = settings_service.settings.max_file_size_upload
except Exception as e:
raise HTTPException(status_code=500, detail=f"Settings error: {e}") from e
# Validate that a file is actually provided
if not file or not file.filename:
raise HTTPException(status_code=400, detail="No file provided")
# Validate file size (convert MB to bytes)
if file.size > max_file_size_upload * 1024 * 1024:
raise HTTPException(
status_code=413,
detail=f"File size is larger than the maximum file size {max_file_size_upload}MB.",
)
# Create a new database record for the uploaded file.
try:
# Enforce unique constraint on name, except for the special _mcp_servers file
new_filename = file.filename
try:
root_filename, file_extension = new_filename.rsplit(".", 1)
except ValueError:
root_filename, file_extension = new_filename, ""
# Special handling for the MCP servers config file: always keep the same root filename
mcp_file = await get_mcp_file(current_user)
mcp_file_ext = await get_mcp_file(current_user, extension=True)
# Initialize existing_file for append mode
existing_file = None
if new_filename == mcp_file_ext:
# Check if an existing record exists; if so, delete it to replace with the new one
existing_mcp_file = await get_file_by_name(mcp_file, current_user, session)
if existing_mcp_file:
await delete_file(existing_mcp_file.id, current_user, session, storage_service)
# Flush the session to ensure the deletion is committed before creating the new file
await session.flush()
unique_filename = new_filename
elif append:
# In append mode, check if file exists and reuse the same filename
existing_file = await get_file_by_name(root_filename, current_user, session)
if existing_file:
# File exists, append to it by reusing the same filename
# Extract the filename from the path
unique_filename = existing_file.path.split("/")[-1] if "/" in existing_file.path else existing_file.path
else:
# File doesn't exist yet, create new one with extension
unique_filename = f"{root_filename}.{file_extension}" if file_extension else root_filename
else:
# For normal files, ensure unique name by appending a count if necessary
stmt = select(UserFile).where(
col(UserFile.name).like(f"{root_filename}%"), UserFile.user_id == current_user.id
)
existing_files = await session.exec(stmt)
files = existing_files.all() # Fetch all matching records
if files:
counts = []
# Extract the count from the filename
for my_file in files:
match = re.search(r"\((\d+)\)(?=\.\w+$|$)", my_file.name)
if match:
counts.append(int(match.group(1)))
count = max(counts) if counts else 0
root_filename = f"{root_filename} ({count + 1})"
# Create the unique filename with extension for storage
unique_filename = f"{root_filename}.{file_extension}" if file_extension else root_filename
# Read file content, save with unique filename, and compute file size in one routine
try:
file_id, stored_file_name = await save_file_routine(
file, storage_service, current_user, file_name=unique_filename, append=append
)
file_size = await storage_service.get_file_size(
flow_id=str(current_user.id),
file_name=stored_file_name,
)
except FileNotFoundError as e:
# S3 bucket doesn't exist or file not found, or file was uploaded but can't be found
raise HTTPException(status_code=404, detail=str(e)) from e
except PermissionError as e:
# Access denied or invalid credentials
raise HTTPException(status_code=403, detail=str(e)) from e
except Exception as e:
# General error saving file or getting file size
raise HTTPException(status_code=500, detail=f"Error accessing file: {e}") from e
if append and existing_file:
existing_file.size = file_size
session.add(existing_file)
await session.commit()
await session.refresh(existing_file)
new_file = existing_file
else:
# Create a new file record
new_file = UserFile(
id=file_id,
user_id=current_user.id,
name=root_filename,
path=f"{current_user.id}/{stored_file_name}",
size=file_size,
)
session.add(new_file)
try:
await session.flush()
await session.refresh(new_file)
except Exception as db_err:
# Database insert failed - clean up the uploaded file to avoid orphaned files
try:
await storage_service.delete_file(flow_id=str(current_user.id), file_name=stored_file_name)
except OSError as e:
# If delete fails, just log the error
await logger.aerror(f"Failed to clean up uploaded file {stored_file_name}: {e}")
raise HTTPException(
status_code=500, detail=f"Error inserting file metadata into database: {db_err}"
) from db_err
except HTTPException:
# Re-raise HTTP exceptions (like 409 conflicts) without modification
raise
except Exception as e:
# Optionally, you could also delete the file from disk if the DB insert fails.
raise HTTPException(status_code=500, detail=f"Database error: {e}") from e
return UploadFileResponse(id=new_file.id, name=new_file.name, path=Path(new_file.path), size=new_file.size)
async def get_file_by_name(
file_name: str, # The name of the file to search for
current_user: CurrentActiveUser,
session: DbSession,
) -> UserFile | None:
"""Get the file associated with a given file name for the current user."""
try:
# Fetch from the UserFile table
stmt = select(UserFile).where(UserFile.user_id == current_user.id).where(UserFile.name == file_name)
result = await session.exec(stmt)
return result.first() or None
except Exception as e:
raise HTTPException(status_code=500, detail=f"Error fetching file: {e}") from e
async def load_sample_files(current_user: CurrentActiveUser, session: DbSession, storage_service: StorageService):
# Check if the sample files in the SAMPLE_DATA_DIR exist
for sample_file_path in Path(SAMPLE_DATA_DIR).iterdir():
sample_file_name = sample_file_path.name
root_filename, _ = sample_file_name.rsplit(".", 1)
# Check if the sample file exists in the storage service
existing_sample_file = await get_file_by_name(
file_name=root_filename, current_user=current_user, session=session
)
</shell_result>
Script executed:
#!/bin/bash
# Check the get_servers endpoint and what calls get_server_list
rg -n "async def get_servers" src/backend/base/langflow/api/v2/mcp.py -A 30Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
167:async def get_servers(
168- current_user: CurrentActiveUser,
169- session: DbSession,
170- storage_service: Annotated[StorageService, Depends(get_storage_service)],
171- settings_service: Annotated[SettingsService, Depends(get_settings_service)],
172- *,
173- action_count: bool | None = None,
174-):
175- """Get the list of available servers."""
176- import asyncio
177-
178- server_list = await get_server_list(current_user, session, storage_service, settings_service)
179-
180- if not action_count:
181- # Return only the server names, with mode and toolsCount as None
182- return [{"name": server_name, "mode": None, "toolsCount": None} for server_name in server_list["mcpServers"]]
183-
184- # Check all of the tool counts for each server concurrently
185- async def check_server(server_name: str) -> dict:
186- server_info: dict[str, str | int | None] = {"name": server_name, "mode": None, "toolsCount": None}
187- try:
188- mode, tool_list, _ = await update_tools(
189- server_name=server_name,
190- server_config=server_list["mcpServers"][server_name],
191- )
192- server_info["mode"] = mode.lower()
193- server_info["toolsCount"] = len(tool_list)
194- if len(tool_list) == 0:
195- server_info["error"] = "No tools found"
196- except ValueError as e:
197- # Configuration validation errors, invalid URLs, etc.
</shell_result>
Script executed:
#!/bin/bash
# Check how DbSession is defined and configured
rg -n "DbSession" src/backend/base/langflow/api/utils.py -B 3 -A 3 | head -50Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stderr:
src/backend/base/langflow/api/utils.py: No such file or directory (os error 2)
</shell_result>
Script executed:
#!/bin/bash
# Find DbSession definition
rg -n "DbSession" src/backend/base/langflow/api/ -l | head -5Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
src/backend/base/langflow/api/v2/mcp.py
src/backend/base/langflow/api/v2/files.py
src/backend/base/langflow/api/health_check_router.py
src/backend/base/langflow/api/utils/core.py
src/backend/base/langflow/api/v1/api_key.py
</shell_result>
Script executed:
#!/bin/bash
# Check DbSession definition
rg -n "DbSession" src/backend/base/langflow/api/utils/core.py -B 2 -A 5Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
36-CurrentActiveUser = Annotated[User, Depends(get_current_active_user)]
37-CurrentActiveMCPUser = Annotated[User, Depends(get_current_active_user_mcp)]
38:# DbSession with auto-commit for write operations
39:DbSession = Annotated[AsyncSession, Depends(injectable_session_scope)]
40:# DbSessionReadOnly for read-only operations (no auto-commit, reduces lock contention)
41:DbSessionReadOnly = Annotated[AsyncSession, Depends(injectable_session_scope_readonly)]
42-
43-
44-class EventDeliveryType(str, Enum):
45- STREAMING = "streaming"
46- DIRECT = "direct"
</shell_result>
Script executed:
#!/bin/bash
# Find injectable_session_scope
rg -n "injectable_session_scope" src/backend/base/langflow/ -lRepository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
src/backend/base/langflow/services/auth/utils.py
src/backend/base/langflow/api/utils/core.py
</shell_result>
Script executed:
#!/bin/bash
# Find injectable_session_scope definition
rg -n "def injectable_session_scope" src/backend/base/langflow/ -A 20Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 1
</shell_result>
Script executed:
#!/bin/bash
# Search more broadly for injectable_session_scope
rg -n "injectable_session_scope" src/backend/base/langflow/services/auth/utils.py -B 3 -A 15Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
11-from fastapi.security import APIKeyHeader, APIKeyQuery, OAuth2PasswordBearer
12-from jose import JWTError, jwt
13-from lfx.log.logger import logger
14:from lfx.services.deps import injectable_session_scope, session_scope
15-from lfx.services.settings.service import SettingsService
16-from sqlalchemy.exc import IntegrityError
17-from sqlmodel.ext.asyncio.session import AsyncSession
18-from starlette.websockets import WebSocket
19-
20-from langflow.helpers.user import get_user_by_flow_id_or_endpoint_name
21-from langflow.services.database.models.api_key.crud import check_key
22-from langflow.services.database.models.user.crud import get_user_by_id, get_user_by_username, update_user_last_login_at
23-from langflow.services.database.models.user.model import User, UserRead
24-from langflow.services.deps import get_settings_service
25-
26-if TYPE_CHECKING:
27- from langflow.services.database.models.api_key.model import ApiKey
28-
29-oauth2_login = OAuth2PasswordBearer(tokenUrl="api/v1/login", auto_error=False)
--
146- token: Annotated[str, Security(oauth2_login)],
147- query_param: Annotated[str, Security(api_key_query)],
148- header_param: Annotated[str, Security(api_key_header)],
149: db: Annotated[AsyncSession, Depends(injectable_session_scope)],
150-) -> User:
151- if token:
152- return await get_current_user_by_jwt(token, db)
153- user = await api_key_security(query_param, header_param)
154- if user:
155- return user
156-
157- raise HTTPException(
158- status_code=status.HTTP_403_FORBIDDEN,
159- detail="Invalid or missing API key",
160- )
161-
162-
163-async def get_current_user_by_jwt(
164- token: str,
--
587- token: Annotated[str, Security(oauth2_login)],
588- query_param: Annotated[str, Security(api_key_query)],
589- header_param: Annotated[str, Security(api_key_header)],
590: db: Annotated[AsyncSession, Depends(injectable_session_scope)],
591-) -> User:
592- """MCP-specific user authentication that always allows fallback to username lookup.
593-
594- This function provides authentication for MCP endpoints with special handling:
595- - If a JWT token is provided, it uses standard JWT authentication
596- - If no API key is provided and AUTO_LOGIN is enabled, it falls back to
597- username lookup using the configured superuser credentials
598- - Otherwise, it validates the provided API key (from query param or header)
599- """
600- if token:
601- return await get_current_user_by_jwt(token, db)
602-
603- # MCP-specific authentication logic - always behaves as if skip_auth_auto_login is True
604- settings_service = get_settings_service()
605- result: ApiKey | User | None
</shell_result>
Script executed:
#!/bin/bash
# Check if there's any error handling at the endpoint level that might catch exceptions
rg -n "def get_servers|async def get_servers" src/backend/base/langflow/api/v2/mcp.py -A 50 | grep -E "(try|except|raise|HTTPException)" | head -20Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
187- try:
196- except ValueError as e:
200- except ConnectionError as e:
204- except (TimeoutError, asyncio.TimeoutError) as e:
208- except OSError as e:
212- except (KeyError, TypeError) as e:
216- except (RuntimeError, ProcessLookupError, PermissionError) as e:
</shell_result>
Script executed:
#!/antml/bash
# Check if there are any error scenarios in upload_server_config or upload_user_file that would cause failure
# after the session.commit() is called
rg -n "raise HTTPException" src/backend/base/langflow/api/v2/files.py | head -20Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
97: raise HTTPException(status_code=404, detail="File not found")
101: raise HTTPException(status_code=403, detail="You don't have access to this file")
145: raise HTTPException(status_code=500, detail=f"Settings error: {e}") from e
149: raise HTTPException(status_code=400, detail="No file provided")
153: raise HTTPException(
226: raise HTTPException(status_code=404, detail=str(e)) from e
229: raise HTTPException(status_code=403, detail=str(e)) from e
232: raise HTTPException(status_code=500, detail=f"Error accessing file: {e}") from e
262: raise HTTPException(
270: raise HTTPException(status_code=500, detail=f"Database error: {e}") from e
288: raise HTTPException(status_code=500, detail=f"Error fetching file: {e}") from e
357: raise HTTPException(status_code=500, detail=f"Error listing files: {e}") from e
375: raise HTTPException(status_code=404, detail="No files found")
434: raise HTTPException(status_code=500, detail=f"Failed to delete any files from database: {db_failures}")
452: raise HTTPException(status_code=500, detail=f"Error deleting files: {e}") from e
472: raise HTTPException(status_code=404, detail="No files found")
507: raise HTTPException(status_code=404, detail=f"File not found: {e}") from e
509: raise HTTPException(status_code=500, detail=f"Error downloading files: {e}") from e
541: raise HTTPException(status_code=500, detail="Invalid file encoding") from exc
543: raise HTTPException(status_code=500, detail=f"Error reading file: {exc}") from exc
</shell_result>
Script executed:
#!/antml/bash
# Check if there's any mechanism to clean up orphaned API keys
rg -n "orphan|cleanup|delete.*api.*key" src/backend/base/langflow/ -iRepository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
src/backend/base/langflow/utils/template_validation.py:192: # Log but don't fail if cleanup times out
src/backend/base/langflow/services/flow/flow_runner.py:70: cleanup: bool = True, # If True, clears flow state after execution
src/backend/base/langflow/services/flow/flow_runner.py:97: if cleanup and user_id:
src/backend/base/langflow/services/job_queue/service.py:27: - Automatically perform periodic cleanup of inactive or completed job queues.
src/backend/base/langflow/services/job_queue/service.py:29: The cleanup process follows a two-phase approach:
src/backend/base/langflow/services/job_queue/service.py:30: 1. When a task is cancelled or fails, it is marked for cleanup by setting a timestamp
src/backend/base/langflow/services/job_queue/service.py:31: 2. The actual cleanup only occurs after CLEANUP_GRACE_PERIOD seconds have elapsed
src/backend/base/langflow/services/job_queue/service.py:41: * The cleanup timestamp (if any).
src/backend/base/langflow/services/job_queue/service.py:42: _cleanup_task (asyncio.Task | None): Background task for periodic cleanup.
src/backend/base/langflow/services/job_queue/service.py:44: CLEANUP_GRACE_PERIOD (int): Number of seconds to wait after a task is marked for cleanup
src/backend/base/langflow/services/job_queue/service.py:58: await service.cleanup_job("job123")
src/backend/base/langflow/services/job_queue/service.py:67: Sets up the internal registry for job queues, initializes the cleanup task, and sets the service state
src/backend/base/langflow/services/job_queue/service.py:71: self._cleanup_task: asyncio.Task | None = None
src/backend/base/langflow/services/job_queue/service.py:74: self.CLEANUP_GRACE_PERIOD = 300 # 5 minutes before cleaning up marked tasks
src/backend/base/langflow/services/job_queue/service.py:82: return self._cleanup_task is not None
src/backend/base/langflow/services/job_queue/service.py:90: """Start the JobQueueService and begin the periodic cleanup routine.
src/backend/base/langflow/services/job_queue/service.py:96: self._cleanup_task = asyncio.create_task(self._periodic_cleanup())
src/backend/base/langflow/services/job_queue/service.py:97: logger.debug("JobQueueService started: periodic cleanup task initiated.")
src/backend/base/langflow/services/job_queue/service.py:104: 2. Cancels the background periodic cleanup task and awaits its termination.
src/backend/base/langflow/services/job_queue/service.py:109: if self._cleanup_task:
src/backend/base/langflow/services/job_queue/service.py:110: self._cleanup_task.cancel()
src/backend/base/langflow/services/job_queue/service.py:111: await asyncio.wait([self._cleanup_task])
src/backend/base/langflow/services/job_queue/service.py:112: if not self._cleanup_task.cancelled():
src/backend/base/langflow/services/job_queue/service.py:113: exc = self._cleanup_task.exception()
src/backend/base/langflow/services/job_queue/service.py:119: await self.cleanup_job(job_id)
src/backend/base/langflow/services/job_queue/service.py:196: and the cleanup timestamp (if any).
src/backend/base/langflow/services/job_queue/service.py:211: async def cleanup_job(self, job_id: str) -> None:
src/backend/base/langflow/services/job_queue/service.py:214: The cleanup process includes:
src/backend/base/langflow/services/job_queue/service.py:224: await logger.adebug(f"No queue found for job_id {job_id} during cleanup.")
src/backend/base/langflow/services/job_queue/service.py:227: await logger.adebug(f"Commencing cleanup for job_id {job_id}")
src/backend/base/langflow/services/job_queue/service.py:252: await logger.adebug(f"Cleanup successful for job_id {job_id}: resources have been released.")
src/backend/base/langflow/services/job_queue/service.py:254: async def _periodic_cleanup(self) -> None:
src/backend/base/langflow/services/job_queue/service.py:259: - Initiates the cleanup of job queues by calling _cleanup_old_queues.
src/backend/base/langflow/services/job_queue/service.py:260: - Monitors and logs any exceptions during the cleanup cycle.
src/backend/base/langflow/services/job_queue/service.py:266: await asyncio.sleep(60) # Sleep for 60 seconds before next cleanup attempt.
src/backend/base/langflow/services/job_queue/service.py:267: await self._cleanup_old_queues()
src/backend/base/langflow/services/job_queue/service.py:269: await logger.adebug("Periodic cleanup task received cancellation signal.")
src/backend/base/langflow/services/job_queue/service.py:272: await logger.aerror(f"Exception encountered during periodic cleanup: {exc}")
src/backend/base/langflow/services/job_queue/service.py:274: async def _cleanup_old_queues(self) -> None:
src/backend/base/langflow/services/job_queue/service.py:279: _, _, task, cleanup_time = self._queues[job_id]
src/backend/base/langflow/services/job_queue/service.py:287: # Check if task should be marked for cleanup
src/backend/base/langflow/services/job_queue/service.py:289: if cleanup_time is None:
src/backend/base/langflow/services/job_queue/service.py:290: # Mark for cleanup by setting the timestamp
src/backend/base/langflow/services/job_queue/service.py:298: f"Job queue for job_id {job_id} marked for cleanup - Task cancelled or failed"
src/backend/base/langflow/services/job_queue/service.py:300: elif current_time - cleanup_time >= self.CLEANUP_GRACE_PERIOD:
src/backend/base/langflow/services/job_queue/service.py:301: # Enough time has passed, perform the actual cleanup
src/backend/base/langflow/services/job_queue/service.py:303: await self.cleanup_job(job_id)
src/backend/base/langflow/services/utils.py:190: # Don't re-raise since this is a cleanup task
src/backend/base/langflow/services/utils.py:217: # Don't re-raise since this is a cleanup task
src/backend/base/langflow/services/utils.py:283: await get_db_service().assign_orphaned_flows_to_superuser()
src/backend/base/langflow/services/utils.py:285: await logger.awarning(f"Error assigning orphaned flows to the superuser: {exc!s}")
src/backend/base/langflow/services/tracing/service.py:295: def _cleanup_inputs(inputs: dict[str, Any]):
src/backend/base/langflow/services/tracing/service.py:316: inputs = self._cleanup_inputs(component_trace_context.inputs)
src/backend/base/langflow/services/tracing/service.py:376: inputs = self._cleanup_inputs(inputs)
src/backend/base/langflow/services/storage/local.py:198: """Perform any cleanup operations when the service is being torn down."""
src/backend/base/langflow/services/storage/s3.py:320: """Perform any cleanup operations when the service is being torn down.
src/backend/base/langflow/services/storage/s3.py:322: For S3, we don't need to do anything as aioboto3 handles cleanup
src/backend/base/langflow/services/task/backends/anyio.py:108: async def cleanup_task(self, task_id: str) -> None:
src/backend/base/langflow/services/task/temp_flow_cleanup.py:19:async def cleanup_orphaned_records() -> None:
src/backend/base/langflow/services/task/temp_flow_cleanup.py:36: # Get distinct orphaned flow IDs from the table
src/backend/base/langflow/services/task/temp_flow_cleanup.py:37: orphaned_flow_ids = (
src/backend/base/langflow/services/task/temp_flow_cleanup.py:43: if orphaned_flow_ids:
src/backend/base/langflow/services/task/temp_flow_cleanup.py:44: logger.debug(f"Found {len(orphaned_flow_ids)} orphaned flow IDs in {table.__name__}")
src/backend/base/langflow/services/task/temp_flow_cleanup.py:46: # Delete all orphaned records in a single query
src/backend/base/langflow/services/task/temp_flow_cleanup.py:47: await session.exec(delete(table).where(col(table.flow_id).in_(orphaned_flow_ids)))
src/backend/base/langflow/services/task/temp_flow_cleanup.py:51: for flow_id in orphaned_flow_ids:
src/backend/base/langflow/services/task/temp_flow_cleanup.py:66: logger.debug(f"Successfully deleted orphaned records from {table.__name__}")
src/backend/base/langflow/services/task/temp_flow_cleanup.py:69: logger.error(f"Error cleaning up orphaned records in {table.__name__}: {exc!s}")
src/backend/base/langflow/services/task/temp_flow_cleanup.py:72:class CleanupWorker:
src/backend/base/langflow/services/task/temp_flow_cleanup.py:78: """Start the cleanup worker."""
src/backend/base/langflow/services/task/temp_flow_cleanup.py:80: await logger.awarning("Cleanup worker is already running")
src/backend/base/langflow/services/task/temp_flow_cleanup.py:84: await logger.adebug("Started database cleanup worker")
src/backend/base/langflow/services/task/temp_flow_cleanup.py:87: """Stop the cleanup worker gracefully."""
src/backend/base/langflow/services/task/temp_flow_cleanup.py:89: await logger.awarning("Cleanup worker is not running")
src/backend/base/langflow/services/task/temp_flow_cleanup.py:92: await logger.adebug("Stopping database cleanup worker...")
src/backend/base/langflow/services/task/temp_flow_cleanup.py:96: await logger.adebug("Database cleanup worker stopped")
src/backend/base/langflow/services/task/temp_flow_cleanup.py:99: """Run the cleanup worker until stopped."""
src/backend/base/langflow/services/task/temp_flow_cleanup.py:103: # Clean up any orphaned records
src/backend/base/langflow/services/task/temp_flow_cleanup.py:104: await cleanup_orphaned_records()
src/backend/base/langflow/services/task/temp_flow_cleanup.py:106: await logger.aerror(f"Error in cleanup worker: {exc!s}")
src/backend/base/langflow/services/task/temp_flow_cleanup.py:110: sleep_task = asyncio.create_task(asyncio.sleep(settings.public_flow_cleanup_interval))
src/backend/base/langflow/services/task/temp_flow_cleanup.py:128: logger.error(f"Error in cleanup worker sleep: {exc!s}")
src/backend/base/langflow/services/task/temp_flow_cleanup.py:134:cleanup_worker = CleanupWorker()
src/backend/base/langflow/main.py:354: await logger.aerror(f"Error during task cleanup: {result}", exc_info=result)
src/backend/base/langflow/main.py:365: temp_dir_cleanups = [asyncio.to_thread(temp_dir.cleanup) for temp_dir in temp_dirs]
src/backend/base/langflow/main.py:367: await asyncio.wait_for(asyncio.gather(*temp_dir_cleanups), timeout=10)
src/backend/base/langflow/main.py:369: await logger.awarning("Temporary file cleanup timed out after 10s.")
src/backend/base/langflow/main.py:385: await logger.aexception(f"Unhandled error during cleanup: {e}")
src/backend/base/langflow/main.py:386: await log_exception_to_telemetry(e, "lifespan_cleanup")
src/backend/base/langflow/initial_setup/starter_projects/Text Sentiment Analysis.json:2410: "value": "\"\"\"Enhanced file component with Docling support and process isolation.\n\nNotes:\n-----\n- ALL Docling parsing/export runs in a separate OS process to prevent memory\n growth and native library state from impacting the main Langflow process.\n- Standard text/structured parsing continues to use existing BaseFileComponent\n utilities (and optional threading via `parallel_load_data`).\n\"\"\"\n\nfrom __future__ import annotations\n\nimport contextlib\nimport json\nimport subprocess\nimport sys\nimport textwrap\nfrom copy import deepcopy\nfrom pathlib import Path\nfrom tempfile import NamedTemporaryFile\nfrom typing import Any\n\nfrom lfx.base.data.base_file import BaseFileComponent\nfrom lfx.base.data.storage_utils import parse_storage_path\nfrom lfx.base.data.utils import TEXT_FILE_TYPES, parallel_load_data, parse_text_file_to_data\nfrom lfx.inputs.inputs import DropdownInput, MessageTextInput, StrInput\nfrom lfx.io import BoolInput, FileInput, IntInput, Output\nfrom lfx.schema.data import Data\nfrom lfx.schema.dataframe import DataFrame # noqa: TC001\nfrom lfx.schema.message import Message\nfrom lfx.services.deps import get_settings_service, get_storage_service\nfrom lfx.utils.async_helpers import run_until_complete\n\n\nclass FileComponent(BaseFileComponent):\n \"\"\"File component with optional Docling processing (isolated in a subprocess).\"\"\"\n\n display_name = \"Read File\"\n description = \"Loads content from one or more files.\"\n documentation: str = \"https://docs.langflow.org/read-file\"\n icon = \"file-text\"\n name = \"File\"\n\n # Extensions that can be processed without Docling (using standard text parsing)\n TEXT_EXTENSIONS = TEXT_FILE_TYPES\n\n # Extensions that require Docling for processing (images, advanced office formats, etc.)\n DOCLING_ONLY_EXTENSIONS = [\n \"adoc\",\n \"asciidoc\",\n \"asc\",\n \"bmp\",\n \"dotx\",\n \"dotm\",\n \"docm\",\n \"jpg\",\n \"jpeg\",\n \"png\",\n \"potx\",\n \"ppsx\",\n \"pptm\",\n \"potm\",\n \"ppsm\",\n \"pptx\",\n \"tiff\",\n \"xls\",\n \"xlsx\",\n \"xhtml\",\n \"webp\",\n ]\n\n # Docling-supported/compatible extensions; TEXT_FILE_TYPES are supported by the base loader.\n VALID_EXTENSIONS = [\n *TEXT_EXTENSIONS,\n *DOCLING_ONLY_EXTENSIONS,\n ]\n\n # Fixed export settings used when markdown export is requested.\n EXPORT_FORMAT = \"Markdown\"\n IMAGE_MODE = \"placeholder\"\n\n _base_inputs = deepcopy(BaseFileComponent.get_base_inputs())\n\n for input_item in _base_inputs:\n if isinstance(input_item, FileInput) and input_item.name == \"path\":\n input_item.real_time_refresh = True\n input_item.tool_mode = False # Disable tool mode for file upload input\n input_item.required = False # Make it optional so it doesn't error in tool mode\n break\n\n inputs = [\n *_base_inputs,\n StrInput(\n name=\"file_path_str\",\n display_name=\"File Path\",\n info=(\n \"Path to the file to read. Used when component is called as a tool. \"\n \"If not provided, will use the uploaded file from 'path' input.\"\n ),\n show=False,\n advanced=True,\n tool_mode=True,\n required=False,\n ),\n BoolInput(\n name=\"advanced_mode\",\n display_name=\"Advanced Parser\",\n value=False,\n real_time_refresh=True,\n info=(\n \"Enable advanced document processing and export with Docling for PDFs, images, and office documents. \"\n \"Note that advanced document processing can consume significant resources.\"\n ),\n show=True,\n ),\n DropdownInput(\n name=\"pipeline\",\n display_name=\"Pipeline\",\n info=\"Docling pipeline to use\",\n options=[\"standard\", \"vlm\"],\n value=\"standard\",\n advanced=True,\n real_time_refresh=True,\n ),\n DropdownInput(\n name=\"ocr_engine\",\n display_name=\"OCR Engine\",\n info=\"OCR engine to use. Only available when pipeline is set to 'standard'.\",\n options=[\"None\", \"easyocr\"],\n value=\"easyocr\",\n show=False,\n advanced=True,\n ),\n StrInput(\n name=\"md_image_placeholder\",\n display_name=\"Image placeholder\",\n info=\"Specify the image placeholder for markdown exports.\",\n value=\"<!-- image -->\",\n advanced=True,\n show=False,\n ),\n StrInput(\n name=\"md_page_break_placeholder\",\n display_name=\"Page break placeholder\",\n info=\"Add this placeholder between pages in the markdown output.\",\n value=\"\",\n advanced=True,\n show=False,\n ),\n MessageTextInput(\n name=\"doc_key\",\n display_name=\"Doc Key\",\n info=\"The key to use for the DoclingDocument column.\",\n value=\"doc\",\n advanced=True,\n show=False,\n ),\n # Deprecated input retained for backward-compatibility.\n BoolInput(\n name=\"use_multithreading\",\n display_name=\"[Deprecated] Use Multithreading\",\n advanced=True,\n value=True,\n info=\"Set 'Processing Concurrency' greater than 1 to enable multithreading.\",\n ),\n IntInput(\n name=\"concurrency_multithreading\",\n display_name=\"Processing Concurrency\",\n advanced=True,\n info=\"When multiple files are being processed, the number of files to process concurrently.\",\n value=1,\n ),\n BoolInput(\n name=\"markdown\",\n display_name=\"Markdown Export\",\n info=\"Export processed documents to Markdown format. Only available when advanced mode is enabled.\",\n value=False,\n show=False,\n ),\n ]\n\n outputs = [\n Output(display_name=\"Raw Content\", name=\"message\", method=\"load_files_message\", tool_mode=True),\n ]\n\n # ------------------------------ UI helpers --------------------------------------\n\n def _path_value(self, template: dict) -> list[str]:\n \"\"\"Return the list of currently selected file paths from the template.\"\"\"\n return template.get(\"path\", {}).get(\"file_path\", [])\n\n def update_build_config(\n self,\n build_config: dict[str, Any],\n field_value: Any,\n field_name: str | None = None,\n ) -> dict[str, Any]:\n \"\"\"Show/hide Advanced Parser and related fields based on selection context.\"\"\"\n if field_name == \"path\":\n paths = self._path_value(build_config)\n\n # If all files can be processed by docling, do so\n allow_advanced = all(not file_path.endswith((\".csv\", \".xlsx\", \".parquet\")) for file_path in paths)\n build_config[\"advanced_mode\"][\"show\"] = allow_advanced\n if not allow_advanced:\n build_config[\"advanced_mode\"][\"value\"] = False\n for f in (\"pipeline\", \"ocr_engine\", \"doc_key\", \"md_image_placeholder\", \"md_page_break_placeholder\"):\n if f in build_config:\n build_config[f][\"show\"] = False\n\n # Docling Processing\n elif field_name == \"advanced_mode\":\n for f in (\"pipeline\", \"ocr_engine\", \"doc_key\", \"md_image_placeholder\", \"md_page_break_placeholder\"):\n if f in build_config:\n build_config[f][\"show\"] = bool(field_value)\n if f == \"pipeline\":\n build_config[f][\"advanced\"] = not bool(field_value)\n\n elif field_name == \"pipeline\":\n if field_value == \"standard\":\n build_config[\"ocr_engine\"][\"show\"] = True\n build_config[\"ocr_engine\"][\"value\"] = \"easyocr\"\n else:\n build_config[\"ocr_engine\"][\"show\"] = False\n build_config[\"ocr_engine\"][\"value\"] = \"None\"\n\n return build_config\n\n def update_outputs(self, frontend_node: dict[str, Any], field_name: str, field_value: Any) -> dict[str, Any]: # noqa: ARG002\n \"\"\"Dynamically show outputs based on file count/type and advanced mode.\"\"\"\n if field_name not in [\"path\", \"advanced_mode\", \"pipeline\"]:\n return frontend_node\n\n template = frontend_node.get(\"template\", {})\n paths = self._path_value(template)\n if not paths:\n return frontend_node\n\n frontend_node[\"outputs\"] = []\n if len(paths) == 1:\n file_path = paths[0] if field_name == \"path\" else frontend_node[\"template\"][\"path\"][\"file_path\"][0]\n if file_path.endswith((\".csv\", \".xlsx\", \".parquet\")):\n frontend_node[\"outputs\"].append(\n Output(\n display_name=\"Structured Content\",\n name=\"dataframe\",\n method=\"load_files_structured\",\n tool_mode=True,\n ),\n )\n elif file_path.endswith(\".json\"):\n frontend_node[\"outputs\"].append(\n Output(display_name=\"Structured Content\", name=\"json\", method=\"load_files_json\", tool_mode=True),\n )\n\n advanced_mode = frontend_node.get(\"template\", {}).get(\"advanced_mode\", {}).get(\"value\", False)\n if advanced_mode:\n frontend_node[\"outputs\"].append(\n Output(\n display_name=\"Structured Output\",\n name=\"advanced_dataframe\",\n method=\"load_files_dataframe\",\n tool_mode=True,\n ),\n )\n frontend_node[\"outputs\"].append(\n Output(\n display_name=\"Markdown\", name=\"advanced_markdown\", method=\"load_files_markdown\", tool_mode=True\n ),\n )\n frontend_node[\"outputs\"].append(\n Output(display_name=\"File Path\", name=\"path\", method=\"load_files_path\", tool_mode=True),\n )\n else:\n frontend_node[\"outputs\"].append(\n Output(display_name=\"Raw Content\", name=\"message\", method=\"load_files_message\", tool_mode=True),\n )\n frontend_node[\"outputs\"].append(\n Output(display_name=\"File Path\", name=\"path\", method=\"load_files_path\", tool_mode=True),\n )\n else:\n # Multiple files => DataFrame output; advanced parser disabled\n frontend_node[\"outputs\"].append(\n Output(display_name=\"Files\", name=\"dataframe\", method=\"load_files\", tool_mode=True)\n )\n\n return frontend_node\n\n # ------------------------------ Core processing ----------------------------------\n\n def _validate_and_resolve_paths(self) -> list[BaseFileComponent.BaseFile]:\n \"\"\"Override to handle file_path_str input from tool mode.\n\n When called as a tool, the file_path_str parameter can be set.\n If not provided, it will fall back to using the path FileInput (uploaded file).\n Priority:\n 1. file_path_str (if provided by the tool call)\n 2. path (uploaded file from UI)\n \"\"\"\n # Check if file_path_str is provided (from tool mode)\n file_path_str = getattr(self, \"file_path_str\", None)\n if file_path_str:\n # Use the string path from tool mode\n from pathlib import Path\n\n from lfx.schema.data import Data\n\n resolved_path = Path(self.resolve_path(file_path_str))\n if not resolved_path.exists():\n msg = f\"File or directory not found: {file_path_str}\"\n self.log(msg)\n if not self.silent_errors:\n raise ValueError(msg)\n return []\n\n data_obj = Data(data={self.SERVER_FILE_PATH_FIELDNAME: str(resolved_path)})\n return [BaseFileComponent.BaseFile(data_obj, resolved_path, delete_after_processing=False)]\n\n # Otherwise use the default implementation (uses path FileInput)\n return super()._validate_and_resolve_paths()\n\n def _is_docling_compatible(self, file_path: str) -> bool:\n \"\"\"Lightweight extension gate for Docling-compatible types.\"\"\"\n docling_exts = (\n \".adoc\",\n \".asciidoc\",\n \".asc\",\n \".bmp\",\n \".csv\",\n \".dotx\",\n \".dotm\",\n \".docm\",\n \".docx\",\n \".htm\",\n \".html\",\n \".jpg\",\n \".jpeg\",\n \".json\",\n \".md\",\n \".pdf\",\n \".png\",\n \".potx\",\n \".ppsx\",\n \".pptm\",\n \".potm\",\n \".ppsm\",\n \".pptx\",\n \".tiff\",\n \".txt\",\n \".xls\",\n \".xlsx\",\n \".xhtml\",\n \".xml\",\n \".webp\",\n )\n return file_path.lower().endswith(docling_exts)\n\n async def _get_local_file_for_docling(self, file_path: str) -> tuple[str, bool]:\n \"\"\"Get a local file path for Docling processing, downloading from S3 if needed.\n\n Args:\n file_path: Either a local path or S3 key (format \"flow_id/filename\")\n\n Returns:\n tuple[str, bool]: (local_path, should_delete) where should_delete indicates\n if this is a temporary file that should be cleaned up\n \"\"\"\n settings = get_settings_service().settings\n if settings.storage_type == \"local\":\n return file_path, False\n\n # S3 storage - download to temp file\n parsed = parse_storage_path(file_path)\n if not parsed:\n msg = f\"Invalid S3 path format: {file_path}. Expected 'flow_id/filename'\"\n raise ValueError(msg)\n\n storage_service = get_storage_service()\n flow_id, filename = parsed\n\n # Get file content from S3\n content = await storage_service.get_file(flow_id, filename)\n\n suffix = Path(filename).suffix\n with NamedTemporaryFile(mode=\"wb\", suffix=suffix, delete=False) as tmp_file:\n tmp_file.write(content)\n temp_path = tmp_file.name\n\n return temp_path, True\n\n def _process_docling_in_subprocess(self, file_path: str) -> Data | None:\n \"\"\"Run Docling in a separate OS process and map the result to a Data object.\n\n We avoid multiprocessing pickling by launching `python -c \"<script>\"` and\n passing JSON config via stdin. The child prints a JSON result to stdout.\n\n For S3 storage, the file is downloaded to a temp file first.\n \"\"\"\n if not file_path:\n return None\n\n settings = get_settings_service().settings\n if settings.storage_type == \"s3\":\n local_path, should_delete = run_until_complete(self._get_local_file_for_docling(file_path))\n else:\n local_path = file_path\n should_delete = False\n\n try:\n return self._process_docling_subprocess_impl(local_path, file_path)\n finally:\n # Clean up temp file if we created one\n if should_delete:\n with contextlib.suppress(Exception):\n Path(local_path).unlink() # Ignore cleanup errors\n\n def _process_docling_subprocess_impl(self, local_file_path: str, original_file_path: str) -> Data | None:\n \"\"\"Implementation of Docling subprocess processing.\n\n Args:\n local_file_path: Path to local file to process\n original_file_path: Original file path to include in metadata\n Returns:\n Data object with processed content\n \"\"\"\n args: dict[str, Any] = {\n \"file_path\": local_file_path,\n \"markdown\": bool(self.markdown),\n \"image_mode\": str(self.IMAGE_MODE),\n \"md_image_placeholder\": str(self.md_image_placeholder),\n \"md_page_break_placeholder\": str(self.md_page_break_placeholder),\n \"pipeline\": str(self.pipeline),\n \"ocr_engine\": (\n self.ocr_engine if self.ocr_engine and self.ocr_engine != \"None\" and self.pipeline != \"vlm\" else None\n ),\n }\n\n self.log(f\"Starting Docling subprocess for file: {local_file_path}\")\n self.log(args)\n\n # Child script for isolating the docling processing\n child_script = textwrap.dedent(\n r\"\"\"\n import json, sys\n\n def try_imports():\n try:\n from docling.datamodel.base_models import ConversionStatus, InputFormat # type: ignore\n from docling.document_converter import DocumentConverter # type: ignore\n from docling_core.types.doc import ImageRefMode # type: ignore\n return ConversionStatus, InputFormat, DocumentConverter, ImageRefMode, \"latest\"\n except Exception as e:\n raise e\n\n def create_converter(strategy, input_format, DocumentConverter, pipeline, ocr_engine):\n # --- Standard PDF/IMAGE pipeline (your existing behavior), with optional OCR ---\n if pipeline == \"standard\":\n try:\n from docling.datamodel.pipeline_options import PdfPipelineOptions # type: ignore\n from docling.document_converter import PdfFormatOption # type: ignore\n\n pipe = PdfPipelineOptions()\n pipe.do_ocr = False\n\n if ocr_engine:\n try:\n from docling.models.factories import get_ocr_factory # type: ignore\n pipe.do_ocr = True\n fac = get_ocr_factory(allow_external_plugins=False)\n pipe.ocr_options = fac.create_options(kind=ocr_engine)\n except Exception:\n # If OCR setup fails, disable it\n pipe.do_ocr = False\n\n fmt = {}\n if hasattr(input_format, \"PDF\"):\n fmt[getattr(input_format, \"PDF\")] = PdfFormatOption(pipeline_options=pipe)\n if hasattr(input_format, \"IMAGE\"):\n fmt[getattr(input_format, \"IMAGE\")] = PdfFormatOption(pipeline_options=pipe)\n\n return DocumentConverter(format_options=fmt)\n except Exception:\n return DocumentConverter()\n\n # --- Vision-Language Model (VLM) pipeline ---\n if pipeline == \"vlm\":\n try:\n from docling.datamodel.pipeline_options import VlmPipelineOptions\n from docling.datamodel.vlm_model_specs import GRANITEDOCLING_MLX, GRANITEDOCLING_TRANSFORMERS\n from docling.document_converter import PdfFormatOption\n from docling.pipeline.vlm_pipeline import VlmPipeline\n\n vl_pipe = VlmPipelineOptions(\n vlm_options=GRANITEDOCLING_TRANSFORMERS,\n )\n\n if sys.platform == \"darwin\":\n try:\n import mlx_vlm\n vl_pipe.vlm_options = GRANITEDOCLING_MLX\n except ImportError as e:\n raise e\n\n # VLM paths generally don't need OCR; keep OCR off by default here.\n fmt = {}\n if hasattr(input_format, \"PDF\"):\n fmt[getattr(input_format, \"PDF\")] = PdfFormatOption(\n pipeline_cls=VlmPipeline,\n pipeline_options=vl_pipe\n )\n if hasattr(input_format, \"IMAGE\"):\n fmt[getattr(input_format, \"IMAGE\")] = PdfFormatOption(\n pipeline_cls=VlmPipeline,\n pipeline_options=vl_pipe\n )\n\n return DocumentConverter(format_options=fmt)\n except Exception as e:\n raise e\n\n # --- Fallback: default converter with no special options ---\n return DocumentConverter()\n\n def export_markdown(document, ImageRefMode, image_mode, img_ph, pg_ph):\n try:\n mode = getattr(ImageRefMode, image_mode.upper(), image_mode)\n return document.export_to_markdown(\n image_mode=mode,\n image_placeholder=img_ph,\n page_break_placeholder=pg_ph,\n )\n except Exception:\n try:\n return document.export_to_text()\n except Exception:\n return str(document)\n\n def to_rows(doc_dict):\n rows = []\n for t in doc_dict.get(\"texts\", []):\n prov = t.get(\"prov\") or []\n page_no = None\n if prov and isinstance(prov, list) and isinstance(prov[0], dict):\n page_no = prov[0].get(\"page_no\")\n rows.append({\n \"page_no\": page_no,\n \"label\": t.get(\"label\"),\n \"text\": t.get(\"text\"),\n \"level\": t.get(\"level\"),\n })\n return rows\n\n def main():\n cfg = json.loads(sys.stdin.read())\n file_path = cfg[\"file_path\"]\n markdown = cfg[\"markdown\"]\n image_mode = cfg[\"image_mode\"]\n img_ph = cfg[\"md_image_placeholder\"]\n pg_ph = cfg[\"md_page_break_placeholder\"]\n pipeline = cfg[\"pipeline\"]\n ocr_engine = cfg.get(\"ocr_engine\")\n meta = {\"file_path\": file_path}\n\n try:\n ConversionStatus, InputFormat, DocumentConverter, ImageRefMode, strategy = try_imports()\n converter = create_converter(strategy, InputFormat, DocumentConverter, pipeline, ocr_engine)\n try:\n res = converter.convert(file_path)\n except Exception as e:\n print(json.dumps({\"ok\": False, \"error\": f\"Docling conversion error: {e}\", \"meta\": meta}))\n return\n\n ok = False\n if hasattr(res, \"status\"):\n try:\n ok = (res.status == ConversionStatus.SUCCESS) or (str(res.status).lower() == \"success\")\n except Exception:\n ok = (str(res.status).lower() == \"success\")\n if not ok and hasattr(res, \"document\"):\n ok = getattr(res, \"document\", None) is not None\n if not ok:\n print(json.dumps({\"ok\": False, \"error\": \"Docling conversion failed\", \"meta\": meta}))\n return\n\n doc = getattr(res, \"document\", None)\n if doc is None:\n print(json.dumps({\"ok\": False, \"error\": \"Docling produced no document\", \"meta\": meta}))\n return\n\n if markdown:\n text = export_markdown(doc, ImageRefMode, image_mode, img_ph, pg_ph)\n print(json.dumps({\"ok\": True, \"mode\": \"markdown\", \"text\": text, \"meta\": meta}))\n return\n\n # structured\n try:\n doc_dict = doc.export_to_dict()\n except Exception as e:\n print(json.dumps({\"ok\": False, \"error\": f\"Docling export_to_dict failed: {e}\", \"meta\": meta}))\n return\n\n rows = to_rows(doc_dict)\n print(json.dumps({\"ok\": True, \"mode\": \"structured\", \"doc\": rows, \"meta\": meta}))\n except Exception as e:\n print(\n json.dumps({\n \"ok\": False,\n \"error\": f\"Docling processing error: {e}\",\n \"meta\": {\"file_path\": file_path},\n })\n )\n\n if __name__ == \"__main__\":\n main()\n \"\"\"\n )\n\n # Validate file_path to avoid command injection or unsafe input\n if not isinstance(args[\"file_path\"], str) or any(c in args[\"file_path\"] for c in [\";\", \"|\", \"&\", \"$\", \"`\"]):\n return Data(data={\"error\": \"Unsafe file path detected.\", \"file_path\": args[\"file_path\"]})\n\n proc = subprocess.run( # noqa: S603\n [sys.executable, \"-u\", \"-c\", child_script],\n input=json.dumps(args).encode(\"utf-8\"),\n capture_output=True,\n check=False,\n )\n\n if not proc.stdout:\n err_msg = proc.stderr.decode(\"utf-8\", errors=\"replace\") or \"no output from child process\"\n return Data(data={\"error\": f\"Docling subprocess error: {err_msg}\", \"file_path\": original_file_path})\n\n try:\n result = json.loads(proc.stdout.decode(\"utf-8\"))\n except Exception as e: # noqa: BLE001\n err_msg = proc.stderr.decode(\"utf-8\", errors=\"replace\")\n return Data(\n data={\n \"error\": f\"Invalid JSON from Docling subprocess: {e}. stderr={err_msg}\",\n \"file_path\": original_file_path,\n },\n )\n\n if not result.get(\"ok\"):\n return Data(data={\"error\": result.get(\"error\", \"Unknown Docling error\"), **result.get(\"meta\", {})})\n\n meta = result.get(\"meta\", {})\n if result.get(\"mode\") == \"markdown\":\n exported_content = str(result.get(\"text\", \"\"))\n return Data(\n text=exported_content,\n data={\"exported_content\": exported_content, \"export_format\": self.EXPORT_FORMAT, **meta},\n )\n\n rows = list(result.get(\"doc\", []))\n return Data(data={\"doc\": rows, \"export_format\": self.EXPORT_FORMAT, **meta})\n\n def process_files(\n self,\n file_list: list[BaseFileComponent.BaseFile],\n ) -> list[BaseFileComponent.BaseFile]:\n \"\"\"Process input files.\n\n - advanced_mode => Docling in a separate process.\n - Otherwise => standard parsing in current process (optionally threaded).\n \"\"\"\n if not file_list:\n msg = \"No files to process.\"\n raise ValueError(msg)\n\n # Validate that files requiring Docling are only processed when advanced mode is enabled\n if not self.advanced_mode:\n for file in file_list:\n extension = file.path.suffix[1:].lower()\n if extension in self.DOCLING_ONLY_EXTENSIONS:\n msg = (\n f\"File '{file.path.name}' has extension '.{extension}' which requires \"\n f\"Advanced Parser mode. Please enable 'Advanced Parser' to process this file.\"\n )\n self.log(msg)\n raise ValueError(msg)\n\n def process_file_standard(file_path: str, *, silent_errors: bool = False) -> Data | None:\n try:\n return parse_text_file_to_data(file_path, silent_errors=silent_errors)\n except FileNotFoundError as e:\n self.log(f\"File not found: {file_path}. Error: {e}\")\n if not silent_errors:\n raise\n return None\n except Exception as e:\n self.log(f\"Unexpected error processing {file_path}: {e}\")\n if not silent_errors:\n raise\n return None\n\n docling_compatible = all(self._is_docling_compatible(str(f.path)) for f in file_list)\n\n # Advanced path: Check if ALL files are compatible with Docling\n if self.advanced_mode and docling_compatible:\n final_return: list[BaseFileComponent.BaseFile] = []\n for file in file_list:\n file_path = str(file.path)\n advanced_data: Data | None = self._process_docling_in_subprocess(file_path)\n\n # --- UNNEST: expand each element in `doc` to its own Data row\n payload = getattr(advanced_data, \"data\", {}) or {}\n doc_rows = payload.get(\"doc\")\n if isinstance(doc_rows, list):\n rows: list[Data | None] = [\n Data(\n data={\n \"file_path\": file_path,\n **(item if isinstance(item, dict) else {\"value\": item}),\n },\n )\n for item in doc_rows\n ]\n final_return.extend(self.rollup_data(file_list, rows))\n else:\n # If not structured, keep as-is (e.g., markdown export or error dict)\n final_return.extend(self.rollup_data(file_list, [advanced_data]))\n return final_return\n\n # Standard multi-file (or single non-advanced) path\n concurrency = 1 if not self.use_multithreading else max(1, self.concurrency_multithreading)\n\n file_paths = [str(f.path) for f in file_list]\n self.log(f\"Starting parallel processing of {len(file_paths)} files with concurrency: {concurrency}.\")\n my_data = parallel_load_data(\n file_paths,\n silent_errors=self.silent_errors,\n load_function=process_file_standard,\n max_concurrency=concurrency,\n )\n return self.rollup_data(file_list, my_data)\n\n # ------------------------------ Output helpers -----------------------------------\n\n def load_files_helper(self) -> DataFrame:\n result = self.load_files()\n\n # Error condition - raise error if no text and an error is present\n if not hasattr(result, \"text\"):\n if hasattr(result, \"error\"):\n raise ValueError(result.error[0])\n msg = \"Could not extract content from the provided file(s).\"\n raise ValueError(msg)\n\n return result\n\n def load_files_dataframe(self) -> DataFrame:\n \"\"\"Load files using advanced Docling processing and export to DataFrame format.\"\"\"\n self.markdown = False\n return self.load_files_helper()\n\n def load_files_markdown(self) -> Message:\n \"\"\"Load files using advanced Docling processing and export to Markdown format.\"\"\"\n self.markdown = True\n result = self.load_files_helper()\n return Message(text=str(result.text[0]))\n"
src/backend/base/langflow/initial_setup/starter_projects/Pokédex Agent.json:885: "value": "import json\nimport re\nimport tempfile\nfrom datetime import datetime, timezone\nfrom pathlib import Path\nfrom typing import Any\nfrom urllib.parse import parse_qsl, urlencode, urlparse, urlunparse\n\nimport aiofiles\nimport aiofiles.os as aiofiles_os\nimport httpx\nimport validators\n\nfrom lfx.base.curl.parse import parse_context\nfrom lfx.custom.custom_component.component import Component\nfrom lfx.inputs.inputs import TabInput\nfrom lfx.io import (\n BoolInput,\n DataInput,\n DropdownInput,\n IntInput,\n MessageTextInput,\n MultilineInput,\n Output,\n TableInput,\n)\nfrom lfx.schema.data import Data\nfrom lfx.schema.dotdict import dotdict\nfrom lfx.utils.component_utils import set_current_fields, set_field_advanced, set_field_display\nfrom lfx.utils.ssrf_protection import SSRFProtectionError, validate_url_for_ssrf\n\n# Define fields for each mode\nMODE_FIELDS = {\n \"URL\": [\n \"url_input\",\n \"method\",\n ],\n \"cURL\": [\"curl_input\"],\n}\n\n# Fields that should always be visible\nDEFAULT_FIELDS = [\"mode\"]\n\n\nclass APIRequestComponent(Component):\n display_name = \"API Request\"\n description = \"Make HTTP requests using URL or cURL commands.\"\n documentation: str = \"https://docs.langflow.org/api-request\"\n icon = \"Globe\"\n name = \"APIRequest\"\n\n inputs = [\n MessageTextInput(\n name=\"url_input\",\n display_name=\"URL\",\n info=\"Enter the URL for the request.\",\n advanced=False,\n tool_mode=True,\n ),\n MultilineInput(\n name=\"curl_input\",\n display_name=\"cURL\",\n info=(\n \"Paste a curl command to populate the fields. \"\n \"This will fill in the dictionary fields for headers and body.\"\n ),\n real_time_refresh=True,\n tool_mode=True,\n advanced=True,\n show=False,\n ),\n DropdownInput(\n name=\"method\",\n display_name=\"Method\",\n options=[\"GET\", \"POST\", \"PATCH\", \"PUT\", \"DELETE\"],\n value=\"GET\",\n info=\"The HTTP method to use.\",\n real_time_refresh=True,\n ),\n TabInput(\n name=\"mode\",\n display_name=\"Mode\",\n options=[\"URL\", \"cURL\"],\n value=\"URL\",\n info=\"Enable cURL mode to populate fields from a cURL command.\",\n real_time_refresh=True,\n ),\n DataInput(\n name=\"query_params\",\n display_name=\"Query Parameters\",\n info=\"The query parameters to append to the URL.\",\n advanced=True,\n ),\n TableInput(\n name=\"body\",\n display_name=\"Body\",\n info=\"The body to send with the request as a dictionary (for POST, PATCH, PUT).\",\n table_schema=[\n {\n \"name\": \"key\",\n \"display_name\": \"Key\",\n \"type\": \"str\",\n \"description\": \"Parameter name\",\n },\n {\n \"name\": \"value\",\n \"display_name\": \"Value\",\n \"description\": \"Parameter value\",\n },\n ],\n value=[],\n input_types=[\"Data\"],\n advanced=True,\n real_time_refresh=True,\n ),\n TableInput(\n name=\"headers\",\n display_name=\"Headers\",\n info=\"The headers to send with the request\",\n table_schema=[\n {\n \"name\": \"key\",\n \"display_name\": \"Header\",\n \"type\": \"str\",\n \"description\": \"Header name\",\n },\n {\n \"name\": \"value\",\n \"display_name\": \"Value\",\n \"type\": \"str\",\n \"description\": \"Header value\",\n },\n ],\n value=[{\"key\": \"User-Agent\", \"value\": \"Langflow/1.0\"}],\n advanced=True,\n input_types=[\"Data\"],\n real_time_refresh=True,\n ),\n IntInput(\n name=\"timeout\",\n display_name=\"Timeout\",\n value=30,\n info=\"The timeout to use for the request.\",\n advanced=True,\n ),\n BoolInput(\n name=\"follow_redirects\",\n display_name=\"Follow Redirects\",\n value=False,\n info=(\n \"Whether to follow HTTP redirects. \"\n \"WARNING: Enabling redirects may allow SSRF bypass attacks where a public URL \"\n \"redirects to internal resources. Only enable if you trust the target server. \"\n \"See OWASP SSRF Prevention Cheat Sheet for details.\"\n ),\n advanced=True,\n ),\n BoolInput(\n name=\"save_to_file\",\n display_name=\"Save to File\",\n value=False,\n info=\"Save the API response to a temporary file\",\n advanced=True,\n ),\n BoolInput(\n name=\"include_httpx_metadata\",\n display_name=\"Include HTTPx Metadata\",\n value=False,\n info=(\n \"Include properties such as headers, status_code, response_headers, \"\n \"and redirection_history in the output.\"\n ),\n advanced=True,\n
... [stdout truncated: output exceeded 50000 characters (304271 total). Please modify your command to reduce output size, for example by:
- Using grep/rg to filter results
- Using head/tail to limit lines
- Using --max-count flag with grep/rg
- Processing output with awk/sed to extract only needed information]
</shell_result>
Verify transaction handling when creating API keys - API key orphaning risk confirmed.
The review comment identifies a legitimate transaction handling concern. Investigation confirms:
-
API Key Creation:
create_api_key()is called in_ensure_mcp_server_config()(line 403) without immediate commit. -
Explicit Commit: Line 135 calls
session.commit()only whencreated_api_keyis True, persisting the API key to the database. -
Unprotected Config Upload: Immediately after (lines 136-142),
upload_server_config()is called without any try/except wrapper. This function delegates toupload_user_file(), which can raiseHTTPExceptionat multiple points:- Storage errors (FileNotFoundError, PermissionError)
- Database insert failures
- All occurring after the session commit at line 135
-
No Cleanup Mechanism: No orphaned API key cleanup logic was found in the codebase.
The partial state is real: If upload_server_config() fails after the commit, an API key exists in the database but the MCP server configuration doesn't reference it, creating an orphaned resource.
Recommendation: Wrap the upload_server_config() call in a try/except block and implement a rollback mechanism (e.g., delete the created API key if config upload fails) to maintain transaction consistency.
| @pytest.fixture | ||
| def mock_sse_transport(): | ||
| with patch("langflow.api.v1.mcp.sse") as mock: | ||
| mock.connect_sse = AsyncMock() | ||
| mock.handle_post_message = AsyncMock() | ||
| def mock_streamable_http_manager(): | ||
| """Mock the StreamableHTTPSessionManager.""" | ||
| with patch("langflow.api.v1.mcp._streamable_http_manager") as mock: | ||
| mock.handle_request = AsyncMock() | ||
| yield mock | ||
|
|
||
|
|
||
| @pytest.fixture | ||
| def mock_ensure_manager_running(): | ||
| """Mock the _ensure_streamable_http_manager_running function.""" | ||
| with patch("langflow.api.v1.mcp._ensure_streamable_http_manager_running") as mock: | ||
| mock.return_value = AsyncMock() | ||
| yield mock |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix Ruff ARG001 by marking the ensure-manager fixture argument as intentionally unused
mock_ensure_manager_running is required to activate the patch on _ensure_streamable_http_manager_running, but the test bodies never reference the argument, triggering Ruff ARG001.
You can keep the fixture behavior and satisfy Ruff by prefixing the parameter name with an underscore:
-async def test_mcp_post_endpoint(
- client: AsyncClient, logged_in_headers, mock_streamable_http_manager, mock_ensure_manager_running
-):
+async def test_mcp_post_endpoint(
+ client: AsyncClient, logged_in_headers, mock_streamable_http_manager, _mock_ensure_manager_running
+):
-async def test_mcp_post_endpoint_disconnect_error(
- client: AsyncClient, logged_in_headers, mock_streamable_http_manager, mock_ensure_manager_running
-):
+async def test_mcp_post_endpoint_disconnect_error(
+ client: AsyncClient, logged_in_headers, mock_streamable_http_manager, _mock_ensure_manager_running
+):
-async def test_mcp_post_endpoint_server_error(
- client: AsyncClient, logged_in_headers, mock_streamable_http_manager, mock_ensure_manager_running
-):
+async def test_mcp_post_endpoint_server_error(
+ client: AsyncClient, logged_in_headers, mock_streamable_http_manager, _mock_ensure_manager_running
+):This keeps the fixture execution while silencing the style errors.
Also applies to: 82-99, 108-128
🤖 Prompt for AI Agents
In src/backend/tests/unit/api/v1/test_mcp.py around lines 34 to 47 (and
similarly at 82-99 and 108-128), the pytest fixture parameter name
mock_ensure_manager_running is unused in test functions which triggers Ruff
ARG001; rename the fixture parameter to start with an underscore (e.g.,
_mock_ensure_manager_running) in the test function signatures so the fixture
still executes (activating the patch) while signalling the argument is
intentionally unused and silencing the linter, updating all affected test
function parameters accordingly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
♻️ Duplicate comments (4)
src/backend/base/langflow/api/v2/mcp.py (1)
132-142: Transaction handling risk: API key may be orphaned if config upload fails.This concern was raised in a previous review and remains unaddressed. If
session.commit()succeeds at line 135 butupload_server_config()fails (due to storage errors, permission issues, etc.), an API key will exist in the database without being referenced in the MCP server configuration.Consider wrapping the config upload in a try/except and implementing cleanup if it fails:
if servers_updated: servers["mcpServers"] = mcp_servers if created_api_key: await session.commit() - await upload_server_config( - servers, - current_user, - session, - storage_service=storage_service, - settings_service=settings_service, - ) + try: + await upload_server_config( + servers, + current_user, + session, + storage_service=storage_service, + settings_service=settings_service, + ) + except Exception: + # Consider logging and/or cleanup of orphaned API keys + raisesrc/backend/tests/unit/api/v1/test_mcp.py (1)
82-91: Prefix unused fixture parameters with underscore to satisfy Ruff ARG001.The
mock_ensure_manager_runningfixture is required to activate the patch but is never referenced in the test bodies. Rename to_mock_ensure_manager_runningin test signatures to indicate intentional non-use.async def test_mcp_post_endpoint( - client: AsyncClient, logged_in_headers, mock_streamable_http_manager, mock_ensure_manager_running + client: AsyncClient, logged_in_headers, mock_streamable_http_manager, _mock_ensure_manager_running ): async def test_mcp_post_endpoint_disconnect_error( - client: AsyncClient, logged_in_headers, mock_streamable_http_manager, mock_ensure_manager_running + client: AsyncClient, logged_in_headers, mock_streamable_http_manager, _mock_ensure_manager_running ): async def test_mcp_post_endpoint_server_error( - client: AsyncClient, logged_in_headers, mock_streamable_http_manager, mock_ensure_manager_running + client: AsyncClient, logged_in_headers, mock_streamable_http_manager, _mock_ensure_manager_running ):Also applies to: 108-117, 120-128
src/backend/base/langflow/api/v1/mcp.py (1)
148-148:request._sendrelies on private Starlette API.This uses an undocumented internal attribute without stability guarantees. Consider refactoring to a native ASGI handler or pinning Starlette version with documentation.
src/backend/base/langflow/api/v1/mcp_projects.py (1)
337-337: Same private API concern withrequest._send.This is the same issue as in
mcp.pyline 148 - relying on undocumented Starlette internals.
🧹 Nitpick comments (11)
src/backend/base/langflow/api/v1/schemas.py (1)
463-471: ComposerUrlResponse shape looks appropriate; consider clarifying semantics in docstringThe unified response model for streamable/legacy URLs is well-structured and matches how the frontend consumes it. As a minor improvement, you could expand the docstring (or field descriptions) to clarify when
streamable_http_urlvslegacy_sse_urlare expected to be populated and whethererror_messageimplies URLs may beNone. This will help future callers reason about the state machine.src/frontend/src/types/mcp/index.ts (1)
48-55: ComposerUrlResponseType aligns with backend; maybe annotate legacysse_urlusageThe type cleanly mirrors the backend
ComposerUrlResponsewhile keepingsse_urloptional for backward compatibility. To avoid confusion later, consider adding a brief comment thatsse_urlis a legacy/compat field and that new backends should preferstreamable_http_url/legacy_sse_url.src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts (1)
110-123: URL selection logic correctly prefers streamable HTTP; consider gating legacy URL onuseComposertooThe
composerConnectionmemo pluscustomGetMCPUrl(projectId, composerConnection)nicely centralize URL selection and give priority tostreamable_http_urlwhile still handling legacysse_url/legacy_sse_url.One nuance:
customGetMCPUrlwill returnlegacySseUrlwhenever it’s set, regardless ofuseComposer, whereas the primary URL path is gated byuseComposer. GivencomposerConnectioncurrently only sourceslegacySseUrlfrom composer metadata, you might optionally tighten this to:
- Either only pass
legacySseUrlwhenuseComposeris true, or- Gate the
legacySseUrlbranch incustomGetMCPUrlonuseComposeras well.That would make it impossible to hit a composer endpoint when the project has composer explicitly disabled, without relying on the backend to clear legacy URLs.
Also applies to: 125-128
src/backend/tests/unit/api/utils/test_config_utils.py (1)
136-140: Validation tests updated correctly for streamable HTTP transportSwitching from SSE URLs to
streamable_http_urland"--transport", "streamablehttp"keeps these tests aligned with the new MCP transport and endpoint shape. Building the URL fromclient.base_urlinstead of a hardcoded host is also good for environment independence (and matches prior guidance about avoiding rawlocalhostin MCP configs, based on learnings).If you want to further reduce duplication, you could add a small helper or a dedicated test around
get_project_streamable_http_urland reuse it here, but that’s optional.Based on learnings, dynamic URL construction here is in line with previous MCP configuration guidance.
Also applies to: 174-179, 214-219
src/frontend/src/customization/utils/custom-mcp-url.ts (1)
3-7: URL helper correctly prioritizes streamable HTTP with sane fallbacksThe
ComposerConnectionOptionsstruct plus the updatedcustomGetMCPUrlgive a clear precedence order: streamable HTTP (when composer is in use), then legacy SSE, then the per-project API URL built fromapi.defaults.baseURL/window.location.origin—which also avoids hardcoded localhost, matching previous MCP URL guidance.One minor nit: since callers may occasionally pass an SSE endpoint into
streamableHttpUrlfor compatibility, consider either documenting that this field can hold “primary MCP endpoint (streamable or SSE)” or renaming it to something slightly more transport-agnostic in the future. Not blocking.Based on learnings, the dynamic host selection here aligns well with prior recommendations against hardcoded localhost MCP URLs.
Also applies to: 11-21, 24-25
src/backend/base/langflow/api/v2/mcp.py (1)
388-400: Consider simplifying the API key generation condition.The nested conditionals for
should_generate_api_keyare difficult to follow. Consider consolidating or adding a comment explaining the logic:
- MCP Composer enabled + project has apikey auth → generate
- MCP Composer disabled + project has apikey auth → generate
- MCP Composer disabled + no project auth + AUTO_LOGIN disabled → generate
- AUTO_LOGIN enabled without SUPERUSER → generate
src/backend/tests/unit/api/v1/test_mcp_projects.py (1)
570-596: Consider renaming test function for consistency with transport migration.The test function is still named
test_project_sse_creationbut now tests MCP server creation with streamable HTTP transport. Consider renaming for clarity:-async def test_project_sse_creation(user_test_project): - """Test that MCP server is correctly created for a project.""" +async def test_project_mcp_server_creation(user_test_project): + """Test that MCP server is correctly created for a project."""src/backend/base/langflow/api/utils/mcp/config_utils.py (1)
256-265: Validatecomposer_porttype before conversion.Line 265 has a type ignore because
composer_portmay be a string or int. Consider explicit conversion and validation earlier:async def get_composer_streamable_http_url(project: Folder) -> str: """Generate Streamable HTTP URL for the MCP Composer instance.""" auth_config = await _get_mcp_composer_auth_config(project) composer_host = auth_config.get("oauth_host") composer_port = auth_config.get("oauth_port") if not composer_host or not composer_port: error_msg = "OAuth host and port are required to get the MCP Composer URL" raise ValueError(error_msg) + try: + port_int = int(composer_port) + except (ValueError, TypeError) as e: + error_msg = f"Invalid OAuth port value: {composer_port}" + raise ValueError(error_msg) from e composer_url = f"http://{composer_host}:{composer_port}" - return await get_url_by_os(composer_host, int(composer_port), composer_url) # type: ignore[arg-type] + return await get_url_by_os(composer_host, port_int, composer_url)src/backend/tests/unit/api/v1/test_mcp.py (1)
42-47: Consider usingAsyncMockdirectly for the async function mock.The patched function
_ensure_streamable_http_manager_runningis async and returnsNone. While the current approach works, usingAsyncMock()directly is more idiomatic for mocking async functions.@pytest.fixture def mock_ensure_manager_running(): """Mock the _ensure_streamable_http_manager_running function.""" - with patch("langflow.api.v1.mcp._ensure_streamable_http_manager_running") as mock: - mock.return_value = AsyncMock() + with patch("langflow.api.v1.mcp._ensure_streamable_http_manager_running", new_callable=AsyncMock) as mock: yield mocksrc/backend/base/langflow/api/v1/mcp.py (2)
81-103: Consider encapsulating manager state in a dataclass or namespace.Ruff PLW0603 flags the
globalstatements. While the double-checked locking pattern is correct, encapsulating the state would improve maintainability and silence the linter.from dataclasses import dataclass, field @dataclass class _StreamableHTTPManagerState: manager: StreamableHTTPSessionManager | None = None lock: asyncio.Lock = field(default_factory=asyncio.Lock) started: bool = False stack: AsyncExitStack | None = None _manager_state = _StreamableHTTPManagerState() async def _ensure_streamable_http_manager_running() -> None: if _manager_state.started: return async with _manager_state.lock: if _manager_state.started: return _manager_state.manager = StreamableHTTPSessionManager(server) _manager_state.stack = AsyncExitStack() await _manager_state.stack.enter_async_context(_manager_state.manager.run()) _manager_state.started = True await logger.adebug("Streamable HTTP session manager started for global MCP server")
106-123:@router.on_eventis deprecated in FastAPI; migrate to lifespan context managers for forward compatibility.FastAPI deprecated
on_eventin favor of lifespan event handlers. Router-level events can be migrated to the app-level lifespan context (as already demonstrated inmain.py), or refactored into separate lifecycle management. Since your project already uses the lifespan pattern, this refactor aligns with existing practices.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (18)
src/backend/base/langflow/api/utils/mcp/__init__.py(1 hunks)src/backend/base/langflow/api/utils/mcp/config_utils.py(6 hunks)src/backend/base/langflow/api/v1/mcp.py(3 hunks)src/backend/base/langflow/api/v1/mcp_projects.py(25 hunks)src/backend/base/langflow/api/v1/schemas.py(1 hunks)src/backend/base/langflow/api/v2/mcp.py(4 hunks)src/backend/base/langflow/initial_setup/starter_projects/News Aggregator.json(1 hunks)src/backend/tests/unit/api/utils/test_config_utils.py(5 hunks)src/backend/tests/unit/api/v1/test_mcp.py(3 hunks)src/backend/tests/unit/api/v1/test_mcp_projects.py(8 hunks)src/frontend/src/controllers/API/queries/mcp/use-get-composer-url.ts(1 hunks)src/frontend/src/controllers/API/queries/mcp/use-patch-flows-mcp.ts(2 hunks)src/frontend/src/customization/utils/custom-mcp-url.ts(1 hunks)src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts(1 hunks)src/frontend/src/types/mcp/index.ts(1 hunks)src/lfx/src/lfx/services/mcp_composer/service.py(9 hunks)src/lfx/tests/unit/services/settings/test_mcp_composer.py(4 hunks)src/lfx/tests/unit/services/settings/test_mcp_composer_windows.py(5 hunks)
🧰 Additional context used
📓 Path-based instructions (8)
src/frontend/src/**/*.{ts,tsx}
📄 CodeRabbit inference engine (.cursor/rules/frontend_development.mdc)
src/frontend/src/**/*.{ts,tsx}: Use React 18 with TypeScript for frontend development
Use Zustand for state management
Files:
src/frontend/src/controllers/API/queries/mcp/use-patch-flows-mcp.tssrc/frontend/src/types/mcp/index.tssrc/frontend/src/customization/utils/custom-mcp-url.tssrc/frontend/src/controllers/API/queries/mcp/use-get-composer-url.tssrc/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts
src/backend/**/*.py
📄 CodeRabbit inference engine (.cursor/rules/backend_development.mdc)
src/backend/**/*.py: Use FastAPI async patterns withawaitfor async operations in component execution methods
Useasyncio.create_task()for background tasks and implement proper cleanup with try/except forasyncio.CancelledError
Usequeue.put_nowait()for non-blocking queue operations andasyncio.wait_for()with timeouts for controlled get operations
Files:
src/backend/base/langflow/api/v2/mcp.pysrc/backend/tests/unit/api/v1/test_mcp.pysrc/backend/base/langflow/api/utils/mcp/config_utils.pysrc/backend/tests/unit/api/utils/test_config_utils.pysrc/backend/base/langflow/api/v1/mcp.pysrc/backend/base/langflow/api/utils/mcp/__init__.pysrc/backend/base/langflow/api/v1/schemas.pysrc/backend/base/langflow/api/v1/mcp_projects.pysrc/backend/tests/unit/api/v1/test_mcp_projects.py
src/backend/base/langflow/api/**/*.py
📄 CodeRabbit inference engine (.cursor/rules/backend_development.mdc)
Backend API endpoints should be organized by version (v1/, v2/) under
src/backend/base/langflow/api/with specific modules for features (chat.py, flows.py, users.py, etc.)
Files:
src/backend/base/langflow/api/v2/mcp.pysrc/backend/base/langflow/api/utils/mcp/config_utils.pysrc/backend/base/langflow/api/v1/mcp.pysrc/backend/base/langflow/api/utils/mcp/__init__.pysrc/backend/base/langflow/api/v1/schemas.pysrc/backend/base/langflow/api/v1/mcp_projects.py
src/backend/tests/**/*.py
📄 CodeRabbit inference engine (.cursor/rules/testing.mdc)
src/backend/tests/**/*.py: Place backend unit tests insrc/backend/tests/directory, component tests insrc/backend/tests/unit/components/organized by component subdirectory, and integration tests accessible viamake integration_tests
Use same filename as component with appropriate test prefix/suffix (e.g.,my_component.py→test_my_component.py)
Use theclientfixture (FastAPI Test Client) defined insrc/backend/tests/conftest.pyfor API tests; it provides an asynchttpx.AsyncClientwith automatic in-memory SQLite database and mocked environment variables. Skip client creation by marking test with@pytest.mark.noclient
Inherit from the correctComponentTestBasefamily class located insrc/backend/tests/base.pybased on API access needs:ComponentTestBase(no API),ComponentTestBaseWithClient(needs API), orComponentTestBaseWithoutClient(pure logic). Provide three required fixtures:component_class,default_kwargs, andfile_names_mapping
Create comprehensive unit tests for all new backend components. If unit tests are incomplete, create a corresponding Markdown file documenting manual testing steps and expected outcomes
Test both sync and async code paths, mock external dependencies appropriately, test error handling and edge cases, validate input/output behavior, and test component initialization and configuration
Use@pytest.mark.asynciodecorator for async component tests and ensure async methods are properly awaited
Test background tasks usingasyncio.create_task()and verify completion withasyncio.wait_for()with appropriate timeout constraints
Test queue operations using non-blockingqueue.put_nowait()andasyncio.wait_for(queue.get(), timeout=...)to verify queue processing without blocking
Use@pytest.mark.no_blockbustermarker to skip the blockbuster plugin in specific tests
For database tests that may fail in batch runs, run them sequentially usinguv run pytest src/backend/tests/unit/test_database.pyr...
Files:
src/backend/tests/unit/api/v1/test_mcp.pysrc/backend/tests/unit/api/utils/test_config_utils.pysrc/backend/tests/unit/api/v1/test_mcp_projects.py
**/{test_*.py,*.test.ts,*.test.tsx}
📄 CodeRabbit inference engine (coderabbit-custom-pre-merge-checks-unique-id-file-non-traceable-F7F2B60C-1728-4C9A-8889-4F2235E186CA.txt)
Check that test files follow the project's naming conventions (test_*.py for backend, *.test.ts for frontend)
Files:
src/backend/tests/unit/api/v1/test_mcp.pysrc/lfx/tests/unit/services/settings/test_mcp_composer_windows.pysrc/lfx/tests/unit/services/settings/test_mcp_composer.pysrc/backend/tests/unit/api/utils/test_config_utils.pysrc/backend/tests/unit/api/v1/test_mcp_projects.py
**/test_*.py
📄 CodeRabbit inference engine (coderabbit-custom-pre-merge-checks-unique-id-file-non-traceable-F7F2B60C-1728-4C9A-8889-4F2235E186CA.txt)
**/test_*.py: Backend tests should follow pytest structure with proper test_*.py naming
For async functions, ensure proper async testing patterns are used with pytest for backend
Files:
src/backend/tests/unit/api/v1/test_mcp.pysrc/lfx/tests/unit/services/settings/test_mcp_composer_windows.pysrc/lfx/tests/unit/services/settings/test_mcp_composer.pysrc/backend/tests/unit/api/utils/test_config_utils.pysrc/backend/tests/unit/api/v1/test_mcp_projects.py
src/frontend/src/**/{hooks,services}/**/*.{ts,tsx}
📄 CodeRabbit inference engine (.cursor/rules/frontend_development.mdc)
Use the useApi hook for API calls with proper error handling, including loading state, error state, and exception throwing
Files:
src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts
src/frontend/src/**/{services,hooks}/**/*.{ts,tsx}
📄 CodeRabbit inference engine (.cursor/rules/frontend_development.mdc)
Use async/await pattern for API calls instead of .then() chains
Files:
src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts
🧠 Learnings (11)
📓 Common learnings
Learnt from: deon-sanchez
Repo: langflow-ai/langflow PR: 9158
File: src/backend/base/langflow/api/v1/mcp_projects.py:404-404
Timestamp: 2025-07-23T21:19:22.567Z
Learning: In langflow MCP projects configuration, prefer using dynamically computed URLs (like the `sse_url` variable) over hardcoded localhost URLs to ensure compatibility across different deployment environments.
📚 Learning: 2025-07-23T21:19:22.567Z
Learnt from: deon-sanchez
Repo: langflow-ai/langflow PR: 9158
File: src/backend/base/langflow/api/v1/mcp_projects.py:404-404
Timestamp: 2025-07-23T21:19:22.567Z
Learning: In langflow MCP projects configuration, prefer using dynamically computed URLs (like the `sse_url` variable) over hardcoded localhost URLs to ensure compatibility across different deployment environments.
Applied to files:
src/frontend/src/customization/utils/custom-mcp-url.tssrc/backend/base/langflow/api/v2/mcp.pysrc/lfx/tests/unit/services/settings/test_mcp_composer_windows.pysrc/backend/base/langflow/api/utils/mcp/config_utils.pysrc/lfx/tests/unit/services/settings/test_mcp_composer.pysrc/backend/tests/unit/api/utils/test_config_utils.pysrc/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.tssrc/lfx/src/lfx/services/mcp_composer/service.pysrc/backend/base/langflow/api/v1/mcp.pysrc/backend/base/langflow/api/utils/mcp/__init__.pysrc/backend/base/langflow/api/v1/mcp_projects.pysrc/backend/tests/unit/api/v1/test_mcp_projects.py
📚 Learning: 2025-11-24T19:46:09.074Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/backend_development.mdc:0-0
Timestamp: 2025-11-24T19:46:09.074Z
Learning: Applies to src/backend/base/langflow/services/database/models/**/*.py : Database models should be organized by domain (api_key/, flow/, folder/, user/, etc.) under `src/backend/base/langflow/services/database/models/`
Applied to files:
src/backend/base/langflow/api/v2/mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use `monkeypatch` fixture to mock internal functions for testing error handling scenarios; validate error status codes and error message content in responses
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Test both sync and async code paths, mock external dependencies appropriately, test error handling and edge cases, validate input/output behavior, and test component initialization and configuration
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use `pytest.mark.api_key_required` and `pytest.mark.no_blockbuster` markers for components that need external APIs; use `MockLanguageModel` from `tests.unit.mock_language_model` for testing without external API keys
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use `aiofiles` and `anyio.Path` for async file operations in tests; create temporary test files using `tmp_path` fixture and verify file existence and content
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use async fixtures with proper cleanup using try/finally blocks to ensure resources are properly released after tests complete
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Test component versioning and backward compatibility using `file_names_mapping` fixture with `VersionComponentMapping` objects mapping component files across Langflow versions
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Test webhook endpoints by posting to `api/v1/webhook/{endpoint_name}` with appropriate payloads and validating response status codes
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:46:09.074Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/backend_development.mdc:0-0
Timestamp: 2025-11-24T19:46:09.074Z
Learning: Applies to src/backend/**/*.py : Use FastAPI async patterns with `await` for async operations in component execution methods
Applied to files:
src/backend/base/langflow/api/v1/mcp.py
🧬 Code graph analysis (8)
src/frontend/src/controllers/API/queries/mcp/use-patch-flows-mcp.ts (1)
src/frontend/src/types/mcp/index.ts (1)
ComposerUrlResponseType(48-55)
src/frontend/src/customization/utils/custom-mcp-url.ts (1)
src/frontend/src/controllers/API/api.tsx (1)
api(371-371)
src/backend/base/langflow/api/v2/mcp.py (2)
src/backend/base/langflow/services/database/models/folder/model.py (1)
Folder(21-36)src/backend/base/langflow/services/database/models/api_key/crud.py (1)
create_api_key(24-40)
src/frontend/src/controllers/API/queries/mcp/use-get-composer-url.ts (2)
src/frontend/src/types/api/index.ts (1)
useQueryFunctionType(266-277)src/frontend/src/types/mcp/index.ts (1)
ComposerUrlResponseType(48-55)
src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts (1)
src/frontend/src/customization/utils/custom-mcp-url.ts (1)
customGetMCPUrl(9-26)
src/backend/base/langflow/api/utils/mcp/__init__.py (1)
src/backend/base/langflow/api/utils/mcp/config_utils.py (4)
get_composer_streamable_http_url(256-265)get_project_sse_url(233-238)get_project_streamable_http_url(225-230)get_url_by_os(181-204)
src/backend/base/langflow/api/v1/mcp_projects.py (2)
src/backend/base/langflow/api/utils/mcp/config_utils.py (5)
get_composer_streamable_http_url(256-265)get_project_sse_url(233-238)get_project_streamable_http_url(225-230)get_url_by_os(181-204)_get_mcp_composer_auth_config(241-253)src/backend/base/langflow/api/v1/schemas.py (1)
ComposerUrlResponse(463-470)
src/backend/tests/unit/api/v1/test_mcp_projects.py (2)
src/backend/base/langflow/api/v1/mcp_projects.py (2)
ProjectMCPServer(1084-1151)get_project_mcp_server(1158-1167)src/backend/tests/unit/api/v1/test_mcp.py (1)
mock_streamable_http_manager(35-39)
🪛 GitHub Actions: CI
src/backend/base/langflow/api/v1/mcp.py
[error] 148-148: Mypy error: 'None' of 'Any | None' has no attribute 'handle_request' [union-attr].
🪛 GitHub Actions: Ruff Style Check
src/backend/base/langflow/api/v1/mcp.py
[warning] 89-89: ruff PLW0603: Using the global statement to update _streamable_http_manager_started is discouraged.
🪛 GitHub Check: Ruff Style Check (3.13)
src/backend/tests/unit/api/v1/test_mcp.py
[failure] 83-83: Ruff (ARG001)
src/backend/tests/unit/api/v1/test_mcp.py:83:75: ARG001 Unused function argument: mock_ensure_manager_running
[failure] 121-121: Ruff (ARG001)
src/backend/tests/unit/api/v1/test_mcp.py:121:75: ARG001 Unused function argument: mock_ensure_manager_running
[failure] 109-109: Ruff (ARG001)
src/backend/tests/unit/api/v1/test_mcp.py:109:75: ARG001 Unused function argument: mock_ensure_manager_running
src/backend/base/langflow/api/v1/mcp.py
[failure] 113-113: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:113:78: PLW0603 Using the global statement to update _streamable_http_manager is discouraged
[failure] 113-113: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:113:46: PLW0603 Using the global statement to update _streamable_http_manager_stack is discouraged
[failure] 113-113: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:113:12: PLW0603 Using the global statement to update _streamable_http_manager_started is discouraged
[failure] 89-89: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:89:78: PLW0603 Using the global statement to update _streamable_http_manager is discouraged
[failure] 89-89: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:89:46: PLW0603 Using the global statement to update _streamable_http_manager_stack is discouraged
[failure] 89-89: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:89:12: PLW0603 Using the global statement to update _streamable_http_manager_started is discouraged
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Optimize new Python code in this PR
🔇 Additional comments (20)
src/backend/base/langflow/initial_setup/starter_projects/News Aggregator.json (1)
1927-1929: Verify the intentionality of the google dependency version downgrade.Line 1928 downgrades the
0.8.5to0.6.15. This is unusual—version downgrades typically require explicit justification. Please clarify:
- Is this downgrade intentional and necessary for the MCP transport migration?
- Are there any breaking changes or compatibility concerns between these versions?
- Does this change relate to the PR's stated objectives, or is it incidental?
If this downgrade is unintended, consider reverting to
0.8.5or the latest stable version.src/backend/tests/unit/api/utils/test_config_utils.py (1)
465-471: Multi-user server access tests remain valid under streamable HTTPThe updated
config_one,config_two, andupdated_config_oneusing"--transport", "streamablehttp"and distinct URLs keep the conflict/isolation semantics intact while matching the new transport. Assertions still clearly verify per-user separation of configs.Also applies to: 501-504
src/frontend/src/controllers/API/queries/mcp/use-patch-flows-mcp.ts (1)
3-7: Good consolidation on shared ComposerUrlResponseTypeUsing the shared
ComposerUrlResponseTypeforPatchFlowMCPResponse.resultkeeps this hook consistent withuse-get-composer-urland the backendComposerUrlResponsemodel, reducing the risk of shape drift. No further changes needed here.Also applies to: 21-24
src/backend/base/langflow/api/utils/mcp/__init__.py (1)
3-9: Re-export of streamable HTTP helpers is consistent and usefulExporting
get_composer_streamable_http_urlandget_project_streamable_http_urlfrom the MCP utils package keeps the public API aligned with the new transport configuration helpers and mirrors the existing pattern forget_project_sse_url/get_url_by_os. Looks good.Also applies to: 11-16
src/lfx/tests/unit/services/settings/test_mcp_composer_windows.py (1)
223-231: Tests correctly updated to newstreamable_http_urlparameterThe Windows-specific MCPComposerService tests now pass
streamable_http_urlvia keyword at all call sites, matching the updated private method signatures while preserving the original behaviors under test (temp-file usage, startup checks, retries, and error handling). No issues spotted here.Also applies to: 314-322, 360-368, 417-422, 512-519
src/frontend/src/controllers/API/queries/mcp/use-get-composer-url.ts (1)
1-35: LGTM! Type migration toComposerUrlResponseTypeis consistent.The type import and usage align with the centralized type definition in
@/types/mcp, which includes the newstreamable_http_urlandlegacy_sse_urlfields alongside the deprecatedsse_urlfor backward compatibility.src/lfx/tests/unit/services/settings/test_mcp_composer.py (2)
194-196: LGTM! Test data correctly updated for new transport URL fields.The project composer dictionary now includes both
streamable_http_urlandlegacy_sse_urlfields, aligning with the new transport model while maintaining backward compatibility.
230-237: LGTM! Test function calls updated to usestreamable_http_urlparameter.The
_do_start_project_composerinvocations correctly use the newstreamable_http_urlparameter name instead of the deprecatedsse_url.src/backend/base/langflow/api/v2/mcp.py (1)
312-327: LGTM! URL parsing helper is defensive and handles edge cases.The
_extract_project_id_from_urlfunction correctly:
- Catches
ValueErrorfromurlparse- Handles missing or malformed path segments
- Returns
Noneon any UUID conversion failuresrc/backend/tests/unit/api/v1/test_mcp_projects.py (2)
71-86: LGTM! Mock fixture correctly simulatesStreamableHTTPSessionManager.The fixture properly:
- Patches the class constructor
- Returns an async context manager from
run()- Mocks
handle_requestas an async methodThis aligns with the
ProjectMCPServerimplementation shown in the relevant code snippets.
186-197: LGTM! Test correctly verifies streamable HTTP transport usage.The test properly asserts that
handle_requestis called on the mock manager, which matches the new transport implementation.src/backend/base/langflow/api/utils/mcp/config_utils.py (2)
207-230: LGTM! Good refactoring to extract common URL components.The
_get_project_base_url_componentshelper follows DRY principles and centralizes the host/port resolution logic. The newget_project_streamable_http_urlcorrectly omits the/ssesuffix for the streamable HTTP transport. Based on learnings, this approach of dynamically computing URLs is preferred over hardcoded localhost values.
397-423: LGTM! Server config correctly uses streamable HTTP transport.Both authenticated and unauthenticated paths now include:
--transport streamablehttpflagstreamable_http_url(without/ssesuffix)This completes the transport migration for auto-configured starter projects.
src/lfx/src/lfx/services/mcp_composer/service.py (2)
950-961: API migration tostreamable_http_urlwith backward-compatiblelegacy_sse_urllooks correct.The method signature properly introduces the new
streamable_http_urlparameter while maintaining backward compatibility through the optionallegacy_sse_urlkeyword argument. The default computation on line 1040 correctly derives the SSE URL from the HTTP endpoint.
1189-1197: Composer info structure properly maintains backward compatibility.The stored info includes both new fields (
streamable_http_url,legacy_sse_url) and the legacysse_urlalias, ensuring code that still referencessse_urlcontinues to work during migration.src/backend/base/langflow/api/v1/mcp_projects.py (5)
1084-1151: Well-structuredProjectMCPServerclass with proper lifecycle management.The class encapsulates session manager state with proper locking and uses
AsyncExitStackfor clean resource management. The double-checked locking pattern inensure_session_manager_runningis correctly implemented.
1170-1178: Shutdown handler correctly handles per-project cleanup with error isolation.The loop continues even if individual servers fail to stop, ensuring best-effort cleanup. Exception logging provides visibility into failures.
800-823: Proper use of dynamically computed URLs inComposerUrlResponse.The endpoint correctly uses
get_project_streamable_http_urlandget_project_sse_urlhelper functions to compute URLs dynamically, ensuring compatibility across different deployment environments. Based on learnings, this approach is preferred over hardcoded localhost URLs.
943-951: URL matching utilities handle both positional and embedded URL detection.The functions correctly check for URLs both as the last argument (typical for mcp-proxy) and anywhere in the args list (for other configurations). The dual-URL support (
streamable_http_urlandlegacy_sse_url) ensures existing installations are properly detected during reinstall.Also applies to: 1031-1055
297-314: Content negotiation on GET endpoint routes streaming vs metadata requests.The endpoint now serves dual purposes: SSE/streaming connections (via
text/event-streamAccept header) are dispatched to the Streamable HTTP handler, while standard requests return tool metadata as JSON. This maintains backward compatibility while supporting the new transport.
| if ( | ||
| not user_starter_folder.auth_settings | ||
| and settings_service.auth_settings.AUTO_LOGIN | ||
| and not settings_service.auth_settings.SUPERUSER | ||
| ): | ||
| default_auth = {"auth_type": "apikey"} | ||
| user_starter_folder.auth_settings = encrypt_auth_settings(default_auth) | ||
| await logger.adebug( | ||
| "AUTO_LOGIN enabled without SUPERUSER; forcing API key auth for starter folder %s", | ||
| user.username, | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
AUTO_LOGIN Langflow authentication security implications
<web_search_result>
Short summary — risks of LANGFLOW_AUTO_LOGIN and recommended mitigations:
Risks
- AUTO_LOGIN=True bypasses normal auth: the web UI auto-signs everyone in as the default superuser, so anyone who can reach the server gets full admin privileges. [1][2]
- Combined with defaults (default superuser credentials or missing secret key) this enables privilege escalation, API/CLI misuse, and exposure of API keys, flows, and connected services. [1][2]
- If environment variables aren’t loaded (known bug in some releases) a deployment meant to disable AUTO_LOGIN can unintentionally revert to auto-login. That has been observed as a security issue. [3]
- Exposed instances with AUTO_LOGIN enabled (or other auth bypasses) massively increase risk from unauthenticated RCE and other Langflow vulnerabilities. Attackers have actively targeted missing-auth and validate/code endpoints. [2][3]
Immediate mitigations (minimum)
- Set LANGFLOW_AUTO_LOGIN=False and disable CLI superuser creation (LANGFLOW_ENABLE_SUPERUSER_CLI=false). Create a strong LANGFLOW_SUPERUSER and LANGFLOW_SUPERUSER_PASSWORD and a strong LANGFLOW_SECRET_KEY. Put these in a securely stored environment, not a checked-in file. [1][2]
- Don’t expose Langflow ports directly to the Internet. Put Langflow behind a reverse proxy or API gateway with authentication and network restrictions (VPN, IP allowlist). Use TLS. [1][2]
- Ensure you run a patched Langflow version (upgrade per project advisories) and verify that .env/environment variables are actually loaded. Monitor Langflow security advisories and CVEs. [3][2]
- Rotate any keys/secrets and audit logs if you previously ran with AUTO_LOGIN enabled or on vulnerable versions. Scan for exposed instances and block/segregate them until patched. [2][3]
If you want, I can:
- check your deployed Langflow version and provide exact upgrade/fix steps, or
- produce a minimal secure .env example and recommended proxy config.
</web_search_result>
Script executed:
#!/bin/bash
# Search for MCP references and AUTO_LOGIN usage
echo "=== Searching for MCP context ==="
rg -n "MCP" src/backend/base/langflow/api/utils/mcp/ -A 2 -B 2 | head -50
echo -e "\n=== Searching for AUTO_LOGIN usage ==="
rg -n "AUTO_LOGIN" src/backend/ -C 2 | head -80
echo -e "\n=== Searching for SUPERUSER usage ==="
rg -n "SUPERUSER" src/backend/ -C 2 | head -80Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
=== Searching for MCP context ===
src/backend/base/langflow/api/utils/mcp/config_utils.py-6-
src/backend/base/langflow/api/utils/mcp/config_utils.py-7-from fastapi import HTTPException
src/backend/base/langflow/api/utils/mcp/config_utils.py:8:from lfx.base.mcp.constants import MAX_MCP_SERVER_NAME_LENGTH
src/backend/base/langflow/api/utils/mcp/config_utils.py-9-from lfx.base.mcp.util import sanitize_mcp_name
src/backend/base/langflow/api/utils/mcp/config_utils.py-10-from lfx.log import logger
--
src/backend/base/langflow/api/utils/mcp/config_utils.py-24-
src/backend/base/langflow/api/utils/mcp/config_utils.py-25-
src/backend/base/langflow/api/utils/mcp/config_utils.py:26:class MCPServerValidationResult:
src/backend/base/langflow/api/utils/mcp/config_utils.py:27: """Represents the result of an MCP server validation check.
src/backend/base/langflow/api/utils/mcp/config_utils.py-28-
src/backend/base/langflow/api/utils/mcp/config_utils.py:29: This class encapsulates the outcome of checking whether an MCP server
src/backend/base/langflow/api/utils/mcp/config_utils.py-30- configuration can be safely created or updated for a given project. The typical
src/backend/base/langflow/api/utils/mcp/config_utils.py-31- sequence is as follows:
src/backend/base/langflow/api/utils/mcp/config_utils.py-32-
src/backend/base/langflow/api/utils/mcp/config_utils.py:33: 1. Initiation: An operation requiring an MCP server (e.g., creating a
src/backend/base/langflow/api/utils/mcp/config_utils.py:34: new project with MCP enabled) triggers a validation check.
src/backend/base/langflow/api/utils/mcp/config_utils.py-35- 2. Validation: The validate_mcp_server_for_project function is called.
src/backend/base/langflow/api/utils/mcp/config_utils.py-36- It generates the expected server name from the project name and checks
--
src/backend/base/langflow/api/utils/mcp/config_utils.py-64- @property
src/backend/base/langflow/api/utils/mcp/config_utils.py-65- def has_conflict(self) -> bool:
src/backend/base/langflow/api/utils/mcp/config_utils.py:66: """Returns True when an MCP server name collision occurs.
src/backend/base/langflow/api/utils/mcp/config_utils.py-67-
src/backend/base/langflow/api/utils/mcp/config_utils.py-68- This indicates that another project is already using the desired server name.
--
src/backend/base/langflow/api/utils/mcp/config_utils.py-72- @property
src/backend/base/langflow/api/utils/mcp/config_utils.py-73- def should_skip(self) -> bool:
src/backend/base/langflow/api/utils/mcp/config_utils.py:74: """Returns True when the MCP server configuration is already correct for this project.
src/backend/base/langflow/api/utils/mcp/config_utils.py-75-
src/backend/base/langflow/api/utils/mcp/config_utils.py-76- This indicates that the server exists and is properly configured for the current project.
--
src/backend/base/langflow/api/utils/mcp/config_utils.py-80- @property
src/backend/base/langflow/api/utils/mcp/config_utils.py-81- def should_proceed(self) -> bool:
src/backend/base/langflow/api/utils/mcp/config_utils.py:82: """Returns True when MCP server setup can proceed safely without conflicts.
src/backend/base/langflow/api/utils/mcp/config_utils.py-83-
src/backend/base/langflow/api/utils/mcp/config_utils.py-84- This indicates either no server exists (safe to create) or the existing server
--
src/backend/base/langflow/api/utils/mcp/config_utils.py-96- settings_service,
src/backend/base/langflow/api/utils/mcp/config_utils.py-97- operation: str = "create",
src/backend/base/langflow/api/utils/mcp/config_utils.py:98:) -> MCPServerValidationResult:
src/backend/base/langflow/api/utils/mcp/config_utils.py:99: """Validate MCP server for a project operation.
src/backend/base/langflow/api/utils/mcp/config_utils.py-100-
src/backend/base/langflow/api/utils/mcp/config_utils.py-101- Args:
--
src/backend/base/langflow/api/utils/mcp/config_utils.py-109-
src/backend/base/langflow/api/utils/mcp/config_utils.py-110- Returns:
src/backend/base/langflow/api/utils/mcp/config_utils.py:111: MCPServerValidationResult with validation details
src/backend/base/langflow/api/utils/mcp/config_utils.py-112- """
src/backend/base/langflow/api/utils/mcp/config_utils.py-113- # Generate server name that would be used for this project
=== Searching for AUTO_LOGIN usage ===
src/backend/tests/unit/test_initial_setup.py-238- "https://github.com/langflow-ai/langflow-bundles/commit/68428ce16729a385fe1bcc0f1ec91fd5f5f420b9"
src/backend/tests/unit/test_initial_setup.py-239- ]
src/backend/tests/unit/test_initial_setup.py:240: settings_service.auth_settings.AUTO_LOGIN = True
src/backend/tests/unit/test_initial_setup.py-241-
src/backend/tests/unit/test_initial_setup.py-242- # Create a superuser in the test database since load_bundles_from_urls requires one
--
src/backend/tests/unit/test_settings_initialization_order.py-240-LANGFLOW_DATABASE_URL=sqlite:///{db_path}
src/backend/tests/unit/test_settings_initialization_order.py-241-LANGFLOW_AUTO_SAVING=false
src/backend/tests/unit/test_settings_initialization_order.py:242:LANGFLOW_AUTO_LOGIN=false
src/backend/tests/unit/test_settings_initialization_order.py-243-LANGFLOW_LOG_LEVEL=ERROR
src/backend/tests/unit/test_settings_initialization_order.py-244- """.strip()
--
src/backend/tests/unit/test_setup_superuser.py-19-# mock_get_session, mock_create_super_user, mock_get_settings_service
src/backend/tests/unit/test_setup_superuser.py-20-# ):
src/backend/tests/unit/test_setup_superuser.py:21:# # Test when AUTO_LOGIN is True
src/backend/tests/unit/test_setup_superuser.py-22-# calls = []
src/backend/tests/unit/test_setup_superuser.py-23-# mock_settings_service = Mock()
src/backend/tests/unit/test_setup_superuser.py:24:# mock_settings_service.auth_settings.AUTO_LOGIN = True
src/backend/tests/unit/test_setup_superuser.py-25-# mock_settings_service.auth_settings.SUPERUSER = DEFAULT_SUPERUSER
src/backend/tests/unit/test_setup_superuser.py-26-# mock_settings_service.auth_settings.SUPERUSER_PASSWORD = DEFAULT_SUPERUSER_PASSWORD
--
src/backend/tests/unit/test_setup_superuser.py-56-# # Test when username and password are default
src/backend/tests/unit/test_setup_superuser.py-57-# mock_settings_service.auth_settings = Mock()
src/backend/tests/unit/test_setup_superuser.py:58:# mock_settings_service.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser.py-59-# mock_settings_service.auth_settings.SUPERUSER = ADMIN_USER_NAME
src/backend/tests/unit/test_setup_superuser.py-60-# mock_settings_service.auth_settings.SUPERUSER_PASSWORD = "password"
--
src/backend/tests/unit/test_setup_superuser.py-81-
src/backend/tests/unit/test_setup_superuser.py-82-# # Test when superuser already exists
src/backend/tests/unit/test_setup_superuser.py:83:# mock_settings_service.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser.py-84-# mock_settings_service.auth_settings.SUPERUSER = ADMIN_USER_NAME
src/backend/tests/unit/test_setup_superuser.py-85-# mock_settings_service.auth_settings.SUPERUSER_PASSWORD = "password"
--
src/backend/tests/unit/test_setup_superuser.py-99-async def test_teardown_superuser_default_superuser(mock_get_session, mock_get_settings_service):
src/backend/tests/unit/test_setup_superuser.py-100- mock_settings_service = MagicMock()
src/backend/tests/unit/test_setup_superuser.py:101: mock_settings_service.auth_settings.AUTO_LOGIN = True
src/backend/tests/unit/test_setup_superuser.py-102- mock_settings_service.auth_settings.SUPERUSER = DEFAULT_SUPERUSER
src/backend/tests/unit/test_setup_superuser.py-103- mock_settings_service.auth_settings.SUPERUSER_PASSWORD = DEFAULT_SUPERUSER_PASSWORD
--
src/backend/tests/unit/test_setup_superuser.py-118- admin_user_name = "admin_user"
src/backend/tests/unit/test_setup_superuser.py-119- mock_settings_service = MagicMock()
src/backend/tests/unit/test_setup_superuser.py:120: mock_settings_service.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser.py-121- mock_settings_service.auth_settings.SUPERUSER = admin_user_name
src/backend/tests/unit/test_setup_superuser.py-122- mock_settings_service.auth_settings.SUPERUSER_PASSWORD = "password" # noqa: S105
--
src/backend/tests/unit/test_user.py-36- auth_settings = settings_service.auth_settings
src/backend/tests/unit/test_user.py-37- login_data = {
src/backend/tests/unit/test_user.py:38: # SUPERUSER may be reset to default depending on AUTO_LOGIN; use constant for stability in tests
src/backend/tests/unit/test_user.py:39: "username": DEFAULT_SUPERUSER if auth_settings.AUTO_LOGIN else auth_settings.SUPERUSER,
src/backend/tests/unit/test_user.py-40- "password": (
src/backend/tests/unit/test_user.py-41- auth_settings.SUPERUSER_PASSWORD.get_secret_value()
--
src/backend/tests/unit/test_setup_superuser_flow.py-13-
src/backend/tests/unit/test_setup_superuser_flow.py-14- settings = get_settings_service()
src/backend/tests/unit/test_setup_superuser_flow.py:15: settings.auth_settings.AUTO_LOGIN = True
src/backend/tests/unit/test_setup_superuser_flow.py-16-
src/backend/tests/unit/test_setup_superuser_flow.py-17- await initialize_services()
--
src/backend/tests/unit/test_setup_superuser_flow.py-29-
src/backend/tests/unit/test_setup_superuser_flow.py-30- settings = get_settings_service()
src/backend/tests/unit/test_setup_superuser_flow.py:31: settings.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser_flow.py-32-
src/backend/tests/unit/test_setup_superuser_flow.py-33- # Ensure default exists and has never logged in
--
src/backend/tests/unit/test_setup_superuser_flow.py-71-
src/backend/tests/unit/test_setup_superuser_flow.py-72- settings = get_settings_service()
src/backend/tests/unit/test_setup_superuser_flow.py:73: settings.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser_flow.py-74-
src/backend/tests/unit/test_setup_superuser_flow.py-75- # Ensure default exists
--
src/backend/tests/unit/test_setup_superuser_flow.py-113-
src/backend/tests/unit/test_setup_superuser_flow.py-114- settings = get_settings_service()
src/backend/tests/unit/test_setup_superuser_flow.py:115: settings.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser_flow.py-116- settings.auth_settings.SUPERUSER = ""
src/backend/tests/unit/test_setup_superuser_flow.py-117- # Reset password to empty
--
src/backend/tests/unit/test_setup_superuser_flow.py-136-
src/backend/tests/unit/test_setup_superuser_flow.py-137- settings = get_settings_service()
src/backend/tests/unit/test_setup_superuser_flow.py:138: settings.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser_flow.py-139- settings.auth_settings.SUPERUSER = "custom_admin"
=== Searching for SUPERUSER usage ===
src/backend/tests/unit/test_auth_settings.py-3-import pytest
src/backend/tests/unit/test_auth_settings.py-4-from lfx.services.settings.auth import AuthSettings
src/backend/tests/unit/test_auth_settings.py:5:from lfx.services.settings.constants import DEFAULT_SUPERUSER
src/backend/tests/unit/test_auth_settings.py-6-from pydantic import SecretStr
src/backend/tests/unit/test_auth_settings.py-7-
--
src/backend/tests/unit/test_auth_settings.py-11- cfg_dir = tmp_path.as_posix()
src/backend/tests/unit/test_auth_settings.py-12- settings = AuthSettings(CONFIG_DIR=cfg_dir, AUTO_LOGIN=auto_login)
src/backend/tests/unit/test_auth_settings.py:13: assert isinstance(settings.SUPERUSER_PASSWORD, SecretStr)
src/backend/tests/unit/test_auth_settings.py-14-
src/backend/tests/unit/test_auth_settings.py-15-
--
src/backend/tests/unit/test_auth_settings.py-19- CONFIG_DIR=cfg_dir,
src/backend/tests/unit/test_auth_settings.py-20- AUTO_LOGIN=True,
src/backend/tests/unit/test_auth_settings.py:21: SUPERUSER="custom",
src/backend/tests/unit/test_auth_settings.py:22: SUPERUSER_PASSWORD=SecretStr("_changed"),
src/backend/tests/unit/test_auth_settings.py-23- )
src/backend/tests/unit/test_auth_settings.py-24- # Validator forces default username and scrubs password
src/backend/tests/unit/test_auth_settings.py:25: assert settings.SUPERUSER == DEFAULT_SUPERUSER
src/backend/tests/unit/test_auth_settings.py:26: assert isinstance(settings.SUPERUSER_PASSWORD, SecretStr)
src/backend/tests/unit/test_auth_settings.py:27: assert settings.SUPERUSER_PASSWORD.get_secret_value() == "langflow"
src/backend/tests/unit/test_auth_settings.py-28-
src/backend/tests/unit/test_auth_settings.py-29- # reset_credentials keeps default username (AUTO_LOGIN on) and keeps password scrubbed
src/backend/tests/unit/test_auth_settings.py-30- settings.reset_credentials()
src/backend/tests/unit/test_auth_settings.py:31: assert settings.SUPERUSER == DEFAULT_SUPERUSER
src/backend/tests/unit/test_auth_settings.py:32: assert settings.SUPERUSER_PASSWORD.get_secret_value() == "langflow"
src/backend/tests/unit/test_auth_settings.py-33-
src/backend/tests/unit/test_auth_settings.py-34-
--
src/backend/tests/unit/test_auth_settings.py-38- CONFIG_DIR=cfg_dir,
src/backend/tests/unit/test_auth_settings.py-39- AUTO_LOGIN=False,
src/backend/tests/unit/test_auth_settings.py:40: SUPERUSER="admin",
src/backend/tests/unit/test_auth_settings.py:41: SUPERUSER_PASSWORD=SecretStr("strongpass"),
src/backend/tests/unit/test_auth_settings.py-42- )
src/backend/tests/unit/test_auth_settings.py-43- # Values preserved at init
src/backend/tests/unit/test_auth_settings.py:44: assert settings.SUPERUSER == "admin"
src/backend/tests/unit/test_auth_settings.py:45: assert settings.SUPERUSER_PASSWORD.get_secret_value() == "strongpass"
src/backend/tests/unit/test_auth_settings.py-46-
src/backend/tests/unit/test_auth_settings.py-47- # After reset, username preserved, password scrubbed
src/backend/tests/unit/test_auth_settings.py-48- settings.reset_credentials()
src/backend/tests/unit/test_auth_settings.py:49: assert settings.SUPERUSER == "admin"
src/backend/tests/unit/test_auth_settings.py:50: assert settings.SUPERUSER_PASSWORD.get_secret_value() == ""
--
src/backend/tests/unit/test_setup_superuser.py-7-from langflow.services.utils import teardown_superuser
src/backend/tests/unit/test_setup_superuser.py-8-from lfx.services.settings.constants import (
src/backend/tests/unit/test_setup_superuser.py:9: DEFAULT_SUPERUSER,
src/backend/tests/unit/test_setup_superuser.py:10: DEFAULT_SUPERUSER_PASSWORD,
src/backend/tests/unit/test_setup_superuser.py-11-)
src/backend/tests/unit/test_setup_superuser.py-12-from sqlalchemy.exc import IntegrityError
--
src/backend/tests/unit/test_setup_superuser.py-23-# mock_settings_service = Mock()
src/backend/tests/unit/test_setup_superuser.py-24-# mock_settings_service.auth_settings.AUTO_LOGIN = True
src/backend/tests/unit/test_setup_superuser.py:25:# mock_settings_service.auth_settings.SUPERUSER = DEFAULT_SUPERUSER
src/backend/tests/unit/test_setup_superuser.py:26:# mock_settings_service.auth_settings.SUPERUSER_PASSWORD = DEFAULT_SUPERUSER_PASSWORD
src/backend/tests/unit/test_setup_superuser.py-27-# mock_get_settings_service.return_value = mock_settings_service
src/backend/tests/unit/test_setup_superuser.py-28-# mock_session = Mock()
--
src/backend/tests/unit/test_setup_superuser.py-37-# mock_session.query.return_value.filter.return_value.first.return_value = None
src/backend/tests/unit/test_setup_superuser.py-38-# actual_expr = mock_session.query.return_value.filter.call_args[0][0]
src/backend/tests/unit/test_setup_superuser.py:39:# expected_expr = User.username == DEFAULT_SUPERUSER
src/backend/tests/unit/test_setup_superuser.py-40-
src/backend/tests/unit/test_setup_superuser.py-41-# assert str(actual_expr) == str(expected_expr)
src/backend/tests/unit/test_setup_superuser.py-42-# create_call = call(
src/backend/tests/unit/test_setup_superuser.py:43:# db=mock_session, username=DEFAULT_SUPERUSER, password=DEFAULT_SUPERUSER_PASSWORD
src/backend/tests/unit/test_setup_superuser.py-44-# )
src/backend/tests/unit/test_setup_superuser.py-45-# calls.append(create_call)
--
src/backend/tests/unit/test_setup_superuser.py-48-
src/backend/tests/unit/test_setup_superuser.py-49-# def reset_mock_credentials():
src/backend/tests/unit/test_setup_superuser.py:50:# mock_settings_service.auth_settings.SUPERUSER = DEFAULT_SUPERUSER
src/backend/tests/unit/test_setup_superuser.py:51:# mock_settings_service.auth_settings.SUPERUSER_PASSWORD = (
src/backend/tests/unit/test_setup_superuser.py:52:# DEFAULT_SUPERUSER_PASSWORD
src/backend/tests/unit/test_setup_superuser.py-53-# )
src/backend/tests/unit/test_setup_superuser.py-54-
--
src/backend/tests/unit/test_setup_superuser.py-57-# mock_settings_service.auth_settings = Mock()
src/backend/tests/unit/test_setup_superuser.py-58-# mock_settings_service.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser.py:59:# mock_settings_service.auth_settings.SUPERUSER = ADMIN_USER_NAME
src/backend/tests/unit/test_setup_superuser.py:60:# mock_settings_service.auth_settings.SUPERUSER_PASSWORD = "password"
src/backend/tests/unit/test_setup_superuser.py-61-# mock_settings_service.auth_settings.reset_credentials = Mock(
</shell_result>
Script executed:
#!/bin/bash
# Get full context around the code snippet
echo "=== Reading config_utils.py around lines 374-384 ==="
sed -n '360,400p' src/backend/base/langflow/api/utils/mcp/config_utils.py | cat -n
echo -e "\n=== Searching for user_starter_folder definition ==="
rg -n "user_starter_folder" src/backend/base/langflow/api/utils/mcp/config_utils.py -B 5 -A 5 | head -100Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
=== Reading config_utils.py around lines 374-384 ===
1 await logger.adebug(
2 f"MCP server '{validation_result.server_name}' already exists for user "
3 f"{user.username}'s starter projects (project ID: "
4 f"{user_starter_folder.id}), skipping"
5 )
6 continue # Skip this user since server already exists for the same project
7
8 server_name = validation_result.server_name
9
10 # Set up THIS USER'S starter folder authentication (same as new projects)
11 # If AUTO_LOGIN is false, automatically enable API key authentication
12 default_auth = {"auth_type": "none"}
13 await logger.adebug(f"Settings service auth settings: {settings_service.auth_settings}")
14 await logger.adebug(f"User starter folder auth settings: {user_starter_folder.auth_settings}")
15 if (
16 not user_starter_folder.auth_settings
17 and settings_service.auth_settings.AUTO_LOGIN
18 and not settings_service.auth_settings.SUPERUSER
19 ):
20 default_auth = {"auth_type": "apikey"}
21 user_starter_folder.auth_settings = encrypt_auth_settings(default_auth)
22 await logger.adebug(
23 "AUTO_LOGIN enabled without SUPERUSER; forcing API key auth for starter folder %s",
24 user.username,
25 )
26 elif not settings_service.auth_settings.AUTO_LOGIN and not user_starter_folder.auth_settings:
27 default_auth = {"auth_type": "apikey"}
28 user_starter_folder.auth_settings = encrypt_auth_settings(default_auth)
29 await logger.adebug(f"Set up auth settings for user {user.username}'s starter folder")
30 elif user_starter_folder.auth_settings:
31 default_auth = user_starter_folder.auth_settings
32
33 # Create API key for this user to access their own starter projects
34 api_key_name = f"MCP Project {DEFAULT_FOLDER_NAME} - {user.username}"
35 unmasked_api_key = await create_api_key(session, ApiKeyCreate(name=api_key_name), user.id)
36
37 # Build connection URLs for THIS USER'S starter folder (unique ID per user)
38 streamable_http_url = await get_project_streamable_http_url(user_starter_folder.id)
39
40 # Prepare server config (similar to new project creation)
41 if default_auth.get("auth_type", "none") == "apikey":
=== Searching for user_starter_folder definition ===
298- folder_names = [f.name for f in all_user_folders]
299- await logger.adebug(f"User {user.username} has folders: {folder_names}")
300-
301- # Find THIS USER'S own starter projects folder
302- # Each user has their own "Starter Projects" folder with unique ID
303: user_starter_folder = (
304- await session.exec(
305- select(Folder).where(
306- Folder.name == DEFAULT_FOLDER_NAME,
307- Folder.user_id == user.id, # Each user has their own!
308- )
309- )
310- ).first()
311: if not user_starter_folder:
312- await logger.adebug(
313- f"No starter projects folder ('{DEFAULT_FOLDER_NAME}') found for user {user.username}, skipping"
314- )
315- # Log what folders this user does have for debugging
316- await logger.adebug(f"User {user.username} available folders: {folder_names}")
317- continue
318-
319- await logger.adebug(
320: f"Found starter folder '{user_starter_folder.name}' for {user.username}: "
321: f"ID={user_starter_folder.id}"
322- )
323-
324- # Configure MCP settings for flows in THIS USER'S starter folder
325- flows_query = select(Flow).where(
326: Flow.folder_id == user_starter_folder.id,
327- Flow.is_component == False, # noqa: E712
328- )
329- user_starter_flows = (await session.exec(flows_query)).all()
330-
331- # Enable MCP for starter flows if not already configured
--
344- if flows_configured > 0:
345- await logger.adebug(f"Enabled MCP for {flows_configured} starter flows for user {user.username}")
346-
347- # Validate MCP server for this starter projects folder
348- validation_result = await validate_mcp_server_for_project(
349: user_starter_folder.id,
350- DEFAULT_FOLDER_NAME,
351- user,
352- session,
353- get_storage_service(),
354- settings_service,
--
358- # Skip if server already exists for this starter projects folder
359- if validation_result.should_skip:
360- await logger.adebug(
361- f"MCP server '{validation_result.server_name}' already exists for user "
362- f"{user.username}'s starter projects (project ID: "
363: f"{user_starter_folder.id}), skipping"
364- )
365- continue # Skip this user since server already exists for the same project
366-
367- server_name = validation_result.server_name
368-
369- # Set up THIS USER'S starter folder authentication (same as new projects)
370- # If AUTO_LOGIN is false, automatically enable API key authentication
371- default_auth = {"auth_type": "none"}
372- await logger.adebug(f"Settings service auth settings: {settings_service.auth_settings}")
373: await logger.adebug(f"User starter folder auth settings: {user_starter_folder.auth_settings}")
374- if (
375: not user_starter_folder.auth_settings
376- and settings_service.auth_settings.AUTO_LOGIN
377- and not settings_service.auth_settings.SUPERUSER
378- ):
379- default_auth = {"auth_type": "apikey"}
380: user_starter_folder.auth_settings = encrypt_auth_settings(default_auth)
381- await logger.adebug(
382- "AUTO_LOGIN enabled without SUPERUSER; forcing API key auth for starter folder %s",
383- user.username,
384- )
385: elif not settings_service.auth_settings.AUTO_LOGIN and not user_starter_folder.auth_settings:
386- default_auth = {"auth_type": "apikey"}
387: user_starter_folder.auth_settings = encrypt_auth_settings(default_auth)
388- await logger.adebug(f"Set up auth settings for user {user.username}'s starter folder")
389: elif user_starter_folder.auth_settings:
390: default_auth = user_starter_folder.auth_settings
391-
392- # Create API key for this user to access their own starter projects
393- api_key_name = f"MCP Project {DEFAULT_FOLDER_NAME} - {user.username}"
394- unmasked_api_key = await create_api_key(session, ApiKeyCreate(name=api_key_name), user.id)
395-
396- # Build connection URLs for THIS USER'S starter folder (unique ID per user)
397: streamable_http_url = await get_project_streamable_http_url(user_starter_folder.id)
398-
399- # Prepare server config (similar to new project creation)
400- if default_auth.get("auth_type", "none") == "apikey":
401- command = "uvx"
402- args = [
</shell_result>
Script executed:
#!/bin/bash
# Find AuthSettings class definition
echo "=== Searching for AuthSettings class ==="
rg -n "class AuthSettings" src/backend/ -A 30 | head -100
echo -e "\n=== Searching for SUPERUSER validator logic ==="
rg -n "SUPERUSER" src/backend/ --type=py -B 2 -A 2 | grep -A 10 -B 10 "validator\|field_validator" | head -80Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
=== Searching for AuthSettings class ===
src/backend/base/langflow/api/v1/schemas.py:422:class AuthSettings(BaseModel):
src/backend/base/langflow/api/v1/schemas.py-423- """Model representing authentication settings for MCP."""
src/backend/base/langflow/api/v1/schemas.py-424-
src/backend/base/langflow/api/v1/schemas.py-425- auth_type: Literal["none", "apikey", "oauth"] = "none"
src/backend/base/langflow/api/v1/schemas.py-426- oauth_host: str | None = None
src/backend/base/langflow/api/v1/schemas.py-427- oauth_port: str | None = None
src/backend/base/langflow/api/v1/schemas.py-428- oauth_server_url: str | None = None
src/backend/base/langflow/api/v1/schemas.py-429- oauth_callback_path: str | None = None
src/backend/base/langflow/api/v1/schemas.py-430- oauth_client_id: str | None = None
src/backend/base/langflow/api/v1/schemas.py-431- oauth_client_secret: SecretStr | None = None
src/backend/base/langflow/api/v1/schemas.py-432- oauth_auth_url: str | None = None
src/backend/base/langflow/api/v1/schemas.py-433- oauth_token_url: str | None = None
src/backend/base/langflow/api/v1/schemas.py-434- oauth_mcp_scope: str | None = None
src/backend/base/langflow/api/v1/schemas.py-435- oauth_provider_scope: str | None = None
src/backend/base/langflow/api/v1/schemas.py-436-
src/backend/base/langflow/api/v1/schemas.py-437-
src/backend/base/langflow/api/v1/schemas.py-438-class MCPSettings(BaseModel):
src/backend/base/langflow/api/v1/schemas.py-439- """Model representing MCP settings for a flow."""
src/backend/base/langflow/api/v1/schemas.py-440-
src/backend/base/langflow/api/v1/schemas.py-441- id: UUID
src/backend/base/langflow/api/v1/schemas.py-442- mcp_enabled: bool | None = None
src/backend/base/langflow/api/v1/schemas.py-443- action_name: str | None = None
src/backend/base/langflow/api/v1/schemas.py-444- action_description: str | None = None
src/backend/base/langflow/api/v1/schemas.py-445- name: str | None = None
src/backend/base/langflow/api/v1/schemas.py-446- description: str | None = None
src/backend/base/langflow/api/v1/schemas.py-447-
src/backend/base/langflow/api/v1/schemas.py-448-
src/backend/base/langflow/api/v1/schemas.py-449-class MCPProjectUpdateRequest(BaseModel):
src/backend/base/langflow/api/v1/schemas.py-450- """Request model for updating MCP project settings including auth."""
src/backend/base/langflow/api/v1/schemas.py-451-
src/backend/base/langflow/api/v1/schemas.py-452- settings: list[MCPSettings]
=== Searching for SUPERUSER validator logic ===
</shell_result>
Script executed:
#!/bin/bash
# The AuthSettings is from lfx.services.settings.auth
# Let's search more specifically
echo "=== Searching for AUTH_LOGIN and SUPERUSER relationships ==="
rg -n "AUTO_LOGIN" src/backend/ --type=py -B 5 -A 5 | grep -B 10 -A 10 "SUPERUSER\|validator" | head -120
echo -e "\n=== Check if there's any config that shows SUPERUSER can be empty ==="
rg -n "not.*SUPERUSER\|SUPERUSER.*not" src/backend/ --type=py -C 3 | head -100Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
=== Searching for AUTH_LOGIN and SUPERUSER relationships ===
src/backend/tests/unit/test_user.py-33- super_user, # noqa: ARG001
src/backend/tests/unit/test_user.py-34-):
src/backend/tests/unit/test_user.py-35- settings_service = get_settings_service()
src/backend/tests/unit/test_user.py-36- auth_settings = settings_service.auth_settings
src/backend/tests/unit/test_user.py-37- login_data = {
src/backend/tests/unit/test_user.py:38: # SUPERUSER may be reset to default depending on AUTO_LOGIN; use constant for stability in tests
src/backend/tests/unit/test_user.py:39: "username": DEFAULT_SUPERUSER if auth_settings.AUTO_LOGIN else auth_settings.SUPERUSER,
src/backend/tests/unit/test_user.py-40- "password": (
src/backend/tests/unit/test_user.py-41- auth_settings.SUPERUSER_PASSWORD.get_secret_value()
src/backend/tests/unit/test_user.py-42- if hasattr(auth_settings.SUPERUSER_PASSWORD, "get_secret_value")
src/backend/tests/unit/test_user.py-43- else auth_settings.SUPERUSER_PASSWORD
src/backend/tests/unit/test_user.py-44- ),
--
src/backend/tests/unit/[email protected]
src/backend/tests/unit/test_setup_superuser_flow.py-11-async def test_initialize_services_creates_default_superuser_when_auto_login_true(client): # noqa: ARG001
src/backend/tests/unit/test_setup_superuser_flow.py-12- from langflow.services.deps import session_scope
src/backend/tests/unit/test_setup_superuser_flow.py-13-
src/backend/tests/unit/test_setup_superuser_flow.py-14- settings = get_settings_service()
src/backend/tests/unit/test_setup_superuser_flow.py:15: settings.auth_settings.AUTO_LOGIN = True
src/backend/tests/unit/test_setup_superuser_flow.py-16-
src/backend/tests/unit/test_setup_superuser_flow.py-17- await initialize_services()
src/backend/tests/unit/test_setup_superuser_flow.py-18-
src/backend/tests/unit/test_setup_superuser_flow.py-19- async with session_scope() as session:
src/backend/tests/unit/test_setup_superuser_flow.py-20- stmt = select(User).where(User.username == DEFAULT_SUPERUSER)
--
src/backend/tests/unit/[email protected]
src/backend/tests/unit/test_setup_superuser_flow.py-27-async def test_teardown_superuser_removes_default_if_never_logged(client): # noqa: ARG001
src/backend/tests/unit/test_setup_superuser_flow.py-28- from langflow.services.deps import session_scope
src/backend/tests/unit/test_setup_superuser_flow.py-29-
src/backend/tests/unit/test_setup_superuser_flow.py-30- settings = get_settings_service()
src/backend/tests/unit/test_setup_superuser_flow.py:31: settings.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser_flow.py-32-
src/backend/tests/unit/test_setup_superuser_flow.py-33- # Ensure default exists and has never logged in
src/backend/tests/unit/test_setup_superuser_flow.py-34- await initialize_services()
--
src/backend/tests/unit/test_setup_superuser_flow.py-76- await initialize_services()
src/backend/tests/unit/test_setup_superuser_flow.py-77-
src/backend/tests/unit/test_setup_superuser_flow.py-78- async with session_scope() as session:
--
src/backend/tests/unit/test_setup_superuser_flow.py-110-async def test_setup_superuser_with_no_configured_credentials(client): # noqa: ARG001
src/backend/tests/unit/test_setup_superuser_flow.py-111- """Test setup_superuser behavior when no superuser credentials are configured."""
src/backend/tests/unit/test_setup_superuser_flow.py-112- from langflow.services.deps import session_scope
src/backend/tests/unit/test_setup_superuser_flow.py-113-
src/backend/tests/unit/test_setup_superuser_flow.py-114- settings = get_settings_service()
src/backend/tests/unit/test_setup_superuser_flow.py:115: settings.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser_flow.py-116- settings.auth_settings.SUPERUSER = ""
src/backend/tests/unit/test_setup_superuser_flow.py-117- # Reset password to empty
src/backend/tests/unit/test_setup_superuser_flow.py-118- settings.auth_settings.SUPERUSER_PASSWORD = ""
src/backend/tests/unit/test_setup_superuser_flow.py-119-
src/backend/tests/unit/test_setup_superuser_flow.py-120- async with session_scope() as session:
--
src/backend/tests/unit/test_setup_superuser_flow.py-133- """Test setup_superuser behavior with custom superuser credentials."""
src/backend/tests/unit/test_setup_superuser_flow.py-134- from langflow.services.deps import session_scope
src/backend/tests/unit/test_setup_superuser_flow.py-135- from pydantic import SecretStr
src/backend/tests/unit/test_setup_superuser_flow.py-136-
src/backend/tests/unit/test_setup_superuser_flow.py-137- settings = get_settings_service()
src/backend/tests/unit/test_setup_superuser_flow.py:138: settings.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser_flow.py-139- settings.auth_settings.SUPERUSER = "custom_admin"
src/backend/tests/unit/test_setup_superuser_flow.py-140- settings.auth_settings.SUPERUSER_PASSWORD = SecretStr("custom_password")
src/backend/tests/unit/test_setup_superuser_flow.py-141-
src/backend/tests/unit/test_setup_superuser_flow.py-142- # Clean DB state to avoid interference from previous tests
src/backend/tests/unit/test_setup_superuser_flow.py-143- async with session_scope() as session:
--
src/backend/tests/unit/test_setup_superuser.py-16-# @patch("langflow.services.deps.get_settings_service")
src/backend/tests/unit/test_setup_superuser.py-17-# # @patch("langflow.services.utils.verify_password")
src/backend/tests/unit/test_setup_superuser.py-18-# def test_setup_superuser(
src/backend/tests/unit/test_setup_superuser.py-19-# mock_get_session, mock_create_super_user, mock_get_settings_service
src/backend/tests/unit/test_setup_superuser.py-20-# ):
src/backend/tests/unit/test_setup_superuser.py:21:# # Test when AUTO_LOGIN is True
src/backend/tests/unit/test_setup_superuser.py-22-# calls = []
src/backend/tests/unit/test_setup_superuser.py-23-# mock_settings_service = Mock()
src/backend/tests/unit/test_setup_superuser.py:24:# mock_settings_service.auth_settings.AUTO_LOGIN = True
src/backend/tests/unit/test_setup_superuser.py-25-# mock_settings_service.auth_settings.SUPERUSER = DEFAULT_SUPERUSER
src/backend/tests/unit/test_setup_superuser.py-26-# mock_settings_service.auth_settings.SUPERUSER_PASSWORD = DEFAULT_SUPERUSER_PASSWORD
src/backend/tests/unit/test_setup_superuser.py-27-# mock_get_settings_service.return_value = mock_settings_service
src/backend/tests/unit/test_setup_superuser.py-28-# mock_session = Mock()
src/backend/tests/unit/test_setup_superuser.py-29-# mock_session.query.return_value.filter.return_value.first.return_value = (
--
src/backend/tests/unit/test_setup_superuser.py-53-# )
src/backend/tests/unit/test_setup_superuser.py-54-
src/backend/tests/unit/test_setup_superuser.py-55-# ADMIN_USER_NAME = "admin_user"
src/backend/tests/unit/test_setup_superuser.py-56-# # Test when username and password are default
src/backend/tests/unit/test_setup_superuser.py-57-# mock_settings_service.auth_settings = Mock()
src/backend/tests/unit/test_setup_superuser.py:58:# mock_settings_service.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser.py-59-# mock_settings_service.auth_settings.SUPERUSER = ADMIN_USER_NAME
src/backend/tests/unit/test_setup_superuser.py-60-# mock_settings_service.auth_settings.SUPERUSER_PASSWORD = "password"
src/backend/tests/unit/test_setup_superuser.py-61-# mock_settings_service.auth_settings.reset_credentials = Mock(
src/backend/tests/unit/test_setup_superuser.py-62-# side_effect=reset_mock_credentials
src/backend/tests/unit/test_setup_superuser.py-63-# )
--
src/backend/tests/unit/test_setup_superuser.py-78-# mock_settings_service.auth_settings.reset_credentials.assert_called_once()
src/backend/tests/unit/test_setup_superuser.py-79-# assert mock_settings_service.auth_settings.SUPERUSER != ADMIN_USER_NAME
src/backend/tests/unit/test_setup_superuser.py-80-# assert mock_settings_service.auth_settings.SUPERUSER_PASSWORD != "password"
src/backend/tests/unit/test_setup_superuser.py-81-
src/backend/tests/unit/test_setup_superuser.py-82-# # Test when superuser already exists
src/backend/tests/unit/test_setup_superuser.py:83:# mock_settings_service.auth_settings.AUTO_LOGIN = False
src/backend/tests/unit/test_setup_superuser.py-84-# mock_settings_service.auth_settings.SUPERUSER = ADMIN_USER_NAME
src/backend/tests/unit/test_setup_superuser.py-85-# mock_settings_service.auth_settings.SUPERUSER_PASSWORD = "password"
src/backend/tests/unit/test_setup_superuser.py-86-# mock_user = Mock()
src/backend/tests/unit/test_setup_superuser.py-87-# mock_user.is_superuser = True
src/backend/tests/unit/test_setup_superuser.py-88-# mock_session.query.return_value.filter.return_value.first.return_value = mock_user
--
src/backend/tests/unit/test_setup_superuser.py-96-
src/backend/tests/unit/test_setup_superuser.py-97-@patch("langflow.services.deps.get_settings_service")
src/backend/tests/unit/test_setup_superuser.py-98-@patch("langflow.services.deps.get_session")
src/backend/tests/unit/test_setup_superuser.py-99-async def test_teardown_superuser_default_superuser(mock_get_session, mock_get_settings_service):
src/backend/tests/unit/test_setup_superuser.py-100- mock_settings_service = MagicMock()
src/backend/tests/unit/test_setup_superuser.py:101: mock_settings_service.auth_settings.AUTO_LOGIN = True
src/backend/tests/unit/test_setup_superuser.py-102- mock_settings_service.auth_settings.SUPERUSER = DEFAULT_SUPERUSER
src/backend/tests/unit/test_setup_superuser.py-103- mock_settings_service.auth_settings.SUPERUSER_PASSWORD = DEFAULT_SUPERUSER_PASSWORD
src/backend/tests/unit/test_setup_superuser.py-104- mock_get_settings_service.return_value = mock_settings_service
src/backend/tests/unit/test_setup_superuser.py-105-
src/backend/tests/unit/test_setup_superuser.py-106- mock_session = MagicMock()
--
src/backend/tests/unit/test_setup_superuser.py-115-
src/backend/tests/unit/test_setup_superuser.py-116-
src/backend/tests/unit/test_setup_superuser.py-117-async def test_teardown_superuser_no_default_superuser():
src/backend/tests/unit/test_setup_superuser.py-118- admin_user_name = "admin_user"
src/backend/tests/unit/test_setup_superuser.py-119- mock_settings_service = MagicMock()
src/backend/tests/unit/test_setup_superuser.py:120: mock_settings_service.auth_settings.AUTO_LOGIN = False
=== Check if there's any config that shows SUPERUSER can be empty ===
</shell_result>
Script executed:
#!/bin/bash
# Look at the comment again and full logic flow
echo "=== Full auth setup logic with comments ==="
sed -n '369,395p' src/backend/base/langflow/api/utils/mcp/config_utils.py
echo -e "\n=== Check what encrypt_auth_settings does ==="
rg -n "def encrypt_auth_settings" src/backend/ -A 10Repository: langflow-ai/langflow
<shell_result>
Repository: langflow-ai/langflow
Exit code: 0
stdout:
=== Full auth setup logic with comments ===
# Set up THIS USER'S starter folder authentication (same as new projects)
# If AUTO_LOGIN is false, automatically enable API key authentication
default_auth = {"auth_type": "none"}
await logger.adebug(f"Settings service auth settings: {settings_service.auth_settings}")
await logger.adebug(f"User starter folder auth settings: {user_starter_folder.auth_settings}")
if (
not user_starter_folder.auth_settings
and settings_service.auth_settings.AUTO_LOGIN
and not settings_service.auth_settings.SUPERUSER
):
default_auth = {"auth_type": "apikey"}
user_starter_folder.auth_settings = encrypt_auth_settings(default_auth)
await logger.adebug(
"AUTO_LOGIN enabled without SUPERUSER; forcing API key auth for starter folder %s",
user.username,
)
elif not settings_service.auth_settings.AUTO_LOGIN and not user_starter_folder.auth_settings:
default_auth = {"auth_type": "apikey"}
user_starter_folder.auth_settings = encrypt_auth_settings(default_auth)
await logger.adebug(f"Set up auth settings for user {user.username}'s starter folder")
elif user_starter_folder.auth_settings:
default_auth = user_starter_folder.auth_settings
# Create API key for this user to access their own starter projects
api_key_name = f"MCP Project {DEFAULT_FOLDER_NAME} - {user.username}"
unmasked_api_key = await create_api_key(session, ApiKeyCreate(name=api_key_name), user.id)
=== Check what encrypt_auth_settings does ===
src/backend/base/langflow/services/auth/mcp_encryption.py:18:def encrypt_auth_settings(auth_settings: dict[str, Any] | None) -> dict[str, Any] | None:
src/backend/base/langflow/services/auth/mcp_encryption.py-19- """Encrypt sensitive fields in auth_settings dictionary.
src/backend/base/langflow/services/auth/mcp_encryption.py-20-
src/backend/base/langflow/services/auth/mcp_encryption.py-21- Args:
src/backend/base/langflow/services/auth/mcp_encryption.py-22- auth_settings: Dictionary containing authentication settings
src/backend/base/langflow/services/auth/mcp_encryption.py-23-
src/backend/base/langflow/services/auth/mcp_encryption.py-24- Returns:
src/backend/base/langflow/services/auth/mcp_encryption.py-25- Dictionary with sensitive fields encrypted, or None if input is None
src/backend/base/langflow/services/auth/mcp_encryption.py-26- """
src/backend/base/langflow/services/auth/mcp_encryption.py-27- if auth_settings is None:
src/backend/base/langflow/services/auth/mcp_encryption.py-28- return None
</shell_result>
Based on my verification, I can now provide the rewritten review comment:
Comment on AUTO_LOGIN security hardening is correct, but the documentation is misleading.
The code correctly forces API key authentication for MCP starter folders when AUTO_LOGIN=True but SUPERUSER is unconfigured—an edge case that deserves hardening since AUTO_LOGIN bypasses normal authentication and gives all users admin access. However, the comment on line 370 ("If AUTO_LOGIN is false, automatically enable API key authentication") describes only the elif condition (line 385) and does not document the primary if condition (lines 374-384). The first condition warrants its own comment explaining the security rationale for this edge case.
if (
not user_starter_folder.auth_settings
and settings_service.auth_settings.AUTO_LOGIN
and not settings_service.auth_settings.SUPERUSER
):
default_auth = {"auth_type": "apikey"}
user_starter_folder.auth_settings = encrypt_auth_settings(default_auth)
await logger.adebug(
"AUTO_LOGIN enabled without SUPERUSER; forcing API key auth for starter folder %s",
user.username,
)Recommend clarifying the comment to explicitly document both branches: the security hardening for auto-login edge cases and the standard behavior when auto-login is disabled.
🤖 Prompt for AI Agents
In src/backend/base/langflow/api/utils/mcp/config_utils.py around lines 374 to
384, the existing inline comment only documents the elif branch and omits the
security rationale for the primary if branch that forces API key auth when
AUTO_LOGIN is enabled but SUPERUSER is not set; add a clear comment immediately
above this if that explains the edge-case hardening (AUTO_LOGIN bypasses normal
auth so we force API key auth for starter folders when no SUPERUSER is
configured), and update or expand the nearby comment at ~line 370 so both
branches (the AUTO_LOGIN-without-SUPERUSER hardening and the behavior when
AUTO_LOGIN is false) are explicitly documented.
| try: | ||
| async with sse.connect_sse(request.scope, request.receive, request._send) as streams: # noqa: SLF001 | ||
| try: | ||
| msg = "Starting SSE connection" | ||
| await logger.adebug(msg) | ||
| msg = f"Stream types: read={type(streams[0])}, write={type(streams[1])}" | ||
| await logger.adebug(msg) | ||
|
|
||
| notification_options = NotificationOptions( | ||
| prompts_changed=True, resources_changed=True, tools_changed=True | ||
| ) | ||
| init_options = server.create_initialization_options(notification_options) | ||
| msg = f"Initialization options: {init_options}" | ||
| await logger.adebug(msg) | ||
|
|
||
| try: | ||
| await server.run(streams[0], streams[1], init_options) | ||
| except Exception as exc: # noqa: BLE001 | ||
| validation_error = find_validation_error(exc) | ||
| if validation_error: | ||
| msg = "Validation error in MCP:" + str(validation_error) | ||
| await logger.adebug(msg) | ||
| else: | ||
| msg = f"Error in MCP: {exc!s}" | ||
| await logger.adebug(msg) | ||
| return | ||
| except BrokenResourceError: | ||
| # Handle gracefully when client disconnects | ||
| await logger.ainfo("Client disconnected from SSE connection") | ||
| except asyncio.CancelledError: | ||
| await logger.ainfo("SSE connection was cancelled") | ||
| raise | ||
| except Exception as e: | ||
| msg = f"Error in MCP: {e!s}" | ||
| await logger.aexception(msg) | ||
| raise | ||
| # Use the global variable instead of the old singleton | ||
| await _streamable_http_manager.handle_request(request.scope, request.receive, request._send) # noqa: SLF001 | ||
| except HTTPException: | ||
| raise | ||
| except Exception as exc: | ||
| await logger.aexception(f"Error handling Streamable HTTP request: {exc!s}") | ||
| raise HTTPException(status_code=500, detail="Internal server error in Streamable HTTP transport") from exc |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add None check for _streamable_http_manager to fix Mypy union-attr error.
The pipeline shows a Mypy error: 'None' of 'Any | None' has no attribute 'handle_request'. After _ensure_streamable_http_manager_running() completes, _streamable_http_manager should be set, but Mypy cannot verify this. Add an explicit guard.
context_token = current_user_ctx.set(current_user)
try:
- # Use the global variable instead of the old singleton
- await _streamable_http_manager.handle_request(request.scope, request.receive, request._send) # noqa: SLF001
+ if _streamable_http_manager is None:
+ raise HTTPException(status_code=503, detail="Streamable HTTP manager not initialized")
+ await _streamable_http_manager.handle_request(request.scope, request.receive, request._send) # noqa: SLF001
except HTTPException:
raise🧰 Tools
🪛 GitHub Actions: CI
[error] 148-148: Mypy error: 'None' of 'Any | None' has no attribute 'handle_request' [union-attr].
🤖 Prompt for AI Agents
In src/backend/base/langflow/api/v1/mcp.py around lines 146 to 153, add an
explicit None-check for _streamable_http_manager before calling its
handle_request method to satisfy Mypy; after calling
_ensure_streamable_http_manager_running(), verify if _streamable_http_manager is
None and if so log an error (or use logger.aexception) and raise an
HTTPException(500, "Internal server error in Streamable HTTP transport") (or a
suitable runtime exception) instead of calling handle_request on a possible
None; this makes the control flow explicit for the type checker and preserves
the existing error handling semantics.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
♻️ Duplicate comments (3)
src/backend/base/langflow/api/v2/mcp.py (1)
132-142: API key orphaning risk when config upload fails.This concern was raised in a previous review. The sequence commits the API key (line 135) before calling
upload_server_config(lines 136-142). If the upload fails, the API key persists in the database but isn't referenced by any server configuration.Consider wrapping the upload in try/except to delete the orphaned key on failure.
src/backend/tests/unit/api/v1/test_mcp.py (1)
82-84: Prefix unused fixture parameters with underscore to satisfy Ruff ARG001.The
mock_ensure_manager_runningfixture is required to activate the patch but isn't referenced in test bodies.async def test_mcp_post_endpoint( - client: AsyncClient, logged_in_headers, mock_streamable_http_manager, mock_ensure_manager_running + client: AsyncClient, logged_in_headers, mock_streamable_http_manager, _mock_ensure_manager_running ):Apply the same rename in
test_mcp_post_endpoint_disconnect_errorandtest_mcp_post_endpoint_server_error.Also applies to: 108-110, 120-122
src/backend/base/langflow/api/v1/mcp.py (1)
148-148:request._sendis an undocumented private Starlette attribute.This was flagged in a previous review. The same pattern exists in
mcp_projects.pyline 337.
🧹 Nitpick comments (4)
src/backend/base/langflow/api/v2/mcp.py (2)
373-386: Verify arg parsing handles edge cases correctly.The while-loop parses args by skipping
--transport(2 tokens), capturing--headers(3 tokens), and filtering HTTP URLs. However:
- If
--headershas fewer than 3 tokens remaining, slicingargs[idx:idx+3]may capture fewer elements than expected.- If an arg starts with "http" but isn't the URL (e.g., a header value like
httpbin), it would be incorrectly skipped.Consider adding bounds checking and more precise URL detection.
while idx < len(args): arg_item = args[idx] if arg_item == "--transport": + if idx + 1 >= len(args): + idx += 1 + continue idx += 2 continue if arg_item == "--headers": + if idx + 2 >= len(args): + idx += 1 + continue existing_header_tokens = args[idx : idx + 3] idx += 3 continue - if isinstance(arg_item, str) and arg_item.startswith("http"): + if isinstance(arg_item, str) and arg_item.startswith("http://") or arg_item.startswith("https://"): idx += 1 continue preserved_args.append(arg_item) idx += 1
388-400: Simplify API key generation conditions.The nested conditionals for
should_generate_api_keyare hard to follow. Consider consolidating:- should_generate_api_key = False - - if settings_service.settings.mcp_composer_enabled: - if project.auth_settings and project.auth_settings.get("auth_type") == "apikey": - should_generate_api_key = True - elif project.auth_settings: - if project.auth_settings.get("auth_type") == "apikey": - should_generate_api_key = True - elif not settings_service.auth_settings.AUTO_LOGIN: - should_generate_api_key = True - - if settings_service.auth_settings.AUTO_LOGIN and not settings_service.auth_settings.SUPERUSER: - should_generate_api_key = True + auth_type = (project.auth_settings or {}).get("auth_type") + auto_login = settings_service.auth_settings.AUTO_LOGIN + is_superuser = settings_service.auth_settings.SUPERUSER + + should_generate_api_key = ( + auth_type == "apikey" + or (not auto_login and not project.auth_settings) + or (auto_login and not is_superuser) + )src/backend/base/langflow/api/utils/mcp/config_utils.py (1)
256-265: Validate composer_port type before conversion.The
int(composer_port)conversion on line 265 could raiseValueErrorifcomposer_portis not a valid numeric string. Consider validating or handling this case.async def get_composer_streamable_http_url(project: Folder) -> str: """Generate Streamable HTTP URL for the MCP Composer instance.""" auth_config = await _get_mcp_composer_auth_config(project) composer_host = auth_config.get("oauth_host") composer_port = auth_config.get("oauth_port") if not composer_host or not composer_port: error_msg = "OAuth host and port are required to get the MCP Composer URL" raise ValueError(error_msg) + try: + port_int = int(composer_port) + except (ValueError, TypeError) as e: + error_msg = f"Invalid OAuth port value: {composer_port}" + raise ValueError(error_msg) from e composer_url = f"http://{composer_host}:{composer_port}" - return await get_url_by_os(composer_host, int(composer_port), composer_url) # type: ignore[arg-type] + return await get_url_by_os(composer_host, port_int, composer_url)src/backend/base/langflow/api/v1/mcp.py (1)
81-103: Consider refactoring global state to a class-based pattern.Ruff PLW0603 flags the use of
globalstatements. While the double-checked locking pattern is correctly implemented, encapsulating this state in a singleton class would improve testability and satisfy the linter.Example pattern:
class _StreamableHTTPManagerState: def __init__(self): self.manager: StreamableHTTPSessionManager | None = None self.lock = asyncio.Lock() self.started = False self.stack: AsyncExitStack | None = None _state = _StreamableHTTPManagerState()
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (18)
src/backend/base/langflow/api/utils/mcp/__init__.py(1 hunks)src/backend/base/langflow/api/utils/mcp/config_utils.py(6 hunks)src/backend/base/langflow/api/v1/mcp.py(3 hunks)src/backend/base/langflow/api/v1/mcp_projects.py(25 hunks)src/backend/base/langflow/api/v1/schemas.py(1 hunks)src/backend/base/langflow/api/v2/mcp.py(4 hunks)src/backend/base/langflow/initial_setup/starter_projects/News Aggregator.json(1 hunks)src/backend/tests/unit/api/utils/test_config_utils.py(5 hunks)src/backend/tests/unit/api/v1/test_mcp.py(3 hunks)src/backend/tests/unit/api/v1/test_mcp_projects.py(8 hunks)src/frontend/src/controllers/API/queries/mcp/use-get-composer-url.ts(1 hunks)src/frontend/src/controllers/API/queries/mcp/use-patch-flows-mcp.ts(2 hunks)src/frontend/src/customization/utils/custom-mcp-url.ts(1 hunks)src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts(1 hunks)src/frontend/src/types/mcp/index.ts(1 hunks)src/lfx/src/lfx/services/mcp_composer/service.py(9 hunks)src/lfx/tests/unit/services/settings/test_mcp_composer.py(4 hunks)src/lfx/tests/unit/services/settings/test_mcp_composer_windows.py(5 hunks)
🧰 Additional context used
📓 Path-based instructions (8)
**/{test_*.py,*.test.ts,*.test.tsx}
📄 CodeRabbit inference engine (coderabbit-custom-pre-merge-checks-unique-id-file-non-traceable-F7F2B60C-1728-4C9A-8889-4F2235E186CA.txt)
Check that test files follow the project's naming conventions (test_*.py for backend, *.test.ts for frontend)
Files:
src/lfx/tests/unit/services/settings/test_mcp_composer.pysrc/lfx/tests/unit/services/settings/test_mcp_composer_windows.pysrc/backend/tests/unit/api/v1/test_mcp.pysrc/backend/tests/unit/api/v1/test_mcp_projects.pysrc/backend/tests/unit/api/utils/test_config_utils.py
**/test_*.py
📄 CodeRabbit inference engine (coderabbit-custom-pre-merge-checks-unique-id-file-non-traceable-F7F2B60C-1728-4C9A-8889-4F2235E186CA.txt)
**/test_*.py: Backend tests should follow pytest structure with proper test_*.py naming
For async functions, ensure proper async testing patterns are used with pytest for backend
Files:
src/lfx/tests/unit/services/settings/test_mcp_composer.pysrc/lfx/tests/unit/services/settings/test_mcp_composer_windows.pysrc/backend/tests/unit/api/v1/test_mcp.pysrc/backend/tests/unit/api/v1/test_mcp_projects.pysrc/backend/tests/unit/api/utils/test_config_utils.py
src/frontend/src/**/*.{ts,tsx}
📄 CodeRabbit inference engine (.cursor/rules/frontend_development.mdc)
src/frontend/src/**/*.{ts,tsx}: Use React 18 with TypeScript for frontend development
Use Zustand for state management
Files:
src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.tssrc/frontend/src/types/mcp/index.tssrc/frontend/src/controllers/API/queries/mcp/use-get-composer-url.tssrc/frontend/src/customization/utils/custom-mcp-url.tssrc/frontend/src/controllers/API/queries/mcp/use-patch-flows-mcp.ts
src/frontend/src/**/{hooks,services}/**/*.{ts,tsx}
📄 CodeRabbit inference engine (.cursor/rules/frontend_development.mdc)
Use the useApi hook for API calls with proper error handling, including loading state, error state, and exception throwing
Files:
src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts
src/frontend/src/**/{services,hooks}/**/*.{ts,tsx}
📄 CodeRabbit inference engine (.cursor/rules/frontend_development.mdc)
Use async/await pattern for API calls instead of .then() chains
Files:
src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts
src/backend/**/*.py
📄 CodeRabbit inference engine (.cursor/rules/backend_development.mdc)
src/backend/**/*.py: Use FastAPI async patterns withawaitfor async operations in component execution methods
Useasyncio.create_task()for background tasks and implement proper cleanup with try/except forasyncio.CancelledError
Usequeue.put_nowait()for non-blocking queue operations andasyncio.wait_for()with timeouts for controlled get operations
Files:
src/backend/base/langflow/api/v2/mcp.pysrc/backend/base/langflow/api/v1/schemas.pysrc/backend/base/langflow/api/utils/mcp/__init__.pysrc/backend/tests/unit/api/v1/test_mcp.pysrc/backend/tests/unit/api/v1/test_mcp_projects.pysrc/backend/tests/unit/api/utils/test_config_utils.pysrc/backend/base/langflow/api/utils/mcp/config_utils.pysrc/backend/base/langflow/api/v1/mcp_projects.pysrc/backend/base/langflow/api/v1/mcp.py
src/backend/base/langflow/api/**/*.py
📄 CodeRabbit inference engine (.cursor/rules/backend_development.mdc)
Backend API endpoints should be organized by version (v1/, v2/) under
src/backend/base/langflow/api/with specific modules for features (chat.py, flows.py, users.py, etc.)
Files:
src/backend/base/langflow/api/v2/mcp.pysrc/backend/base/langflow/api/v1/schemas.pysrc/backend/base/langflow/api/utils/mcp/__init__.pysrc/backend/base/langflow/api/utils/mcp/config_utils.pysrc/backend/base/langflow/api/v1/mcp_projects.pysrc/backend/base/langflow/api/v1/mcp.py
src/backend/tests/**/*.py
📄 CodeRabbit inference engine (.cursor/rules/testing.mdc)
src/backend/tests/**/*.py: Place backend unit tests insrc/backend/tests/directory, component tests insrc/backend/tests/unit/components/organized by component subdirectory, and integration tests accessible viamake integration_tests
Use same filename as component with appropriate test prefix/suffix (e.g.,my_component.py→test_my_component.py)
Use theclientfixture (FastAPI Test Client) defined insrc/backend/tests/conftest.pyfor API tests; it provides an asynchttpx.AsyncClientwith automatic in-memory SQLite database and mocked environment variables. Skip client creation by marking test with@pytest.mark.noclient
Inherit from the correctComponentTestBasefamily class located insrc/backend/tests/base.pybased on API access needs:ComponentTestBase(no API),ComponentTestBaseWithClient(needs API), orComponentTestBaseWithoutClient(pure logic). Provide three required fixtures:component_class,default_kwargs, andfile_names_mapping
Create comprehensive unit tests for all new backend components. If unit tests are incomplete, create a corresponding Markdown file documenting manual testing steps and expected outcomes
Test both sync and async code paths, mock external dependencies appropriately, test error handling and edge cases, validate input/output behavior, and test component initialization and configuration
Use@pytest.mark.asynciodecorator for async component tests and ensure async methods are properly awaited
Test background tasks usingasyncio.create_task()and verify completion withasyncio.wait_for()with appropriate timeout constraints
Test queue operations using non-blockingqueue.put_nowait()andasyncio.wait_for(queue.get(), timeout=...)to verify queue processing without blocking
Use@pytest.mark.no_blockbustermarker to skip the blockbuster plugin in specific tests
For database tests that may fail in batch runs, run them sequentially usinguv run pytest src/backend/tests/unit/test_database.pyr...
Files:
src/backend/tests/unit/api/v1/test_mcp.pysrc/backend/tests/unit/api/v1/test_mcp_projects.pysrc/backend/tests/unit/api/utils/test_config_utils.py
🧠 Learnings (14)
📓 Common learnings
Learnt from: deon-sanchez
Repo: langflow-ai/langflow PR: 9158
File: src/backend/base/langflow/api/v1/mcp_projects.py:404-404
Timestamp: 2025-07-23T21:19:22.567Z
Learning: In langflow MCP projects configuration, prefer using dynamically computed URLs (like the `sse_url` variable) over hardcoded localhost URLs to ensure compatibility across different deployment environments.
📚 Learning: 2025-07-23T21:19:22.567Z
Learnt from: deon-sanchez
Repo: langflow-ai/langflow PR: 9158
File: src/backend/base/langflow/api/v1/mcp_projects.py:404-404
Timestamp: 2025-07-23T21:19:22.567Z
Learning: In langflow MCP projects configuration, prefer using dynamically computed URLs (like the `sse_url` variable) over hardcoded localhost URLs to ensure compatibility across different deployment environments.
Applied to files:
src/lfx/tests/unit/services/settings/test_mcp_composer.pysrc/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.tssrc/frontend/src/customization/utils/custom-mcp-url.tssrc/backend/base/langflow/api/v2/mcp.pysrc/lfx/tests/unit/services/settings/test_mcp_composer_windows.pysrc/backend/base/langflow/api/utils/mcp/__init__.pysrc/backend/tests/unit/api/utils/test_config_utils.pysrc/backend/base/langflow/api/utils/mcp/config_utils.pysrc/lfx/src/lfx/services/mcp_composer/service.pysrc/backend/base/langflow/api/v1/mcp_projects.pysrc/backend/base/langflow/api/v1/mcp.py
📚 Learning: 2025-11-24T19:46:09.074Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/backend_development.mdc:0-0
Timestamp: 2025-11-24T19:46:09.074Z
Learning: Applies to src/backend/base/langflow/services/database/models/**/*.py : Database models should be organized by domain (api_key/, flow/, folder/, user/, etc.) under `src/backend/base/langflow/services/database/models/`
Applied to files:
src/backend/base/langflow/api/v2/mcp.py
📚 Learning: 2025-11-24T19:46:09.074Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/backend_development.mdc:0-0
Timestamp: 2025-11-24T19:46:09.074Z
Learning: Applies to src/backend/base/langflow/components/**/__init__.py : Update `__init__.py` with alphabetically sorted imports when adding new components
Applied to files:
src/backend/base/langflow/api/utils/mcp/__init__.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use `monkeypatch` fixture to mock internal functions for testing error handling scenarios; validate error status codes and error message content in responses
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Test both sync and async code paths, mock external dependencies appropriately, test error handling and edge cases, validate input/output behavior, and test component initialization and configuration
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use `pytest.mark.api_key_required` and `pytest.mark.no_blockbuster` markers for components that need external APIs; use `MockLanguageModel` from `tests.unit.mock_language_model` for testing without external API keys
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use `aiofiles` and `anyio.Path` for async file operations in tests; create temporary test files using `tmp_path` fixture and verify file existence and content
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Each test should have a clear docstring explaining its purpose; complex test setups should be commented; mock usage should be documented; expected behaviors should be explicitly stated
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use async fixtures with proper cleanup using try/finally blocks to ensure resources are properly released after tests complete
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Use the `client` fixture (FastAPI Test Client) defined in `src/backend/tests/conftest.py` for API tests; it provides an async `httpx.AsyncClient` with automatic in-memory SQLite database and mocked environment variables. Skip client creation by marking test with `pytest.mark.noclient`
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Create comprehensive unit tests for all new backend components. If unit tests are incomplete, create a corresponding Markdown file documenting manual testing steps and expected outcomes
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:47:28.965Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/testing.mdc:0-0
Timestamp: 2025-11-24T19:47:28.965Z
Learning: Applies to src/backend/tests/**/*.py : Test webhook endpoints by posting to `api/v1/webhook/{endpoint_name}` with appropriate payloads and validating response status codes
Applied to files:
src/backend/tests/unit/api/v1/test_mcp.py
📚 Learning: 2025-11-24T19:46:09.074Z
Learnt from: CR
Repo: langflow-ai/langflow PR: 0
File: .cursor/rules/backend_development.mdc:0-0
Timestamp: 2025-11-24T19:46:09.074Z
Learning: Applies to src/backend/**/*.py : Use FastAPI async patterns with `await` for async operations in component execution methods
Applied to files:
src/backend/base/langflow/api/v1/mcp.py
🧬 Code graph analysis (10)
src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts (1)
src/frontend/src/customization/utils/custom-mcp-url.ts (1)
customGetMCPUrl(9-26)
src/frontend/src/controllers/API/queries/mcp/use-get-composer-url.ts (3)
src/frontend/src/types/api/index.ts (1)
useQueryFunctionType(266-277)src/frontend/src/types/mcp/index.ts (1)
ComposerUrlResponseType(48-55)src/frontend/src/controllers/API/services/request-processor.ts (1)
UseRequestProcessor(14-54)
src/frontend/src/customization/utils/custom-mcp-url.ts (1)
src/frontend/src/controllers/API/api.tsx (1)
api(371-371)
src/frontend/src/controllers/API/queries/mcp/use-patch-flows-mcp.ts (1)
src/frontend/src/types/mcp/index.ts (1)
ComposerUrlResponseType(48-55)
src/backend/base/langflow/api/utils/mcp/__init__.py (1)
src/backend/base/langflow/api/utils/mcp/config_utils.py (4)
get_composer_streamable_http_url(256-265)get_project_sse_url(233-238)get_project_streamable_http_url(225-230)get_url_by_os(181-204)
src/backend/tests/unit/api/v1/test_mcp.py (1)
src/backend/tests/unit/api/v1/test_mcp_projects.py (1)
mock_streamable_http_manager(71-86)
src/backend/tests/unit/api/v1/test_mcp_projects.py (2)
src/backend/base/langflow/api/v1/mcp_projects.py (2)
ProjectMCPServer(1084-1151)get_project_mcp_server(1158-1167)src/backend/tests/unit/api/v1/test_mcp.py (1)
mock_streamable_http_manager(35-39)
src/backend/tests/unit/api/utils/test_config_utils.py (1)
src/backend/tests/unit/api/v2/test_registration.py (1)
client(19-22)
src/backend/base/langflow/api/utils/mcp/config_utils.py (2)
src/backend/base/langflow/services/auth/mcp_encryption.py (2)
decrypt_auth_settings(58-97)encrypt_auth_settings(18-55)src/backend/base/langflow/api/v1/mcp_projects.py (1)
_get_mcp_composer_auth_config(1058-1080)
src/backend/base/langflow/api/v1/mcp_projects.py (5)
src/backend/base/langflow/api/utils/mcp/config_utils.py (6)
auto_configure_starter_projects_mcp(268-453)get_composer_streamable_http_url(256-265)get_project_sse_url(233-238)get_project_streamable_http_url(225-230)get_url_by_os(181-204)_get_mcp_composer_auth_config(241-253)src/backend/base/langflow/api/v1/schemas.py (2)
ComposerUrlResponse(463-470)MCPProjectResponse(456-460)src/backend/base/langflow/api/v1/mcp.py (1)
im_alive(127-128)src/lfx/src/lfx/services/mcp_composer/service.py (3)
clear_last_error(97-99)get_last_error(89-91)start_project_composer(951-1013)src/backend/base/langflow/__main__.py (1)
run(185-417)
🪛 GitHub Actions: CI
src/backend/base/langflow/api/v1/mcp.py
[error] 148-148: mypy: Item "None" of "Any | None" has no attribute "handle_request" [union-attr]. Command: uv run mypy --namespace-packages -p "langflow"
🪛 GitHub Actions: Ruff Style Check
src/backend/base/langflow/api/v1/mcp.py
[error] 89-89: PLW0603 Using the global statement to update _streamable_http_manager_started is discouraged. (ruff check with --output-format=github)
🪛 GitHub Check: Ruff Style Check (3.13)
src/backend/tests/unit/api/v1/test_mcp.py
[failure] 83-83: Ruff (ARG001)
src/backend/tests/unit/api/v1/test_mcp.py:83:75: ARG001 Unused function argument: mock_ensure_manager_running
[failure] 121-121: Ruff (ARG001)
src/backend/tests/unit/api/v1/test_mcp.py:121:75: ARG001 Unused function argument: mock_ensure_manager_running
[failure] 109-109: Ruff (ARG001)
src/backend/tests/unit/api/v1/test_mcp.py:109:75: ARG001 Unused function argument: mock_ensure_manager_running
src/backend/base/langflow/api/v1/mcp.py
[failure] 113-113: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:113:78: PLW0603 Using the global statement to update _streamable_http_manager is discouraged
[failure] 113-113: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:113:46: PLW0603 Using the global statement to update _streamable_http_manager_stack is discouraged
[failure] 113-113: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:113:12: PLW0603 Using the global statement to update _streamable_http_manager_started is discouraged
[failure] 89-89: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:89:78: PLW0603 Using the global statement to update _streamable_http_manager is discouraged
[failure] 89-89: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:89:46: PLW0603 Using the global statement to update _streamable_http_manager_stack is discouraged
[failure] 89-89: Ruff (PLW0603)
src/backend/base/langflow/api/v1/mcp.py:89:12: PLW0603 Using the global statement to update _streamable_http_manager_started is discouraged
🔇 Additional comments (42)
src/backend/base/langflow/initial_setup/starter_projects/News Aggregator.json (1)
1927-1929: Verify the Google library version downgrade and confirm it aligns with PR objectives.Line 1928 shows the
0.8.5to0.6.15. This change appears unrelated to the PR's stated focus on MCP transport migration (SSE to Streamable HTTP), and the PR comments already noted that incidental changes to starter project JSON files should be avoided.Please confirm:
- Whether this downgrade is intentional and necessary for the MCP migration.
- Whether this version change aligns with the overall PR scope or if it should be reverted.
- Any compatibility concerns or breaking changes introduced by downgrading the Google library.
src/lfx/src/lfx/services/mcp_composer/service.py (3)
954-961: LGTM! Parameter migration to streamable HTTP is well-structured.The signature change from
sse_urltostreamable_http_urlwith backward-compatiblelegacy_sse_urlparameter aligns with the PR objectives. The keyword-only parameter syntax ensures clarity at call sites.
1040-1040: I'll help you verify the default legacy SSE URL derivation pattern. Let me start by examining the code context and searching for SSE endpoint definitions.
<function_calls>
<function_calls>
#!/bin/bashFirst, let's examine the context around line 1040
cat -n src/lfx/src/lfx/services/mcp_composer/service.py | sed -n '1030,1050p'
</function_calls>Let me search for file structure to confirm the exact path:
<function_calls>
#!/bin/bashFind the service.py file
find . -name "service.py" -path "/mcp_composer/" -type f
</function_calls>Now let me search for SSE endpoint definitions and configurations:
<function_calls>
#!/bin/bashSearch for SSE endpoint definitions and routes
rg -n --type py -C3 'sse|SSE' --max-count=50 | head -100
</function_calls>Let me also check for environment-specific configurations:
<function_calls>
#!/bin/bashSearch for route decorators and endpoint definitions related to sse
rg -n --type py -C3 '@.*route|@.*get|@.*post' | grep -i sse | head -50
</function_calls>Let me examine the specific file and context more carefully:
<function_calls>
#!/bin/bashLook for Python files with mcp_composer in path
find . -path "mcp_composer" -name "*.py" -type f
</function_calls>Let me check for the structure more broadly:
<function_calls>
#!/bin/bashSearch for legacy_sse_url references
rg -n "legacy_sse_url" --type py -C5
</function_calls>Let me also check if there are any environment configs or documentation about the endpoints:
<function_calls>
#!/bin/bashSearch for streamable_http_url and related URL configurations
rg -n "streamable_http_url" --type py -C3
</function_calls>Let me verify the actual repository structure:
<function_calls>
#!/bin/bashList files at root to understand repo structure
ls -la
</function_calls><capability_instructions>
I need to gather evidence about the SSE endpoint pattern to verify the concern raised in the review. Let me request the shell capability to execute these scripts.
</capability_instructions>
1256-1271: Based on my verification through web searches and codebase examination, I can now provide the rewritten review comment:Verified: Command line argument construction for HTTP mode is correctly implemented.
The
--mode http,--endpoint, and--sse-urlflags are all supported by mcp-composer. The comment accurately reflects that--sse-urlis retained for backward compatibility with composer tooling even in HTTP mode, which is documented in the package's public examples.src/lfx/tests/unit/services/settings/test_mcp_composer.py (2)
194-195: LGTM! Test data updated consistently with service changes.The test fixture now uses
streamable_http_urlandlegacy_sse_urlkeys, matching the updated service implementation.
232-237: LGTM! Test invocations updated correctly.All test calls to
_do_start_project_composernow usestreamable_http_urlparameter instead ofsse_url, consistent with the updated signature.src/backend/base/langflow/api/v1/schemas.py (1)
463-470: LGTM! Well-structured response model.The
ComposerUrlResponsemodel clearly defines the composer URL response structure with appropriate optional fields for both streamable HTTP and legacy SSE URLs. The docstring and field types are clear.src/frontend/src/pages/MainPage/pages/homePage/hooks/useMcpServer.ts (1)
125-128: Verify the removal of dependencies from apiUrl useMemo.The
apiUrldependencies have been changed from[folderName, apiKeyFromStore, ...]to[projectId, composerConnection]. This appears correct sincecustomGetMCPUrlnow takesprojectIdandcomposerConnectionas parameters.However, ensure that changes to
folderNameorapiKeyFromStorethat should trigger URL recomputation are properly handled elsewhere in the component.src/backend/tests/unit/api/utils/test_config_utils.py (2)
136-140: LGTM! Test updated to use streamable HTTP transport.The test configuration now uses the streamable HTTP URL with the
--transport streamablehttpflag, aligning with the transport migration.
464-471: The transport flag value "streamablehttp" is correct and valid.According to mcp-proxy documentation, the
--transportflag accepts either "sse" or "streamablehttp". The test configurations correctly use "streamablehttp", which is an explicitly supported transport type. No changes are needed.src/frontend/src/types/mcp/index.ts (1)
48-55: LGTM! Frontend type matches backend schema.The
ComposerUrlResponseTypedefinition correctly mirrors the backendComposerUrlResponsemodel fields, ensuring type safety across the API boundary.src/frontend/src/controllers/API/queries/mcp/use-get-composer-url.ts (1)
2-2: LGTM! Type references updated consistently.All type references have been updated from
ComposerUrlResponsetoComposerUrlResponseType, maintaining consistency with the new frontend type definitions.Also applies to: 13-13, 17-17
src/backend/base/langflow/api/utils/mcp/__init__.py (1)
3-17: LGTM! Public API exports updated correctly.The new URL utility functions
get_composer_streamable_http_urlandget_project_streamable_http_urlare properly exported. The imports and__all__list are alphabetically sorted, following the project's conventions.Based on learnings, alphabetically sorted imports in
__init__.pyfiles are a project convention, which has been followed here.src/lfx/tests/unit/services/settings/test_mcp_composer_windows.py (1)
227-227: Parameter renames align with the streamable HTTP migration.The test call sites consistently update
sse_urltostreamable_http_url, matching the new API signature inMCPComposerService. The test logic and assertions remain unchanged, validating the same behavior with the renamed parameter.Also applies to: 318-318, 364-364, 419-419, 514-514
src/frontend/src/customization/utils/custom-mcp-url.ts (3)
3-7: Clean type definition for connection options.The
ComposerConnectionOptionstype provides a clear, extensible interface for the MCP URL configuration. Using an options object pattern improves maintainability over multiple positional parameters.
15-21: URL priority logic correctly implements the transport migration.The precedence order (streamableHttpUrl → legacySseUrl → fallback) ensures:
- New streamable HTTP transport is preferred when the composer is active
- Legacy SSE URLs remain functional for backward compatibility
- Dynamic fallback for direct project access
23-25: Based on my verification of the codebase, I can now provide the rewritten review comment:
The endpoint correctly handles streamable HTTP connections via content-negotiation.
The backend endpoint at
GET /api/v1/mcp/project/{project_id}(without/ssesuffix) is properly designed. It uses the Accept header to determine response type:
- When
Accept: text/event-streamis present, it routes to_dispatch_project_streamable_httpfor streaming- Otherwise, it returns JSON metadata
The frontend correctly passes this URL to mcp-proxy, which handles content-negotiation. The
/sseendpoint (line 351) remains available for legacy compatibility.src/backend/base/langflow/api/v2/mcp.py (2)
312-327: Robust URL parsing for project ID extraction.The helper correctly handles malformed URLs with try/except, searches for the "project" path segment, and validates UUID format. Edge cases (invalid UUID, missing path parts) return
Noneappropriately.
402-404: Transport migration logic is correct.Stripping
/ssesuffix and setting--transport streamablehttpcorrectly converts legacy SSE configs to the new transport format.src/frontend/src/controllers/API/queries/mcp/use-patch-flows-mcp.ts (2)
3-7: Type imports align with the centralized MCP type definitions.Importing
ComposerUrlResponseTypefrom@/types/mcpensures consistent typing across MCP-related queries and avoids inline type duplication.
21-24: Response type correctly references the shared ComposerUrlResponseType.Using the centralized type ensures the response shape stays synchronized with the backend schema changes (
streamable_http_url,legacy_sse_url).src/backend/base/langflow/api/utils/mcp/config_utils.py (4)
207-222: Clean extraction of base URL components.The helper correctly handles
0.0.0.0bind address conversion tolocalhostand falls back through runtime_port → configured port → default 7860. This follows the learning about dynamically computed URLs.
225-230: Streamable HTTP URL builder follows the new transport model.The function correctly builds the URL without the
/ssesuffix and applies WSL IP handling. Based on learnings, this dynamic URL computation ensures deployment compatibility.
374-388: New condition for AUTO_LOGIN without SUPERUSER.The added condition correctly handles the case where
AUTO_LOGINis enabled butSUPERUSERis false, defaulting to API key authentication for starter folders. The logging provides useful debug context.
400-423: Server config args correctly use streamable HTTP transport.Both the API key and no-auth paths now include
--transport streamablehttpand use the newstreamable_http_url. This is consistent with the transport migration.src/backend/tests/unit/api/v1/test_mcp.py (2)
34-47: LGTM!The fixtures correctly mock the new Streamable HTTP transport components and align with the production code patterns in
mcp.py.
94-99: LGTM!The status code change from 400 to 403 Forbidden correctly reflects authentication requirements via
CurrentActiveMCPUser.src/backend/tests/unit/api/v1/test_mcp_projects.py (5)
70-86: LGTM!The fixture correctly mocks
StreamableHTTPSessionManagerat the class level, matching howProjectMCPServerinstantiates it. Thehandle_requestmock and async context manager setup align with the production usage.
186-197: LGTM!Test correctly verifies that
handle_requestis invoked on theStreamableHTTPSessionManager, matching the new streamable HTTP transport dispatch.
570-595: LGTM!The test properly validates MCP server creation, caching behavior, and instance type. The
asyncio.sleep(0)correctly yields control for async test compliance.
598-619: LGTM!Test correctly verifies that
init_mcp_serversinitializesProjectMCPServerinstances for all projects and stores them in the cache.
622-640: LGTM!The error handling test correctly simulates a failure during MCP server creation for the first project and verifies that
init_mcp_serverscontinues processing subsequent projects.src/backend/base/langflow/api/v1/mcp.py (2)
27-48: LGTM!The default notification options and monkey-patching approach correctly ensure MCP servers advertise change notifications by default. This pattern aligns with the
_configure_server_notification_defaultsfunction inmcp_projects.py.
161-169: LGTM!The route handlers correctly delegate to the common dispatch function, and the
/ssebackward-compatibility endpoint ensures existing clients continue to work during the migration.src/backend/base/langflow/api/v1/mcp_projects.py (8)
71-93: LGTM!The notification defaults configuration mirrors the pattern in
mcp.pyand correctly ensures per-project MCP servers advertise change notifications.
297-314: LGTM!The endpoint correctly implements content negotiation, delegating to streamable HTTP dispatch for event-stream requests while returning JSON for standard requests.
322-348: LGTM!The dispatch handler correctly manages per-project MCP server lifecycle and context propagation. The pattern aligns with the global MCP handler in
mcp.py.Note:
request._sendusage is a known concern flagged in a previous review.
467-476: LGTM!The response correctly includes both
streamable_http_urlandlegacy_sse_urlfor client compatibility. Based on learnings, dynamic URL computation ensures compatibility across deployment environments.
1084-1151: LGTM!The
ProjectMCPServerclass cleanly encapsulates per-project MCP server lifecycle management with proper async locking and context-aware handler registration.
1170-1177: LGTM!The shutdown hook ensures graceful cleanup of all per-project session managers, with proper exception handling to prevent one failure from blocking others.
943-951: LGTM!The utility correctly checks multiple URLs against server configurations, supporting both streamable HTTP and legacy SSE URL matching.
783-860: LGTM!The endpoint properly returns
ComposerUrlResponsemodel instances for all code paths, ensuring consistent API response typing.
63eb353 to
386fa99
Compare
e6168d8 to
9afc891
Compare
21c123a to
07909c4
Compare
|
@coderabbitai update the walkthrough, it is outdated |
|
@HzaRashid: I'll regenerate the walkthrough to ensure it accurately reflects the MCP transport migration from SSE to streamable HTTP, excluding any incidental chat component changes. ✅ Actions performedSummary regeneration triggered. |
fix tests refactor mcp and mcp_projects backwards compat with SSE transport provide streamable http option for json mcp config remove streamable_http mgmt and update tests
9583ee7 to
592923c
Compare
653bea6 to
dca3f83
Compare
898155d to
fb77727
Compare
420b398 to
cf8cd4c
Compare
9427c28 to
e3b717f
Compare
172cfa6 to
edbdf19
Compare
2d059d3 to
bc5df6a
Compare
Summary by CodeRabbit
New Features
Breaking Changes
✏️ Tip: You can customize this high-level summary in your review settings.