-
Notifications
You must be signed in to change notification settings - Fork 5k
enh(stream): fix table not exists in processTag #33625
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Summary of ChangesHello @wangmm0220, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request addresses a bug in the stream processing module where an error could occur if table metadata was not properly handled during tag processing. The changes involve a significant refactoring of the tag processing mechanism, introducing a dedicated function for caching table metadata and streamlining the application of tags to data blocks. This improves the robustness and correctness of stream operations, particularly when dealing with table existence and metadata retrieval, ensuring that stream readers can correctly access table information. Highlights
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
|
/gemini review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request refactors stream metadata handling to fix a race condition where a table might not exist during tag processing. The core idea of pre-caching tag and schema information is a solid approach. The changes simplify the processTag function and improve consistency by centralizing SStorageAPI usage. However, I've identified a few critical issues with the implementation of the new caching mechanism, specifically related to incorrect pointer handling when using taosHashPut with isPointerType=true. These could lead to dangling pointers and memory corruption. I have also noted a minor logging issue. Addressing these points will make the solution more robust.
| SFilterInfo* pFilterInfo; | ||
| SHashObj* pTableMetaCacheTrigger; | ||
| SHashObj* pTableMetaCacheCalc; | ||
| SHashObj* triggerTableSchemaMapVTable; // key: uid, value: STSchema* |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里的 schema 包含原始表所有列,会不会有内存占用问题?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个schema不大,不会有问题。除非极限情况,引用了非常多的原始表,并且每个原始表列非常多。这种情况这个缓存占内存可能有点大。现在没必要为了异常极限情况做处理,处理比较麻烦。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
后续要给这种成员变量加上统计信息,当流计算内存占用较高时,便于分析。
Description
Please briefly describe the code changes in this pull request.
Jira: https://jira.taosdata.com:18080/browse/TD-
Checklist
Please check the items in the checklist if applicable.