Skip to content

Conversation

RoadRoller
Copy link

@RoadRoller RoadRoller commented Jun 23, 2025

  • Improved so that the AI bot responds to personal messages and to messages in which it was mentioned
  • Previous functionality of responding to direct messages is preserved
  • A small refactoring of checks was made to ensure that messages need to be processed
  • Added tests for the changed functionality

I would also suggest switching from the openai library to https://ai-sdk.dev in the ai-bot service so as not to depend on OpenAI as a provider, but I did not understand how to test the functionality of calling tools from the UI and I am afraid of breaking something.

Resolves #9328

Copy link

Connected to Huly®: UBERF-11731

@BykhovDenis BykhovDenis requested review from Copilot and kristina-fefelova and removed request for kristina-fefelova June 23, 2025 14:39
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This pull request refactors the AI bot response logic to improve the handling of personal messages and mentions while preserving existing direct message functionality. Key changes include:

  • Extraction of helper functions (findAiBotSocialIdentity, findAiBotAllSocialIds, isThreadMessage, getMessageDoc, isDirectAvailable, isAiBotShouldReplay, and OnAiBotShouldReplay) to organize message processing logic.
  • Update of message filtering and condition checks to decide when the AI bot should send a response.
  • Removal of duplicate code and streamlining of direct and thread message handling.
Comments suppressed due to low confidence (1)

server-plugins/ai-bot-resources/src/index.ts:154

  • [nitpick] Consider renaming 'isAiBotShouldReplay' to 'isAiBotShouldReply' to more clearly reflect its purpose of determining whether the AI bot should reply to a message.
): Promise<boolean> {

@RoadRoller RoadRoller force-pushed the ai-bot-improvements branch from 4917372 to 6179dba Compare June 23, 2025 19:19
@RoadRoller RoadRoller force-pushed the ai-bot-improvements branch from d245b1c to 018d43c Compare June 30, 2025 13:51
const isDirect =
messageDoc._class === chunter.class.DirectMessage && (await isDirectAvailable(messageDoc as DirectMessage, control))
const isAiBotPersonalChat = messageDoc._id === aiBotPersonId
const isAiBotMentioned = message.message.includes(aiBotPersonId)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not enough to check just includes(id). We need to ensure there's a mention node and inspect its attributes to confirm it's indeed referencing ai. You can look at how isDocMentioned works in server-plugins/activity-resources/src/references.ts.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you, refactored with honest checking in references.

@iustin05
Copy link

iustin05 commented Aug 6, 2025

Seems like this is just waiting for a merge? @RoadRoller

@RoadRoller
Copy link
Author

@iustin05, approval from reviewer is needed, I don't know why the request is stuck.

@grigoriy-a
Copy link

up
very nice feature

@grigoriy-a
Copy link

grigoriy-a commented Sep 30, 2025

Ok, we checked this in prod.
It's not ready for production. When users start actively texting in chat and tagging, especially if generation takes some time (and it does, since there is summarization of history via LLM, then actual request to the LLM) - Huly just hangs for everyone with a lot of messages from Transactor service:

transactor-1  | {"level":"warn","message":"request hang found","params":{"id":...,"meta":{},"method":"findAll","params":["activity:class:ActivityMessage",{"attachedTo":"chunter:space:General","isPinned":true,"space":"chunter:space:General"},{"limit":1,"total":true}],"time":...},"sec":30,"timestamp":"...","total":7,"user":"...","wsId":"..."}

transactor-1  | {"level":"warn","message":"request hang found","params":{"id":...,"meta":{},"method":"findAll","params":["chunter:class:ThreadMessage",{"isPinned":true,"objectId":"chunter:space:General","space":"chunter:space:General"},{"limit":1,"total":true}],"time":...},"sec":30,"timestamp":"...","total":7,"user":"...","wsId":"..."}

transactor-1  | {"level":"warn","message":"request hang found","params":{"id":...,"meta":{},"method":"tx","params":[{"_class":"core:class:TxUpdateDoc","_id":"...","createdOn":...,"modifiedBy":"...","modifiedOn":...,"objectClass":"workbench:class:WorkbenchTab","objectId":"...","objectSpace":"core:space:Workspace","operations":{"location":"/workbench/company/chunter/chunter%3Aspace%3AGeneral%7Cchunter%3Aclass%3AChannel?message","name":"general"},"retrieve":false,"space":"core:space:DerivedTx"}],"time":...},"sec":30,"timestamp":"....","total":7,"user":"...","wsId":"..."}

@grigoriy-a
Copy link

@RoadRoller it needs to be more asynchronous. And bot doesn't see chat/thread history, which makes it useless in group chats/threads. He should at least get last X messages from chat history then.

@grigoriy-a
Copy link

grigoriy-a commented Sep 30, 2025

Ah yes, I think it's aibot's service problem. Await here (server.ts in pod-ai-aibot) is completely unnecessary, because there is no value return from this endpoint, but it makes the caller wait, which in return probably makes txs hang in Huly itself
image

Edit: no, removing "await" doesn't help (at least with warns). After messages are sent either to aibot direct or via tag in group, I see these warns appearing and repeating every 10-15 seconds:

transactor-1  | {"level":"warn","message":"request hang found","params":{"id":...,"meta":{},"method":"tx","params":[{"_class":"core:class:TxApplyIf","_id":"...","createdOn":...,"extraNotify":[],"match":[],"measureName":"chunter.create.chunter:class:ChatMessage chunter:class:Channel","modifiedBy":"...","modifiedOn":...,"notMatch":[],"notify":true,"objectSpace":"core:space:Tx","space":"core:space:Tx","txes":[{"_class":"core:class:TxCreateDoc","_id":"...","attachedTo":"...","attachedToClass":"chunter:class:Channel","attributes":{"attachments":0,"message":"{\"type\":\"doc\",\"content\":[{\"type\":\"paragraph\",\"content\":[{\"type\":\"reference\",\"attrs\":{\"id\":\"...\",\"objectclass\":\"contact:class:Person\",\"label\":\"Bot Huly\"}},{\"type\":\"text\",\"text\":\"hey what’s up? \"}]}]}"},"collection":"messages","createdBy":"...","createdOn":...,"modifiedBy":"...","modifiedOn":...,"objectClass":"chunter:class:ChatMessage","objectId":"...","objectSpace":"...","space":"core:space:Tx"}]}],"time":...},"sec":240,"timestamp":"...","total":2,"user":"...","wsId":"..."}

transactor-1  | {"level":"warn","message":"request hang found","params":{"id":...,"meta":{},"method":"tx","params":[{"_class":"core:class:TxApplyIf","_id":"...","createdOn":...,"extraNotify":[],"match":[],"measureName":"chunter.create.chunter:class:ChatMessage chunter:class:DirectMessage","modifiedBy":"...","modifiedOn":...,"notMatch":[],"notify":true,"objectSpace":"core:space:Tx","space":"core:space:Tx","txes":[{"_class":"core:class:TxCreateDoc","_id":"...","attachedTo":"...","attachedToClass":"chunter:class:DirectMessage","attributes":{"attachments":0,"message":"{\"type\":\"doc\",\"content\":[{\"type\":\"paragraph\",\"content\":[{\"type\":\"text\",\"text\":\"so what’s new?\"}]}]}"},"collection":"messages","createdBy":"...","createdOn":...,"modifiedBy":"...","modifiedOn":...,"objectClass":"chunter:class:ChatMessage","objectId":"...","objectSpace":"...","space":"core:space:Tx"}]}],"time":...},"sec":30,"timestamp":"...","total":2,"user":"...","wsId":"..."}

Which later (I think) result in this:

transactor-1  | {"err":{"message":"fetch failed","stack":"TypeError: fetch failed\n    at node:internal/deps/undici/undici:13510:13\n    at async sendAIEvents (/usr/src/app/bundle.js:302894:9)\n    at async OnAiBotShouldReply (/usr/src/app/bundle.js:303176:7)\n    at async Object.OnMessageSend [as op] (/usr/src/app/bundle.js:303031:13)\n    at async Triggers.applyTrigger (/usr/src/app/bundle.js:115485:29)\n    at async ctx.with.count (/usr/src/app/bundle.js:115513:35)\n    at async Triggers.apply (/usr/src/app/bundle.js:115508:13)\n    at async _TriggersMiddleware2.processAsyncTriggers (/usr/src/app/bundle.js:258575:25)\n    at async ctx.contextData.asyncRequests (/usr/src/app/bundle.js:258540:15)\n    at async handleAyncs (/usr/src/app/bundle.js:140747:17)"},"level":"error","message":"Could not send ai events","timestamp":"2025-09-30T16:22:12.181Z"}

Edit2:
This way no warns
image

@grigoriy-a
Copy link

@RoadRoller suggestion - perhaps taking full group chat history does not make a lot of sense. But for example thread history can be useful - not so much context, isolated, and can be used summarizing local conversations.

@RoadRoller
Copy link
Author

RoadRoller commented Oct 9, 2025

@grigoriy-a, I made this request three months ago, and I'm afraid I'm not up to date on what's going on at Huly with AI agents. I think it's worth checking with the Huly team about their plans in this area. I think I'll close this request to avoid any misunderstandings.

@RoadRoller RoadRoller closed this Oct 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

AI Bot does not respond to personal messages and to message that mention it

4 participants