You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have an open ai assistant created and have been using it directly via openai python SDK and have been working smoothly....Recently i had a requirement in which i had to interact with multiple assistants to get to a final state.
I was trying out the langgraph library to orchestrate this flow with each assistant and function call as a node in the graph.
Please see the code below
`constStateAnnotation=Annotation.Root({reducer: messagesStateReducer,}),});constbaseAgentTools=[newCreateCaseFn()];consttoolNode=newToolNode(baseAgentTools);constbaseAgentNode=newOpenAIAssistantRunnable({assistantId:process.env.ASST,asAgent: true,});constagentExecutor=AgentExecutor.fromAgentAndTools({agent: baseAgentNode,tools: baseAgentTools,maxIterations: 1,verbose: true,});asyncfunctioncallBaseAgent(state: typeofStateAnnotation.State){constmessages=state.messages;constresponse=awaitagentExecutor.invoke(messages);return{messages: [response]};}functionshouldContinue(state: typeofStateAnnotation.State){constmessages=state.messages;constlastMessage=messages[messages.length-1]asAIMessage;if(lastMessage.tool_calls?.length){return'create_case_fn';}return'__end__';}constworkflow=newStateGraph(StateAnnotation).addNode('base_agent',callBaseAgent).addNode('create_case_fn',toolNode).addEdge('__start__','base_agent').addConditionalEdges('base_agent',shouldContinue).addEdge('base_agent','create_case_fn');constcheckpointer=newMemorySaver();constapp=workflow.compile({ checkpointer });conststart=async()=>{constfinalState=awaitapp.invoke({messages: [newHumanMessage('what is the weather in sf')]},{configurable: {thread_id: '42'}},);console.log(finalState.messages[finalState.messages.length-1].content);};start();
See the below logs
(node:31200) [DEP0040] DeprecationWarning: The punycodemodule is deprecated. Please use a userland alternative instead. (Usenode --trace-deprecation ... to show where the warning was created) [chain/start] [1:chain:AgentExecutor] Entering Chain run with input: { "0": { "lc": 1, "type": "constructor", "id": [ "langchain_core", "messages", "HumanMessage" ], "kwargs": { "content": "what is the weather in sf", "additional_kwargs": {}, "response_metadata": {} } } } [chain/error] [1:chain:AgentExecutor] [495ms] Chain run errored with error: "400 Missing required parameter: 'thread.messages[0].content'.\n\nError: 400 Missing required parameter: 'thread.messages[0].content'.\n at Function.generate (C:\\Projects\\proj\\prof-consumer\\node_modules\\@langchain\\openai\\node_modules\\openai\\src\\error.ts:72:14)\n at OpenAI.makeStatusError (C:\\Projects\\proj\\proj-consumer\\node_modules\\@langchain\\openai\\node_modules\\openai\\src\\core.ts:435:21)\n at OpenAI.makeRequest (C:\\Projects\\proj\\proj-consumer\\node_modules\\@langchain\\openai\\node_modules\\openai\\src\\core.ts:499:24)\n at processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async OpenAIAssistantRunnable.invoke (C:\\Projects\\proj\\proj-consumer\\node_modules\\langchain\\dist\\experimental\\openai_assistant\\index.cjs:91:19)\n at async OpenAIAssistantRunnable._streamIterator (C:\\Projects\\proj\\proj-consumer\\node_modules\\@langchain\\core\\dist\\runnables\\base.cjs:165:9)" C:\Projects\proj\proj-consumer\node_modules\@langchain\openai\node_modules\openai\src\error.ts:72 return new BadRequestError(status, error, message, headers); ^ BadRequestError: 400 Missing required parameter: 'thread.messages[0].content'. at Function.generate (C:\Projects\proj\proj-consumer\node_modules\@langchain\openai\node_modules\openai\src\error.ts:72:14) at OpenAI.makeStatusError (C:\Projects\proj\proj-consumer\node_modules\@langchain\openai\node_modules\openai\src\core.ts:435:21) at OpenAI.makeRequest (C:\Projects\proj\proj-consumer\node_modules\@langchain\openai\node_modules\openai\src\core.ts:499:24) at processTicksAndRejections (node:internal/process/task_queues:95:5) at async OpenAIAssistantRunnable.invoke (C:\Projects\proj\proj-consumer\node_modules\langchain\dist\experimental\openai_assistant\index.cjs:91:19) at async OpenAIAssistantRunnable._streamIterator (C:\Projects\proj\proj-consumer\node_modules\@langchain\core\dist\runnables\base.cjs:165:9)
Is there something wrong with the way i am invoking my assistant ?
Any help is greately appreciated
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I have an open ai assistant created and have been using it directly via openai python SDK and have been working smoothly....Recently i had a requirement in which i had to interact with multiple assistants to get to a final state.
I was trying out the langgraph library to orchestrate this flow with each assistant and function call as a node in the graph.
Please see the code below
See the below logs
(node:31200) [DEP0040] DeprecationWarning: The
punycodemodule is deprecated. Please use a userland alternative instead. (Use
node --trace-deprecation ...to show where the warning was created) [chain/start] [1:chain:AgentExecutor] Entering Chain run with input: { "0": { "lc": 1, "type": "constructor", "id": [ "langchain_core", "messages", "HumanMessage" ], "kwargs": { "content": "what is the weather in sf", "additional_kwargs": {}, "response_metadata": {} } } } [chain/error] [1:chain:AgentExecutor] [495ms] Chain run errored with error: "400 Missing required parameter: 'thread.messages[0].content'.\n\nError: 400 Missing required parameter: 'thread.messages[0].content'.\n at Function.generate (C:\\Projects\\proj\\prof-consumer\\node_modules\\@langchain\\openai\\node_modules\\openai\\src\\error.ts:72:14)\n at OpenAI.makeStatusError (C:\\Projects\\proj\\proj-consumer\\node_modules\\@langchain\\openai\\node_modules\\openai\\src\\core.ts:435:21)\n at OpenAI.makeRequest (C:\\Projects\\proj\\proj-consumer\\node_modules\\@langchain\\openai\\node_modules\\openai\\src\\core.ts:499:24)\n at processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async OpenAIAssistantRunnable.invoke (C:\\Projects\\proj\\proj-consumer\\node_modules\\langchain\\dist\\experimental\\openai_assistant\\index.cjs:91:19)\n at async OpenAIAssistantRunnable._streamIterator (C:\\Projects\\proj\\proj-consumer\\node_modules\\@langchain\\core\\dist\\runnables\\base.cjs:165:9)" C:\Projects\proj\proj-consumer\node_modules\@langchain\openai\node_modules\openai\src\error.ts:72 return new BadRequestError(status, error, message, headers); ^ BadRequestError: 400 Missing required parameter: 'thread.messages[0].content'. at Function.generate (C:\Projects\proj\proj-consumer\node_modules\@langchain\openai\node_modules\openai\src\error.ts:72:14) at OpenAI.makeStatusError (C:\Projects\proj\proj-consumer\node_modules\@langchain\openai\node_modules\openai\src\core.ts:435:21) at OpenAI.makeRequest (C:\Projects\proj\proj-consumer\node_modules\@langchain\openai\node_modules\openai\src\core.ts:499:24) at processTicksAndRejections (node:internal/process/task_queues:95:5) at async OpenAIAssistantRunnable.invoke (C:\Projects\proj\proj-consumer\node_modules\langchain\dist\experimental\openai_assistant\index.cjs:91:19) at async OpenAIAssistantRunnable._streamIterator (C:\Projects\proj\proj-consumer\node_modules\@langchain\core\dist\runnables\base.cjs:165:9)
Beta Was this translation helpful? Give feedback.
All reactions