You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
#4. Invoke the app
config = {"configurable": {"thread_id": str(uuid.uuid4())}}
result = app.invoke(input=("user", "please generate a random line plots"), config = config)
I got error message below when try to add config argument:
venv\lib\site-packages\langgraph\serde\jsonplus.py:72, in JsonPlusSerializer._default(self, obj)
68 return self._encode_constructor_args(
69 obj.class, kwargs={"node": obj.node, "arg": obj.arg}
70 )
71 else:
---> 72 raise TypeError(
73 f"Object of type {obj.class.name} is not JSON serializable"
74 )
TypeError: Object of type Result is not JSON serializable
I suppose the error comes due to change of message format that can not be json.dump(), is there any quick way to fix this? like some parser function maybe?
The text was updated successfully, but these errors were encountered:
Hi @LizhongLiu-cims, yes this looks like a JSON format problem. Sometimes the model ignores the required JSON schema they should be producing. This could be related
Hi @mlejva, in that way, if I want to add memory as checkpointer to save the history, do I need to rewrite my own memory function instead of using MemorySaver() from langgraph.checkpoint? If that way, could you please provide an demo example to follow? Thanks!
I try to run code here: examples/langgraph-python/langgraph_e2b_python/code_interpreter_tool.py
All my code is same as the example, expect that I added the memorysaver to enable the llm have chat memory for constant chat experience.
workflow = MessageGraph()
workflow.add_node("agent", llm.bind_tools(tools))
workflow.add_node("action", lambda x: execute_tools(x, tool_map))
#Conditional agent -> action OR agent -> END
workflow.add_conditional_edges(
"agent",
should_continue,
)
#Always transition
action
->agent
workflow.add_edge("action", "agent")
workflow.set_entry_point("agent")
memory = MemorySaver()
app = workflow.compile(checkpointer = memory)
#4. Invoke the app
config = {"configurable": {"thread_id": str(uuid.uuid4())}}
result = app.invoke(input=("user", "please generate a random line plots"), config = config)
I got error message below when try to add config argument:
venv\lib\site-packages\langgraph\serde\jsonplus.py:72, in JsonPlusSerializer._default(self, obj)
68 return self._encode_constructor_args(
69 obj.class, kwargs={"node": obj.node, "arg": obj.arg}
70 )
71 else:
---> 72 raise TypeError(
73 f"Object of type {obj.class.name} is not JSON serializable"
74 )
TypeError: Object of type Result is not JSON serializable
I suppose the error comes due to change of message format that can not be json.dump(), is there any quick way to fix this? like some parser function maybe?
The text was updated successfully, but these errors were encountered: