LangSmith

LangGraph流程圖

Agent工作流監測


在LangGraph的程式碼中加入callback函數,監測Agent工作流的運行狀態。 範例程式碼:

                from langchain_core.callbacks import CallbackManager
                from langchain.callbacks.tracers.langchain import LangChainTracer

                callback_manager = CallbackManager([LangChainTracer()])

                def init_llm():
                    """初始化LLM模型,加入CallbackManager"""
                    llm = GoogleGenerativeAI(
                        model="gemini-2.5-flash",
                        callbacks=callback_manager
                    )                    
                    return llm

                @traceable(name="分類節點")
                def classify(state):
                    """
                    分類節點,根據問題分類,並返回是否為RAG問題
                    Chain加入callback_manager
                    """
                    chain = (judge_chain | llm).with_config(callbacks=callback_manager)
                    ans = chain.invoke({"question": state["question"]},config = {"thinking_budget": 0})
                    return {**state,"is_rag": "是" in ans}
            
LangSmith官網監控儀表板圖示:
LangSmith_demo