Maison >développement back-end >Tutoriel Python >Machines à états LangGraph : gestion des flux de tâches d'agent complexes en production
LangGraph est un framework d'orchestration de flux de travail conçu spécifiquement pour les applications LLM. Ses principes fondamentaux sont :
Pensez au shopping : Parcourir → Ajouter au panier → Commander → Paiement. LangGraph nous aide à gérer efficacement ces flux de travail.
Les états sont comme des points de contrôle dans l'exécution de vos tâches :
from typing import TypedDict, List class ShoppingState(TypedDict): # Current state current_step: str # Cart items cart_items: List[str] # Total amount total_amount: float # User input user_input: str class ShoppingGraph(StateGraph): def __init__(self): super().__init__() # Define states self.add_node("browse", self.browse_products) self.add_node("add_to_cart", self.add_to_cart) self.add_node("checkout", self.checkout) self.add_node("payment", self.payment)
Les transitions d'état définissent la « feuille de route » de votre flux de tâches :
class ShoppingController: def define_transitions(self): # Add transition rules self.graph.add_edge("browse", "add_to_cart") self.graph.add_edge("add_to_cart", "browse") self.graph.add_edge("add_to_cart", "checkout") self.graph.add_edge("checkout", "payment") def should_move_to_cart(self, state: ShoppingState) -> bool: """Determine if we should transition to cart state""" return "add to cart" in state["user_input"].lower()
Pour garantir la fiabilité du système, nous devons conserver les informations d'état :
class StateManager: def __init__(self): self.redis_client = redis.Redis() def save_state(self, session_id: str, state: dict): """Save state to Redis""" self.redis_client.set( f"shopping_state:{session_id}", json.dumps(state), ex=3600 # 1 hour expiration ) def load_state(self, session_id: str) -> dict: """Load state from Redis""" state_data = self.redis_client.get(f"shopping_state:{session_id}") return json.loads(state_data) if state_data else None
Toute étape peut échouer, et nous devons gérer ces situations avec élégance :
class ErrorHandler: def __init__(self): self.max_retries = 3 async def with_retry(self, func, state: dict): """Function execution with retry mechanism""" retries = 0 while retries < self.max_retries: try: return await func(state) except Exception as e: retries += 1 if retries == self.max_retries: return self.handle_final_error(e, state) await self.handle_retry(e, state, retries) def handle_final_error(self, error, state: dict): """Handle final error""" # Save error state state["error"] = str(error) # Rollback to last stable state return self.rollback_to_last_stable_state(state)
Regardons un exemple pratique : un système de service client intelligent :
from langgraph.graph import StateGraph, State class CustomerServiceState(TypedDict): conversation_history: List[str] current_intent: str user_info: dict resolved: bool class CustomerServiceGraph(StateGraph): def __init__(self): super().__init__() # Initialize states self.add_node("greeting", self.greet_customer) self.add_node("understand_intent", self.analyze_intent) self.add_node("handle_query", self.process_query) self.add_node("confirm_resolution", self.check_resolution) async def greet_customer(self, state: State): """Greet customer""" response = await self.llm.generate( prompt=f""" Conversation history: {state['conversation_history']} Task: Generate appropriate greeting Requirements: 1. Maintain professional friendliness 2. Acknowledge returning customers 3. Ask how to help """ ) state['conversation_history'].append(f"Assistant: {response}") return state async def analyze_intent(self, state: State): """Understand user intent""" response = await self.llm.generate( prompt=f""" Conversation history: {state['conversation_history']} Task: Analyze user intent Output format: {{ "intent": "refund/inquiry/complaint/other", "confidence": 0.95, "details": "specific description" }} """ ) state['current_intent'] = json.loads(response) return state
# Initialize system graph = CustomerServiceGraph() state_manager = StateManager() error_handler = ErrorHandler() async def handle_customer_query(user_id: str, message: str): # Load or create state state = state_manager.load_state(user_id) or { "conversation_history": [], "current_intent": None, "user_info": {}, "resolved": False } # Add user message state["conversation_history"].append(f"User: {message}") # Execute state machine flow try: result = await graph.run(state) # Save state state_manager.save_state(user_id, result) return result["conversation_history"][-1] except Exception as e: return await error_handler.with_retry( graph.run, state )
Principes de conception de l'État
Optimisation de la logique de transition
Stratégie de gestion des erreurs
Optimisation des performances
Explosion d'État
Situations d'impasse
Cohérence de l'État
Les machines à états LangGraph fournissent une solution puissante pour gérer les flux de tâches complexes des agents IA :
Ce qui précède est le contenu détaillé de. pour plus d'informations, suivez d'autres articles connexes sur le site Web de PHP en chinois!