pip install langgraph langchain langchain-google-genai tavily-python python-dotenv
2 Configuração das APIs (.env)
Crie um arquivo .env:
GOOGLE_API_KEY= SUA_CHAVE_GEMINI
TAVILY_API_KEY= SUA_CHAVE_TAVILY
Carregue no código:
import os
from dotenv import load_dotenv
load_dotenv()
GOOGLE_API_KEY = os.getenv("GOOGLE_API_KEY")
TAVILY_API_KEY = os.getenv("TAVILY_API_KEY")
3️ Criando o AgentState
O estado gerencia o histórico das mensagens.
from typing import Annotated, TypedDict
from langgraph.graph import add_messages
class AgentState(TypedDict):
messages: Annotated[list, add_messages]
4️ Criando o LLM (Gêmeos)
from langchain_google_genai import ChatGoogleGenerativeAI
llm = ChatGoogleGenerativeAI(
model="gemini-1.5-flash",
temperature=0
)
5️ Criando ferramenta (Tavily Search)
from langchain_community.tools.tavily_search import TavilySearchResults
search_tool = TavilySearchResults(
tavily_api_key=TAVILY_API_KEY,
max_results=3
)
tools = [search_tool]
6️ Criando o Agente com LangGraph
Usaremos um agente ReAct já pronto no LangGraph:
from langgraph.prebuilt import create_react_agent
agent = create_react_agent(
model=llm,
tools=tools
)
7️ Construindo o Gráfico Manualmente
Caso queira montar o fluxo manualmente:
from langgraph.graph import StateGraph, END
from langgraph.prebuilt import ToolNode
builder = StateGraph(AgentState)
Nó LLM
builder.add_node("agent", agent)
Nó de ferramentas
builder.add_node("tools", ToolNode(tools))
builder.set_entry_point("agent")
8️ Criando Aresta Condicional
Defina quando ir para ferramenta ou encerrar.
def route_tools(state):
last_message = state["messages"][-1]
if getattr(last_message, "tool_calls", None):
return "tools"
return END
builder.add_conditional_edges(
"agent",
route_tools,
{
"tools": "tools",
END: END
}
)
builder.add_edge("tools", "agent")
9️ Compilando o Gráfico
graph = builder.compile()
10 Testando o Agente
Consulta simples
response = graph.invoke({
"messages": [("user", "Qual a previsão do tempo em São Paulo hoje?")]
})
print(response["messages"][-1].content)