Adding searches on the Internet DuckDuckGo in Llama 3.2 8b using Ollama and LangChain for AI Agents

Tiempo de lectura: < 1 minuto

We will today make our AI agent have a tool that allows it to perform internet searches.

Duck - pexels

First, we will install the necessary libraries:

pip install langchain langchain_community pip install ddgs

NOW INITIALIZING CONNECTION WITH OLLAMA SERVER (https://devcodelight.com/ollama-con-llama-3-2-in-docker/)

from langchain_community.chat_models import ChatOllama from langchain_core.prompts import ChatPromptTemplate # ... other imports ... # ⚠️ YOUR REMOTE OLLAMA_SERVER_URL = "URL_OLLAMA" MODEL_NAME = "llama3.2:3b" # 3. Initializing the LLM (the rest of the code is the same) try: llm = ChatOllama( model=MODEL_NAME, base_url=OLLAMA_SERVER_URL, temperature=0.7 ) print(f"✅ Connected to ChatOllama on: {OLLAMA_SERVER_URL}") # ... continue with bind_tools(mis_tools) and agent creation. except Exception as e: print(f"❌ Error initializing or binding: {e}")

And now we are going to create the necessary tool for Google search.

from langchain.agents import Tool from langchain.tools import DuckDuckGoSearchResults search = DuckDuckGoSearchResults() duck_tool = Tool( name="DuckDuckGo", func=lambda q: search.run(q), # run returns a summarized string description="Search the web and return a text summary." ) mis_tools = [duck_tool]

And now add to the model:

agent = initialize_agent(tools=mis_tools, llm=llm, agent_type=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True)

And it is used:

question = "Who won the last world cup of football?" result = agent.invoke(question) print(" 💬 Agent's response:", result)

Leave a Comment