Skip to main content

DuckDuckGoSearch

This notebook provides a quick overview for getting started with DuckDuckGoSearch. For detailed documentation of all DuckDuckGoSearch features and configurations head to the API reference.

DuckDuckGoSearch offers a privacy-focused search API designed for LLM Agents. It provides seamless integration with a wide range of data sources, prioritizing user privacy and relevant search results.

Overview​

Integration details​

ClassPackageSerializablePY supportPackage latest
DuckDuckGoSearch@langchain/communityβŒβœ…NPM - Version

Setup​

The integration lives in the @langchain/community package, along with the duck-duck-scrape dependency:

yarn add @langchain/community duck-duck-scrape

Credentials​

It’s also helpful (but not needed) to set up LangSmith for best-in-class observability:

process.env.LANGCHAIN_TRACING_V2 = "true";
process.env.LANGCHAIN_API_KEY = "your-api-key";

Instantiation​

Here we show how to insatiate an instance of the DuckDuckGoSearch tool, with

import { DuckDuckGoSearch } from "@langchain/community/tools/duckduckgo_search";

const tool = new DuckDuckGoSearch({ maxResults: 1 });

Invocation​

Invoke directly with args​

await tool.invoke("What is Anthropic's estimated revenue for 2024?");
[{"title":"Anthropic forecasts more than $850 mln in annualized revenue rate by ...","link":"https://www.reuters.com/technology/anthropic-forecasts-more-than-850-mln-annualized-revenue-rate-by-2024-end-report-2023-12-26/","snippet":"Dec 26 (Reuters) - Artificial intelligence startup <b>Anthropic</b> has projected it will generate more than $850 million in annualized <b>revenue</b> by the end of <b>2024</b>, the Information reported on Tuesday ..."}]

Invoke with ToolCall​

We can also invoke the tool with a model-generated ToolCall, in which case a ToolMessage will be returned:

// This is usually generated by a model, but we'll create a tool call directly for demo purposes.
const modelGeneratedToolCall = {
args: {
input: "What is Anthropic's estimated revenue for 2024?",
},
id: "tool_call_id",
name: tool.name,
type: "tool_call",
};
await tool.invoke(modelGeneratedToolCall);
ToolMessage {
"content": "[{\"title\":\"Anthropic forecasts more than $850 mln in annualized revenue rate by ...\",\"link\":\"https://www.reuters.com/technology/anthropic-forecasts-more-than-850-mln-annualized-revenue-rate-by-2024-end-report-2023-12-26/\",\"snippet\":\"Dec 26 (Reuters) - Artificial intelligence startup <b>Anthropic</b> has projected it will generate more than $850 million in annualized <b>revenue</b> by the end of <b>2024</b>, the Information reported on Tuesday ...\"}]",
"name": "duckduckgo-search",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "tool_call_id"
}

Chaining​

We can use our tool in a chain by first binding it to a tool-calling model and then calling it:

Pick your chat model:

Install dependencies

yarn add @langchain/openai 

Add environment variables

OPENAI_API_KEY=your-api-key

Instantiate the model

import { ChatOpenAI } from "@langchain/openai";

const llm = new ChatOpenAI({
model: "gpt-4o-mini",
temperature: 0
});
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { RunnableConfig } from "@langchain/core/runnables";
import { AIMessage } from "@langchain/core/messages";

const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful assistant."],
["human", "{user_input}"],
["placeholder", "{messages}"],
]);

// specifying tool_choice will force the model to call this tool.
const llmWithTools = llm.bindTools([tool], {
tool_choice: tool.name,
});

const llmChain = prompt.pipe(llmWithTools);

const toolChain = async (
userInput: string,
config?: RunnableConfig
): Promise<AIMessage> => {
const input_ = { user_input: userInput };
const aiMsg = await llmChain.invoke(input_, config);
const toolMsgs = await tool.batch(aiMsg.tool_calls, config);
return llmChain.invoke({ ...input_, messages: [aiMsg, ...toolMsgs] }, config);
};

const toolChainResult = await toolChain(
"What is Anthropic's estimated revenue for 2024?"
);
const { tool_calls, content } = toolChainResult;

console.log(
"AIMessage",
JSON.stringify(
{
tool_calls,
content,
},
null,
2
)
);
AIMessage {
"tool_calls": [
{
"name": "duckduckgo-search",
"args": {
"input": "Anthropic revenue 2024 forecast"
},
"type": "tool_call",
"id": "call_E22L1T1bI6xPrMtL8wrKW5C5"
}
],
"content": ""
}

With an agent​

We can also pass the DuckDuckGoSearch tool to an agent. First, ensure you have the LangGraph package installed.

yarn add @langchain/langgraph
import { DuckDuckGoSearch } from "@langchain/community/tools/duckduckgo_search";
import { createReactAgent } from "@langchain/langgraph/prebuilt";

// Define the tools the agent will have access to.
const toolsForAgent = [new DuckDuckGoSearch({ maxResults: 1 })];

const agentExecutor = createReactAgent({ llm, tools: toolsForAgent });
const exampleQuery = "What is Anthropic's estimated revenue for 2024?";

const events = await agentExecutor.stream(
{ messages: [["user", exampleQuery]] },
{ streamMode: "values" }
);

for await (const event of events) {
const lastMsg = event.messages[event.messages.length - 1];
if (lastMsg.tool_calls?.length) {
console.dir(lastMsg.tool_calls, { depth: null });
} else if (lastMsg.content) {
console.log(lastMsg.content);
}
}
[
{
name: 'duckduckgo-search',
args: { input: 'Anthropic estimated revenue 2024' },
type: 'tool_call',
id: 'call_eZwbyemMAu8tgQw4VqJs65hF'
}
]
[{"title":"Anthropic forecasts more than $850 mln in annualized revenue rate by ...","link":"https://www.reuters.com/technology/anthropic-forecasts-more-than-850-mln-annualized-revenue-rate-by-2024-end-report-2023-12-26/","snippet":"Dec 26 (Reuters) - Artificial intelligence startup <b>Anthropic</b> has projected it will generate more than $850 million in annualized <b>revenue</b> by the end of <b>2024</b>, the Information reported on Tuesday ..."}]
Anthropic is projected to generate more than $850 million in annualized revenue by the end of 2024.

API reference​

For detailed documentation of all DuckDuckGoSearch features and configurations head to the API reference


Was this page helpful?


You can also leave detailed feedback on GitHub.