Back to Blog

Tool Calling in Llama 3: A Step-by-step Guide To Build Agents

Tool Calling in Llama 3: A Step-by-step Guide To Build Agents Recently

Robert

·

Image for blog post: Tool Calling in Llama 3: A Step-by-step Guide To Build Agents

Recently, Meta released the Llama 3 and 3.1 family of models, including an 8b, 70b, and 405b parameter model. The 3.1 model natively supports tool calling, making it one of the best open-source large language models for agentic automation.

This article will discuss tool-calling in LLMs and how to use Groq’s Llama 3.1 tool-calling feature to build capable AI agents.

In this article, we cover

  1. The basics of tool calling.
  1. How to use Llama 3.1 on Groq Cloud for tool calling.
  1. How to use Composio tools with LlamaIndex to build a research agent.

But before that, let’s understand what even tool calling is.

Note: We will use the terms tool calling and function calling interchangeably. As they more or less refer to the same concept.

What is Tool Calling?

Tools are functions that allow large language models (LLMs) to interact with external applications. They provide an interface for LLMs to access and retrieve information from outside sources.

Despite the name, LLMs don’t directly call tools themselves. Instead, when they determine that a request requires a tool, they generate a structured schema specifying the appropriate tool and the necessary parameters.

For instance, assume an LLM can access internet browsing tools that accept a text parameter. So, when a user asks it to fetch information about a recent event, the LLM, instead of generating texts, will now generate a JSON schema of the Internet tool, which you can call the function/tool.

Here is a schematic diagram that showcases how tool calling works.

What is Groq?

Groq is an AI inference provider that hosts open-source models such as Mistral, Gemma, Whisper, and Llama. They are the industry-leading platform for offering the fastest AI inference, which makes it ideal for building capable agentic systems.

This article will showcase how you can use the Groq cloud to build AI agents using Composio tools.

Create Groq API key

You will need an API key to use the LLMs hosted by Groq Cloud. So, go to Groq Cloud and create one.

More Posts