TL;DR
I spend a lot of time on Slack and often need deep-researched information. For this, I have to go to Google search and research topics manually, which seems unproductive in the age of AI.
So, I built a Slack chatbot to access the internet and find relevant information with citations, similar to Perplexity.
Here’s how I built it;
- Configure a SlackBot in the workspace.
- The bot forwards all the messages in the workspace to an event listener.
- Parse the information from the message events and pass it to an AI agent.
- The AI agent, equipped with tools like Exa and Tavily, searches the Internet for the topic and returns the response.
- The agent’s response is then posted as a comment in the main message thread.
Try the Agent live now on Composio Playground👇.
What are AI agents?
Before going ahead, let’s understand what is an AI agent.
AI agents are systems powered by AI models that can autonomously perform tasks, interact with their environment, and make decisions based on their programming and the data they process.
Your AI agent tooling platform 🛠️
Composio is an open-source platform that offers over 150 production-ready tools and integrations such as GitHub, Slack, Code Interpreter, and more to empower AI agents to accomplish complex real-world workflows.
Please help us with a star. 🥹
It would help us to create more articles like this 💖
Star the Composio.dev repository ⭐
Let’s get started 🔥
Start with creating a virtual environment.
python -m venv slack-agent
cd slack-agent
source bin/activate
Now, install the libraries.
pip install composio-core composio-llamaindex
pip install llama-index-llms-openai python-dotenv
A brief description of libraries
- The
composio-core
is the main library for accessing and configuring tools and integrations. It also has a CLI API to manage integrations and triggers conveniently. - The
composio-llamaindex
is the LlamaIndex plug-in for Composio. It lets you use all the LlamaIndex functionalities with Composio tools. - The
llama-index-llms-openai
is an additional library from LlamaIndex that enables you to use OpenAI models within its framework. -
python-dotenv
loads environment variables from a.env
file into your Python project's environment, making it easier to manage configuration settings.
Next, Create a .env
file and add environment variables for the OpenAI API key.
OPENAI_API_KEY=your API key
Configure the Integrations *🔧*
Composio allows you to configure SlackBot without writing any code for the integration. Composio handles all the user authentication and authorization flows, so you can focus on shipping faster.
You can do it from the terminal using Composio’s dedicated CLI API.
But before that, log in to Composio from the CLI and update apps by running the following commands.
composio login
composio apps update
Complete the login flow to use Composio CLI API.
Execute the following command to configure a Slackbot and a GitHub user account.
composio add slackbot
Now, finish the authentication flow to add a Slackbot integration.
Once you finish the integration flow, your live integration will appear in the Integrations section.
Once the SlackBot is integrated, go to the apps section in your workspace, get the BOT ID, and add it to the .env
file.
Set up SlackBot Trigger ⚙️
Triggers are predefined conditions that activate your agents when met. Composio offers a built-in event-listener to capture these trigger events.
Here, we set up a SlackBot trigger to fetch the event data when a new message is added to the workspace and another trigger when a new message is added to the thread.
composio triggers enable slack-receive-message
composio triggers enable slackbot_receive_thread_reply
Go to the trigger section and add the triggers you need.
Also, you can add triggers from the dashboard from the SlackBot page.
Go to the trigger section and add the triggers you need.
Building the Agentic Workflow 🏗️
Now that we have set up integrations and triggers let's move on to the coding part.
Step 1: Import packages and define tools
Create a main.py
file and paste the following codes.
import os
from dotenv import load_dotenv
from composio_llamaindex import Action, App, ComposioToolSet
from composio.client.collections import TriggerEventData
from llama_index.core.agent import FunctionCallingAgentWorker
from llama_index.core.llms import ChatMessage
from llama_index.llms.openai import OpenAI
load_dotenv()
llm = OpenAI(model="gpt-4o")
# Bot configuration constants
BOT_USER_ID = os.environ["SLACK_BOT_ID"] # Bot ID for Composio. Replace with your bot member ID, once the bot joins the channel.
RESPOND_ONLY_IF_TAGGED = (
True # Set to True to have the bot respond only when tagged in a message
)
# Initialize the Composio toolset for integration with the OpenAI Assistant Framework
composio_toolset = ComposioToolSet()
composio_tools = composio_toolset.get_tools(
apps=[App.CODEINTERPRETER, App.EXA, App.FIRECRAWL, App.TAVILY]
)
Here’s what is going on in the above code block,
- We imported the packages and modules needed for the project.
- Set up
.env
variables to the environment variable withload_dotenv()
. - Set up global variables BOT_USER_ID and RESPOND_ONLY_IF_TAGGED.
- Initialize the Composio toolset.
- Add Code Interpreter, Exa, Fire Crawl, and Tavily services as tools.
Step 2: Define Agents
Now define the Slack agent with tools and llm.
# Define the Crew AI agent with a specific role, goal, and backstory
prefix_messages = [
ChatMessage(
role="system",
content=(
"You are now an integration agent, and whatever you are requested, you will try to execute utilizing your tools."
),
)
]
agent = FunctionCallingAgentWorker(
tools=composio_tools,
llm=llm,
prefix_messages=prefix_messages,
max_function_calls=10,
allow_parallel_tool_calls=False,
verbose=True,
).as_agent()
- We defined the agent using
FunctionCallingAgentWorker
with a prefix message as the system prompt. - This provides the LLM with additional context regarding the roles and expectations.
- The agent has been provided with the defined tools.
- The
max_function_calls
parameter sets the number of times the agent will retry if any error is encountered. - verbosity is set to True to log complete agent workflow.
Note: The FunctionCallingAgentWorker
only supports LLMs with function-calling abilities like GPT, Mistral, and Anthropic models.
Step 3: Defining the Event Listener
The next step is to set up the event listener. This will receive the payloads from the trigger events in Slack.
The payloads contain the required event information, such as channel ID, message text, timestamps, etc. You retrieve the needed information, process it, and perform actions.
# Create a listener to handle Slack events and triggers for Composio
listener = composio_toolset.create_trigger_listener()
# Callback function for handling new messages in a Slack channel
@listener.callback(filters={"trigger_name": "slackbot_receive_message"})
def callback_new_message(event: TriggerEventData) -> None:
payload = event.payload
user_id = payload.get("event", {}).get("user", "")
# Ignore messages from the bot itself to prevent self-responses
if user_id == BOT_USER_ID:
return
message = payload.get("event", {}).get("text", "")
# Respond only if the bot is tagged in the message if configured to do so
if RESPOND_ONLY_IF_TAGGED and f"<@{BOT_USER_ID}>" not in message:
print("Bot not tagged, ignoring message")
return
# Extract channel and timestamp information from the event payload
channel_id = payload.get("event", {}).get("channel", "")
ts = payload.get("event", {}).get("ts", "")
thread_ts = payload.get("event", {}).get("thread_ts", ts)
# Process the message and post the response in the same channel or thread
result = agent.chat(message)
print(result)
composio_toolset.execute_action(
action=Action.SLACKBOT_CHAT_POST_MESSAGE,
params={
"channel": channel_id,
"text": result.response,
"thread_ts": thread_ts,
},
)
listener.listen()
- The callback function
callback_new_message
is invoked when the trigger event in Slack matchesslackbot_receive_message
. - The user ID is extracted from the event payload. If it matches the bot ID, the process is skipped.
- If not, the code checks if the message has the bot ID mentioned.
- We extract the message, channel ID, and timestamps.
- The message is passed to the Slack agent you defined earlier.
- The response from the agent is then sent to Slack.
Now, once everything is set up, run the Python file.
Make sure you have set up the Slack bot correctly in your channel. Here is how you can add the Slack Bot to your channel 👇.
When you send a message in Slack tagging the bot, the event listener receives the payload, the agent acts on it, and the response is sent back to the same Slack channel.
Here is a full video of the Slack bot in action.
Let's connect! 🔌
You can join our community to engage with maintainers and contribute as an open-source developer. Don't hesitate to visit our GitHub repository to contribute and create issues related to Composio.
The source for this tutorial is available here also, check out the implementation with other frameworks:
Full code of the AI Slack Bot ✨
Thank you for reading!