9 open-source AI coding tools that every developer should know šŸŽÆ

Sunil Kumar Dash - Nov 4 - - Dev Community

AI is changing the world as we know it, and for developers, embracing it can significantly boost productivity. It helps you ship new features faster, write test cases for you, and even find vulnerabilities in your code.

The internet offers many tools, but finding the right one can take time and effort. So, I have compiled a list of AI tools to help you become a better developer.

Peter nodding GIF


1. SWE-Kit šŸ‘‘: Opensource head-less IDE for Coding agents

As a developer, I have always wanted to build customized AI tools to let me chat with the codebase, automate pushing changes to GitHub, and ship new features automatically. Honestly, I couldnā€™t find a single tool until this.

SWE-Kit is a headless IDE with features like LSPs, Code Indexing, and Code RAG. It offers a flexible runtime, which can run on any Docker host or remote server alongside specialized coding toolkits.

These toolkits include integrations with platforms like GitHub, Jira, and Slack, as well as tools such as file search and code indexing, which Composio powers.

The coding agent built with SweKit has scored an impressive 48.60% on the verified SWE bench.

This comprehensive benchmark includes some real-world GitHub issues from popular libraries like Django, Scikit-learn, Flask, Sympy, etc.

SweKit SWE-bench Stats

It is compatible with all the major LLM frameworks like LangChain, CrewAI, Autogen, and LlamaIndex.

You can build and deploy your own.

  • GitHub PR Agent: This is used to automate the review of GitHub PRs.
  • SWE Agent: You can build an SWE agent to write features, unit tests, documents, etc, automatically.
  • Chat with Codebase: You can build a tool for chatting with any remote or local codebase using the code indexing tool.

Install swekit and composio-core to get started quickly.

pip install compsio-core swekit
Enter fullscreen mode Exit fullscreen mode

Install any framework of your choice.

pip install crewai composio-crewai
Enter fullscreen mode Exit fullscreen mode

Now, letā€™s create a Coding agent with GitHub access.

composio add github
Enter fullscreen mode Exit fullscreen mode

Generate a new agent scaffolding.

swekit scaffold crewai -o swe_agent
Enter fullscreen mode Exit fullscreen mode

Run the agent.

cd swe_agent/agent
python main.py
Enter fullscreen mode Exit fullscreen mode

This uses Docker as the default workspace environment. For more, see the documentation.

SweKit Image

Visit the site and show support on Product Huntā­


2. Composio: AI integration and tooling platform

Composio is an open-source platform that offers third-party integrations for AI agents. It offers Linear, Slack, GitHub, Jira, Asana, and other integrations that can be used to build customized AI agents.

For instance,

  • You can build comprehensive agents to interact with users on Slack and Discord. Respond to queries, guide them through the support process, or schedule follow-up actions.
  • You can build Coding agents to automate bug fixing in GitHub from Jira tickets.
  • Build agents to write code documentation.

Composio is very easy to get started with.

pip install composio-core
Enter fullscreen mode Exit fullscreen mode

Add a GitHub integration.

composio add github
Enter fullscreen mode Exit fullscreen mode

Composio handles user authentication and authorization on your behalf.

Here is how you can use the GitHub integration to star a repository.

from openai import OpenAI
from composio_openai import ComposioToolSet, App

openai_client = OpenAI(api_key="******OPENAIKEY******")

# Initialise the Composio Tool Set
composio_toolset = ComposioToolSet(api_key="**\\\\*\\\\***COMPOSIO_API_KEY**\\\\*\\\\***")

## Step 4
# Get GitHub tools that are pre-configured
actions = composio_toolset.get_actions(actions=[Action.GITHUB_ACTIVITY_STAR_REPO_FOR_AUTHENTICATED_USER])

## Step 5
my_task = "Star a repo ComposioHQ/composio on GitHub"

# Create a chat completion request to decide on the action
response = openai_client.chat.completions.create(
model="gpt-4-turbo",
tools=actions, # Passing actions we fetched earlier.
messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": my_task}
  ]
)
Enter fullscreen mode Exit fullscreen mode

Run this Python script to execute the given instruction using the agent.

Composio works with famous frameworks like LangChain, LlamaIndex, CrewAi, etc.

For more information, visit the officialĀ docs, and for even more complex examples, see the repository'sĀ exampleĀ sections.

Composio GIF

Star the Composio repository ā­


3. Aider - The AI Pair-programmer

This is the perfect choice if you're looking for a pair programmer to help you ship code faster.

Aider lets you pair programs with LLMs to edit code in your local GitHub repository. You can start a new project or work with an existing GitHub repo.

You can get started quickly like this:

pip install aider-chat

# Change the directory into a git repo
cd /to/your/git/repo

# Work with Claude 3.5 Sonnet on your repo
export ANTHROPIC_API_KEY=your-key-goes-here
aider

# Work with GPT-4o on your repo
export OPENAI_API_KEY=your-key-goes-here
aider
Enter fullscreen mode Exit fullscreen mode

For more details, see theĀ installation instructionsĀ and otherĀ documentation.

Aider GIF

Star the Aider repository ā­


4. Mentat - A GitHub native coding agent

Mentat is an AI tool built to help you tackle any coding task from your command line.

Unlike Copilot, Mentat can coordinate edits across multiple files and locations. And unlike ChatGPT, Mentat understands the context of your project from the startā€”there's no need to copy and paste!

It has a dedicated CLI tool to communicate directly with code bases and can generate and execute Python code from prompts in the terminal.

Follow the steps to run Mentat. First, create a Python virtual environment.

# Python 3.10 or higher is required
python3 -m venv .venv
source .venv/bin/activate
Enter fullscreen mode Exit fullscreen mode

Clone your GitHub repository.

git clone https://github.com/AbanteAI/mentat.git
cd mentat

# install with pip in editable mode:
pip install -e .
Enter fullscreen mode Exit fullscreen mode

Add OpenAI or any LLM providerā€™s API key.

export OPENAI_API_KEY=<your key here>
Enter fullscreen mode Exit fullscreen mode

Run Mentat from within your project directory. Mentat uses git, so if your project doesn't already have git set up, runĀ git init. Then you can run Mentat with:

mentat <paths to files or directories>
Enter fullscreen mode Exit fullscreen mode

For more information on Mentat, check the documentation.

Mentat GIF

Star the Mentat repository ā­


5. AutoCodeRover- Autonomous Program Improvement

AutoCodeRover offers a fully automated solution for resolving GitHub issues, including bug fixes and feature additions.

By combining LLMs with advanced analysis and debugging capabilities, AutoCodeRover prioritizes patch locations to create and implement patches efficiently.

To get started, set the OPENAI_API_KEY or any other for that matter,

export OPENAI_KEY=sk-YOUR-OPENAI-API-KEY-HERE
Enter fullscreen mode Exit fullscreen mode

Build and start the docker image:

docker build -f Dockerfile -t acr .
docker run -it -e OPENAI_KEY="${OPENAI_KEY:-OPENAI_API_KEY}" -p 3000:3000 -p 5000:5000 acr
Enter fullscreen mode Exit fullscreen mode

Check out their official repository for more information.

AutoCodeRover GIF

Star the Auto Code Rover repository ā­


5. ContinueĀ - Leading AI-powered code assistant

You must have heard about Cursor IDE, the popular AI-powered IDE; Continue is similar to it but open source under Apache license.

It is highly customizable and lets you add any language model for auto-completion or chat. This can immensely improve your productivity. You can add Continue to VScode and JetBrains.

Key features

  • ChatĀ to understand and iterate on code in the sidebar
  • AutocompleteĀ to receive inline code suggestions as you type
  • EditĀ to modify code without leaving your current file
  • ActionsĀ to establish shortcuts for everyday use cases

For more, check theĀ documentation.

Continue GIF


6. Qodo Merge: Tool for automated pull request analysis

This open-source tool from Codium AI automates GitHub Pull request review, analysis, feedback, and suggestions. It can help you become more productive with pull requests and is compatible with other version control systems like GitLab and BitBucket.

It has both self-hosted and cloud-hosted solutions.

You will need an OpenAI API key and a GitHub or GitLab access token for a self-hosted solution.

To use it locally, install the library.

pip install pr-agent
Enter fullscreen mode Exit fullscreen mode

Then, run the relevant tool with the script below.

Make sure to fill in the required parameters (user_token,Ā openai_key,Ā pr_url,Ā command):

from pr_agent import cli
from pr_agent.config_loader import get_settings

def main():
    # Fill in the following values
    provider = "github" # GitHub provider
    user_token = "..."  # GitHub user token
    openai_key = "..."  # OpenAI key
    pr_url = "..."      # PR URL, for example 'https://github.com/Codium-ai/pr-agent/pull/809'
    command = "/review" # Command to run (e.g. '/review', '/describe', '/ask="What is the purpose of this PR?"', ...)

    # Setting the configurations
    get_settings().set("CONFIG.git_provider", provider)
    get_settings().set("openai.key", openai_key)
    get_settings().set("github.user_token", user_token)

    # Run the command. Feedback will appear in GitHub PR comments
    cli.run_command(pr_url, command)

if __name__ == '__main__':
    main()
Enter fullscreen mode Exit fullscreen mode

You can also use Docker images or run from the source. The documentation has more on Qodo merge.

Qodo AI Image

Star the Qodo Merge repository ā­


7.OpenHands: Platform for AI software developer agents

OpenHands is one of the leading open-source platforms for AI agents and a direct competitor of Devin. An OpenHands agent can build new greenfield projects, add features to existing codebases, debug issues, and more.

Recently, their agent also topped the SWE-bench leaderboard with 53%.

To start with OpenHands, you need Docker version 26.0.0+ or Docker Desktop 4.31.0+ and Linux, Mac, or WSL.

Pull the docker image and run the container.

docker pull docker.all-hands.dev/all-hands-ai/runtime:0.12-nikolaik

docker run -it --rm --pull=always \
    -e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.12-nikolaik \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --add-host host.docker.internal:host-gateway \
    --name openhands-app \
    docker.all-hands.dev/all-hands-ai/openhands:0.12
Enter fullscreen mode Exit fullscreen mode

After running the command above, you'll find OpenHands running atĀ http://localhost:3000.

Upon launching OpenHands, you'll see a settings modal. Select anĀ LLM ProviderĀ andĀ LLM ModelĀ and enter a correspondingĀ API Key. You can change these anytime by selecting theĀ UI's Settings button.

If your model is not listed, toggle the advanced mode and enter it manually.

OpenHands Screenshots

They provide four methods for working with Agents: an interactive GUI, a command-line interface (CLI), and options for non-interactive use through headless mode and GitHub Actions. Each has its pros. For more, refer to the documentation.

AllHands AI

Star the OpenHands repository ā­


8. Cody from Sourcegraph: Coding assistant for IDEs

Cody is an open-source project from Sourcegraph designed to supercharge your coding workflow directly within your IDEā€”whether it's VS Code, JetBrains, or others. Cody leverages advanced search as a coding assistant to pull context from local and remote codebases. This enables seamless access to details about APIs, symbols, and usage patterns at any scale, right from your IDE.

With Cody, you can chat with your codebase, make inline edits, get code suggestions, and enjoy features like auto-completion, all tailored to help you code faster and more effectively.

You can simply install this on your IDEs and get started. For more, check the documentation.

CodyAI Image

Star the Cody repository ā­


9. VannaAI: Chat with SQL database

I dread writing SQL queries, but at the same time, it is one of the most critical technologies in modern software development. Almost all companies heavily rely on SQL to interact with relational databases. But as they say, there is always an AI tool for it and SQL databases; it is Vanna AI.

It is an open-source tool that lets you chat with SQL databases using natural language.

Vanna works in two easy steps - train a RAG "model" on your data and then ask questions that will return SQL queries that can be set up to run on your database automatically.

VannaAI workflow


Getting started with Vanna is easy. Install it using pip.

pip install vanna
Enter fullscreen mode Exit fullscreen mode
# The import statement will vary depending on your LLM and vector database. This is an example for OpenAI + ChromaDB

from vanna.openai.openai_chat import OpenAI_Chat
from vanna.chromadb.chromadb_vector import ChromaDB_VectorStore

class MyVanna(ChromaDB_VectorStore, OpenAI_Chat):
    def __init__(self, config=None):
        ChromaDB_VectorStore.__init__(self, config=config)
        OpenAI_Chat.__init__(self, config=config)

vn = MyVanna(config={'api_key': 'sk-...', 'model': 'gpt-4-...'})

# See the documentation for other options

Enter fullscreen mode Exit fullscreen mode

You can train models using your custom data.

Depending on your use case, you may or may not need to run these vn.train commands.

Train with DDL statements.

vn.train(ddl="""
    CREATE TABLE IF NOT EXISTS my-table (
        id INT PRIMARY KEY,
        name VARCHAR(100),
        age INT
    )
""")
Enter fullscreen mode Exit fullscreen mode

Asking Questions

vn.ask("What are the top 10 customers by sales?")
Enter fullscreen mode Exit fullscreen mode

You will get SQL output.

SELECT c.c_name as customer_name,
        sum(l.l_extendedprice * (1 - l.l_discount)) as total_sales
FROM   snowflake_sample_data.tpch_sf1.lineitem l join snowflake_sample_data.tpch_sf1.orders o
        ON l.l_orderkey = o.o_orderkey join snowflake_sample_data.tpch_sf1.customer c
        ON o.o_custkey = c.c_custkey
GROUP BY customer_name
ORDER BY total_sales desc limit 10;
Enter fullscreen mode Exit fullscreen mode

See theĀ documentationĀ for more details.

VannaAI Image

Star the Vanna repository ā­


Thanks for reading. If you use any other AI tool that has helped you, comment below.

. . . . . . . . . . . . . . . . . . . . . .