Skip to main content

LazyGit AI Commit Message

Having AI‑generated commit messages directly integrated into LazyGit

If you use LazyGit every day, you already know how it turns Git from a chore into something you can actually enjoy. But there is one part of the workflow that still tends to feel a bit tedious: writing good commit messages. In this post, I show how to plug OpenAI models directly into LazyGit using a tiny one‑file BASH script, so you can get AI‑generated commit messages based on your actual diffs, without waiting for external tools to catch up with the new OpenAI Responses API.

The result is a minimal, focused tool you can drop into your setup today: lgaicm. It behaves like a mini aichat that does exactly one thing: generate commit messages from Git diffs, optimized for LazyGit.

Why AI‑generated commit messages in LazyGit?

Commit messages matter. They are the story of your codebase. But let's be honest:

  • After a long coding session, your brain is often too tired to craft a clean, descriptive message.
  • Keeping consistency (especially with Conventional Commits) is easy to forget.
  • You sometimes end up with messages like fix stuff or wip that are basically useless later.

Large language models are actually great at this. Feed them a Git diff, tell them your preferred commit style, and they will happily produce:

  • Structured summaries of your changes.
  • Messages that match your favorite style, like feat:, fix:, chore:, etc.
  • Multiple alternatives you can pick from.

And LazyGit, with its support for custom commands and menus, is the perfect place to integrate this. You press a key, LazyGit calls your AI helper, shows you a list of generated messages, and you commit with one key press.

The aichat idea, and the OpenAI Responses API roadblock

The inspiration for this whole setup came from a GitHub issue in the LazyGit repo: lazygit/issues/3212.

In that issue, someone proposed a neat approach: integrate AI‑generated commit messages into LazyGit using an external tool called aichat. It's a CLI chat client for various AI providers, and it can be scripted to do commit‑message generation from diffs. Sounds perfect.

But there was a catch.

OpenAI is moving to the Responses API for its newer and recommended models. Meanwhile, in that same LazyGit issue, sigoden, the maintainer of aichat, clearly stated that aichat will not support the new Responses API.

That puts you in an awkward position if, like many developers, you've already subscribed to OpenAI and want to use the newest models in your daily workflow. You essentially have three choices:

  1. Stick to old APIs or models just to keep aichat happy (not ideal).
  2. Fork aichat and re‑architect it around the Responses API, hoping not to break all its other built‑in features.
  3. Build something small and focused that only does commit messages.

Option 2 sounds powerful but quickly becomes a time sink. aichat does a lot of things: multi‑provider support, chat sessions, configuration, templating, etc. Swapping out its core API layer for Responses would mean reading and understanding a lot of code, then carefully testing everything so you don't regress important features.

That is way more work than necessary if you just want one thing:

“Give me a conventional commit message for my current diff, using an OpenAI model that talks via the Responses API.”

The minimal solution: lgaicm, a one‑file Bash script

Instead of forking aichat or waiting for new features, the approach I take is brutally simple:

  • Write a single BASH script that:
    • Generates a Git diff from your working tree.
    • Sends that diff to the OpenAI Responses API.
    • Receives and prints a list of commit message suggestions, one per line.
  • Wire that script into LazyGit as a custom command that powers a menu.

That script lives in its own tiny project: rakotomandimby/lgaicm.

Think of lgaicm as a hyper‑minimal aichat clone that only does one thing: generate commit messages from diffs using OpenAI's Responses API.

Why a one‑file script is actually a feature

A single BASH file may sound too simple, but that's exactly why it works so well:

  • Easy to audit: You can open the script and see everything it does in a few minutes.
  • Easy to fork or tweak: Want to change the model, temperature, or prompt style? It's one file.
  • No dependency hell: No need to install a complex tool or track its versions. It's just Bash + curl + Git.
  • Tailored for one workflow: Instead of a general chat client, lgaicm is optimized for commit message generation in LazyGit.

Getting started with lgaicm

Let's go through the steps to put this into your own setup. You'll need:

  • A working Git + LazyGit environment.
  • An OpenAI API key (paid subscription, since you want the new models).
  • A shell environment where you can create symlinks and export env vars.

1. Clone the repository

First, clone the lgaicm project locally:

git clone https://github.com/rakotomandimby/lgaicm
cd lgaicm

2. Expose the script on your PATH

The project contains a single BASH script, lgaicm.sh, that you want to invoke from anywhere. The easiest way is to create a symbolic link in a directory that is already on your PATH, such as /usr/local/bin.

ln -s "$(pwd)/lgaicm.sh" /usr/local/bin/lgaicm

After that, you should be able to run lgaicm from any Git repository.

3. Set your OPENAI_API_KEY

lgaicm uses the OpenAI Responses API directly over HTTP, so you have to provide your API key in the environment. The script expects OPENAI_API_KEY to be set:

export OPENAI_API_KEY="sk-..."  # put your real key here

You can add that line to your shell profile (~/.bashrc, ~/.zshrc, etc.) so it's available in all terminals, including the one LazyGit uses.

Treat your API key like a password: don't commit it to Git, don't paste it into screenshots, and avoid putting it into scripts that live in your repository.

Integrating lgaicm into LazyGit

LazyGit has a very powerful customCommands system. You can attach a keybinding to a command, prompt the user for input, show menus, and then execute shell commands with templated arguments.

Here’s a complete example configuration that wires lgaicm into LazyGit so you can:

  1. Press Ctrl+A inside LazyGit.
  2. Pick a commit type (or let AI auto‑decide).
  3. Choose a commit message from an AI‑generated menu.
  4. Commit with that message.
customCommands:
  - key: <c-a>
    description: AI-powered conventional commit
    context: global
    prompts:
      - type: "menu"
        key: "Type"
        title: "Type of change"
        options:
          - name: "AI defined"
            description: "Let AI analyze and determine the best commit type"
            value: "ai-defined"
          - name: "feat"
            description: "A new feature"
            value: "feat"
          - name: "fix"
            description: "A bug fix"
            value: "fix"
          - name: "chore"
            description: "Other changes that don't modify src or test files"
            value: "chore"
          - name: "ci"
            description: "Changes to CI configuration files and scripts"
            value: "ci"
          - name: "refactor"
            description: "A code change that neither fixes a bug nor adds a feature"
            value: "refactor"
          - name: "test"
            description: "Adding missing tests or correcting existing tests"
            value: "test"
      - type: menuFromCommand
        title: "AI Generated Commit Messages"
        key: CommitMsg
        command: "lgaicm --type {{.Form.Type}}"
    command: "git commit -m \"{{.Form.CommitMsg}}\""
    loadingText: "Generating commit messages..."

What this configuration does

Let’s break it down step by step.

Key and description

The top‑level section defines a custom command triggered by <c-a> (Control + A) in any context:

  - key: <c-a>
    description: AI-powered conventional commit
    context: global

You can change the key later if you want, but this gives you a quick entry point for AI commits from anywhere in LazyGit.

Prompt 1: choose the commit type

The first prompt is a static menu to select the type of change:

      - type: "menu"
        key: "Type"
        title: "Type of change"
        options:
          - name: "AI defined"
            description: "Let AI analyze and determine the best commit type"
            value: "ai-defined"
          - name: "feat"
            description: "A new feature"
            value: "feat"
          - name: "fix"
            description: "A bug fix"
            value: "fix"
          - name: "chore"
            description: "Other changes that don't modify src or test files"
            value: "chore"
          - name: "ci"
            description: "Changes to CI configuration files and scripts"
            value: "ci"
          - name: "refactor"
            description: "A code change that neither fixes a bug nor adds a feature"
            value: "refactor"
          - name: "test"
            description: "Adding missing tests or correcting existing tests"
            value: "test"

The selected value is stored under .Form.Type. That value is later passed to the AI script, so it can either:

  • Respect your chosen type (e.g. feat, fix), or
  • If you pick ai-defined, let the AI choose the best commit type based on the diff.

Prompt 2: AI‑generated menu

The second prompt is where the magic happens. It's a menuFromCommand, meaning LazyGit will run a shell command and treat each line of its output as an option:

      - type: menuFromCommand
        title: "AI Generated Commit Messages"
        key: CommitMsg
        command: "lgaicm --type {{.Form.Type}}"

Here, it runs lgaicm --type {{.Form.Type}}, which means:

  • lgaicm inspects your Git diff (for staged or working tree changes, depending on how it's implemented).
  • It sends the diff along with your selected commit type to the OpenAI Responses API.
  • It prints back a list of commit messages, one per line, for LazyGit to display.

You then get an interactive menu inside LazyGit titled “AI Generated Commit Messages,” showing the suggestions from the model. When you choose one, it becomes .Form.CommitMsg.

Final command: git commit

Finally, the main command is executed with the selected commit message:

    command: "git commit -m \"{{.Form.CommitMsg}}\""
    loadingText: "Generating commit messages..."

While lgaicm calls the OpenAI API and generates the menu options, LazyGit shows the loading text “Generating commit messages...” so you know it's working. When you pick a suggestion, the commit runs immediately.

What the Bash script actually does (conceptually)

You don't need to know every implementation detail to use lgaicm, but it helps to have a rough mental model of what it's doing under the hood.

  1. Collect the diff: It uses git diff (or possibly git diff --cached) to capture the changes you plan to commit.
  2. Build an OpenAI Responses API request: It constructs a JSON payload including:
    • The model name (e.g., a current Responses‑compatible model).
    • A system prompt explaining that the assistant should generate conventional commit messages.
    • Your diff as input.
    • The desired commit type, if you specified one.
  3. Call the API: It uses curl with the Authorization: Bearer $OPENAI_API_KEY header and the Responses endpoint to get the model's reply.
  4. Parse and print suggestions: It prints the commit messages line by line to stdout. LazyGit uses that to build the selection menu.

Importantly, because this script talks directly to the new Responses API, you are not held back by older SDKs or third‑party tools that don't plan to adopt it.

Why this approach works right now

There are a lot of smart ideas floating around in issues like lazygit/issues/3212, but they often get hung up on:

  • API changes (like OpenAI's shift to Responses).
  • Tool maintainers with different priorities (like aichat dropping support).
  • The complexity of general‑purpose AI chat clients.

lgaicm sidesteps all of that by embracing a few principles:

  • Single responsibility: It only generates commit messages from diffs. Nothing else.
  • Minimal surface area: A single Bash file is easier to keep working than a large application with many features.
  • Direct OpenAI integration: It talks directly to the Responses API, so you're always aligned with how OpenAI expects clients to behave.
  • LazyGit‑first design: The output format—one message per line—is exactly what LazyGit's menuFromCommand expects.

And you can use this today, without waiting on any PRs to be merged in LazyGit, aichat, or any other third‑party tool.

Ideas to extend or customize lgaicm

Because lgaicm is just a simple script, it's very easy to adapt it to your preferences. Some ideas:

  • Adjust the prompt style: Tune the system prompt to make messages shorter, longer, more formal, more imperative, etc.
  • Limit or expand diff context: For extremely large diffs, you might want to restrict the diff (or chunk it) to stay under token limits.
  • Choose different models: Swap in newer or cheaper OpenAI models depending on your needs.
  • Support multiple languages: If you work in a team that uses another language for commit messages, you can tell the model to respond in that language.

All of this can be done by editing one script instead of navigating through a larger codebase.

Closing thoughts

Integrating AI into your Git workflow doesn't have to be complicated or dependent on big external tools. With LazyGit's custom commands, a small Bash script, and OpenAI's Responses API, you can:

  • Streamline your commits.
  • Maintain consistent, conventional messages.
  • Leverage the latest OpenAI models you're already paying for.

The combination of LazyGit + lgaicm gives you a focused, practical solution for AI‑generated commit messages right now, without having to wait for ecosystem tools to catch up with the Responses API.

If you're curious, explore the script at github.com/rakotomandimby/lgaicm, plug it into your LazyGit config, and enjoy letting AI handle one of the least exciting parts of your Git workflow.

Popular posts from this blog

Undefined global vim

Defining vim as global outside of Neovim When developing plugins for Neovim, particularly in Lua, developers often encounter the "Undefined global vim" warning. This warning can be a nuisance and disrupt the development workflow. However, there is a straightforward solution to this problem by configuring the Lua Language Server Protocol (LSP) to recognize 'vim' as a global variable. Getting "Undefined global vim" warning when developing Neovim plugin While developing Neovim plugins using Lua, the Lua language server might not recognize the 'vim' namespace by default. This leads to warnings about 'vim' being an undefined global variable. These warnings are not just annoying but can also clutter the development environment with unnecessary alerts, potentially hiding other important warnings or errors. Defining vim as global in Lua LSP configuration to get rid of the warning To resolve the "Undefined global vi...

wget maven ntlm proxy

How to make wget, curl and Maven download behind an NTLM Proxy Working on CentOS, behind an NTLM proxy: yum can deal without problem with a NTLM Proxy wget, curl and Maven cannot The solution is to use " cntlm ". " cntlm " is a NTLM client for proxies requiring NTLM authentication. How it works Install "cntlm" Configure "cntlm"  by giving it your credentials by giving it the NTLM Proxy Start "cntlm" deamon (it listens to "127.0.0.1:3128") Configure wget, curl and Maven to use "cntlm" instead of using directly the NTLM Proxy Note: You will have then a kind of 2 stages Proxy : cntlm + the NTLM proxy Configure CNTLM After installing cntlm, the configuration file is in "cntlm.conf". You must have your domain (in the Windows meaning), proxy login and  proxy password. Mine are respectively: rktmb.org, mihamina, 1234abcd (yes, just for the example) You must have you NTLM Proxy Hostnama or IP ...

npm run build base-href

Using NPM to specify base-href When building an Angular application, people usually use "ng" and pass arguments to that invocation. Typically, when wanting to hard code "base-href" in "index.html", one will issue: ng build --base-href='https://ngx.rktmb.org/foo' I used to build my angular apps through Bamboo or Jenkins and they have a "npm" plugin. I got the habit to build the application with "npm run build" before deploying it. But the development team once asked me to set the "--base-href='https://ngx.rktmb.org/foo'" parameter. npm run build --base-href='https://ngx.rktmb.org/foo did not set the base href in indext.html After looking for a while, I found https://github.com/angular/angular-cli/issues/13560 where it says: You need to use −− to pass arguments to npm scripts. This did the job! The command to issue is then: npm run build -- --base-href='https://ngx.rktmb.org/foo...