Chat

The chat interface is the main way to interact with your Gumm assistant. This guide covers everything from sending messages to understanding how tool calls work.


Interface overview

The chat page (/) consists of:

  • Message list — the conversation history, with user messages on the right and assistant responses on the left
  • Input bar — at the bottom of the page; type here to send a message
  • Conversation list — accessible from the sidebar or top of the page

Sending a message

  1. Click in the text field at the bottom of the page
  2. Type your message
  3. Press Enter to send, or click the send button

Use Shift + Enter to add a line break without sending.


Conversations

Each conversation is independent and stored permanently in the database.

ActionHow
Start a new conversationClick + New in the top-right area
Switch to a previous conversationClick it in the sidebar conversation list
Browse all past conversationsGo to History (/history)

Conversation titles are generated automatically from the first message. You can rename them from the History page.


How Gumm uses tools (modules)

When Gumm needs to perform an action — play a song, check the weather, create a reminder — it uses the tool system rather than making things up.

Here is what happens behind the scenes when you ask something like “Play my morning playlist on Spotify”:

  1. Your message is sent to the LLM with all active module tool definitions attached
  2. The LLM decides to call the spotify_play tool and returns a structured tool call
  3. Gumm executes the tool (calls the Spotify API via the module)
  4. The result is fed back to the LLM
  5. The LLM composes a natural-language response and sends it to you

From your perspective, you see only the final response. Tool calls happen automatically and transparently.

If a module is disabled, its tools are hidden from the LLM and will not be called.


Attaching files

You can attach files to your messages by clicking the paperclip icon in the input bar. Gumm will include the file content in the context sent to the LLM.

Supported content: text files, code files, documents. Binary files (images, etc.) depend on the capabilities of the model you are using.


Memory in conversations

Gumm automatically extracts facts from your conversations and stores them in long-term memory. Over time, it learns things like your preferences, your name, your projects, and your habits — and surfaces this context in future conversations.

You can view, edit, and delete memories from the Brain page (/brain).

→ See Brain & Memory for full details.


Rate limits

To protect performance, the chat endpoint is rate-limited:

  • 30 messages per minute sustained
  • 5 messages in 5 seconds burst limit

If you hit a limit, wait a moment and try again. Limits apply per session.