Chat Table

Basic Usage

  1. Every chat table needs a LLM Agent. Let's create our agent. >> Chat Table >> Agents >> New Agent.

  1. Let's add and configure our LLM Agent.

Configuration Parameters:

  • Agent ID: Name of your agent.

  • Models: LLM Model.

  • Temperature:

  • Max tokens:

  • Top-p:

  • Customize system prompt (optional): Define the behaviour of your LLM Agent.

  • User message (optional): Set a first reply to your LLM Agent.

  • AI response (optional): Conversational opener by your LLM Agent.

  1. Select the LLM Agent that you have created and create a new conversation.

  1. You will be brought to the Chat Table interface.

  1. Toggle between Conversation Mode and Table Mode using the toggle bar on the top right of the interface.

  1. Start chatting with your LLM Agent.

Advanced Usage: Chat with your Knowledge Table

  1. After creating your LLM Agent, select it. Click on the LLM Agent. You will be brought to a Table View.

  2. Update the LLM Agent configuration to use RAG.

  3. There are a few settings to configure:

  • k: The number of maximum Knowledge Row that can be fetched as references during RAG.

  • Reranking Model (optional): Reranking the Knowledge Row retrieved before passing it into LLM Agent.

  • Knowledge Table: The table to search for Knowledge Rows as references.

  1. Start chatting.

Last updated