Chat Table
Last updated
Last updated
Every chat table needs a LLM Agent. Let's create our agent. >> Chat Table >> Agents >> New Agent.
Let's add and configure our LLM Agent.
Configuration Parameters:
Agent ID: Name of your agent.
Models: LLM Model.
Temperature:
Max tokens:
Top-p:
Customize system prompt (optional): Define the behaviour of your LLM Agent.
User message (optional): Set a first reply to your LLM Agent.
AI response (optional): Conversational opener by your LLM Agent.
Select the LLM Agent that you have created and create a new conversation.
You will be brought to the Chat Table interface.
Toggle between Conversation Mode and Table Mode using the toggle bar on the top right of the interface.
Start chatting with your LLM Agent.
After creating your LLM Agent, select it. Click on the LLM Agent. You will be brought to a Table View.
Update the LLM Agent configuration to use RAG.
There are a few settings to configure:
k: The number of maximum Knowledge Row that can be fetched as references during RAG.
Reranking Model (optional): Reranking the Knowledge Row retrieved before passing it into LLM Agent.
Knowledge Table: The table to search for Knowledge Rows as references.
Start chatting.
If you are in Table View. Click on the of AI (output column)and open settings.
If you are in Conversation View. Click to bring up the Model setting.