JamAI Base Docs
  • GETTING STARTED
    • Welcome to JamAI Base
      • Why Choose JamAI Base?
      • Key Features
      • Architecture
    • Use Case
      • Chatbot (Frontend Only)
      • Chatbot
      • Create a simple food recommender with JamAI and Flutter
      • Create a simple fitness planner app with JamAI and Next.js with streaming text.
      • Customer Service Chatbot
      • Women Clothing Reviews Analysis Dashboard
      • Geological Survey Investigation Report Generation
      • Medical Insurance Underwriting Automation
      • Medical Records Extraction
    • Frequently Asked Questions (FAQ)
    • Quick Start
      • Quick Start with Chat Table
      • Quick Start: Action Table (Multimodal)
      • ReactJS
      • Next JS
      • SvelteKit
      • Nuxt
      • NLUX (Frontend Only)
      • NLUX + Express.js
  • Using The Platform
    • Action Table
    • Chat Table
    • Knowledge Table
    • Supported Models
      • Which LLM Should You Choose?
      • Comparative Analysis of Large Language Models in Vision Tasks
    • Roadmap
  • 🦉API
    • OpenAPI
    • TS/JS SDK
  • 🦅SDK
    • Flutter
    • TS/JS
    • Python SDK Documentation
      • Quick Start with Chat Table
      • Quick Start: Action Table (Mutimodal)
        • Action Table - Image
        • Action Table - Audio
      • Quick Start: Knowledge Table File Upload
Powered by GitBook
On this page
  • Basic Usage
  • Advanced Usage: Chat with your Knowledge Table

Was this helpful?

  1. Using The Platform

Chat Table

PreviousAction TableNextKnowledge Table

Last updated 1 year ago

Was this helpful?

Basic Usage

  1. Every chat table needs a LLM Agent. Let's create our agent. >> Chat Table >> Agents >> New Agent.

  1. Let's add and configure our LLM Agent.

Configuration Parameters:

  • Agent ID: Name of your agent.

  • Models: LLM Model.

  • Temperature:

  • Max tokens:

  • Top-p:

  • Customize system prompt (optional): Define the behaviour of your LLM Agent.

  • User message (optional): Set a first reply to your LLM Agent.

  • AI response (optional): Conversational opener by your LLM Agent.

  1. Select the LLM Agent that you have created and create a new conversation.

  1. You will be brought to the Chat Table interface.

  1. Toggle between Conversation Mode and Table Mode using the toggle bar on the top right of the interface.

  1. Start chatting with your LLM Agent.

Advanced Usage: Chat with your Knowledge Table

  1. After creating your LLM Agent, select it. Click on the LLM Agent. You will be brought to a Table View.

  2. Update the LLM Agent configuration to use RAG.

  3. There are a few settings to configure:

  • k: The number of maximum Knowledge Row that can be fetched as references during RAG.

  • Reranking Model (optional): Reranking the Knowledge Row retrieved before passing it into LLM Agent.

  • Knowledge Table: The table to search for Knowledge Rows as references.

  1. Start chatting.

If you are in Table View. Click on the of AI (output column)and open settings.

If you are in Conversation View. Click to bring up the Model setting.

Demo of Chat Table Usage
Toggle button is circled in Orange.