JamAI Base Docs
  • GETTING STARTED
    • Welcome to JamAI Base
      • Why Choose JamAI Base?
      • Key Features
      • Architecture
    • Use Case
      • Chatbot (Frontend Only)
      • Chatbot
      • Create a simple food recommender with JamAI and Flutter
      • Create a simple fitness planner app with JamAI and Next.js with streaming text.
      • Customer Service Chatbot
      • Women Clothing Reviews Analysis Dashboard
      • Geological Survey Investigation Report Generation
      • Medical Insurance Underwriting Automation
      • Medical Records Extraction
    • Frequently Asked Questions (FAQ)
    • Quick Start
      • Quick Start with Chat Table
      • Quick Start: Action Table (Multimodal)
      • ReactJS
      • Next JS
      • SvelteKit
      • Nuxt
      • NLUX (Frontend Only)
      • NLUX + Express.js
  • Using The Platform
    • Action Table
    • Chat Table
    • Knowledge Table
    • Supported Models
      • Which LLM Should You Choose?
      • Comparative Analysis of Large Language Models in Vision Tasks
    • Roadmap
  • 🦉API
    • OpenAPI
    • TS/JS SDK
  • 🦅SDK
    • Flutter
    • TS/JS
    • Python SDK Documentation
      • Quick Start with Chat Table
      • Quick Start: Action Table (Mutimodal)
        • Action Table - Image
        • Action Table - Audio
      • Quick Start: Knowledge Table File Upload
Powered by GitBook
On this page

Was this helpful?

  1. GETTING STARTED
  2. Use Case

Customer Service Chatbot

Build a customer service chatbot that answers questions about your product and embed it into any website.

Step 1 - Load Data into a Knowledge Table

  1. Create a Knowledge Table:

    • Go to >> Project >> Knowledge Table.

    • Create a New Knowledge Table with your desired Table ID (table name) and pick a Text Embedding Model.

  2. Upload Documentation:

    • Open the table that you have just created.

    • Upload your files to fill up the Knowledge Table. JamAI Base will process your files into Knowledge Rows which you can quickly search up later.

Step 2 - Create and Configure an LLM Agent

  1. Create an LLM Agent:

    • Go to >> Chat Table >> Agents >> New Agent.

    • Add and configure your LLM Agent with the following parameters:

      • Agent ID: Name of your agent.

      • Models: LLM Model.

      • Temperature: Set as needed.

      • Max tokens: Set as needed.

      • Top-p: Set as needed.

      • Customize system prompt (optional): Define the behavior of your LLM Agent.

      • User message (optional): Set a first reply to your LLM Agent.

      • AI response (optional): Conversational opener by your LLM Agent.

Step 3 - Configure the LLM Agent to Use RAG (Retrieval-Augmented Generation)

  1. Select the LLM Agent:

    • Click on the LLM Agent to bring up the Table View.

  2. Update Configuration:

    • In Table View, click on the more icon of the AI (output column) and open settings.

    • Configure the following settings:

      • k: The number of maximum Knowledge Rows that can be fetched as references during RAG.

      • Reranking Model: Rerank the Knowledge Rows retrieved before passing them into the LLM Agent.

      • Knowledge Table: The table to search for Knowledge Rows as references.

Step 4 - Create a New Conversation

  1. Start a New Conversation:

    • Select the LLM Agent that you have created and create a new conversation.

    • You will be brought to the Chat Table interface.

  2. Toggle Between Modes:

    • Toggle between Conversation Mode and Table Mode using the toggle bar on the top right of the interface.

  3. Start Chatting:

    • Begin interacting with your LLM Agent.

Step 5 - Deploy the Chatbot

  1. Run the Chatbot:

    • You can run the chatbot within the Chat Table interface.

  2. Embed the Chatbot:

    • To embed the chatbot into your website, you can generate an API call and use the LLM in your website.

Advanced Usage: Enrich Knowledge Rows

  1. Add Output Column:

    • After creating a new Knowledge Table, add a new Output Column (>> Action >> Add output column).

  2. Setup Knowledge LLM Agent:

    • Configure the LLM Agent to further process the text content to enrich your Knowledge Row using the following template:

      • Column ID: The title of the column.

      • Data Type: str.

      • Models: The LLM models.

      • Temperature: 0.1.

      • Max Tokens: 512.

      • Top-p: 1.0.

      • Customize prompt: Prompt to process the Input columns.

  3. Start Using the Knowledge Table:

    • See the LLM agent magically process your Knowledge Row when you upload files to the Knowledge Table.


By following these steps, you can create a customer service chatbot using JamAIBase that leverages your documentation to answer user questions effectively.

PreviousCreate a simple fitness planner app with JamAI and Next.js with streaming text.NextWomen Clothing Reviews Analysis Dashboard

Last updated 11 months ago

Was this helpful?