Chatbot Creation on Custom Data

Updated 3 August 2023

In this blog, we will talk about AI Chatbots, we will get their workings, essentials, and the key parts that bring them to life.

Before starting that, we should understand how we can leverage our e-commerce business with Chabtot.

Recently Bagisto has added a new product AI Chatbot. From simple customer service assistants to cutting-edge AI-powered conversational agents, chatbots have become an integral part of our digital landscape, transforming the way we interact with technology and businesses.

Here are a few use cases of how an AI Chatbot can help grow your business and customer satisfaction:

  1. A 24/7 assistant on your e-commerce: AI Chatbot can serve as a 24/7 assistant on your e-commerce platform, it can help users get answers to their queries in a few seconds.
  2. Policies-related Queries: AI Chatbot can answer users’ queries regarding your e-commerce policies like return policy within seconds.
  3. Products-related Queries: Users don’t need to go through hundreds of products, rather AI Chatbot can return the exact products with their URLs.
  4. Discount-related Queries: Users love to know whether discounts are available or not and the chatbot can help them.
  5.  Customer engagement: Fast responses from AI Chatbot will grow the user’s engagement on your e-commerce platforms.
  6. Saves customer Time: Faster solutions to their queries and ease in finding the desired products will save a lot of time for users

So let’s start our journey to understand the world of Chatbots:

Let’s start by understanding the Chatbot

An AI chatbot is nothing but a pre-trained Large Language Model that is again trained on custom data to handle user queries.

Understanding Large Language Model

A large language model is a model that can understand human language. It is a Neural Network that is trained on Millions of text documents for hundreds of hours on large number of GPUs.

Why pre-trained large language model?

As we have discussed how an LLM is trained, it is very costly, computationally expensive, and time-consuming, and a highly skilled team of Data Engineers and Data Scientists is required.

And once an LLM is trained it can be reused very easily, so it is highly preferred to use a pre-trained LLM rather than training it again.

To use a pre-trained LLM like OpenAI, we need to provide its OpenAI key. Bagisto has provided the option to enter the API key and use the Chatbot.

Structure of a Chatbot

Custom Knowledge Base 

The custom Knowledge base is the text data from which you want the Chatbot to answer the queries. Suppose you want to deploy a chatbot on your e-commerce website, you would be willing to train the chatbot to answer queries of users regarding your return policy or the products/services offered by you.

So in that case, your knowledge base would be the data containing the return policy and list of products/services offered by you.

 

Training the Chatbot on Custom Data

Now this is the tricky part. The knowledge base can be huge for e.g it can consist of 50000 words, but the pre-trained LLMs like OpenAI have token limits. Token limit is nothing but word limits that an LLM can handle at once.

So, the token limit of the latest OpenAI LLM has a token limit of 16000, so how are we going to train the chatbot to answer queries from data of 50000 words? Here comes the Langchain.

Langchain

LangChain is a framework for developing applications powered by language models. It is written in Python and TypeScript and can be used in a variety of environments, including Node.js, Cloudflare Workers, and the browser.

 

Solving the issue of the token limit with the help of Langchain

Langchain trains the data into smaller chunks using CharacterTextSplitter and stores all the chunks in the form of embeddings(Vectors). Now whenever a query is asked by the user it first checks which chunk of text is most related to the user’s query, after that, it sends that chunk along with the query to the LLM to generate the answer.

 

How does Langchain find the most related text chunk to the user’s queries?

Actually, computers only understand numbers, so each word is stored in the form of long vectors, which are also called embeddings. So to find the most related chunk of text, it uses an algorithm, which is nothing but a mathematical equation that finds similarity between two vectors. This algorithm is called Cosine Similarity.

So, it converts all the text chunks into vectors and stores them in the form of a vector database. When a query is asked, it converts the query into a vector. After that, Vector Store applies the cosine similarity between the query vector and each text chunk vector and returns the most similar text chunk.

 

Understanding Vector Store

Vectorstores retrieve the most similar text from the vector database. All the vector Database end functions are performed by vector stores. Langchain provides a variety of Vector Stores.

Now, finally run the Chatbot

Here are a few key points for using Bagisto’s AI Chatbot

In Bagisto’s AI Chatbot, we have provided the option to choose between the most efficient Vector Store. There is an option to select between Chroma and Pinecone Vector Store.

After choosing the Vector Store, you have to provide the Vector Store API Key to use that Vector Store.

With these simple steps, harness the power of AI effortlessly and unlock a seamless conversational experience with Bagisto’s AI Chatbot. Leave the technical intricacies to us; now you can focus on engaging your customers and enhancing your online store like never before!

. . .

Leave a Comment

Your email address will not be published. Required fields are marked*


Be the first to comment.

Start a Project




    Message Sent!

    If you have more details or questions, you can reply to the received confirmation email.

    Back to Home