Cloud

Build a chatbot with Google’s PaLM API


In a previous article, I introduced Google’s PaLM 2 foundation model by building a simple application to generate a blog post. In this article, we will explore how to build a chatbot with the PaLM 2 API available on Google Cloud Vertex AI.

For a detailed guide on setting up the environment and configuring the SDK, please refer to the previous tutorial. This guide assumes you completed the previous one.

The library, vertexai.preview.language_models, has multiple classes including ChatModel, TextEmbedding, and TextGenerationModel. For this article, we will focus on the ChatModel library, where the PaLM model will act as a physics teacher responding to our questions.

As a first step, let’s import the appropriate classes from the library.

from vertexai.preview.language_models import ChatModel, InputOutputTextPair

The ChatModel class is responsible for accepting a prompt and returning the answer. The InputOutputTextPair makes it easy to create a pair of questions and answers that provide examples to the chatbot.

We will then initialize the object based on the pre-trained model, chat-bison@001, which is optimized for chat-like conversations.

model = ChatModel.from_pretrained("chat-bison@001") 

The next step is to define a function that accepts the prompt as an input and returns the response generated by the model as output.

def get_completion(msg):
   
    ctx="My name is Jani. You are a physics teacher, knowledgeable about the gravitational theory"   
   
    exp=[
        InputOutputTextPair(
            input_text="How you define gravity?",
            output_text="Gravity is a force that attracts a body towards the centre of the earth or any other physical body having mass."
        ),
    ]   
    chat = model.start_chat(context=ctx,examples=exp)

    response = chat.send_message(msg,max_output_tokens=256,temperature=0.2)

    return response
vertex ai palm chatbot 01 IDG

This method captures the essence of building a chatbot. It contains three elements essential to generating a meaningful and relevant response:

  • Context: This helps us customize the behavior of the chat model. It is used to add additional context to instruct a model on the key topic or theme of the conversation. Though optional, context plays an important role in generating accurate responses.
  • Examples: A list of input-output pairs demonstrating exemplary model output for a given input is an example of a chat prompt. You can use examples to change the way the model responds to specific questions.
  • Messages: A chat prompt’s messages are a list of author-content pairs. The model responds to the most recent message, which is the last pair in the list of messages. The chat session history is made up of the pairs preceding the last pair.

Notice how we defined the context and examples. Since we are building a physics chatbot specializing in gravitational theory, both context and examples have references to the topic.

The PaLM model accepts tokens as an input parameter that defines the size of the prompt and responses. A token is approximately four characters. 100 tokens correspond to roughly 60 to 80 words. For this conversation, we set the max_output_tokens parameter to 250. You can increase this depending on your use case. Note that the token limit determines how many messages are retained in the chat session history.

The model’s creativity is defined by the next parameter, temperature. This option determines how random the token selection is. Lower temperatures are preferable for prompts that require a specific and less creative response, whereas higher temperatures can yield more diverse and creative responses. The value can range from 0 to 1. We set this to 0.2 because we need accuracy.

With the method in place, let’s construct the prompt.

prompt="What is the relationship between gravity and weight?"

Invoke the method by passing the prompt.
response=get_completion(prompt)
print(response)
response=get_completion(prompt)
vertex ai palm chatbot 02 IDG

Let’s ask another question related to gravity.

prompt="What is gravity according to Einstein?"

response=get_completion(prompt)
print(response.text)
vertex ai palm chatbot 03 IDG

Below is the complete code for your reference.

from vertexai.preview.language_models import ChatModel, InputOutputTextPair

model = ChatModel.from_pretrained("chat-bison@001")

def get_completion(msg):
   
    ctx="My name is Jani. You are a physics teacher, knowledgeable about the gravitational theory"   

    exp=[
        InputOutputTextPair(
            input_text="How you define gravity?",
            output_text="Gravity is a force that attracts a body towards the centre of the earth or any other physical body having mass."
        ),
    ]   
    chat = model.start_chat(context=ctx,examples=exp)

    response = chat.send_message(msg,max_output_tokens=256,temperature=0.2)

    return response

prompt="What is the relationship between gravity and weight?"
response=get_completion(prompt)
print(response.text)

prompt="What is gravity according to Einstein?"
response=get_completion(prompt)
print(response.text)

We built a chatbot based on PaLM 2 large language model in just a few lines of code. In the next article, we will explore the word embeddings capability of the model. Stay tuned.

Copyright © 2023 IDG Communications, Inc.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.