How to Build Your Own Panel AI Chatbots
Powered by OpenAI and LangChain
By: Andrew Huang and Sophia Yang
The open-source project Panel, with its latest version 1.3, has just introduced an exciting and highly anticipated new feature: the Chat Interface widget. This new capability has opened up a world of possibilities, making the creation of AI chatbots more accessible and user-friendly than ever before.
In this post, you’ll learn how to use the ChatInterface widget and to build:
- A basic chatbot
- An OpenAI ChatGPT-powered AI chatbot
- A LangChain-powered AI chatbot
Before we get started, you will need to install panel
(any version greater than or equal to 1.3.0) and other packages you might need like jupyterlab
, openai
, and langchain
.
Now you are ready to go!
Use the ChatInterface widget
The brand new ChatInterface widget is a high-level widget, providing a user-friendly chat interface to send messages with four built-in operations:
- Send: Send messages to the chat log
- Rerun: Resend the most recent user message
- Undo: Remove the most recent messages
- Clear: Clear all chat messages
Curious to know more on how ChatInterface
works under the hood? It’s a high-level widget that wraps around the middle-level widget ChatFeed
that manages a list of ChatMessage
items for displaying chat messages. Check out the docs on ChatInterface, ChatFeed, and ChatMessage to learn more.
1. Build a basic chatbot
With pn.chat.ChatInterface
, we can send messages to the chat interface, but how should the system reply? We can define a callback
function!
In this example, our callback
function simply echoes back a user message. See how it’s becoming more functional already?
"""
Demonstrates how to use the ChatInterface widget to echo back a message.
"""
import panel as pn
pn.extension()
def callback(contents: str, user: str, instance: pn.chat.ChatInterface):
message = f"Echoing {user}: {contents}"
return message
chat_interface = pn.chat.ChatInterface(callback=callback, callback_user="System")
chat_interface.send("Send a message to receive an echo!", user="System", respond=False)
chat_interface.servable()
To serve the app, run panel serve app.py
or panel serve app.ipynb
.
2. Build a ChatGPT-powered AI chatbot
How can we use OpenAI ChatGPT to reply messages? We can simply call the OpenAI API in the callback
function.
Please make sure to install openai
in your environment and add your OpenAI API key in the script. Note that in this example, we added async
to the function to allow collaborative multitasking within a single thread and allow IO tasks to happen in the background. This ensures that our app runs smoothly while waiting for OpenAI API responses. Async enables concurrent execution, allowing us to perform other tasks while waiting and ensuring a responsive application.
"""
Demonstrates how to use the ChatInterface widget to create a chatbot using
OpenAI's GPT-3 API.
"""
import openai
import panel as pn
pn.extension()
openai.api_key = "Add your key here"
async def callback(contents: str, user: str, instance: pn.chat.ChatInterface):
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": contents}],
stream=True,
)
message = ""
for chunk in response:
message += chunk["choices"][0]["delta"].get("content", "")
yield message
chat_interface = pn.chat.ChatInterface(callback=callback, callback_user="ChatGPT")
chat_interface.send(
"Send a message to get a reply from ChatGPT!", user="System", respond=False
)
chat_interface.servable()
3. Build a LangChain-powered AI chatbot
The Panel ChatInterface also seamlessly integrates with LangChain, leveraging the full spectrum of LangChain’s capabilities.
Here is an example of how we use LangChain’s ConversationChain
with the ConversationBufferMemory
to store messages and pass previous messages to the OpenAI API.
Again, please remember to make sure to install langchain
in your environment and add your OpenAI API key in the script.
Note that before we dive into the LangChain code, we defined a callback_handler
because LangChain interface does not have a way to stream from generators, so we need to wrap the LangChain interface with pn.chat.langchain.PanelCallbackHandler
, which inherits from `langchain.callbacks.base.BaseCallbackHandler`. For more info, check out our docs.
"""
Demonstrates how to use the ChatInterface widget to create a chatbot using
OpenAI's GPT-3 API with LangChain.
"""
import os
import panel as pn
from langchain.chat_models import ChatOpenAI
from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory
pn.extension()
os.environ["OPENAI_API_KEY"] = "Type your API key here"
async def callback(contents: str, user: str, instance: pn.chat.ChatInterface):
await chain.apredict(input=contents)
chat_interface = pn.chat.ChatInterface(callback=callback, callback_user="ChatGPT")
callback_handler = pn.chat.langchain.PanelCallbackHandler(chat_interface)
llm = ChatOpenAI(streaming=True, callbacks=[callback_handler])
memory = ConversationBufferMemory()
chain = ConversationChain(llm=llm, memory=memory)
chat_interface.send(
"Send a message to get a reply from ChatGPT!", user="System", respond=False
)
chat_interface.servable()
Conclusion
In this blog post, we’ve taken an in-depth look at the exciting new ChatInterface widget in Panel. We started by guiding you through building a basic chatbot using pn.chat.ChatInterface
. We elevated your chatbot’s capabilities from there by seamlessly integrating OpenAI ChatGPT. To further enhance your understanding, we also explored the integration of LangChain with Panel’s ChatInterface. If you’re eager to explore more chatbot examples, don’t hesitate to visit this GitHub repository and consider contributing your own.
Armed with these insights and hands-on examples, you’re now well-prepared to embark on your journey of crafting AI chatbots in Panel. Happy coding!
. . .
Originally published on anaconda.com.
By Andrew Huang and Sophia Yang
Sophia Yang is a Senior Data Scientist. Connect with me on LinkedIn, Twitter, and YouTube and join the DS/ML Book Club ❤️