This tutorial teaches how to add a memory feature to a chatbot using LangChain for improved dialogue context.
In this tutorial, you’ll learn how to build a simple memory (conversation history) feature into a chatbot using LangChain. We’ll start with a stateless prompt, observe why follow-up questions break, then enhance our chain to carry context across messages for a more natural dialogue.
First, let’s define a minimal chat chain without any history support:
Copy
Ask AI
from langchain_core.prompts import ChatPromptTemplatefrom langchain_openai.chat_models import ChatOpenAImodel = ChatOpenAI()prompt = ChatPromptTemplate.from_messages([ ("system", "You’re an assistant with expertise in {ability}."), ("human", "{input}"),])base_chain = prompt | model
Invoke it twice—once to ask about right-angled triangles, then a follow-up without context:
Copy
Ask AI
# First call: establishing the topicresponse1 = base_chain.invoke({ "ability": "math", "input": "What’s a right-angled triangle?"})print(response1.content)# Second call: follow-up without historyresponse2 = base_chain.invoke({ "ability": "math", "input": "What are the other types?"})print(response2.content)# ➜ "Could you please clarify what you’re asking about?"
Without passing previous messages, the model has no context to answer follow-up questions.
Build a simple list of (role, content) tuples that tracks the dialogue so far:
Copy
Ask AI
history = [ ("human", "What’s a right-angled triangle?"), ("ai", "A right-angled triangle has one angle of 90 degrees, " "with the other two angles summing to 90 degrees."),]
Pass the history on each request to provide context:
Copy
Ask AI
response = base_chain.invoke({ "ability": "math", "input": "What are the other types?", "history": history})print(response.content)# ➜ "Other types of triangles include equilateral (all sides equal), "# "isosceles (two sides equal), and scalene (no sides equal)."
By supplying the history key, the chain can reference previous turns and answer accurately.
Here’s the complete, memory-enabled chatbot example:
Copy
Ask AI
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholderfrom langchain_openai.chat_models import ChatOpenAImodel = ChatOpenAI()prompt = ChatPromptTemplate.from_messages([ ("system", "You’re an assistant who’s good at {ability}."), MessagesPlaceholder(variable_name="history"), ("human", "{input}"),])base_chain = prompt | modelhistory = [ ("human", "What’s a right-angled triangle?"), ("ai", "A right-angled triangle has one angle of 90 degrees, " "with the other two angles summing to 90 degrees.")]result = base_chain.invoke({ "ability": "math", "input": "What are the other types?", "history": history})print(result.content)
You can also include adjustable parameters—like response length limits—directly in your system prompt:
Copy
Ask AI
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholderfrom langchain_openai.chat_models import ChatOpenAImodel = ChatOpenAI()prompt = ChatPromptTemplate.from_messages([ ("system", "You’re an assistant who’s good at {ability}. Respond in 20 words or fewer."), MessagesPlaceholder(variable_name="history"), ("human", "{input}"),])base_chain = prompt | modelhistory = [ ("human", "What’s a right-angled triangle?"), ("ai", "A right-angled triangle has one angle of 90 degrees, " "with the other two angles summing to 90 degrees.")]response = base_chain.invoke({ "ability": "math", "input": "What are the other types?", "history": history})print(response.content)