Conversation Summary Memory(对话总结记忆)
对话总结记忆会在对话进行时对其进行总结并储存在记忆中。这个记忆能够被用于将当前对话总结注入到提示/链中。这个记忆在对较长的对话进行总结时非常有用,因为直接在提示中保留所有过去的对话历史信息将会占用过多的token。
Usage(使用方法),与LLM一起使用
import { OpenAI } from "langchain/llms/openai";
import { ConversationSummaryMemory } from "langchain/memory";
import { LLMChain } from "langchain/chains";
import { PromptTemplate } from "langchain/prompts";
export const run = async () => {
const memory = new ConversationSummaryMemory({
memoryKey: "chat_history",
llm: new OpenAI({ modelName: "gpt-3.5-turbo", temperature: 0 }),
});
const model = new OpenAI({ temperature: 0.9 });
const prompt =
PromptTemplate.fromTemplate(`The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
Current conversation:
{chat_history}
Human: {input}
AI:`);
const chain = new LLMChain({ llm: model, prompt, memory });
const res1 = await chain.call({ input: "Hi! I'm Jim." });
console.log({ res1, memory: await memory.loadMemoryVariables({}) });
/*
{
res1: {
text: " Hi Jim, I'm AI! It's nice to meet you. I'm an AI programmed to provide information about the environment around me. Do you have any specific questions about the area that I can answer for you?"
},
memory: {
chat_history: 'Jim introduces himself to the AI and the AI responds, introducing itself as a program designed to provide information about the environment. The AI offers to answer any specific questions Jim may have about the area.'
}
}
*/
const res2 = await chain.call({ input: "What's my name?" });
console.log({ res2, memory: await memory.loadMemoryVariables({}) });
/*
{
res2: { text: ' You told me your name is Jim.' },
memory: {
chat_history: 'Jim introduces himself to the AI and the AI responds, introducing itself as a program designed to provide information about the environment. The AI offers to answer any specific questions Jim may have about the area. Jim asks the AI what his name is, and the AI responds that Jim had previously told it his name.'
}
}
*/
};
Usage(使用方法),与聊天模型一起使用
import { ChatOpenAI } from "langchain/chat_models/openai";
import { ConversationSummaryMemory } from "langchain/memory";
import { LLMChain } from "langchain/chains";
import { PromptTemplate } from "langchain/prompts";
export const run = async () => {
const memory = new ConversationSummaryMemory({
memoryKey: "chat_history",
llm: new ChatOpenAI({ modelName: "gpt-3.5-turbo", temperature: 0 }),
});
const model = new ChatOpenAI();
const prompt =
PromptTemplate.fromTemplate(`The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
Current conversation:
{chat_history}
Human: {input}
AI:`);
const chain = new LLMChain({ llm: model, prompt, memory });
const res1 = await chain.call({ input: "Hi! I'm Jim." });
console.log({ res1, memory: await memory.loadMemoryVariables({}) });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
},
memory: {
chat_history: 'Jim introduces himself to the AI and the AI greets him and offers assistance.'
}
}
*/
const res2 = await chain.call({ input: "What's my name?" });
console.log({ res2, memory: await memory.loadMemoryVariables({}) });
/*
{
res2: {
text: "Your name is Jim. It's nice to meet you, Jim. How can I assist you today?"
},
memory: {
chat_history: 'Jim introduces himself to the AI and the AI greets him and offers assistance. The AI addresses Jim by name and asks how it can assist him.'
}
}
*/
};