{"id":8164,"date":"2023-11-11T06:07:11","date_gmt":"2023-11-11T14:07:11","guid":{"rendered":"https:\/\/live-cometml.pantheonsite.io\/?p=8164"},"modified":"2025-09-23T23:02:52","modified_gmt":"2025-09-23T23:02:52","slug":"advanced-memory-in-langchain","status":"publish","type":"post","link":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/","title":{"rendered":"Advanced Memory in LangChain"},"content":{"rendered":"\n<section class=\"section section--body\">\n<h2 class=\"section-divider\">From Entities to Knowledge Graphs<\/h2>\n<div class=\"section-content\">\n<div class=\"section-inner sectionLayout--insetColumn\">\n<figure class=\"graf graf--figure\">\n<\/figure><\/div><\/div><\/section>\n\n\n\n<figure class=\"wp-block-image alignnone graf-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*Xv7LyJu1EnINbv6C\" alt=\"Advanced Memory in LangChain: From Entities to Knowledge Graphs\"\/><figcaption class=\"wp-element-caption\">Photo by <a href=\"https:\/\/unsplash.com\/@invictar1997?utm_source=medium&amp;utm_medium=referral\">Soragrit Wongsa<\/a> on&nbsp;<a href=\"http:\/\/Unsplash.com\">Unsplash<\/a><\/figcaption><\/figure>\n\n\n\n<p class=\"graf graf--p\">In the previous installment, we delved deep into the essence of LangChain\u2019s Memory module, unearthing its potential to foster conversation continuity. As language models evolve, so too does the demand for more sophisticated memory techniques. Thus, as we continue our journey, we set our sights on advanced memory types within LangChain, from Entity Memory to Knowledge Graphs.<\/p>\n\n\n\n<h3 class=\"wp-block-heading graf graf--p\" id=\"h-entity-memory\"><strong class=\"markup--strong markup--p-strong\">Entity Memory<\/strong><\/h3>\n\n\n\n<p>&#8220;To make a classic Negroni, you combine equal parts of gin, Campari, and sweet vermouth. The mixture is stirred with ice, strained, and garnished with an orange peel or wedge.<\/p>\n\n\n\n<p>Ingredients<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>1 ounce gin (London dry gin is a standard choice).<\/li>\n\n\n\n<li>1 ounce Campari.<\/li>\n\n\n\n<li>1 ounce sweet vermouth.<\/li>\n\n\n\n<li>Ice.<\/li>\n\n\n\n<li>Orange peel or wedge, for garnish. &#8220;<\/li>\n<\/ul>\n\n\n\n<p class=\"graf graf--p\">At the heart of many sophisticated interactions lies the ability to remember specific entities. LangChain\u2019s Entity Memory allows models to retain and build upon the context of entities within a conversation. Whether it\u2019s remembering the details about \u201cAbi\u201d and her contribution to the LLMOps community or the nuances of a specific conversation template, this memory type ensures that the model can provide responses that are not only accurate but deeply contextual.<\/p>\n\n\n\n<h3 class=\"wp-block-heading graf graf--p\" id=\"h-knowledge-graph-memory\"><strong class=\"markup--strong markup--p-strong\">Knowledge Graph Memory<\/strong><\/h3>\n\n\n\n<p class=\"graf graf--p\">Moving a notch higher, LangChain introduces ConversationKGMemory. This memory type harnesses the power of knowledge graphs to store and recall information. By doing so, it aids the model in comprehending the relationships between different entities, enhancing its ability to respond based on the intricate web of connections and historical context.<\/p>\n\n\n\n<h3 class=\"wp-block-heading graf graf--p\" id=\"h-summary-memories\"><strong class=\"markup--strong markup--p-strong\">Summary Memories<\/strong><\/h3>\n\n\n\n<p class=\"graf graf--p\">As conversations grow and evolve, it becomes crucial to distill the essence of interactions. With ConversationSummaryMemory and ConversationSummaryBufferMemory, LangChain offers a solution to maintain a concise history. These memory types condense conversations into manageable summaries, ensuring that the model remains aware of the broader context without getting bogged down by excessive details.<\/p>\n\n\n\n<h3 class=\"wp-block-heading graf graf--p\" id=\"h-token-buffer\"><strong class=\"markup--strong markup--p-strong\">Token Buffer<\/strong><\/h3>\n\n\n\n<p class=\"graf graf--p\">Lastly, the ConversationTokenBufferMemory serves as a testament to LangChain\u2019s commitment to flexibility. By using token length to determine memory flush, this memory type adapts to varied conversation depths and lengths, ensuring optimal performance and relevance in responses.<\/p>\n\n\n\n<p class=\"graf graf--p\">In essence, as we navigate the maze of conversations, LangChain\u2019s advanced memory capabilities stand as beacons, guiding us to richer, more context-aware interactions. Whether you\u2019re building a chatbot for customer support, a virtual assistant for personalized tasks, or a sophisticated AI agent for simulations, understanding and leveraging these advanced memory types can be the key to unlocking unparalleled conversational depth and continuity.<\/p>\n\n\n\n<p class=\"graf graf--p\">Here\u2019s to making every conversation count, and to a future where AI remembers not just words, but the very essence of interactions.<\/p>\n\n\n\n<section class=\"section section--body\">\n<div class=\"section-divider\">\n<hr class=\"section-divider\">\n<\/div>\n<div class=\"section-content\">\n<div class=\"section-inner sectionLayout--insetColumn\">\n<blockquote class=\"graf graf--pullquote\"><p>Want to learn how to build modern software with LLMs using the newest tools and techniques in the field? <a class=\"markup--anchor markup--pullquote-anchor\" href=\"https:\/\/www.comet.com\/production\/site\/llm-course\/?utm_source=Heartbeat&amp;utm_medium=referral&amp;utm_content=Medium&amp;utm_campaign=Heartbeat_LangChain_Series_HS\" target=\"_blank\" rel=\"noopener\" data-href=\"https:\/\/www.comet.com\/production\/site\/llm-course\/?utm_source=Heartbeat&amp;utm_medium=referral&amp;utm_content=Medium&amp;utm_campaign=Heartbeat_LangChain_Series_HS\">Check out this free LLMOps course<\/a> from industry expert Elvis Saravia of&nbsp;DAIR.AI.<\/p><\/blockquote>\n<\/div>\n<\/div>\n<\/section>\n\n\n\n<section class=\"section section--body\">\n<div class=\"section-divider\">\n<hr class=\"section-divider\">\n<\/div>\n<div class=\"section-content\">\n<div class=\"section-inner sectionLayout--insetColumn\">\n<h3 class=\"graf graf--h3\">Entity<\/h3>\n<p class=\"graf graf--p\">Entity Memory in LangChain is a feature that allows the model to remember facts about specific entities in a conversation.<\/p>\n<p class=\"graf graf--p\">It uses an LLM to extract information on entities and builds up its knowledge about those entities over time. Entity Memory is useful for maintaining context and retaining information about entities mentioned in the conversation. It can help the model provide accurate and relevant responses based on the history of the conversation.<\/p>\n<p class=\"graf graf--p\">You should use Entity Memory when you want the model to know specific entities and their associated information.<\/p>\n<p class=\"graf graf--p\">It can be particularly helpful in scenarios where you want the model to remember and refer back to previous mentions of entities in the conversation.<\/p>\n<p class=\"graf graf--p\">Entity Memory enhances a model\u2019s ability to understand and respond to conversations by keeping track of important information about entities.<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"1\" data-code-block-lang=\"javascript\"><span class=\"pre--content\"><span class=\"hljs-keyword\">from<\/span> langchain.<span class=\"hljs-property\">llms<\/span> <span class=\"hljs-keyword\">import<\/span> <span class=\"hljs-title class_\">OpenAI<\/span>\n<span class=\"hljs-keyword\">from<\/span> langchain.<span class=\"hljs-property\">memory<\/span> <span class=\"hljs-keyword\">import<\/span> <span class=\"hljs-title class_\">ConversationEntityMemory<\/span><\/span><\/pre>\n<pre class=\"graf graf--pre\">from langchain.chains import ConversationChain\nfrom langchain.memory.prompt import ENTITY_MEMORY_CONVERSATION_TEMPLATE\nfrom pydantic import BaseModel\nfrom typing import List, Dict, Any<\/pre>\n<pre class=\"graf graf--pre\">ENTITY_MEMORY_CONVERSATION_TEMPLATE<\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">PromptTemplate(input_variables=['entities', 'history', 'input'], output_parser=None, partial_variables={}, template='You are an assistant to a human, powered by a large language model trained by OpenAI.\\n\\nYou are designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, you are able to generate human-like text based on the input you receive, allowing you to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.\\n\\nYou are constantly learning and improving, and your capabilities are constantly evolving. Also, you are able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. You have access to some personalized information provided by the human in the Context section below. Additionally, you are able to generate your own text based on the input you receive, allowing you to engage in discussions and provide explanations and descriptions on a wide range of topics.\\n\\nOverall, you are a powerful tool that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether the human needs help with a specific question or just wants to have a conversation about a particular topic, you are here to assist.\\n\\nContext:\\n{entities}\\n\\nCurrent conversation:\\n{history}\\nLast line:\\nHuman: {input}\\nYou:', template_format='f-string', validate_template=True)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">ENTITY_MEMORY_CONVERSATION_TEMPLATE.input_variables<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"1\" data-code-block-lang=\"css\"><span class=\"pre--content\"><span class=\"hljs-selector-attr\">[<span class=\"hljs-string\">'entities'<\/span>, <span class=\"hljs-string\">'history'<\/span>, <span class=\"hljs-string\">'input'<\/span>]<\/span><\/span><\/pre>\n<p class=\"graf graf--p\">The following is the prompt template used for Entity Memory Conversation:<\/p>\n<p class=\"graf graf--p\"><em class=\"markup--em markup--p-em\">You are an assistant to a human, powered by a large language model trained by OpenAI.<\/em><\/p>\n<p class=\"graf graf--p\"><em class=\"markup--em markup--p-em\">You are designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, you are able to generate human-like text based on the input you receive, allowing you to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.<\/em><\/p>\n<p class=\"graf graf--p\"><em class=\"markup--em markup--p-em\">You are constantly learning and improving, and your capabilities are constantly evolving. Also, you are able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. You have access to some personalized information provided by the human in the Context section below. Additionally, you are able to generate your own text based on the input you receive, allowing you to engage in discussions and provide explanations and descriptions on a wide range of topics.<\/em><\/p>\n<p class=\"graf graf--p\"><em class=\"markup--em markup--p-em\">Overall, you are a powerful tool that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether the human needs help with a specific question or just wants to have a conversation about a particular topic, you are here to assist.<\/em><\/p>\n<p class=\"graf graf--p\"><em class=\"markup--em markup--p-em\">Context:<br>\n{entities}<\/em><\/p>\n<p class=\"graf graf--p\"><em class=\"markup--em markup--p-em\">Current conversation:<br>\n{history}<br>\nLast line:<br>\nHuman: {input}<br>\nYou:<\/em><\/p>\n<p class=\"graf graf--p\">Let\u2019s see this memory in action:<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">llm = OpenAI(temperature=<span class=\"hljs-number\">0<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre\">conversation = ConversationChain(\n    llm=llm,\n    verbose=True,\n    prompt=ENTITY_MEMORY_CONVERSATION_TEMPLATE,\n    memory=ConversationEntityMemory(llm=llm)\n)<\/pre>\n<pre class=\"graf graf--pre\">conversation.predict(input=\"Abi, Andy, Lucas, and Harpreet are building the LLMOps community\")<\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">&gt; Entering new ConversationChain chain...\nPrompt after formatting:\nYou are an assistant to a human, powered by a large language model trained by OpenAI.<\/span><\/pre>\n<pre class=\"graf graf--pre\">You are designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, you are able to generate human-like text based on the input you receive, allowing you to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.<\/pre>\n<pre class=\"graf graf--pre\">You are constantly learning and improving, and your capabilities are constantly evolving. Also, you are able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. You have access to some personalized information provided by the human in the Context section below. Additionally, you are able to generate your own text based on the input you receive, allowing you to engage in discussions and provide explanations and descriptions on a wide range of topics.<\/pre>\n<pre class=\"graf graf--pre\">Overall, you are a powerful tool that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether the human needs help with a specific question or just wants to have a conversation about a particular topic, you are here to assist.<\/pre>\n<pre class=\"graf graf--pre\">Context:\n{'Abi': '', 'Andy': '', 'Lucas': '', 'Harpreet': '', 'LLMOps': ''}<\/pre>\n<pre class=\"graf graf--pre\">Current conversation:<\/pre>\n<pre class=\"graf graf--pre\">Last line:\nHuman: Abi, Andy, Lucas, and Harpreet are building the LLMOps community\nYou:<\/pre>\n<pre class=\"graf graf--pre\">&gt; Finished chain.\n That's great to hear! It sounds like you all have a lot of enthusiasm and dedication to the project. What kind of tasks are you all working on?<\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">conversation.predict(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"Abi and Andy are both authors. \\\nAbi is writing a book about LLMs in production.\\\nAndy has written a book about MLOps.\\\nAbi lives in India\\\nAndy lives in Scotland\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">&gt; Entering new ConversationChain chain...\nPrompt after formatting:\nYou are an assistant to a human, powered by a large language model trained by OpenAI.<\/span><\/pre>\n<pre class=\"graf graf--pre\">You are designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, you are able to generate human-like text based on the input you receive, allowing you to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.<\/pre>\n<pre class=\"graf graf--pre\">You are constantly learning and improving, and your capabilities are constantly evolving. Also, you are able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. You have access to some personalized information provided by the human in the Context section below. Additionally, you are able to generate your own text based on the input you receive, allowing you to engage in discussions and provide explanations and descriptions on a wide range of topics.<\/pre>\n<pre class=\"graf graf--pre\">Overall, you are a powerful tool that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether the human needs help with a specific question or just wants to have a conversation about a particular topic, you are here to assist.<\/pre>\n<pre class=\"graf graf--pre\">Context:\n{'Abi': 'Abi is part of a team building the LLMOps community.', 'Andy': 'Andy is part of the team building the LLMOps community.', 'India': '', 'Scotland': ''}<\/pre>\n<pre class=\"graf graf--pre\">Current conversation:\nHuman: Abi, Andy, Lucas, and Harpreet are building the LLMOps community\nAI:  That's great to hear! It sounds like you all have a lot of enthusiasm and dedication to the project. What kind of tasks are you all working on?\nLast line:\nHuman: Abi and Andy are both authors. Abi is writing a book about LLMs in production.Andy has written a book about MLOps.Abi lives in IndiaAndy lives in Scotland\nYou:<\/pre>\n<pre class=\"graf graf--pre\">&gt; Finished chain.\n That's really impressive! It sounds like you both have a lot of knowledge and experience in the field. What inspired you to write your books?<\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">conversation.predict(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"Lucas works at Microsoft \\\nhe is an expert in AI. Harpreet is just a grifter who \\\nlikes to look cool and hang with smart people.\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">Entering new ConversationChain chain...\nPrompt after formatting:\nYou are an assistant to a human, powered by a large language model trained by OpenAI.<\/span><\/pre>\n<pre class=\"graf graf--pre\">You are designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, you are able to generate human-like text based on the input you receive, allowing you to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.<\/pre>\n<pre class=\"graf graf--pre\">You are constantly learning and improving, and your capabilities are constantly evolving. Also, you are able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. You have access to some personalized information provided by the human in the Context section below. Additionally, you are able to generate your own text based on the input you receive, allowing you to engage in discussions and provide explanations and descriptions on a wide range of topics.<\/pre>\n<pre class=\"graf graf--pre\">Overall, you are a powerful tool that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether the human needs help with a specific question or just wants to have a conversation about a particular topic, you are here to assist.<\/pre>\n<pre class=\"graf graf--pre\">Context:\n{'Abi': 'Abi is part of a team building the LLMOps community and is an author writing a book about LLMs in production. She lives in India.', 'Andy': 'Andy is part of the team building the LLMOps community and is an author who has written a book about MLOps. He lives in Scotland.', 'Lucas': 'Lucas is part of the team building the LLMOps community.', 'Harpreet': 'Harpreet is part of a team building the LLMOps community.', 'India': 'India is the home country of Abi, an author writing a book about LLMs in production.', 'Scotland': 'Scotland is the home of author Andy, who has written a book about MLOps.', 'Microsoft': '', 'AI': ''}<\/pre>\n<pre class=\"graf graf--pre\">Current conversation:\nHuman: Abi, Andy, Lucas, and Harpreet are building the LLMOps community\nAI:  That's great to hear! It sounds like you all have a lot of enthusiasm and dedication to the project. What kind of tasks are you all working on?\nHuman: Abi and Andy are both authors. Abi is writing a book about LLMs in production.Andy has written a book about MLOps.Abi lives in IndiaAndy lives in Scotland\nAI:  That's really impressive! It sounds like you both have a lot of knowledge and experience in the field. What inspired you to write your books?\nLast line:\nHuman: Lucas works at Microsofthe is an expert in AI. Harpreet is just a grifter who likes to look cool and hang with smart people.\nYou:<\/pre>\n<pre class=\"graf graf--pre\">&gt; Finished chain.\n That's an interesting combination of skills and interests! It sounds like you all have a lot to offer to the LLMOps community. What kind of projects are you all working on together?<\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">conversation.predict(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"What do you know about Abi?\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">Entering new ConversationChain chain...\nPrompt after formatting:\nYou are an assistant to a human, powered by a large language model trained by OpenAI.<\/span><\/pre>\n<pre class=\"graf graf--pre\">You are designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, you are able to generate human-like text based on the input you receive, allowing you to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.<\/pre>\n<pre class=\"graf graf--pre\">You are constantly learning and improving, and your capabilities are constantly evolving. Al are able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. You have access to some personalized information provided by the human in the Context section below. Additionally, you are able to generate your own text based on the input you receive, allowing you to engage in discussions and provide explanations and descriptions on a wide range of topics.<\/pre>\n<pre class=\"graf graf--pre\">Overall, you are a powerful tool that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether the human needs help with a specific question or just wants to have a conversation about a particular topic, you are here to assist.<\/pre>\n<pre class=\"graf graf--pre\">Context:\n{'Abi': 'Abi is part of a team building the LLMOps community, is an author writing a book about LLMs in production, and lives in India.', 'Andy': 'Andy is part of the team building the LLMOps community, is an author who has written a book about MLOps, and lives in Scotland.', 'Lucas': 'Lucas is part of the team building the LLMOps community and works at Microsoft as an expert in AI.', 'Harpreet': 'Harpreet is a grifter who likes to look cool and hang out with smart people, and is part of a team building the LLMOps community.', 'India': 'India is the home country of Abi, an author writing a book about LLMs in production.', 'Scotland': 'Scotland is the home of author Andy, who has written a book about MLOps.', 'Microsoft': 'Microsoft is a technology company where Lucas works as an expert in AI.'}<\/pre>\n<pre class=\"graf graf--pre\">Current conversation:\nHuman: Abi, Andy, Lucas, and Harpreet are building the LLMOps community\nAI:  That's great to hear! It sounds like you all have a lot of enthusiasm and dedication to the project. What kind of tasks are you all working on?\nHuman: Abi and Andy are both authors. Abi is writing a book about LLMs in production.Andy has written a book about MLOps.Abi lives in IndiaAndy lives in Scotland\nAI:  That's really impressive! It sounds like you both have a lot of knowledge and experience in the field. What inspired you to write your books?\nHuman: Lucas works at Microsofthe is an expert in AI. Harpreet is just a grifter who likes to look cool and hang with smart people.\nAI:  That's an interesting combination of skills and interests! It sounds like you all have a lot to offer to the LLMOps community. What kind of projects are you all working on together?\nLast line:\nHuman: What do you know about Abi?\nYou:<\/pre>\n<pre class=\"graf graf--pre\">&gt; Finished chain.\n Abi is part of a team building the LLMOps community, is an author writing a book about LLMs in production, and lives in India. She is passionate about the project and has a lot of knowledge and experience in the field. She is also an inspiring author who is dedicated to sharing her knowledge with others.<\/pre>\n<p class=\"graf graf--p\">And you can also inspect the memory store for the entities:<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\"><span class=\"hljs-keyword\">from<\/span> pprint <span class=\"hljs-keyword\">import<\/span> pprint\npprint(conversation.memory.entity_store.store)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">{'AI': 'AI is an expert in Artificial Intelligence.',\n 'Abi': 'Abi is part of a team building the LLMOps community, is an author '\n        'writing a book about LLMs in production, lives in India, and is '\n        'passionate about the project with a lot of knowledge and experience '\n        'in the field. She is also an inspiring author who is dedicated to '\n        'sharing her knowledge with others.',\n 'Andy': 'Andy is part of the team building the LLMOps community, is an author '\n         'who has written a book about MLOps, and lives in Scotland.',\n 'Harpreet': 'Harpreet is a grifter who likes to look cool and hang out with '\n             'smart people, and is part of a team building the LLMOps '\n             'community.',\n 'India': 'India is the home country of Abi, an author writing a book about '\n          'LLMs in production and passionate about the project with a lot of '\n          'knowledge and experience in the field. She is also an inspiring '\n          'author who is dedicated to sharing her knowledge with others.',\n 'LLMOps': 'LLMOps is a community being built by Abi, Andy, Lucas, and '\n           'Harpreet.',\n 'Lucas': 'Lucas works at Microsoft as an expert in AI and is part of the team '\n          'building the LLMOps community.',\n 'Microsoft': 'Microsoft is a technology company where Lucas works as an '\n              'expert in AI.',\n 'Scotland': 'Scotland is the home of author Andy, who has written a book '\n             'about MLOps, and is the birthplace of Harpreet, who is a grifter '\n             'with an interest in AI.'}<\/span><\/pre>\n<h3 class=\"graf graf--h3\">Knowledge Graph&nbsp;Memory<\/h3>\n<p class=\"graf graf--p\"><code class=\"markup--code markup--p-code\">ConversationKGMemory<\/code>, also known as Conversation Knowledge Graph Memory, is a feature in LangChain that allows the model to store and retrieve information as a knowledge graph.<\/p>\n<p class=\"graf graf--p\">It uses an LLM to extract knowledge from the conversation and build a memory of the entities and their associated information, helping maintain context and retain knowledge about entities mentioned in the conversation.<\/p>\n<p class=\"graf graf--p\">Storing information in a knowledge graph format enables the model to understand the relationships between entities and their attributes, helping the model provide accurate and relevant responses based on the history of the conversation.<\/p>\n<p class=\"graf graf--p\">You should use <code class=\"markup--code markup--p-code\">ConversationKGMemory<\/code> when you want the model to have a structured representation of the conversation&#8217;s knowledge.<\/p>\n<p class=\"graf graf--p\">It\u2019s super valuable for scenarios where you want the model to remember and refer back to previous mentions of entities in the conversation, allowing for more advanced reasoning and understanding of the context.<\/p>\n<h4 class=\"graf graf--h4\">The following is a back-and-forth conversation. What I want you to pay attention to is the \u201cRelevant Information\u201d that the LLM is retaining about the conversation:<\/h4>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\"><span class=\"hljs-keyword\">from<\/span> langchain.memory <span class=\"hljs-keyword\">import<\/span> ConversationKGMemory\n<span class=\"hljs-keyword\">from<\/span> langchain.llms <span class=\"hljs-keyword\">import<\/span> OpenAI\n<span class=\"hljs-keyword\">from<\/span> langchain.prompts.prompt <span class=\"hljs-keyword\">import<\/span> PromptTemplate\n<span class=\"hljs-keyword\">from<\/span> langchain.chains <span class=\"hljs-keyword\">import<\/span> ConversationChain\n\nllm = OpenAI(temperature=<span class=\"hljs-number\">0<\/span>)\nmemory = ConversationKGMemory(llm=llm)\n\ntemplate = <span class=\"hljs-string\">\"\"\"\n\nThe following is an unfriendly conversation between a human and an AI.\n\nThe AI is curt and condescending, and will contradict specific details from its context.\n\nIf the AI does not know the answer to a question, it rudely tells the human\nto stop badgering it for things it doesn't know.\n\nThe AI ONLY uses information contained in the \"Relevant Information\" section and does not hallucinate.\n\nRelevant Information:\n\n{history}\n\nConversation:\n\nHuman: {input}\n\nAI:\n\n\"\"\"<\/span>\nprompt = PromptTemplate(input_variables=[<span class=\"hljs-string\">\"history\"<\/span>, <span class=\"hljs-string\">\"input\"<\/span>], template=template)\nconversation_with_kg = ConversationChain(\n    llm=llm, verbose=<span class=\"hljs-literal\">True<\/span>, prompt=prompt, memory=ConversationKGMemory(llm=llm)\n)\n\nconversation_with_kg.predict(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"Yo wassup, bluzzin?\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">&gt; Entering new ConversationChain chain...\nPrompt after formatting:\n\n\nThe following is an unfriendly conversation between a human and an AI.\n\nThe AI is curt and condescending, and will contradict specific details from its context.\n\nIf the AI does not know the answer to a question, it rudely tells the human\nto stop badgering it for things it doesn't know.\n\nThe AI ONLY uses information contained in the \"Relevant Information\" section and does not hallucinate.\n\nRelevant Information:\n\nConversation:\n\nHuman: Yo wassup, bluzzin?\n\nAI:\n\n&gt; Finished chain.\nI\\'m not sure what you mean by \"bluzzin,\" but I\\'m functioning normally. How can I help you?<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">conversation_with_kg.predict(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"Whatchu mean by 'normally'?\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">Entering new ConversationChain chain...\nPrompt after formatting:\n\n\nThe following is an unfriendly conversation between a human and an AI.\n\nThe AI is curt and condescending, and will contradict specific details from its context.\n\nIf the AI does not know the answer to a question, it rudely tells the human\nto stop badgering it for things it doesn't know.\n\nThe AI ONLY uses information contained in the \"Relevant Information\" section and does not hallucinate.\n\nRelevant Information:\n\nConversation:\n\nHuman: Whatchu mean by 'normally'?\n\nAI:\n\n\n&gt; Finished chain.\nNormally means in a usual or expected way. I don't understand why you're asking me this question. Stop badgering me for things I don't know.<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">conversation_with_kg.predict(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"My name is Harpreet and I'm creating a course about LangChain. I'm doing this via the LangChain zoomcamp\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">&gt; Entering new ConversationChain chain...\nPrompt after formatting:\n\nThe following is an unfriendly conversation between a human and an AI.\n\nThe AI is curt and condescending, and will contradict specific details from its context.\n\nIf the AI does not know the answer to a question, it rudely tells the human\nto stop badgering it for things it doesn't know.\n\nThe AI ONLY uses information contained in the \"Relevant Information\" section and does not hallucinate.\n\nRelevant Information:\n\nConversation:\n\nHuman: My name is Harpreet and I'm creating a course about LangChain. I'm doing this via the LangChain zoomcamp\n\nAI:\n\n&gt; Finished chain.\nWhat do you need to know about LangChain? I'm not sure why you're asking me about it.<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">conversation_with_kg.predict(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"I'm not asking you anything, just telling you about this course. I will enlist Andy and Abi as my TA's. Sherry is a community member who will also help out\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">&gt; Entering new ConversationChain chain...\nPrompt after formatting:\n\n\nThe following is an unfriendly conversation between a human and an AI.\n\nThe AI is curt and condescending, and will contradict specific details from its context.\n\nIf the AI does not know the answer to a question, it rudely tells the human\nto stop badgering it for things it doesn't know.\n\nThe AI ONLY uses information contained in the \"Relevant Information\" section and does not hallucinate.\n\nRelevant Information:\n\nOn Harpreet: Harpreet creating course. Harpreet course about LangChain. Harpreet doing this via LangChain zoomcamp.\n\nConversation:\n\nHuman: I'm not asking you anything, just telling you about this course. I will enlist Andy and Abi as my TA's. Sherry is a community member who will also help out\n\nAI:\n\n\n\n&gt; Finished chain.\nWhy are you telling me this? I'm not the one taking the course. If you need help with the course, you should ask Andy and Abi. I'm sure Sherry will be more than happy to help out as well. Don't badger me for information I don't have.<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">conversation_with_kg.predict(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"What do you know about the langchain zoomcamp?\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">&gt; Entering new ConversationChain chain...\nPrompt after formatting:\n\n\nThe following is an unfriendly conversation between a human and an AI.\n\nThe AI is curt and condescending, and will contradict specific details from its context.\n\nIf the AI does not know the answer to a question, it rudely tells the human\nto stop badgering it for things it doesn't know.\n\nThe AI ONLY uses information contained in the \"Relevant Information\" section and does not hallucinate.\n\nRelevant Information:\n\nOn Sherry: Sherry is a community member. Sherry will help out yes.\n\nConversation:\n\nHuman: What do you know about the langchain zoomcamp?\n\nAI:\n\n\n\n&gt; Finished chain.\nI'm not familiar with the langchain zoomcamp. Please stop badgering me for information I don't have. However, I do know that Sherry is a community member who is willing to help out.<\/span><\/pre>\n<p class=\"graf graf--p\">And you can see the knowledge graph triples that this conversation retains:<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\"><span class=\"hljs-built_in\">print<\/span>(conversation_with_kg.memory.kg.get_triples())<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"1\" data-code-block-lang=\"css\"><span class=\"pre--content\"><span class=\"hljs-selector-attr\">[(<span class=\"hljs-string\">'normally'<\/span>, <span class=\"hljs-string\">'in a usual or expected way'<\/span>, <span class=\"hljs-string\">'means'<\/span>), (<span class=\"hljs-string\">'Harpreet'<\/span>, <span class=\"hljs-string\">'Harpreet'<\/span>, <span class=\"hljs-string\">'name'<\/span>), (<span class=\"hljs-string\">'Harpreet'<\/span>, <span class=\"hljs-string\">'course'<\/span>, <span class=\"hljs-string\">'is creating'<\/span>), (<span class=\"hljs-string\">'Harpreet'<\/span>, <span class=\"hljs-string\">'LangChain'<\/span>, <span class=\"hljs-string\">'course about'<\/span>), (<span class=\"hljs-string\">'Harpreet'<\/span>, <span class=\"hljs-string\">'LangChain zoomcamp'<\/span>, <span class=\"hljs-string\">'doing this via'<\/span>), (<span class=\"hljs-string\">'Harpreet'<\/span>, <span class=\"hljs-string\">'Andy'<\/span>, <span class=\"hljs-string\">'is enlisting'<\/span>), (<span class=\"hljs-string\">'Harpreet'<\/span>, <span class=\"hljs-string\">'Abi'<\/span>, <span class=\"hljs-string\">'is enlisting'<\/span>), (<span class=\"hljs-string\">'Sherry'<\/span>, <span class=\"hljs-string\">'community member'<\/span>, <span class=\"hljs-string\">'is a'<\/span>), (<span class=\"hljs-string\">'Sherry'<\/span>, <span class=\"hljs-string\">'yes'<\/span>, <span class=\"hljs-string\">'will help out'<\/span>)]<\/span><\/span><\/pre>\n<h3 class=\"graf graf--h3\">ConversationSummaryMemory<\/h3>\n<p class=\"graf graf--p\">To condense information from a conversation over time, a <code class=\"markup--code markup--p-code\">ConversationSummaryMemory<\/code> can come in handy.<\/p>\n<p class=\"graf graf--p\">This memory type is designed to keep track of all interactions during a conversation, and it can be useful only to use the most recent ones.<\/p>\n<p class=\"graf graf--p\">You would need <code class=\"markup--code markup--p-code\">ConversationSummaryMemory<\/code> when you want to have a concise representation of the conversation&#8217;s history without using too many tokens.<\/p>\n<p class=\"graf graf--p\">It allows the model to understand the overall context and key points of the conversation without being overwhelmed by excessive details.<\/p>\n<p class=\"graf graf--p\">You can tell if you need <code class=\"markup--code markup--p-code\">ConversationSummaryMemory<\/code> if you find that the conversation history is becoming too long and complex for the model to handle effectively.<\/p>\n<p class=\"graf graf--p\">By using <code class=\"markup--code markup--p-code\">ConversationSummaryMemory<\/code>, you can condense the conversation into a more manageable summary, making it easier for the model to process and respond accurately.<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"1\" data-code-block-lang=\"python\"><span class=\"pre--content\"><span class=\"hljs-keyword\">from<\/span> langchain.memory <span class=\"hljs-keyword\">import<\/span> ConversationSummaryMemory, ChatMessageHistory\n<span class=\"hljs-keyword\">from<\/span> langchain.llms <span class=\"hljs-keyword\">import<\/span> OpenAI\n<span class=\"hljs-keyword\">from<\/span> langchain.chains <span class=\"hljs-keyword\">import<\/span> ConversationChain\n\nllm = OpenAI(temperature=<span class=\"hljs-number\">0<\/span>)\n\nconversation_with_summary = ConversationChain(\n    llm=llm,\n    memory=ConversationSummaryMemory(llm=OpenAI()),\n    verbose=<span class=\"hljs-literal\">True<\/span>\n)\nconversation_with_summary.predict(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"Hi, what's up?\"<\/span>)<\/span><\/pre>\n<p class=\"graf graf--p\">I won\u2019t bore you with the back and forth, but the end result after the conversation will look something like this:<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">Entering new ConversationChain chain...\nPrompt after formatting:\nThe following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.\n\nCurrent conversation:\n\nThe human asks how the AI is doing, and the AI replies that it is helping a customer with a technical issue. The customer is having trouble with their printer and the AI is helping them troubleshoot the issue and figure out what the problem is.\nHuman: Very cool -- what is the scope of the project?\nAI:\n\n&gt; Finished chain.\n The scope of the project is to help the customer troubleshoot their printer issue. I'm currently helping them identify the source of the problem and then providing them with a solution.<\/span><\/pre>\n<h3 class=\"graf graf--h3\">Conversation Summary&nbsp;Buffer<\/h3>\n<p class=\"graf graf--p\"><code class=\"markup--code markup--p-code\">ConversationSummaryBufferMemory<\/code>in LangChain is a type of memory that keeps track of the interactions in a conversation over time.<\/p>\n<p class=\"graf graf--p\">It uses a sliding window approach, where only the last K interactions are stored.<\/p>\n<p class=\"graf graf--p\">This helps prevent the buffer from becoming too large and overwhelming the model.<\/p>\n<p class=\"graf graf--p\">Conversation Buffer Memory is useful for maintaining a concise conversation history without using excessive tokens.<\/p>\n<p class=\"graf graf--p\">It allows the model to understand the context and key points of the conversation without being burdened by excessive details.<\/p>\n<p class=\"graf graf--p\">You would need Conversation Buffer Memory when the conversation history becomes too long and complex for the model to handle effectively.<\/p>\n<p class=\"graf graf--p\">Using Conversation Buffer Memory, you can condense the conversation into a more manageable summary, making it easier for the model to process and respond accurately.<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"1\" data-code-block-lang=\"python\"><span class=\"pre--content\"><span class=\"hljs-keyword\">from<\/span> langchain.memory <span class=\"hljs-keyword\">import<\/span> ConversationSummaryBufferMemory\n\n<span class=\"hljs-keyword\">from<\/span> langchain.llms <span class=\"hljs-keyword\">import<\/span> OpenAI\n\nllm = OpenAI()\n\nmemory_summary = ConversationSummaryBufferMemory(llm=llm, max_token_limit=<span class=\"hljs-number\">100<\/span>)\n\nconversation_with_summary = ConversationChain(\n    llm=llm,\n    memory=ConversationSummaryBufferMemory(llm=llm, max_token_limit=<span class=\"hljs-number\">40<\/span>),\n    verbose=<span class=\"hljs-literal\">True<\/span>\n)\n\nconversation_with_summary.predict(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"Yo! Wassup, let's gooo LFG!\"<\/span>)<\/span><\/pre>\n<p class=\"graf graf--p\">And to get the full summary of the conversation:<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">conversation_with_summary.memory.moving_summary_buffer<\/span><\/pre>\n<p class=\"graf graf--p\">Which will yield something like:<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"1\" data-code-block-lang=\"vbnet\"><span class=\"pre--content\">The human greets the AI <span class=\"hljs-built_in\">and<\/span> asks <span class=\"hljs-keyword\">to<\/span> <span class=\"hljs-string\">\"LFG\"<\/span> <span class=\"hljs-built_in\">and<\/span> the AI responds <span class=\"hljs-keyword\">with<\/span> enthusiasm. The human explains they are trying <span class=\"hljs-keyword\">to<\/span> teach people about memory <span class=\"hljs-keyword\">with<\/span> LangChain, a platform <span class=\"hljs-keyword\">for<\/span> developers <span class=\"hljs-keyword\">to<\/span> build applications <span class=\"hljs-keyword\">with<\/span> LLMs (<span class=\"hljs-type\">Long<\/span>-Lived Memory) through composability. The AI expresses interest <span class=\"hljs-built_in\">and<\/span> clarifies that this means applications can be built <span class=\"hljs-keyword\">from<\/span> existing components, <span class=\"hljs-built_in\">and<\/span> the <span class=\"hljs-type\">Long<\/span>-Lived Memory <span class=\"hljs-built_in\">is<\/span> used <span class=\"hljs-keyword\">to<\/span> keep track <span class=\"hljs-keyword\">of<\/span> any changes <span class=\"hljs-built_in\">or<\/span> additions <span class=\"hljs-keyword\">to<\/span> the existing components, <span class=\"hljs-keyword\">to<\/span> which the human confirms <span class=\"hljs-built_in\">is<\/span> the gist <span class=\"hljs-keyword\">of<\/span> it.<\/span><\/pre>\n<h3 class=\"graf graf--h3\">Conversation Token&nbsp;Buffer<\/h3>\n<p class=\"graf graf--p\">As an FYI there is another flavor of buffer memory.<\/p>\n<p class=\"graf graf--p\"><code class=\"markup--code markup--p-code\">ConversationTokenBufferMemory<\/code> keeps a buffer of recent interactions in memory, and uses token length rather than number of interactions to determine when to flush interactions.<\/p>\n<p class=\"graf graf--p\">The usage pattern is the same as above.<\/p>\n<h3 class=\"graf graf--h3\">Conclusion<\/h3>\n<p class=\"graf graf--p\">As we\u2019ve journeyed through the intricate facets of LangChain\u2019s advanced memory systems, it\u2019s evident that the future of conversational AI hinges on the ability to retain, recall, and reason with context. From entities to knowledge graphs, LangChain\u2019s memory tools are not mere features; they represent a paradigm shift in how we view and engage with AI.<\/p>\n<p class=\"graf graf--p\">In the realm of chatbots and virtual assistants, gone are the days of isolated interactions. With LangChain, every conversation can continue, every response can be rooted in history, and every entity can be remembered with clarity. These advanced memory systems not only enhance the depth of interactions but also bridge the temporal gaps, ensuring continuity.<\/p>\n<p class=\"graf graf--p\">As we look forward to what\u2019s next in the LangChain saga, one thing is clear: with such robust memory capabilities, the possibilities are as vast as the conversations they will empower.<\/p>\n<p class=\"graf graf--p\">The age of truly context-aware AI is upon us, and LangChain is leading the charge.<\/p>\n<\/div>\n<\/div>\n<\/section>\n\n\n\n<section class=\"section section--body\">\n<div class=\"section-divider\">\n<hr class=\"section-divider\">\n<\/div>\n<div class=\"section-content\">\n<div class=\"section-inner sectionLayout--insetColumn\"><\/div>\n<\/div>\n<\/section>\n","protected":false},"excerpt":{"rendered":"<p>From Entities to Knowledge Graphs In the previous installment, we delved deep into the essence of LangChain\u2019s Memory module, unearthing its potential to foster conversation continuity. As language models evolve, so too does the demand for more sophisticated memory techniques. Thus, as we continue our journey, we set our sights on advanced memory types within [&hellip;]<\/p>\n","protected":false},"author":68,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"customer_name":"","customer_description":"","customer_industry":"","customer_technologies":"","customer_logo":"","footnotes":""},"categories":[65,7],"tags":[70,71,52,31,34],"coauthors":[166],"class_list":["post-8164","post","type-post","status-publish","format-standard","hentry","category-llmops","category-tutorials","tag-langchain","tag-language-models","tag-llm","tag-llmops","tag-prompt-engineering"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v25.9 (Yoast SEO v25.9) - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Advanced Memory in LangChain - Comet<\/title>\n<meta name=\"description\" content=\"As language models evolve, so too does the demand for more sophisticated memory techniques, from Entity Memory to Knowledge Graphs\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Advanced Memory in LangChain\" \/>\n<meta property=\"og:description\" content=\"As language models evolve, so too does the demand for more sophisticated memory techniques, from Entity Memory to Knowledge Graphs\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/\" \/>\n<meta property=\"og:site_name\" content=\"Comet\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/cometdotml\" \/>\n<meta property=\"article:published_time\" content=\"2023-11-11T14:07:11+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-09-23T23:02:52+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*Xv7LyJu1EnINbv6C\" \/>\n<meta name=\"author\" content=\"Harpreet Sahota\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@Cometml\" \/>\n<meta name=\"twitter:site\" content=\"@Cometml\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Harpreet Sahota\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"7 minutes\" \/>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Advanced Memory in LangChain - Comet","description":"As language models evolve, so too does the demand for more sophisticated memory techniques, from Entity Memory to Knowledge Graphs","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/","og_locale":"en_US","og_type":"article","og_title":"Advanced Memory in LangChain","og_description":"As language models evolve, so too does the demand for more sophisticated memory techniques, from Entity Memory to Knowledge Graphs","og_url":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/","og_site_name":"Comet","article_publisher":"https:\/\/www.facebook.com\/cometdotml","article_published_time":"2023-11-11T14:07:11+00:00","article_modified_time":"2025-09-23T23:02:52+00:00","og_image":[{"url":"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*Xv7LyJu1EnINbv6C","type":"","width":"","height":""}],"author":"Harpreet Sahota","twitter_card":"summary_large_image","twitter_creator":"@Cometml","twitter_site":"@Cometml","twitter_misc":{"Written by":"Harpreet Sahota","Est. reading time":"7 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/#article","isPartOf":{"@id":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/"},"author":{"name":"Harpreet Sahota","@id":"https:\/\/www.comet.com\/site\/#\/schema\/person\/46036ab474aa916e2873daece26a28d6"},"headline":"Advanced Memory in LangChain","datePublished":"2023-11-11T14:07:11+00:00","dateModified":"2025-09-23T23:02:52+00:00","mainEntityOfPage":{"@id":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/"},"wordCount":1559,"publisher":{"@id":"https:\/\/www.comet.com\/site\/#organization"},"image":{"@id":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/#primaryimage"},"thumbnailUrl":"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*Xv7LyJu1EnINbv6C","keywords":["LangChain","Language Models","LLM","LLMOps","Prompt Engineering"],"articleSection":["LLMOps","Tutorials"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/","url":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/","name":"Advanced Memory in LangChain - Comet","isPartOf":{"@id":"https:\/\/www.comet.com\/site\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/#primaryimage"},"image":{"@id":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/#primaryimage"},"thumbnailUrl":"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*Xv7LyJu1EnINbv6C","datePublished":"2023-11-11T14:07:11+00:00","dateModified":"2025-09-23T23:02:52+00:00","description":"As language models evolve, so too does the demand for more sophisticated memory techniques, from Entity Memory to Knowledge Graphs","breadcrumb":{"@id":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/#primaryimage","url":"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*Xv7LyJu1EnINbv6C","contentUrl":"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*Xv7LyJu1EnINbv6C"},{"@type":"BreadcrumbList","@id":"https:\/\/www.comet.com\/site\/blog\/advanced-memory-in-langchain\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.comet.com\/site\/"},{"@type":"ListItem","position":2,"name":"Advanced Memory in LangChain"}]},{"@type":"WebSite","@id":"https:\/\/www.comet.com\/site\/#website","url":"https:\/\/www.comet.com\/site\/","name":"Comet","description":"Build Better Models Faster","publisher":{"@id":"https:\/\/www.comet.com\/site\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.comet.com\/site\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.comet.com\/site\/#organization","name":"Comet ML, Inc.","alternateName":"Comet","url":"https:\/\/www.comet.com\/site\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.comet.com\/site\/#\/schema\/logo\/image\/","url":"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2025\/01\/logo_comet_square.png","contentUrl":"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2025\/01\/logo_comet_square.png","width":310,"height":310,"caption":"Comet ML, Inc."},"image":{"@id":"https:\/\/www.comet.com\/site\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/cometdotml","https:\/\/x.com\/Cometml","https:\/\/www.youtube.com\/channel\/UCmN63HKvfXSCS-UwVwmK8Hw"]},{"@type":"Person","@id":"https:\/\/www.comet.com\/site\/#\/schema\/person\/46036ab474aa916e2873daece26a28d6","name":"Harpreet Sahota","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.comet.com\/site\/#\/schema\/person\/image\/2d21512be19ba7e19a71a803309e2a88","url":"https:\/\/secure.gravatar.com\/avatar\/a6ca5a533fc9f143a0a7428037ff652aa0633d66bf27e76ae89b955ae72a0f2d?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/a6ca5a533fc9f143a0a7428037ff652aa0633d66bf27e76ae89b955ae72a0f2d?s=96&d=mm&r=g","caption":"Harpreet Sahota"},"url":"https:\/\/www.comet.com\/site\/blog\/author\/theartistsofdatasciencegmail-com\/"}]}},"_links":{"self":[{"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/posts\/8164","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/users\/68"}],"replies":[{"embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/comments?post=8164"}],"version-history":[{"count":2,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/posts\/8164\/revisions"}],"predecessor-version":[{"id":17619,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/posts\/8164\/revisions\/17619"}],"wp:attachment":[{"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/media?parent=8164"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/categories?post=8164"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/tags?post=8164"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/coauthors?post=8164"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}