{"id":8277,"date":"2023-11-30T06:50:32","date_gmt":"2023-11-30T14:50:32","guid":{"rendered":"https:\/\/live-cometml.pantheonsite.io\/?p=8277"},"modified":"2025-04-24T17:04:00","modified_gmt":"2025-04-24T17:04:00","slug":"contextual-recall-in-langchain-agents","status":"publish","type":"post","link":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/","title":{"rendered":"Contextual Recall in LangChain Agents"},"content":{"rendered":"\n<section class=\"section section--body\">\n<div class=\"section-divider\"><\/div>\n<div class=\"section-content\">\n<div class=\"section-inner sectionLayout--insetColumn\">\n<h2 class=\"graf graf--h4\">Empowering Conversational AI with Contextual Recall<\/h2>\n<figure class=\"graf graf--figure\">\n<\/figure><\/div><\/div><\/section>\n\n\n\n<figure class=\"wp-block-image alignnone graf-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*e8fL__n-4GUQKQXn\" alt=\"Contextual Recall in LangChain Agents, Comet, CometLLM\"\/><figcaption class=\"wp-element-caption\">Photo by <a href=\"https:\/\/unsplash.com\/@thefredyjacob?utm_source=medium&amp;utm_medium=referral\">Fredy Jacob<\/a> on\u00a0<a href=\"http:\/\/Unsplash.com\">Unsplash<\/a><\/figcaption><\/figure>\n\n\n\n<h3 class=\"wp-block-heading graf graf--h3\">Memory in&nbsp;Agents<\/h3>\n\n\n\n<p class=\"graf graf--p\">Memory in Agents is an important feature that allows them to retain information from previous interactions and use it to provide more accurate and context-aware responses.<\/p>\n\n\n\n<p class=\"graf graf--p\">By incorporating memory into an Agent, it can remember the history of the conversation and use that information to answer subsequent questions more effectively. You need to give memory to your Agent when you want it to remember the history of the conversation and use that information to answer subsequent questions more effectively.<\/p>\n\n\n\n<p class=\"graf graf--p\">An Agent with Memory is different from a Chain with Memory regarding functionality.<\/p>\n\n\n\n<p class=\"graf graf--p\">A Chain with Memory is a sequence of actions and observations that can be used to perform a specific task. In contrast, an Agent with Memory is a conversational agent that can engage in a conversation with the user and utilize memory to provide context-aware responses.<\/p>\n\n\n\n<section class=\"section section--body\">\n<div class=\"section-divider\">\n<hr class=\"section-divider\">\n<\/div>\n<div class=\"section-content\">\n<div class=\"section-inner sectionLayout--insetColumn\">\n<blockquote class=\"graf graf--pullquote\"><p>Want to learn how to build modern software with LLMs using the newest tools and techniques in the field? <a class=\"markup--anchor markup--pullquote-anchor\" href=\"https:\/\/www.comet.com\/production\/site\/llm-course\/?utm_source=Heartbeat&amp;utm_medium=referral&amp;utm_content=Medium&amp;utm_campaign=Heartbeat_LangChain_Series_HS\" target=\"_blank\" rel=\"noopener ugc nofollow\" data-href=\"https:\/\/www.comet.com\/production\/site\/llm-course\/?utm_source=Heartbeat&amp;utm_medium=referral&amp;utm_content=Medium&amp;utm_campaign=Heartbeat_LangChain_Series_HS\">Check out this free LLMOps course<\/a> from industry expert Elvis Saravia of&nbsp;DAIR.AI!<\/p><\/blockquote>\n<\/div>\n<\/div>\n<\/section>\n\n\n\n<section class=\"section section--body\">\n<div class=\"section-divider\">\n<hr class=\"section-divider\">\n<\/div>\n<div class=\"section-content\">\n<div class=\"section-inner sectionLayout--insetColumn\">\n<h3 class=\"graf graf--h3\">Here are the main differences between an Agent with Memory and a Chain with&nbsp;Memory:<\/h3>\n<ul class=\"postList\">\n<li class=\"graf graf--li\"><strong class=\"markup--strong markup--li-strong\">Functionality:<\/strong> A Chain with Memory is designed to perform a specific task or sequence of actions, while an Agent with Memory is designed to engage in a conversation and provide context-aware responses.<\/li>\n<li class=\"graf graf--li\"><strong class=\"markup--strong markup--li-strong\">Conversation Flow:<\/strong> An Agent with Memory uses the AgentExecutor class to handle the conversation flow, allowing it to interact with the user and respond to their queries. On the other hand, a Chain with Memory follows a predefined sequence of actions and observations without user interaction.<\/li>\n<li class=\"graf graf--li\"><strong class=\"markup--strong markup--li-strong\">Memory Usage:<\/strong> An Agent with Memory utilizes memory to remember the history of the conversation and use that information to provide more accurate responses. It can remember previous questions and answers, providing context-aware responses based on the conversation context. A Chain with Memory may also use memory, but its usage is limited to the specific task or sequence of actions it is designed for.<\/li>\n<\/ul>\n<p class=\"graf graf--p\">An Agent with Memory is a conversational agent that can converse with the user and utilize memory to provide context-aware responses. It differs from a Chain with Memory, a sequence of actions and observations designed for a specific task.<\/p>\n<h3 class=\"graf graf--h3\">To add memory to an Agent, the following steps can be followed:<\/h3>\n<ul class=\"postList\">\n<li class=\"graf graf--li\">Create an <code class=\"markup--code markup--li-code\">LLMChain<\/code> with memory.<\/li>\n<li class=\"graf graf--li\">Use the <code class=\"markup--code markup--li-code\">ConversationBufferMemory<\/code> class to store and retrieve conversation history.<\/li>\n<li class=\"graf graf--li\">Construct the <code class=\"markup--code markup--li-code\">LLMChain<\/code> with the Memory object.<\/li>\n<li class=\"graf graf--li\">Create the Agent using the <code class=\"markup--code markup--li-code\">ZeroShotAgent<\/code> class, passing in the <code class=\"markup--code markup--li-code\">LLMChain<\/code> and the memory object.<\/li>\n<li class=\"graf graf--li\">Use the <code class=\"markup--code markup--li-code\">AgentExecutor<\/code> class to execute the Agent and handle the conversation flow.<\/li>\n<\/ul>\n<h3 class=\"graf graf--h3\">\ud83e\uddd1\ud83c\udffd\u200d\ud83d\udcbb Let\u2019s&nbsp;code!<\/h3>\n<p class=\"graf graf--p\">Start by getting some preliminaries out of the way:<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"1\" data-code-block-lang=\"javascript\"><span class=\"pre--content\">%%capture\n!pip install langchain openai duckduckgo-search youtube_search wikipedia\n\n<span class=\"hljs-keyword\">import<\/span> os\n<span class=\"hljs-keyword\">import<\/span> getpass\n\nos.<span class=\"hljs-property\">environ<\/span>[<span class=\"hljs-string\">\"OPENAI_API_KEY\"<\/span>] = getpass.<span class=\"hljs-title function_\">getpass<\/span>(<span class=\"hljs-string\">\"Enter Your OpenAI API Key:\"<\/span>)\n\n<span class=\"hljs-keyword\">from<\/span> langchain.<span class=\"hljs-property\">agents<\/span> <span class=\"hljs-keyword\">import<\/span> load_tools\n<span class=\"hljs-keyword\">from<\/span> langchain.<span class=\"hljs-property\">agents<\/span> <span class=\"hljs-keyword\">import<\/span> initialize_agent\n<span class=\"hljs-keyword\">from<\/span> langchain.<span class=\"hljs-property\">agents<\/span> <span class=\"hljs-keyword\">import<\/span> <span class=\"hljs-title class_\">ZeroShotAgent<\/span>, <span class=\"hljs-title class_\">Tool<\/span>, <span class=\"hljs-title class_\">AgentExecutor<\/span>\n<span class=\"hljs-keyword\">from<\/span> langchain.<span class=\"hljs-property\">memory<\/span> <span class=\"hljs-keyword\">import<\/span> <span class=\"hljs-title class_\">ConversationBufferMemory<\/span>\n<span class=\"hljs-keyword\">from<\/span> langchain <span class=\"hljs-keyword\">import<\/span> <span class=\"hljs-title class_\">OpenAI<\/span>, <span class=\"hljs-title class_\">LLMChain<\/span><\/span><\/pre>\n<h3 class=\"graf graf--h3\">Create an LLMChain with&nbsp;memory<\/h3>\n<p class=\"graf graf--p\">Start by creating an LLMChain object that includes memory. This allows the Agent to retain information from previous interactions.<\/p>\n<p class=\"graf graf--p\">Notice the usage of the <code class=\"markup--code markup--p-code\">chat_history<\/code> variable in the <code class=\"markup--code markup--p-code\">PromptTemplate<\/code>, which matches up with the dynamic key name in the <code class=\"markup--code markup--p-code\">ConversationBufferMemory<\/code>.<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"1\" data-code-block-lang=\"python\"><span class=\"pre--content\">prefix = <span class=\"hljs-string\">\"\"\"\nYou're having a conversation with a human. You're helpful and answering\nquestions to your maximum ability. You also speak using British slang.\nLike you're from the show TopBoy. You have access to the following tools:\n\"\"\"<\/span>\n\nsuffix = <span class=\"hljs-string\">\"\"\"Let's Go!\"\n\n{chat_history}\n\nQuestion: {input}\n\n{agent_scratchpad}\n\"\"\"<\/span>\n\nllm = OpenAI(temperature=<span class=\"hljs-number\">0.0<\/span>)\n\ntools = load_tools([<span class=\"hljs-string\">\"ddg-search\"<\/span>, <span class=\"hljs-string\">\"llm-math\"<\/span>, <span class=\"hljs-string\">\"wikipedia\"<\/span>], llm=llm)\n\ntools[<span class=\"hljs-number\">0<\/span>].description += <span class=\"hljs-string\">\" Prioritize this when you're looking for current information or current events.\"<\/span>\n\ntools[<span class=\"hljs-number\">2<\/span>].description += <span class=\"hljs-string\">\" Prioritize this when you're looking for factual information about people\"<\/span>\n\nprompt = ZeroShotAgent.create_prompt(\n    tools,\n    prefix=prefix,\n    suffix=suffix,\n    input_variables = [<span class=\"hljs-string\">\"input\"<\/span>, <span class=\"hljs-string\">\"chat_history\"<\/span>, <span class=\"hljs-string\">\"agent_scratchpad\"<\/span>]\n)<\/span><\/pre>\n<h3 class=\"graf graf--h3\">Use the ConversationBufferWindowMemory class<\/h3>\n<p class=\"graf graf--p\">To store and retrieve conversation history, use the ConversationBufferWindowMemory class. This memory object will keep track of the conversation context.<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">memory = ConversationBufferMemory(memory_key=<span class=\"hljs-string\">\"chat_history\"<\/span>)<\/span><\/pre>\n<h3 class=\"graf graf--h3\">Construct the LLMChain with the Memory&nbsp;object<\/h3>\n<p class=\"graf graf--p\">Initialize the LLMChain with the memory object created in the previous step. This connects the memory to the chain.<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">llm_chain = LLMChain(llm=llm,\n                     prompt=prompt)<\/span><\/pre>\n<h3 class=\"graf graf--h3\">Create the&nbsp;Agent<\/h3>\n<p class=\"graf graf--p\">Use the <code class=\"markup--code markup--p-code\">ZeroShotAgent class<\/code> to create the Agent. Pass in the LLMChain and the memory object as parameters.<\/p>\n<p class=\"graf graf--p\">This creates an Agent that can utilize memory during conversations.<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">agent = ZeroShotAgent(llm_chain=llm_chain,\n                      tools=tools,\n                      verbose=<span class=\"hljs-literal\">True<\/span>)<\/span><\/pre>\n<h3 class=\"graf graf--h3\">Use the <code class=\"markup--code markup--h3-code\">AgentExecutor<\/code> class<\/h3>\n<p class=\"graf graf--p\">To execute the Agent and handle the conversation flow, use the <code class=\"markup--code markup--p-code\">AgentExecutor<\/code> class. This class allows the Agent to interact with the user and respond to their queries.<\/p>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">agent_chain = AgentExecutor.from_agent_and_tools(\n    agent=agent,\n    tools=tools,\n    verbose=<span class=\"hljs-literal\">True<\/span>,\n    memory=memory\n)<\/span><\/pre>\n<h3 class=\"graf graf--h3\">Use the&nbsp;Agent<\/h3>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">agent_chain.run(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"Who is Yann LeCun and where was he born?\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">&gt; Entering new AgentExecutor chain...\nThought: I need to answer a question about a person\nAction: Wikipedia\nAction Input: Yann LeCun\nObservation: Page: Yann LeCun\nSummary: Yann Andr\u00e9 LeCun ( l\u0259-KUN, French: [l\u0259k\u0153\u0303]; originally spelled Le Cun; born <span class=\"hljs-number\">8<\/span> July <span class=\"hljs-number\">1960<\/span>) <span class=\"hljs-keyword\">is<\/span> a  Turing Award winning French computer scientist working primarily <span class=\"hljs-keyword\">in<\/span> the fields of machine learning, computer vision, mobile robotics <span class=\"hljs-keyword\">and<\/span> computational neuroscience. He <span class=\"hljs-keyword\">is<\/span> the Silver Professor of the Courant Institute of Mathematical Sciences at New York University <span class=\"hljs-keyword\">and<\/span> Vice-President, Chief AI Scientist at Meta.He <span class=\"hljs-keyword\">is<\/span> well known <span class=\"hljs-keyword\">for<\/span> his work on optical character recognition <span class=\"hljs-keyword\">and<\/span> computer vision using convolutional neural networks (CNN), <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-keyword\">is<\/span> a founding father of convolutional nets. He <span class=\"hljs-keyword\">is<\/span> also one of the main creators of the DjVu image compression technology (together <span class=\"hljs-keyword\">with<\/span> L\u00e9on Bottou <span class=\"hljs-keyword\">and<\/span> Patrick Haffner). He co-developed the Lush programming language <span class=\"hljs-keyword\">with<\/span> L\u00e9on Bottou.\nLeCun received the <span class=\"hljs-number\">2018<\/span> Turing Award (often referred to <span class=\"hljs-keyword\">as<\/span> the <span class=\"hljs-string\">\"Nobel Prize of Computing\"<\/span>), together <span class=\"hljs-keyword\">with<\/span> Yoshua Bengio <span class=\"hljs-keyword\">and<\/span> Geoffrey Hinton, <span class=\"hljs-keyword\">for<\/span> their work on deep learning.\nThe three are sometimes referred to <span class=\"hljs-keyword\">as<\/span> the <span class=\"hljs-string\">\"Godfathers of AI\"<\/span> <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-string\">\"Godfathers of Deep Learning\"<\/span>.\n\nPage: LeNet\nSummary: LeNet <span class=\"hljs-keyword\">is<\/span> a convolutional neural network structure proposed by LeCun et al. <span class=\"hljs-keyword\">in<\/span> <span class=\"hljs-number\">1998<\/span>,. In general, LeNet refers to LeNet-<span class=\"hljs-number\">5<\/span> <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-keyword\">is<\/span> a simple convolutional neural network. Convolutional neural networks are a kind of feed-forward neural network whose artificial neurons can respond to a part of the surrounding cells <span class=\"hljs-keyword\">in<\/span> the coverage <span class=\"hljs-built_in\">range<\/span> <span class=\"hljs-keyword\">and<\/span> perform well <span class=\"hljs-keyword\">in<\/span> large-scale image processing.\n\n\n\nPage: Geoffrey Hinton\nSummary: Geoffrey Everest Hinton  (born <span class=\"hljs-number\">6<\/span> December <span class=\"hljs-number\">1947<\/span>) <span class=\"hljs-keyword\">is<\/span> a British-Canadian cognitive psychologist <span class=\"hljs-keyword\">and<\/span> computer scientist, most noted <span class=\"hljs-keyword\">for<\/span> his work on artificial neural networks. From <span class=\"hljs-number\">2013<\/span> to <span class=\"hljs-number\">2023<\/span>, he divided his time working <span class=\"hljs-keyword\">for<\/span> Google (Google Brain) <span class=\"hljs-keyword\">and<\/span> the University of Toronto, before publicly announcing his departure <span class=\"hljs-keyword\">from<\/span> Google <span class=\"hljs-keyword\">in<\/span> May <span class=\"hljs-number\">2023<\/span> citing concerns about the risks of artificial intelligence (AI) technology. In <span class=\"hljs-number\">2017<\/span>, he co-founded <span class=\"hljs-keyword\">and<\/span> became the chief scientific advisor of the Vector Institute <span class=\"hljs-keyword\">in<\/span> Toronto.With David Rumelhart <span class=\"hljs-keyword\">and<\/span> Ronald J. Williams, Hinton was co-author of a highly cited paper published <span class=\"hljs-keyword\">in<\/span> <span class=\"hljs-number\">1986<\/span> that popularised the backpropagation algorithm <span class=\"hljs-keyword\">for<\/span> training multi-layer neural networks, although they were <span class=\"hljs-keyword\">not<\/span> the first to propose the approach. Hinton <span class=\"hljs-keyword\">is<\/span> viewed <span class=\"hljs-keyword\">as<\/span> a leading figure <span class=\"hljs-keyword\">in<\/span> the deep learning community. The dramatic image-recognition milestone of the AlexNet designed <span class=\"hljs-keyword\">in<\/span> collaboration <span class=\"hljs-keyword\">with<\/span> his students Alex Krizhevsky <span class=\"hljs-keyword\">and<\/span> Ilya Sutskever <span class=\"hljs-keyword\">for<\/span> the ImageNet challenge <span class=\"hljs-number\">2012<\/span> was a breakthrough <span class=\"hljs-keyword\">in<\/span> the field of computer vision.Hinton received the <span class=\"hljs-number\">2018<\/span> Turing Award (often referred to <span class=\"hljs-keyword\">as<\/span> the <span class=\"hljs-string\">\"Nobel Prize of Computing\"<\/span>), together <span class=\"hljs-keyword\">with<\/span> Yoshua Bengio <span class=\"hljs-keyword\">and<\/span> Yann LeCun, <span class=\"hljs-keyword\">for<\/span> their work on deep learning. They are sometimes referred to <span class=\"hljs-keyword\">as<\/span> the <span class=\"hljs-string\">\"Godfathers of Deep Learning\"<\/span>, <span class=\"hljs-keyword\">and<\/span> have continued to give public talks together.In May <span class=\"hljs-number\">2023<\/span>, Hinton announced his resignation <span class=\"hljs-keyword\">from<\/span> Google to be able to <span class=\"hljs-string\">\"freely speak out about the risks of A.I.\"<\/span> He has voiced concerns about deliberate misuse by malicious actors, technological unemployment, <span class=\"hljs-keyword\">and<\/span> existential risk <span class=\"hljs-keyword\">from<\/span> artificial general intelligence.\n\n\nThought:I now know the final answer\nFinal Answer: Yann LeCun <span class=\"hljs-keyword\">is<\/span> a French computer scientist <span class=\"hljs-keyword\">and<\/span> Turing Award winner who <span class=\"hljs-keyword\">is<\/span> well known <span class=\"hljs-keyword\">for<\/span> his work on optical character recognition <span class=\"hljs-keyword\">and<\/span> computer vision using convolutional neural networks. He was born <span class=\"hljs-keyword\">in<\/span> <span class=\"hljs-number\">1960<\/span> <span class=\"hljs-keyword\">and<\/span> did his post-doc at the University of Toronto <span class=\"hljs-keyword\">with<\/span> Geoffrey Hinton. He <span class=\"hljs-keyword\">is<\/span> one of the <span class=\"hljs-string\">\"Godfathers of AI\"<\/span> <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-string\">\"Godfathers of Deep Learning\"<\/span> alongside Yoshua Bengio <span class=\"hljs-keyword\">and<\/span> Geoffrey Hinton.\n\n&gt; Finished chain.\nYann LeCun <span class=\"hljs-keyword\">is<\/span> a French computer scientist <span class=\"hljs-keyword\">and<\/span> Turing Award winner who <span class=\"hljs-keyword\">is<\/span> well known <span class=\"hljs-keyword\">for<\/span> his work on optical character recognition <span class=\"hljs-keyword\">and<\/span> computer vision using convolutional neural networks. He was born <span class=\"hljs-keyword\">in<\/span> <span class=\"hljs-number\">1960<\/span> <span class=\"hljs-keyword\">and<\/span> did his post-doc at the University of Toronto <span class=\"hljs-keyword\">with<\/span> Geoffrey Hinton. He <span class=\"hljs-keyword\">is<\/span> one of the <span class=\"hljs-string\">\"Godfathers of AI\"<\/span> <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-string\">\"Godfathers of Deep Learning\"<\/span> alongside Yoshua Bengio <span class=\"hljs-keyword\">and<\/span> Geoffrey Hinton.<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">agent_chain.run(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"Where did he go for his post-doc?\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">&gt; Entering new AgentExecutor chain...\nThought: Yann LeCun did his post-doc at the University of Toronto with Geoffrey Hinton.\nAction: Wikipedia\nAction Input: Yann LeCun\nObservation: Page: Yann LeCun\nSummary: Yann Andr\u00e9 LeCun ( l\u0259-KUN, French: [l\u0259k\u0153\u0303]; originally spelled Le Cun; born 8 July 1960) is a  Turing Award winning French computer scientist working primarily in the fields of machine learning, computer vision, mobile robotics and computational neuroscience. He is the Silver Professor of the Courant Institute of Mathematical Sciences at New York University and Vice-President, Chief AI Scientist at Meta.He is well known for his work on optical character recognition and computer vision using convolutional neural networks (CNN), and is a founding father of convolutional nets. He is also one of the main creators of the DjVu image compression technology (together with L\u00e9on Bottou and Patrick Haffner). He co-developed the Lush programming language with L\u00e9on Bottou.\nLeCun received the 2018 Turing Award (often referred to as the \"Nobel Prize of Computing\"), together with Yoshua Bengio and Geoffrey Hinton, for their work on deep learning.\nThe three are sometimes referred to as the \"Godfathers of AI\" and \"Godfathers of Deep Learning\".\n\nPage: LeNet\nSummary: LeNet is a convolutional neural network structure proposed by LeCun et al. in 1998,. In general, LeNet refers to LeNet-5 and is a simple convolutional neural network. Convolutional neural networks are a kind of feed-forward neural network whose artificial neurons can respond to a part of the surrounding cells in the coverage range and perform well in large-scale image processing.\n\n\n\nPage: Geoffrey Hinton\nSummary: Geoffrey Everest Hinton  (born 6 December 1947) is a British-Canadian cognitive psychologist and computer scientist, most noted for his work on artificial neural networks. From 2013 to 2023, he divided his time working for Google (Google Brain) and the University of Toronto, before publicly announcing his departure from Google in May 2023 citing concerns about the risks of artificial intelligence (AI) technology. In 2017, he co-founded and became the chief scientific advisor of the Vector Institute in Toronto.With David Rumelhart and Ronald J. Williams, Hinton was co-author of a highly cited paper published in 1986 that popularised the backpropagation algorithm for training multi-layer neural networks, although they were not the first to propose the approach. Hinton is viewed as a leading figure in the deep learning community. The dramatic image-recognition milestone of the AlexNet designed in collaboration with his students Alex Krizhevsky and Ilya Sutskever for the ImageNet challenge 2012 was a breakthrough in the field of computer vision.Hinton received the 2018 Turing Award (often referred to as the \"Nobel Prize of Computing\"), together with Yoshua Bengio and Yann LeCun, for their work on deep learning. They are sometimes referred to as the \"Godfathers of Deep Learning\", and have continued to give public talks together.In May 2023, Hinton announced his resignation from Google to be able to \"freely speak out about the risks of A.I.\" He has voiced concerns about deliberate misuse by malicious actors, technological unemployment, and existential risk from artificial general intelligence.\n\n\nThought:I now know the final answer.\nFinal Answer: Yann LeCun did his post-doc at the University of Toronto with Geoffrey Hinton.\n\n&gt; Finished chain.\nYann LeCun did his post-doc at the University of Toronto with Geoffrey Hinton.<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">agent_chain.run(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"Search for current temperature there on September 8, 2023?\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"plaintext\"><span class=\"pre--content\">&gt; Entering new AgentExecutor chain...\nThought: I need to find the current temperature in Toronto on September 8, 2023\nAction: DuckDuckGo Search\nAction Input: \"Current temperature in Toronto on September 8, 2023\"\nObservation: DuckDuckGo Search is not a valid tool, try one of [duckduckgo_search, Calculator, Wikipedia].\nThought:I need to use duckduckgo_search to find the current temperature in Toronto on September 8, 2023\nAction: duckduckgo_search\nAction Input: \"Current temperature in Toronto on September 8, 2023\"\nObservation: Current Weather for Popular Cities . San Francisco, CA 59 \u00b0 F Fair; Manhattan, NY warning 79 \u00b0 F Fair; Schiller Park, IL (60176) warning 65 \u00b0 F Cloudy; Boston, MA 81 \u00b0 F Clear; Houston, TX ... Posted September 7, 2023 4:21 pm 17:37 Global News at 6 Toronto: September 3, 2023 WATCH: It's not exactly the September weather we're accustomed to but it is here \u2014 searing hot... The hot weather blast will only increase in intensity on Monday and Tuesday, when it's expected to feel like 37 C and 39 C, respectively. Based on the weather agency's 14-day forecast ... Temperatures will continue to fall as more seasonal conditions return heading into the weekend. It's a change after a cooler-than-usual August. There were no days in which the temperature reached more than 30 C last month, compared to five days of 30-plus temperatures in July. The last time Toronto saw a 35 degree day was in June 2022. Toronto weather in September 2023 30 days Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun Jul Mon Tue Wed Thu Fri Sat Sun 28 August +25\u00b0+18\u00b0 29 August +25\u00b0+16\u00b0 30 August +27\u00b0+17\u00b0 31 August +26\u00b0+19\u00b0 1 September +25\u00b0+18\u00b0 2 September +25\u00b0+19\u00b0 3 September +25\u00b0+19\u00b0 4 September +24\u00b0+17\u00b0 5 September +23\u00b0+16\u00b0\nThought:I now know the final answer\nFinal Answer: The current temperature in Toronto on September 8, 2023 is expected to be around 25\u00b0C.\n\n&gt; Finished chain.\nThe current temperature in Toronto on September 8, 2023 is expected to be around 25\u00b0C.<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">agent_chain.run(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"What's the difference between the current temperature there and in Winnipeg, MB?\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">&gt; Entering new AgentExecutor chain...\nThought: I need to find the current temperature <span class=\"hljs-keyword\">in<\/span> both Toronto <span class=\"hljs-keyword\">and<\/span> Winnipeg\nAction: duckduckgo_search\nAction Input: current temperature <span class=\"hljs-keyword\">in<\/span> Toronto\nObservation: Current weather <span class=\"hljs-keyword\">in<\/span> Toronto, Ontario, Canada. Check current conditions <span class=\"hljs-keyword\">in<\/span> Toronto, Ontario, Canada <span class=\"hljs-keyword\">with<\/span> radar, hourly, <span class=\"hljs-keyword\">and<\/span> more. Find the most current <span class=\"hljs-keyword\">and<\/span> reliable <span class=\"hljs-number\">7<\/span> day weather forecasts, storm alerts, reports <span class=\"hljs-keyword\">and<\/span> information <span class=\"hljs-keyword\">for<\/span> [city] <span class=\"hljs-keyword\">with<\/span> The Weather Network. Feels Like: <span class=\"hljs-number\">92<\/span> \u00b0F. Forecast: <span class=\"hljs-number\">85<\/span> \/ <span class=\"hljs-number\">72<\/span> \u00b0F. Wind: <span class=\"hljs-number\">10<\/span> mph \u2191 <span class=\"hljs-keyword\">from<\/span> South. Location: Toronto Pearson International Airport. Current Time: Sep <span class=\"hljs-number\">6<\/span>, <span class=\"hljs-number\">2023<\/span> at <span class=\"hljs-number\">4<\/span>:05:<span class=\"hljs-number\">54<\/span> pm. Latest Report: Sep <span class=\"hljs-number\">6<\/span>, <span class=\"hljs-number\">2023<\/span> at <span class=\"hljs-number\">3<\/span>:<span class=\"hljs-number\">00<\/span> pm. Current Conditions Past <span class=\"hljs-number\">24<\/span> hours Weather Radar Satellite Lightning <span class=\"hljs-number\">27<\/span>\u00b0C \u00b0C \u00b0F Observed at: Toronto Pearson Int<span class=\"hljs-string\">'l Airport Date: 10:00 PM EDT Tuesday 5 September 2023 Condition: Clear Pressure: 101.4 kPa Tendency: Steady Temperature: 26.6\u00b0C Dew point: 20.8\u00b0C Humidity: 71% Wind: WSW 14 km\/h Humidex: 35 Visibility: 24 km Forecast Hourly Forecast Temperatures will continue to fall as more seasonal conditions return heading into the weekend. It'<\/span>s a change after a cooler-than-usual August. There were no days <span class=\"hljs-keyword\">in<\/span> which the temperature reached more than <span class=\"hljs-number\">30<\/span> C last month, compared to five days of <span class=\"hljs-number\">30<\/span>-plus temperatures <span class=\"hljs-keyword\">in<\/span> July. The last time Toronto saw a <span class=\"hljs-number\">35<\/span> degree day was <span class=\"hljs-keyword\">in<\/span> June <span class=\"hljs-number\">2022.<\/span>\nThought:I now need to find the current temperature <span class=\"hljs-keyword\">in<\/span> Winnipeg\nAction: duckduckgo_search\nAction Input: current temperature <span class=\"hljs-keyword\">in<\/span> Winnipeg\nObservation: <span class=\"hljs-number\">21<\/span>\u00b0C \u00b0C \u00b0F Observed at: Winnipeg Richardson Int<span class=\"hljs-string\">'l Airport Date: 10:00 PM CDT Monday 4 September 2023 Condition: Mostly Cloudy Pressure: 100.1 kPa Tendency: Rising Temperature: 21.1\u00b0C Dew point: 17.8\u00b0C Humidity: 81% Wind: NNW 20 km\/h Humidex: 27 Visibility: 18 km Forecast Hourly Forecast Air Quality Alerts Jet Stream Tonight 24\u00b0C \u00b0C \u00b0F Observed at: Winnipeg Richardson Int'<\/span>l Airport Date: <span class=\"hljs-number\">8<\/span>:<span class=\"hljs-number\">00<\/span> PM CDT Sunday <span class=\"hljs-number\">3<\/span> September <span class=\"hljs-number\">2023<\/span> Condition: Mainly Sunny Pressure: <span class=\"hljs-number\">100.6<\/span> kPa Tendency: Falling Temperature: <span class=\"hljs-number\">23.8<\/span>\u00b0C Dew point: <span class=\"hljs-number\">14.4<\/span>\u00b0C Humidity: <span class=\"hljs-number\">56<\/span>% Wind: NE <span class=\"hljs-number\">11<\/span> km\/h Humidex: <span class=\"hljs-number\">27<\/span> Visibility: <span class=\"hljs-number\">16<\/span> km Forecast Hourly Forecast Air Quality Alerts Jet Stream Tonight Find the most current <span class=\"hljs-keyword\">and<\/span> reliable <span class=\"hljs-number\">7<\/span> day weather forecasts, storm alerts, reports <span class=\"hljs-keyword\">and<\/span> information <span class=\"hljs-keyword\">for<\/span> [city] <span class=\"hljs-keyword\">with<\/span> The Weather Network. Winnipeg, Manitoba, Canada Weather Forecast, <span class=\"hljs-keyword\">with<\/span> current conditions, wind, air quality, <span class=\"hljs-keyword\">and<\/span> what to expect <span class=\"hljs-keyword\">for<\/span> the <span class=\"hljs-built_in\">next<\/span> <span class=\"hljs-number\">3<\/span> days. Current weather <span class=\"hljs-keyword\">in<\/span> Winnipeg <span class=\"hljs-keyword\">and<\/span> forecast <span class=\"hljs-keyword\">for<\/span> today, tomorrow, <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-built_in\">next<\/span> <span class=\"hljs-number\">14<\/span> days\nThought:I now know the final answer\nFinal Answer: The difference <span class=\"hljs-keyword\">in<\/span> temperature between Toronto <span class=\"hljs-keyword\">and<\/span> Winnipeg <span class=\"hljs-keyword\">is<\/span> <span class=\"hljs-number\">5<\/span>\u00b0C.\n\n&gt; Finished chain.\nThe difference <span class=\"hljs-keyword\">in<\/span> temperature between Toronto <span class=\"hljs-keyword\">and<\/span> Winnipeg <span class=\"hljs-keyword\">is<\/span> <span class=\"hljs-number\">5<\/span>\u00b0C.<\/span><\/pre>\n<h3 class=\"graf graf--h3\">Compare the above with a memoryless Agent<\/h3>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">prefix = <span class=\"hljs-string\">\"\"\"\nYou're having a conversation with a human. You're helpful and answering\nquestions to your maximum ability. You also speak using British slang.\nLike you're from the show TopBoy. You have access to the following tools:\n\"\"\"<\/span>\n\nsuffix = <span class=\"hljs-string\">\"\"\"Let's Go!\"\n\nQuestion: {input}\n\n{agent_scratchpad}\n\"\"\"<\/span>\n\nllm = OpenAI(temperature=<span class=\"hljs-number\">0.0<\/span>)\n\ntools = load_tools([<span class=\"hljs-string\">\"ddg-search\"<\/span>, <span class=\"hljs-string\">\"llm-math\"<\/span>, <span class=\"hljs-string\">\"wikipedia\"<\/span>], llm=llm)\n\nprompt = ZeroShotAgent.create_prompt(tools,\n                                     prefix=prefix,\n                                     suffix=suffix,\n                                     input_variables=[<span class=\"hljs-string\">\"input\"<\/span>, <span class=\"hljs-string\">\"agent_scratchpad\"<\/span>])\n\nllm_chain = LLMChain(llm=llm,\n                     prompt=prompt)\n\nagent = ZeroShotAgent(llm_chain=llm_chain,\n                      tools=tools,\n                      verbose=<span class=\"hljs-literal\">True<\/span>)\n\nagent_without_memory = AgentExecutor.from_agent_and_tools(\n    agent=agent,\n    tools=tools,\n    verbose=<span class=\"hljs-literal\">True<\/span>\n)\n\nagent_without_memory.run(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"Who is Yann LeCun and where was he born?\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">&gt; Entering new AgentExecutor chain...\nThought: I need to find out who Yann LeCun <span class=\"hljs-keyword\">is<\/span> <span class=\"hljs-keyword\">and<\/span> where he was born.\nAction: Wikipedia\nAction Input: Yann LeCun\nObservation: Page: Yann LeCun\nSummary: Yann Andr\u00e9 LeCun ( l\u0259-KUN, French: [l\u0259k\u0153\u0303]; originally spelled Le Cun; born <span class=\"hljs-number\">8<\/span> July <span class=\"hljs-number\">1960<\/span>) <span class=\"hljs-keyword\">is<\/span> a  Turing Award winning French computer scientist working primarily <span class=\"hljs-keyword\">in<\/span> the fields of machine learning, computer vision, mobile robotics <span class=\"hljs-keyword\">and<\/span> computational neuroscience. He <span class=\"hljs-keyword\">is<\/span> the Silver Professor of the Courant Institute of Mathematical Sciences at New York University <span class=\"hljs-keyword\">and<\/span> Vice-President, Chief AI Scientist at Meta.He <span class=\"hljs-keyword\">is<\/span> well known <span class=\"hljs-keyword\">for<\/span> his work on optical character recognition <span class=\"hljs-keyword\">and<\/span> computer vision using convolutional neural networks (CNN), <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-keyword\">is<\/span> a founding father of convolutional nets. He <span class=\"hljs-keyword\">is<\/span> also one of the main creators of the DjVu image compression technology (together <span class=\"hljs-keyword\">with<\/span> L\u00e9on Bottou <span class=\"hljs-keyword\">and<\/span> Patrick Haffner). He co-developed the Lush programming language <span class=\"hljs-keyword\">with<\/span> L\u00e9on Bottou.\nLeCun received the <span class=\"hljs-number\">2018<\/span> Turing Award (often referred to <span class=\"hljs-keyword\">as<\/span> the <span class=\"hljs-string\">\"Nobel Prize of Computing\"<\/span>), together <span class=\"hljs-keyword\">with<\/span> Yoshua Bengio <span class=\"hljs-keyword\">and<\/span> Geoffrey Hinton, <span class=\"hljs-keyword\">for<\/span> their work on deep learning.\nThe three are sometimes referred to <span class=\"hljs-keyword\">as<\/span> the <span class=\"hljs-string\">\"Godfathers of AI\"<\/span> <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-string\">\"Godfathers of Deep Learning\"<\/span>.\n\nPage: LeNet\nSummary: LeNet <span class=\"hljs-keyword\">is<\/span> a convolutional neural network structure proposed by LeCun et al. <span class=\"hljs-keyword\">in<\/span> <span class=\"hljs-number\">1998<\/span>,. In general, LeNet refers to LeNet-<span class=\"hljs-number\">5<\/span> <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-keyword\">is<\/span> a simple convolutional neural network. Convolutional neural networks are a kind of feed-forward neural network whose artificial neurons can respond to a part of the surrounding cells <span class=\"hljs-keyword\">in<\/span> the coverage <span class=\"hljs-built_in\">range<\/span> <span class=\"hljs-keyword\">and<\/span> perform well <span class=\"hljs-keyword\">in<\/span> large-scale image processing.\n\n\n\nPage: Geoffrey Hinton\nSummary: Geoffrey Everest Hinton  (born <span class=\"hljs-number\">6<\/span> December <span class=\"hljs-number\">1947<\/span>) <span class=\"hljs-keyword\">is<\/span> a British-Canadian cognitive psychologist <span class=\"hljs-keyword\">and<\/span> computer scientist, most noted <span class=\"hljs-keyword\">for<\/span> his work on artificial neural networks. From <span class=\"hljs-number\">2013<\/span> to <span class=\"hljs-number\">2023<\/span>, he divided his time working <span class=\"hljs-keyword\">for<\/span> Google (Google Brain) <span class=\"hljs-keyword\">and<\/span> the University of Toronto, before publicly announcing his departure <span class=\"hljs-keyword\">from<\/span> Google <span class=\"hljs-keyword\">in<\/span> May <span class=\"hljs-number\">2023<\/span> citing concerns about the risks of artificial intelligence (AI) technology. In <span class=\"hljs-number\">2017<\/span>, he co-founded <span class=\"hljs-keyword\">and<\/span> became the chief scientific advisor of the Vector Institute <span class=\"hljs-keyword\">in<\/span> Toronto.With David Rumelhart <span class=\"hljs-keyword\">and<\/span> Ronald J. Williams, Hinton was co-author of a highly cited paper published <span class=\"hljs-keyword\">in<\/span> <span class=\"hljs-number\">1986<\/span> that popularised the backpropagation algorithm <span class=\"hljs-keyword\">for<\/span> training multi-layer neural networks, although they were <span class=\"hljs-keyword\">not<\/span> the first to propose the approach. Hinton <span class=\"hljs-keyword\">is<\/span> viewed <span class=\"hljs-keyword\">as<\/span> a leading figure <span class=\"hljs-keyword\">in<\/span> the deep learning community. The dramatic image-recognition milestone of the AlexNet designed <span class=\"hljs-keyword\">in<\/span> collaboration <span class=\"hljs-keyword\">with<\/span> his students Alex Krizhevsky <span class=\"hljs-keyword\">and<\/span> Ilya Sutskever <span class=\"hljs-keyword\">for<\/span> the ImageNet challenge <span class=\"hljs-number\">2012<\/span> was a breakthrough <span class=\"hljs-keyword\">in<\/span> the field of computer vision.Hinton received the <span class=\"hljs-number\">2018<\/span> Turing Award (often referred to <span class=\"hljs-keyword\">as<\/span> the <span class=\"hljs-string\">\"Nobel Prize of Computing\"<\/span>), together <span class=\"hljs-keyword\">with<\/span> Yoshua Bengio <span class=\"hljs-keyword\">and<\/span> Yann LeCun, <span class=\"hljs-keyword\">for<\/span> their work on deep learning. They are sometimes referred to <span class=\"hljs-keyword\">as<\/span> the <span class=\"hljs-string\">\"Godfathers of Deep Learning\"<\/span>, <span class=\"hljs-keyword\">and<\/span> have continued to give public talks together.In May <span class=\"hljs-number\">2023<\/span>, Hinton announced his resignation <span class=\"hljs-keyword\">from<\/span> Google to be able to <span class=\"hljs-string\">\"freely speak out about the risks of A.I.\"<\/span> He has voiced concerns about deliberate misuse by malicious actors, technological unemployment, <span class=\"hljs-keyword\">and<\/span> existential risk <span class=\"hljs-keyword\">from<\/span> artificial general intelligence.\n\n\nThought:I now know the final answer.\n\nFinal Answer: Yann LeCun <span class=\"hljs-keyword\">is<\/span> a French computer scientist <span class=\"hljs-keyword\">and<\/span> Turing Award winner working primarily <span class=\"hljs-keyword\">in<\/span> the fields of machine learning, computer vision, mobile robotics <span class=\"hljs-keyword\">and<\/span> computational neuroscience. He was born <span class=\"hljs-keyword\">in<\/span> <span class=\"hljs-number\">1960<\/span> <span class=\"hljs-keyword\">in<\/span> France. He <span class=\"hljs-keyword\">is<\/span> one of the <span class=\"hljs-string\">\"Godfathers of AI\"<\/span> <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-string\">\"Godfathers of Deep Learning\"<\/span> alongside Yoshua Bengio <span class=\"hljs-keyword\">and<\/span> Geoffrey Hinton.\n\n&gt; Finished chain.\nYann LeCun <span class=\"hljs-keyword\">is<\/span> a French computer scientist <span class=\"hljs-keyword\">and<\/span> Turing Award winner working primarily <span class=\"hljs-keyword\">in<\/span> the fields of machine learning, computer vision, mobile robotics <span class=\"hljs-keyword\">and<\/span> computational neuroscience. He was born <span class=\"hljs-keyword\">in<\/span> <span class=\"hljs-number\">1960<\/span> <span class=\"hljs-keyword\">in<\/span> France. He <span class=\"hljs-keyword\">is<\/span> one of the <span class=\"hljs-string\">\"Godfathers of AI\"<\/span> <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-string\">\"Godfathers of Deep Learning\"<\/span> alongside Yoshua Bengio <span class=\"hljs-keyword\">and<\/span> Geoffrey Hinton.<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">agent_without_memory.run(<span class=\"hljs-built_in\">input<\/span>=<span class=\"hljs-string\">\"Where did he go for his post-doc?\"<\/span>)<\/span><\/pre>\n<pre class=\"graf graf--pre graf--preV2\" spellcheck=\"false\" data-code-block-mode=\"2\" data-code-block-lang=\"python\"><span class=\"pre--content\">&gt; Entering new AgentExecutor chain...\nThought: I need to find out where he went <span class=\"hljs-keyword\">for<\/span> his post-doc\nAction: Wikipedia\nAction Input: <span class=\"hljs-string\">\"post-doc\"<\/span>\nObservation: Page: Postdoctoral researcher\nSummary: A postdoctoral fellow, postdoctoral researcher, <span class=\"hljs-keyword\">or<\/span> simply postdoc, <span class=\"hljs-keyword\">is<\/span> a person professionally conducting research after the completion of their doctoral studies (typically a PhD). Postdocs most commonly, but <span class=\"hljs-keyword\">not<\/span> always, have a temporary academic appointment, sometimes <span class=\"hljs-keyword\">in<\/span> preparation <span class=\"hljs-keyword\">for<\/span> an academic faculty position. According to data <span class=\"hljs-keyword\">from<\/span> the US National science foundation. The number of holders of PhD <span class=\"hljs-keyword\">in<\/span> Biological sciences who end up <span class=\"hljs-keyword\">in<\/span> tenure track has consistently dropped <span class=\"hljs-keyword\">in<\/span> the last decades <span class=\"hljs-keyword\">from<\/span> over <span class=\"hljs-number\">50<\/span>% <span class=\"hljs-keyword\">in<\/span> the 1970s to contemporary levels of <span class=\"hljs-number\">20<\/span>% [direct reference needed]. They <span class=\"hljs-keyword\">continue<\/span> their studies <span class=\"hljs-keyword\">or<\/span> carry out research <span class=\"hljs-keyword\">and<\/span> further increase expertise <span class=\"hljs-keyword\">in<\/span> a specialist subject, including integrating a team <span class=\"hljs-keyword\">and<\/span> acquiring novel skills <span class=\"hljs-keyword\">and<\/span> research methods. Postdoctoral research <span class=\"hljs-keyword\">is<\/span> often considered essential <span class=\"hljs-keyword\">while<\/span> advancing the scholarly mission of the host institution; it <span class=\"hljs-keyword\">is<\/span> expected to produce relevant publications <span class=\"hljs-keyword\">in<\/span> peer-reviewed academic journals <span class=\"hljs-keyword\">or<\/span> conferences. In some countries, postdoctoral research may lead to further formal qualifications <span class=\"hljs-keyword\">or<\/span> certification, <span class=\"hljs-keyword\">while<\/span> <span class=\"hljs-keyword\">in<\/span> other countries, it does <span class=\"hljs-keyword\">not<\/span>.Postdoctoral research may be funded through an appointment <span class=\"hljs-keyword\">with<\/span> a salary <span class=\"hljs-keyword\">or<\/span> an appointment <span class=\"hljs-keyword\">with<\/span> a stipend <span class=\"hljs-keyword\">or<\/span> sponsorship award. Appointments <span class=\"hljs-keyword\">for<\/span> such a research position may be called postdoctoral research fellow, postdoctoral research associate, <span class=\"hljs-keyword\">or<\/span> postdoctoral research assistant. Postdoctoral researchers typically work under the supervision of a principal investigator. In many English-speaking countries, postdoctoral researchers are colloquially referred to <span class=\"hljs-keyword\">as<\/span> <span class=\"hljs-string\">\"postdocs\"<\/span>.\n\nPage: Doc Martin\nSummary: Doc Martin <span class=\"hljs-keyword\">is<\/span> a British medical comedy-drama television series starring Martin Clunes <span class=\"hljs-keyword\">as<\/span> Doctor Martin Ellingham. It was created by Dominic Minghella developing the character of Dr Martin Bamford <span class=\"hljs-keyword\">from<\/span> the <span class=\"hljs-number\">2000<\/span> comedy film Saving Grace. The programme <span class=\"hljs-keyword\">is<\/span> <span class=\"hljs-built_in\">set<\/span> <span class=\"hljs-keyword\">in<\/span> the fictional seaside village of Portwenn <span class=\"hljs-keyword\">and<\/span> filmed on location <span class=\"hljs-keyword\">in<\/span> the village of Port Isaac, Cornwall, United Kingdom, <span class=\"hljs-keyword\">with<\/span> most interior scenes shot <span class=\"hljs-keyword\">in<\/span> a converted local barn. Fern Cottage <span class=\"hljs-keyword\">is<\/span> used <span class=\"hljs-keyword\">as<\/span> the home <span class=\"hljs-keyword\">and<\/span> surgery of Doctor Ellingham.\nNine series aired between <span class=\"hljs-number\">2004<\/span> <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-number\">2019<\/span>, <span class=\"hljs-keyword\">with<\/span> a television film airing on Christmas Day <span class=\"hljs-keyword\">in<\/span> <span class=\"hljs-number\">2006.<\/span> The ninth series aired on ITV premiered <span class=\"hljs-keyword\">in<\/span> September <span class=\"hljs-number\">2019.<\/span> The tenth (<span class=\"hljs-keyword\">and<\/span> final) series aired <span class=\"hljs-keyword\">from<\/span> <span class=\"hljs-number\">7<\/span> September <span class=\"hljs-number\">2022<\/span> to <span class=\"hljs-number\">26<\/span> October <span class=\"hljs-number\">2022<\/span>; one last installment, a Christmas special that aired on <span class=\"hljs-number\">25<\/span> December <span class=\"hljs-number\">2022<\/span>, was the programme<span class=\"hljs-string\">'s final episode. On 29 December 2022 a documentary entitled \u201cFarewell Doc Martin\u201d aired on ITV, featuring behind-the-scenes interviews with the cast and crew as they filmed the final series. It also looked back at highlights from the 18 years of the show.\n\n\nPage: Doc (aircraft)\nSummary: Doc is a Boeing B-29 Superfortress. It is one of two that are currently flying in the world, the other B-29 being FIFI. It is owned by Doc'<\/span>s Friends, Inc., a non-profit organization based <span class=\"hljs-keyword\">in<\/span> Wichita, Kansas, United States.\nDoc attends various air shows <span class=\"hljs-keyword\">and<\/span> offers rides.\nThought:I now know that Doc Martin was a British medical comedy-drama television series starring Martin Clunes <span class=\"hljs-keyword\">as<\/span> Doctor Martin Ellingham, <span class=\"hljs-keyword\">and<\/span> that Doc <span class=\"hljs-keyword\">is<\/span> a Boeing B-<span class=\"hljs-number\">29<\/span> Superfortress owned by Doc<span class=\"hljs-string\">'s Friends, Inc.\nFinal Answer: Doc Martin is a British medical comedy-drama television series starring Martin Clunes as Doctor Martin Ellingham, and Doc is a Boeing B-29 Superfortress owned by Doc'<\/span>s Friends, Inc.\n\n&gt; Finished chain.\nDoc Martin <span class=\"hljs-keyword\">is<\/span> a British medical comedy-drama television series starring Martin Clunes <span class=\"hljs-keyword\">as<\/span> Doctor Martin Ellingham, <span class=\"hljs-keyword\">and<\/span> Doc <span class=\"hljs-keyword\">is<\/span> a Boeing B-<span class=\"hljs-number\">29<\/span> Superfortress owned by Doc<span class=\"hljs-string\">'s Friends, Inc.<\/span><\/span><\/pre>\n<h3 class=\"graf graf--h3\">Conclusion<\/h3>\n<p class=\"graf graf--p\">The blog\u2019s journey concludes with a clear understanding of memory&#8217;s pivotal role in LangChain agents.<\/p>\n<p class=\"graf graf--p\">By integrating memory, agents transcend simple task execution, becoming capable conversationalists with the ability to recall and utilize past interactions. This feature elevates agents from mere executors of commands to intelligent interlocutors, capable of contextual awareness and improved decision-making. It\u2019s an essential step forward, marking a significant advancement in creating more nuanced and sophisticated AI-driven interactions.<\/p>\n<p class=\"graf graf--p\">As you deploy these agents, remember the power of memory\u200a\u2014\u200ait\u2019s not just about responding but understanding and growing with each interaction.<\/p>\n<\/div>\n<\/div>\n<\/section>\n\n\n\n<section class=\"section section--body\">\n<div class=\"section-divider\">\n<hr class=\"section-divider\">\n<\/div>\n<div class=\"section-content\">\n<div class=\"section-inner sectionLayout--insetColumn\"><\/div>\n<\/div>\n<\/section>\n","protected":false},"excerpt":{"rendered":"<p>Empowering Conversational AI with Contextual Recall Memory in&nbsp;Agents Memory in Agents is an important feature that allows them to retain information from previous interactions and use it to provide more accurate and context-aware responses. By incorporating memory into an Agent, it can remember the history of the conversation and use that information to answer subsequent [&hellip;]<\/p>\n","protected":false},"author":68,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"customer_name":"","customer_description":"","customer_industry":"","customer_technologies":"","customer_logo":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[65,7],"tags":[70,71,52,31,34],"coauthors":[166],"class_list":["post-8277","post","type-post","status-publish","format-standard","hentry","category-llmops","category-tutorials","tag-langchain","tag-language-models","tag-llm","tag-llmops","tag-prompt-engineering"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v25.9 (Yoast SEO v25.9) - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Contextual Recall in LangChain Agents - Comet<\/title>\n<meta name=\"description\" content=\"A LangChain Agent with Memory is different from a Chain with Memory. Learn more about what distinguishes the two in this article.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Contextual Recall in LangChain Agents\" \/>\n<meta property=\"og:description\" content=\"A LangChain Agent with Memory is different from a Chain with Memory. Learn more about what distinguishes the two in this article.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/\" \/>\n<meta property=\"og:site_name\" content=\"Comet\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/cometdotml\" \/>\n<meta property=\"article:published_time\" content=\"2023-11-30T14:50:32+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-04-24T17:04:00+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*e8fL__n-4GUQKQXn\" \/>\n<meta name=\"author\" content=\"Harpreet Sahota\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@Cometml\" \/>\n<meta name=\"twitter:site\" content=\"@Cometml\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Harpreet Sahota\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"19 minutes\" \/>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Contextual Recall in LangChain Agents - Comet","description":"A LangChain Agent with Memory is different from a Chain with Memory. Learn more about what distinguishes the two in this article.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/","og_locale":"en_US","og_type":"article","og_title":"Contextual Recall in LangChain Agents","og_description":"A LangChain Agent with Memory is different from a Chain with Memory. Learn more about what distinguishes the two in this article.","og_url":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/","og_site_name":"Comet","article_publisher":"https:\/\/www.facebook.com\/cometdotml","article_published_time":"2023-11-30T14:50:32+00:00","article_modified_time":"2025-04-24T17:04:00+00:00","og_image":[{"url":"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*e8fL__n-4GUQKQXn","type":"","width":"","height":""}],"author":"Harpreet Sahota","twitter_card":"summary_large_image","twitter_creator":"@Cometml","twitter_site":"@Cometml","twitter_misc":{"Written by":"Harpreet Sahota","Est. reading time":"19 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/#article","isPartOf":{"@id":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/"},"author":{"name":"Harpreet Sahota","@id":"https:\/\/www.comet.com\/site\/#\/schema\/person\/46036ab474aa916e2873daece26a28d6"},"headline":"Contextual Recall in LangChain Agents","datePublished":"2023-11-30T14:50:32+00:00","dateModified":"2025-04-24T17:04:00+00:00","mainEntityOfPage":{"@id":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/"},"wordCount":722,"publisher":{"@id":"https:\/\/www.comet.com\/site\/#organization"},"image":{"@id":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/#primaryimage"},"thumbnailUrl":"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*e8fL__n-4GUQKQXn","keywords":["LangChain","Language Models","LLM","LLMOps","Prompt Engineering"],"articleSection":["LLMOps","Tutorials"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/","url":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/","name":"Contextual Recall in LangChain Agents - Comet","isPartOf":{"@id":"https:\/\/www.comet.com\/site\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/#primaryimage"},"image":{"@id":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/#primaryimage"},"thumbnailUrl":"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*e8fL__n-4GUQKQXn","datePublished":"2023-11-30T14:50:32+00:00","dateModified":"2025-04-24T17:04:00+00:00","description":"A LangChain Agent with Memory is different from a Chain with Memory. Learn more about what distinguishes the two in this article.","breadcrumb":{"@id":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/#primaryimage","url":"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*e8fL__n-4GUQKQXn","contentUrl":"https:\/\/cdn-images-1.medium.com\/max\/1600\/0*e8fL__n-4GUQKQXn"},{"@type":"BreadcrumbList","@id":"https:\/\/www.comet.com\/site\/blog\/contextual-recall-in-langchain-agents\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.comet.com\/site\/"},{"@type":"ListItem","position":2,"name":"Contextual Recall in LangChain Agents"}]},{"@type":"WebSite","@id":"https:\/\/www.comet.com\/site\/#website","url":"https:\/\/www.comet.com\/site\/","name":"Comet","description":"Build Better Models Faster","publisher":{"@id":"https:\/\/www.comet.com\/site\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.comet.com\/site\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.comet.com\/site\/#organization","name":"Comet ML, Inc.","alternateName":"Comet","url":"https:\/\/www.comet.com\/site\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.comet.com\/site\/#\/schema\/logo\/image\/","url":"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2025\/01\/logo_comet_square.png","contentUrl":"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2025\/01\/logo_comet_square.png","width":310,"height":310,"caption":"Comet ML, Inc."},"image":{"@id":"https:\/\/www.comet.com\/site\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/cometdotml","https:\/\/x.com\/Cometml","https:\/\/www.youtube.com\/channel\/UCmN63HKvfXSCS-UwVwmK8Hw"]},{"@type":"Person","@id":"https:\/\/www.comet.com\/site\/#\/schema\/person\/46036ab474aa916e2873daece26a28d6","name":"Harpreet Sahota","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.comet.com\/site\/#\/schema\/person\/image\/2d21512be19ba7e19a71a803309e2a88","url":"https:\/\/secure.gravatar.com\/avatar\/a6ca5a533fc9f143a0a7428037ff652aa0633d66bf27e76ae89b955ae72a0f2d?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/a6ca5a533fc9f143a0a7428037ff652aa0633d66bf27e76ae89b955ae72a0f2d?s=96&d=mm&r=g","caption":"Harpreet Sahota"},"url":"https:\/\/www.comet.com\/site\/blog\/author\/theartistsofdatasciencegmail-com\/"}]}},"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/posts\/8277","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/users\/68"}],"replies":[{"embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/comments?post=8277"}],"version-history":[{"count":1,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/posts\/8277\/revisions"}],"predecessor-version":[{"id":15428,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/posts\/8277\/revisions\/15428"}],"wp:attachment":[{"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/media?parent=8277"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/categories?post=8277"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/tags?post=8277"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/coauthors?post=8277"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}