How Messages Connect to Long-Term Memory

In the previous lesson, you learned that add_message() stores the message in the short-term graph. That is not all it does.

In this lesson, you will learn how every message call automatically seeds the long-term memory layer through entity extraction.

What add_message does beyond storage

When you call add_message(), the library runs an entity extraction pipeline on the message content. Any named entities the pipeline finds — people, organizations, locations, events, objects — are automatically created or merged as entity nodes in the long-term memory graph.

For example, if the message content is "Review Jessica Norris account for credit limit increase", the extraction pipeline identifies:

  • "Jessica Norris" is created or merged as an :EntityPerson node

  • A (Message)-[:MENTIONS]→(EntityPerson) relationship is created linking the message to that node

This is what keeps the three memory layers connected: every conversation message is a potential seed for the knowledge graph.

How entity extraction enables cross-session retrieval

Without this connection, short-term and long-term memory would be separate silos. With entity extraction in place, a question asked in session one ("What do you know about Jessica Norris?") can retrieve knowledge extracted from sessions two, five, and twelve — even though the user never explicitly stored anything.

The extraction pipeline — the three stages, merge strategies, and deduplication — is covered in detail in Module 3. For now, the key point is that you do not need to call anything extra: add_message() handles the connection automatically.

Summary

In this lesson, you learned how messages connect to long-term memory:

  • Automatic extractionadd_message() runs entity extraction on every message; no additional call needed

  • Graph connectivity — extracted entities become :Entity nodes linked to the Message through MENTIONS

  • Cross-session knowledge — entities persist after sessions end, so future sessions can retrieve knowledge from past conversations

  • Full pipeline details — the extraction pipeline (spaCy → GLiNER2 → LLM fallback) is covered in Module 3

In the next lesson, you will test your understanding of short-term memory with a short quiz.

Chatbot

How can I help you today?

Data Model

Your data model will appear here.