History
Persistent conversation memory that allows agents to remember past interactions across sessions.
History is Peargent’s persistent conversation memory system. It allows agents and pools to remember past interactions across sessions, enabling continuity, context-awareness, and long-running workflows.
Think of history like a notebook your agent writes in. Each message, tool call, and response is recorded so the agent can look back and recall what happened earlier.
You can pass a HistoryConfig to any Agent or Pool. If a pool receives a history, it overrides individual agent histories so all agents share the same conversation thread. History can be stored using backends such as in-memory, file, SQLite, PostgreSQL, Redis, or custom storage backends.
Creating History
To create a history, you need to pass a HistoryConfig to the create_agent or create_pool function. HistoryConfig is a configuration object that allows you to configure the history of an agent or pool.
By defautl HistoryConfig uses InMemory() storage backend (temporary storage - data is lost when program exits).
Adding history to Agents:
from peargent import create_agent
from peargent.history import HistoryConfig
from peargent.models import openai
agent = create_agent(
name="Assistant",
description="Helpful assistant with memory",
persona="You are a helpful assistant.",
model=openai("gpt-4o"),
history=HistoryConfig()
)
# First conversation
agent.run("My name is Alice")
# Later conversation - agent remembers
agent.run("What's my name?")
# Output: "Your name is Alice"Adding history to Pools:
from peargent import create_pool
from peargent.history import HistoryConfig
pool = create_pool(
agents=[agent1, agent2]
history=HistoryConfig()
)
# First conversation
pool.run("My name is Alice")
# Later conversation - agent remembers
pool.run("What's my name?")
# Output: "Your name is Alice"How History Works
Load Conversation
When an agent begins a run, it loads the existing conversation thread from the configured storage backend.
Append Messages
Each new user message, tool call, and agent response is added to the conversation thread in order.
Manage Context
If the conversation grows beyond max_context_messages, the configured strategy (trim or summarize) is applied to keep the context window manageable.
Persist Data
All updates are saved back to the storage backend, ensuring the conversation history is retained across sessions and future runs.
Because history supports many advanced capabilities, custom storage backends, manual thread control, serialization, and low-level message operations, listing every option here would make this page too large. For deeper configuration and advanced usage, see Advanced History.
Storage Backends
History can be stored in different backends depending on your use case. Here are all supported backends available in Peargent:
from peargent.history import HistoryConfig
from peargent.storage import InMemory, File, Sqlite, Postgresql, Redis
# InMemory (Default)
# - Fast, temporary storage
# - Data is lost when the program exits
history = HistoryConfig(store=InMemory())
# File (JSON files)
# - Stores conversations as JSON on disk
# - Good for local development or small apps
history = HistoryConfig(store=File(storage_dir="./conversations"))
# SQLite (Local database)
# - Reliable, ACID-compliant
# - Ideal for single-server production
history = HistoryConfig(
store=Sqlite(
database_path="./chat.db",
table_prefix="peargent"
)
)
# PostgreSQL (Production database)
# - Scalable, supports multi-server deployments
history = HistoryConfig(
store=Postgresql(
connection_string="postgresql://user:pass@localhost/dbname",
table_prefix="peargent"
)
)
# Redis (Distributed + TTL)
# - Fast, supports key expiration
# - Ideal for cloud deployments and ephemeral memory
history = HistoryConfig(
store=Redis(
host="localhost",
port=6379,
db=0,
password=None,
key_prefix="peargent"
)
)To create a custom storage backend, refer to History Management - Custom Storage Backends.
Auto Context Management
When conversations become too long, Peargent automatically manages the context window to keep prompts efficient and within model limits. This behavior is controlled by the strategy you choose.
Strategies
smart (Default)
Automatically decides whether to trim or summarize based on the size and importance of the overflow:
-
Small overflow → trim (fast)
-
Important tool calls → summarize
-
Large overflow → aggressive summarization
history = HistoryConfig(
auto_manage_context=True,
strategy="smart"
)trim_last
Keeps the most recent messages and removes the oldest.
Fast and uses no LLM.
history = HistoryConfig(
auto_manage_context=True,
strategy="trim_last",
max_context_messages=15
)trim_first
Keeps older messages and removes the newer ones.
history = HistoryConfig(
auto_manage_context=True,
strategy="trim_first"
)summarize
Uses an LLM to summarize older messages, preserving context while reducing size.
history = HistoryConfig(
auto_manage_context=True,
strategy="summarize",
summarize_model=gemini("gemini-2.5-flash") # Fast model for summaries
)summarize_model is used only with "summarize" and "smart" strategies. If not provided, the Agent's model will be used. Parameters
| Parameter | Type | Default | Description | Required |
|---|---|---|---|---|
auto_manage_context | bool | False | Automatically manage context window when conversations get too long | No |
max_context_messages | int | 20 | Maximum messages before auto-management triggers | No |
strategy | str | "smart" | Context management strategy: "smart", "trim_last", "trim_first", "summarize" | No |
summarize_model | Model | None | LLM model for summarization (defaults to agent's model if not provided) | No |
store | StorageType | InMemory() | Storage backend: InMemory(), File(), Sqlite(), Postgresql(), Redis() | No |
Learn more about advanced history features including custom storage backends, manual thread control, and all available history methods in Advanced History.