@@ -294,6 +294,155 @@ Works with Claude Code, Cursor, and any MCP-compatible AI coding assistant.
294294
295295---
296296
297+ ## AutoGen Integration
298+
299+ Give AutoGen agents persistent memory that survives across sessions.
300+
301+ ``` python
302+ from agentmemory import MemoryStore
303+ from agentmemory.adapters.autogen import AutoGenMemoryHook, get_autogen_memory_context
304+ import autogen
305+
306+ memory = MemoryStore(agent_id = " my-autogen-agent" )
307+
308+ # Inject past context into the agent's system_message
309+ context = get_autogen_memory_context(memory, role = " Research Assistant" ,
310+ goal = " literature review on LLMs" )
311+
312+ assistant = autogen.AssistantAgent(
313+ name = " researcher" ,
314+ system_message = context + " \n You are a helpful research assistant." ,
315+ llm_config = {" model" : " gpt-4o-mini" },
316+ )
317+
318+ # Hook captures every reply and stores it in memory
319+ hook = AutoGenMemoryHook(memory, importance = 6 )
320+ assistant.register_reply(
321+ trigger = autogen.ConversableAgent,
322+ reply_func = hook.on_agent_reply,
323+ position = 0 ,
324+ )
325+ ```
326+
327+ Install: ` pip install "agentcortex[autogen]" `
328+
329+ ---
330+
331+ ## Qdrant Production Backend
332+
333+ Scale to millions of vectors with a dedicated vector database.
334+
335+ ``` python
336+ from agentmemory import MemoryStore
337+
338+ # docker run -p 6333:6333 qdrant/qdrant
339+ memory = MemoryStore(
340+ agent_id = " my-agent" ,
341+ semantic_backend = " qdrant" ,
342+ qdrant_url = " http://localhost:6333" , # or Qdrant Cloud URL
343+ embedding_provider = " sentence-transformers" ,
344+ )
345+
346+ memory.remember(" Production architecture uses microservices" , importance = 8 )
347+ results = memory.recall(" architecture" )
348+ ```
349+
350+ Install: ` pip install "agentcortex[qdrant]" `
351+
352+ ---
353+
354+ ## Memory Export / Import (JSON)
355+
356+ Back up and restore episodic memories across machines or agent instances.
357+
358+ ``` python
359+ from agentmemory import MemoryStore
360+
361+ memory = MemoryStore(agent_id = " my-agent" )
362+ memory.remember(" PostgreSQL is our main database" , importance = 8 )
363+
364+ # Export to JSON file
365+ memory.export_json(" backup.json" )
366+
367+ # Restore on another machine / new agent
368+ new_memory = MemoryStore(agent_id = " new-agent" )
369+ count = new_memory.import_json(" backup.json" )
370+ print (f " Imported { count} memories " )
371+
372+ # Merge instead of replacing
373+ new_memory.import_json(" backup.json" , merge = True )
374+
375+ # Or work with the dict directly
376+ data = memory.export_json() # no path → returns dict only
377+ new_memory.import_json(data)
378+ ```
379+
380+ ---
381+
382+ ## Memory CLI
383+
384+ Inspect and manage memories from the command line.
385+
386+ ``` bash
387+ # Inspect stored memories
388+ agentmemory inspect --agent-id my-project
389+
390+ # agentmemory — agent: my-project
391+ # ════════════════════════════════════════
392+ # EPISODIC MEMORY (3 entries)
393+ # ────────────────────────────────────────
394+ # # IMP Created Content
395+ # 1 9 2026-02-28 14:23:01 We use PostgreSQL for relational...
396+ # 2 7 2026-02-27 09:14:55 payment/process_transaction.py h...
397+ # 3 5 2026-02-26 18:30:12 User prefers functional style ove...
398+
399+ # Export memories to JSON
400+ agentmemory export --agent-id my-project --output memories.json
401+
402+ # Import memories (restores; use --merge to add alongside existing)
403+ agentmemory import memories.json --agent-id new-project --merge
404+ ```
405+
406+ Install: ` pip install agentcortex ` (the CLI is always included)
407+
408+ ---
409+
410+ ## Async Support
411+
412+ Use agentmemory in FastAPI, aiohttp, or any async Python application.
413+
414+ ``` python
415+ import asyncio
416+ from agentmemory import AsyncMemoryStore
417+
418+ async def main ():
419+ # Identical API to MemoryStore — just add await
420+ memory = AsyncMemoryStore(agent_id = " my-async-agent" )
421+
422+ await memory.remember(" User prefers Python over JavaScript" , importance = 7 )
423+ results = await memory.recall(" tech stack" )
424+ context = await memory.get_context(" What do we know?" )
425+
426+ # Export / import work the same way
427+ data = await memory.export_json()
428+ await memory.import_json(data)
429+
430+ memory.close()
431+
432+ # Or use as an async context manager
433+ async def with_context_manager ():
434+ async with AsyncMemoryStore(agent_id = " my-agent" ) as memory:
435+ await memory.remember(" Context manager closes executor automatically" )
436+ ctx = await memory.get_context()
437+ print (ctx)
438+
439+ asyncio.run(main())
440+ ```
441+
442+ Install: ` pip install agentcortex ` (` AsyncMemoryStore ` is always included)
443+
444+ ---
445+
297446## Comparison
298447
299448| | MemGPT | LangChain Memory | ** AgentMemory** |
@@ -311,11 +460,11 @@ Works with Claude Code, Cursor, and any MCP-compatible AI coding assistant.
311460
312461## Roadmap
313462
314- - [ ] AutoGen adapter
315- - [ ] Qdrant production backend examples
316- - [ ] Memory export/import (JSON)
317- - [ ] Memory visualization CLI ( ` agentmemory inspect ` )
318- - [ ] Async support ( ` AsyncMemoryStore ` )
463+ - [x ] AutoGen adapter ( ` pip install "agentcortex[autogen]" ` )
464+ - [x ] Qdrant production backend ( ` pip install "agentcortex[qdrant]" ` )
465+ - [x ] Memory export/import (JSON) — ` memory.export_json() ` / ` memory.import_json() `
466+ - [x ] Memory visualization CLI — ` agentmemory inspect / export / import `
467+ - [x ] Async support — ` AsyncMemoryStore ` with full ` await ` API
319468- [x] MCP server integration (` pip install "agentcortex[mcp]" ` )
320469
321470---
0 commit comments