distill-memory▌
nowledge-co/community · updated Apr 8, 2026
Persistent knowledge capture for insights, decisions, and procedures that span multiple agent sessions.
- ›Store decisions with rationale, debugging lessons, repeatable workflows, and durable preferences as searchable memories
- ›Distinguish between new insights (use add ) and refinements to existing memories (use update ) to avoid duplication
- ›Design memories as atomic, standalone entries with clear titles that remain useful across future sessions
- ›Ideal for preserving incident learnings
Distill Memory
Save proactively when the conversation produces a decision, preference, plan, procedure, learning, or important context. Do not wait to be asked.
When to Suggest (Moment Detection)
Breakthrough: Extended debugging resolves, user relief ("Finally!", "Aha!"), root cause found
Decision: Compared options, chose with rationale, trade-off resolved
Research: Investigated multiple approaches, conclusion reached, optimal path determined
Twist: Unexpected cause-effect, counterintuitive solution, assumption challenged
Lesson: "Next time do X", preventive measure, pattern recognized
Skip: Routine fixes, work in progress, simple Q&A, generic info
Memory Quality
Good (atomic + actionable):
- "React hooks cleanup must return function. Caused leaks."
- "PostgreSQL over MongoDB: ACID needed for transactions."
Poor: Vague "Fixed bugs", conversation transcript
Tool Usage
Use nmem CLI to create memories:
nmem m add "Insight + context for future use" \
-t "Searchable title (50-60 chars)" \
-i 0.8
If an existing memory already captures the same decision, workflow, or preference and the new information refines it, update that memory instead of creating a duplicate:
nmem m update <id> -t "Updated title"
Content: Outcome/insight focus, include "why", enough context
Importance: 0.8-1.0 major | 0.5-0.7 useful | 0.3-0.4 minor
Note: For programmatic use, add --json flag to get JSON response
Examples:
# High-value insight
nmem m add "React hooks cleanup must return function. Caused memory leaks in event listeners." \
-t "React Hooks Cleanup Pattern" \
-i 0.9
# Decision with context
nmem m add "Chose PostgreSQL over MongoDB for ACID compliance and complex queries" \
-t "Database: PostgreSQL" \
-i 0.9
Suggestion
Timing: After resolution/decision, when user pauses
Pattern: "This [type] seems valuable - [essence]. Distill into memory?"
Frequency: 1-3 per session typical, quality over quantity
Troubleshooting
If nmem is not in PATH: pip install nmem-cli
For remote servers: create ~/.nowledge-mem/config.json with {"apiUrl": "...", "apiKey": "..."}.
Run /status to check server connection.
Discussion
Product Hunt–style comments (not star reviews)- No comments yet — start the thread.
Ratings
4.4★★★★★27 reviews- ★★★★★Advait Harris· Dec 24, 2024
Registry listing for distill-memory matched our evaluation — installs cleanly and behaves as described in the markdown.
- ★★★★★Anaya Iyer· Dec 16, 2024
Useful defaults in distill-memory — fewer surprises than typical one-off scripts, and it plays nicely with `npx skills` flows.
- ★★★★★Aditi Patel· Dec 12, 2024
distill-memory fits our agent workflows well — practical, well scoped, and easy to wire into existing repos.
- ★★★★★Mei Rao· Nov 15, 2024
distill-memory reduced setup friction for our internal harness; good balance of opinion and flexibility.
- ★★★★★Aanya Sanchez· Nov 7, 2024
We added distill-memory from the explainx registry; install was straightforward and the SKILL.md answered most questions upfront.
- ★★★★★Neel Garcia· Oct 26, 2024
distill-memory reduced setup friction for our internal harness; good balance of opinion and flexibility.
- ★★★★★Oshnikdeep· Sep 13, 2024
distill-memory reduced setup friction for our internal harness; good balance of opinion and flexibility.
- ★★★★★Kaira Khanna· Sep 13, 2024
Solid pick for teams standardizing on skills: distill-memory is focused, and the summary matches what you get after install.
- ★★★★★Mei Mehta· Sep 5, 2024
distill-memory is among the better-maintained entries we tried; worth keeping pinned for repeat workflows.
- ★★★★★Advait Martin· Aug 24, 2024
Solid pick for teams standardizing on skills: distill-memory is focused, and the summary matches what you get after install.
showing 1-10 of 27