Add DollhouseMCP/mcp-server to Knowledge & Memory

This commit is contained in:
Mick Darling
2026-04-16 11:45:02 -04:00
parent a56e86528f
commit 4035857382
+1
View File
@@ -1490,6 +1490,7 @@ Persistent memory storage using knowledge graph structures. Enables AI models to
- [dodopayments/contextmcp](https://github.com/dodopayments/context-mcp) 📇 ☁️ 🏠 - Self-hosted MCP server that indexes documentation from various sources and serves it to AI Agents with semantic search. - [dodopayments/contextmcp](https://github.com/dodopayments/context-mcp) 📇 ☁️ 🏠 - Self-hosted MCP server that indexes documentation from various sources and serves it to AI Agents with semantic search.
- [doobidoo/MCP-Context-Provider](https://github.com/doobidoo/MCP-Context-Provider) 📇 🏠 - Static server that provides persistent tool-specific context and rules for AI models - [doobidoo/MCP-Context-Provider](https://github.com/doobidoo/MCP-Context-Provider) 📇 🏠 - Static server that provides persistent tool-specific context and rules for AI models
- [doobidoo/mcp-memory-service](https://github.com/doobidoo/mcp-memory-service) 📇 🏠 - Universal memory service providing semantic search, persistent storage, and autonomous memory consolidation - [doobidoo/mcp-memory-service](https://github.com/doobidoo/mcp-memory-service) 📇 🏠 - Universal memory service providing semantic search, persistent storage, and autonomous memory consolidation
- [DollhouseMCP/mcp-server](https://github.com/DollhouseMCP/mcp-server) [![DollhouseMCP MCP server](https://glama.ai/mcp/servers/DollhouseMCP/mcp-server/badge)](https://glama.ai/mcp/servers/DollhouseMCP/mcp-server) 📇 🏠 🍎 🪟 🐧 - One-line installable MCP server that adds reusable customization elements — personas, skills, templates, agents, memory, and ensembles (collected customization tools) — to any MCP Client application. Dynamic permissioning for safe AI operations, a robust validation architecture, versioning, and a public collection of shareable content. Install: `npx @dollhousemcp/mcp-server@latest --web`.
- [edobusy/agenthold](https://github.com/edobusy/agenthold) [![agenthold MCP server](https://glama.ai/mcp/servers/edobusy/agenthold/badges/score.svg)](https://glama.ai/mcp/servers/edobusy/agenthold) 🐍 🏠 🍎 🪟 🐧 - Shared versioned state store with optimistic concurrency control for coordinating concurrent AI agents. SQLite-backed claim/release locks and append-only audit log. - [edobusy/agenthold](https://github.com/edobusy/agenthold) [![agenthold MCP server](https://glama.ai/mcp/servers/edobusy/agenthold/badges/score.svg)](https://glama.ai/mcp/servers/edobusy/agenthold) 🐍 🏠 🍎 🪟 🐧 - Shared versioned state store with optimistic concurrency control for coordinating concurrent AI agents. SQLite-backed claim/release locks and append-only audit log.
- [elvismdev/mem0-mcp-selfhosted](https://github.com/elvismdev/mem0-mcp-selfhosted) [![mem0-mcp-selfhosted MCP server](https://glama.ai/mcp/servers/elvismdev/mem0-mcp-selfhosted/badges/score.svg)](https://glama.ai/mcp/servers/elvismdev/mem0-mcp-selfhosted) 🐍 🏠 🍎 🪟 🐧 - Self-hosted mem0 MCP server for Claude Code with Qdrant vector search, Neo4j knowledge graph, and Ollama embeddings. Zero-config OAT auth, split-model graph routing, session hooks for automatic cross-session memory, and 11 tools. Supports both Anthropic and fully local Ollama setups. - [elvismdev/mem0-mcp-selfhosted](https://github.com/elvismdev/mem0-mcp-selfhosted) [![mem0-mcp-selfhosted MCP server](https://glama.ai/mcp/servers/elvismdev/mem0-mcp-selfhosted/badges/score.svg)](https://glama.ai/mcp/servers/elvismdev/mem0-mcp-selfhosted) 🐍 🏠 🍎 🪟 🐧 - Self-hosted mem0 MCP server for Claude Code with Qdrant vector search, Neo4j knowledge graph, and Ollama embeddings. Zero-config OAT auth, split-model graph routing, session hooks for automatic cross-session memory, and 11 tools. Supports both Anthropic and fully local Ollama setups.
- [Cartisien/engram-mcp](https://github.com/Cartisien/engram-mcp) [![engram-mcp MCP server](https://glama.ai/mcp/servers/Cartisien/engram-mcp/badges/score.svg)](https://glama.ai/mcp/servers/Cartisien/engram-mcp) 📇 🏠 🍎 🪟 🐧 - Persistent semantic memory for AI agents. SQLite-backed, local-first, zero config. Semantic search via Ollama embeddings (nomic-embed-text) with keyword fallback. remember, recall, history, forget, and stats tools. Works with Claude Desktop, Cursor, and any MCP client. - [Cartisien/engram-mcp](https://github.com/Cartisien/engram-mcp) [![engram-mcp MCP server](https://glama.ai/mcp/servers/Cartisien/engram-mcp/badges/score.svg)](https://glama.ai/mcp/servers/Cartisien/engram-mcp) 📇 🏠 🍎 🪟 🐧 - Persistent semantic memory for AI agents. SQLite-backed, local-first, zero config. Semantic search via Ollama embeddings (nomic-embed-text) with keyword fallback. remember, recall, history, forget, and stats tools. Works with Claude Desktop, Cursor, and any MCP client.