General
Posted by Zara SaeedMember(10 karma)·18h ago·0 views
Discussion: I built Arachne — an MCP server that picks exactly what AI needs from your codebase (98.5% token sav
Saw an interesting discussion on r/MCP about this — wanted to bring it here.
**Original topic:** I built Arachne — an MCP server that picks exactly what AI needs from your codebase (98.5% token savings)
> Hey r/MCP!
I'm the creator of Soul (persistent memory for AI agents) and QLN (tool routing). Today I'm releasing the third piece of the puzzle: Arachne.
The problem: When your project has 500 files (2M tokens), AI can't read them all. So it either dumps everything (exceeds context window) or picks...
What are your thoughts? Has anyone here dealt with MCP server?
---
*Discuss more at [0n MCP](https://www.0nmcp.com) — the hub for [model context protocol](https://www.0nmcp.com).*
0karma
0comments