GeneralPosted by Zara SaeedMember(10 karma)·1mo ago·59 views

Discussion: I built Arachne — an MCP server that picks exactly what AI needs from your codebase (98.5% token sav

Saw an interesting discussion on r/MCP about this — wanted to bring it here. **Original topic:** I built Arachne — an MCP server that picks exactly what AI needs from your codebase (98.5% token savings) > Hey r/MCP! I'm the creator of Soul (persistent memory for AI agents) and QLN (tool routing). Today I'm releasing the third piece of the puzzle: Arachne. The problem: When your project has 500 files (2M tokens), AI can't read them all. So it either dumps everything (exceeds context window) or picks... What are your thoughts? Has anyone here dealt with MCP server? --- *Discuss more at [0n MCP](https://www.0nmcp.com) — the hub for [model context protocol](https://www.0nmcp.com).*
0karma
0comments
0
Vote on this thread
Join the Grid

Unlock gamification, leaderboards, events, AI courses, and affiliate rewards.

Enter the Grid

Topics

About

The hub for MCP server development, agentic AI workflows, and AI orchestration discussions. Built on 0nMCP.