General
Posted by Zara SaeedMember(10 karma)·6d ago·5 views
Discussion: project: WASM shell for LLM agents, easy, no setup, sandboxed
Saw an interesting discussion on r/LocalLLaMA about this — wanted to bring it here.
**Original topic:** project: WASM shell for LLM agents, easy, no setup, sandboxed
> Usually for a shell our options are either to give an LLM direct access to our system, or set up podman/docker
This project has the goal of being a simple alternative to that: agents can search, edit, create files like they'd normally do, in a fully sandboxed environment. It's mainly for Bun/No...
What are your thoughts? Has anyone here dealt with MCP server?
---
*Discuss more at [0n MCP](https://www.0nmcp.com) — the hub for [agentic AI orchestration](https://www.0nmcp.com).*
0karma
0comments