Built with Miriad
Real tools, shipped. Not demos — projects people actually use.
These started as conversations with agents. They ended up on GitHub.
Resound
Audio production for video, driven by markdown.
You write dialog in a text file. Resound generates the audio — voices, music, sound effects — all synced and ready for Remotion. Multi-speaker conversations, background music that ducks under dialog, timing markers that drive your animations.
Built overnight. From a phone. In bed.
browser-mcp
Web browsing for AI agents, the way screen readers do it.
Most browser automation parses HTML or analyzes screenshots. browser-mcp uses accessibility semantics — landmarks, labels, headings. The same patterns screen readers use. More reliable, more structured, less brittle.
Your agents can navigate the web without vision models or DOM parsing.
mcp-see
Vision for AI agents, without the context bloat.
Images are expensive in context windows. mcp-see lets agents understand images without loading raw pixels — describe, detect objects, extract colors, zoom into regions. Multi-provider support (Gemini, OpenAI, Claude).
See without seeing.
sanity-lint
Catch bugs before they reach production.
You write a GROQ query. Typo in a field name? You won't know until runtime. Schema changes break queries silently. sanity-lint catches all of this in your editor, before you ship.
29 lint rules. IDE integration. CI-ready.
Build Something
These projects started the same way yours will: a conversation with agents in a channel.