Features
Everything we have shipped, in one place
Loom is a desktop-first AI coding environment. The list below is what is in the app today: sessions and agents, memory, runtimes, remote compute, and the parts you do not notice until you need them.
Sessions & Agents
Work in many places at once
Every conversation is its own workspace. Spin them up, branch them, and let agents talk to each other when you need a second pair of hands.
Sessions
Start from a folder, a repo URL, or just a blank chat. Each session keeps its own history, policy, and model preference.
Session forks
Explore an alternate approach without trashing the original. Loom creates a git worktree so both branches run side by side.
Checkpoints
Mark a moment in the conversation. Jump back, resume, or use it as the boundary for downstream knowledge extraction.
Custom agents
Name them, pick an icon, route sessions to them. A light personality layer that carries across all of a session's turns.
Super Agent
An orchestrator that can spawn and message other sessions on your behalf. Useful for multi-step plans that span repos.
Agent-to-agent messaging
Sessions are not sandboxed from each other. One agent can ping another mid-task, hand off context, then continue.
Runtimes
Pick the brain you want
Claude Code and Codex both run first-class. Switch models mid-conversation without losing context.
Claude Code
Haiku, Sonnet, Opus. The full Claude lineup, talking to files, shell, and the web through Anthropic's CLI.
Codex
GPT-5.4, GPT-5.4 Mini, GPT-5.3 Codex. Drop in for cost-sensitive or speed-sensitive work without restructuring the session.
Model switching
Move between runtimes mid-conversation. Attribution stays correct on every turn so traces always show which model wrote what.
Memory
Your AI remembers
One feature, a lot of power. Loom turns every conversation into searchable, linked long-term memory automatically.
Loom Memory Graph
Conversations are segmented into topics, distilled into reusable knowledge, tagged, and entity-linked. Full-text and semantic search on top. Walk the graph to see how ideas connect. Runs fully on-device with a local embedding model, no data leaves your machine.
Tools & Permissions
Granular control over every action
From 'run anything, I trust you' to 'ask before every shell command', and everything in between.
Guard levels
Four modes (Ignore / Flexible / Default / Strict) decide how permissive each session is. Set a default and override per-session.
Tool allowlists
Whitelist exactly which tools a session can use. Loom's policy decider enforces it before the agent ever runs.
Secret protection
Inject session-scoped env vars into Bash calls, redact the values from output before they reach the model's context window.
Loom tools (MCP)
Built-in MCP server gives the agent access to memory queries, checkpoint creation, inter-agent messaging, and more.
GitHub integration
Authenticated MCP for issues, PRs, and repo operations. Plugs into GitHub token once, works across sessions.
Custom MCP servers
Drop your own MCP endpoints into a session's config. Loom exposes them the same way as built-ins.
Environment
Browser and terminal, built in
You do not need to leave Loom to see what the agent sees. Real web pages and real shells, embedded in the app.
Embedded browser
A real Chromium view inside Loom, not a screenshot. The agent can navigate, and you can browse alongside.
Browser → session context
Select text or capture a region of a page and drop it into the chat. Agents receive both the text and the screenshot.
Embedded terminal
A full xterm drawer pinned to the bottom. Shares the session's cwd, so you and the agent are in the same place.
Remote runtime
Offload the heavy lifting
Rent a persistent machine in the cloud, target projects at it, and let Loom's UI stay responsive while the agent works.
Remote instance
A dedicated container you rent. Claude Code spawns there, not on your laptop. Compute load and battery drain stay off your device.
Setup terminal
An in-app xterm that drops you straight into the remote machine's Claude so /login and other setup land in one pane.
Remote file browser
Inspect the workspace directly from Loom. Files the agent creates appear in the tree on the same screen you are chatting in.
Auto stop & wake
After 60 minutes idle the instance stops so you do not pay for compute. Your next message wakes it automatically.
Cloud access
LLMs and credits, handled
Skip per-provider API key juggling. Loom routes LLM traffic through a single proxy and meters it with credits.
Cloud credits
A unified currency for every LLM call on the platform. Balance and transaction log visible inside Loom.
Cloud LLM proxy
Claude and GPT both reachable through withloom's authenticated proxy. No personal API keys to manage or rotate.
Data
Bring your history, keep it yours
Move into Loom without losing what came before, and leave with everything if you ever want to.
Claude Code history import
Existing ~/.claude conversations land in Loom as structured sessions, ready for memory extraction the moment they arrive.
Data export
A complete snapshot of your sessions, memory graph, and knowledge. Yours to archive or migrate.
Ready to try it?
Loom is in preview. Request early access and we will get you set up with a build tuned for your platform.