i hated current chat interfaces for a long time:
1. Context switching hell — the moment you want to explore a different angle you either muddy the current thread or start over and lose everything you built up.
2. Memory degradation — LLMs get worse as conversations grow. Some queries need no memory, some need temporary context, some need everything. Branching solves this cleanly.
3. Parallel execution — why wait for one response when you can run multiple simultaneously? Every chat app makes you wait. That's absurd.
4. Model lock-in — OpenAI won't show you Claude's answer. Anthropic won't show you GPT's. LM Canvas connects to 300+ models and has no reason to hide any of them from you.
so i built a spatial canvas for working with LLMs — branch conversations, run prompts in parallel, compare models side by side. you can also import your ChatGPT conversations and they convert into branchable nodes on the canvas.
please give me the most spicy / raw feedback you have to offer. i’ll be in the comments for all day / night :)
loading...