Rabbitholes

Freemium

A tool to create and navigate exploratory AI conversations.

Rabbitholes is an AI productivity tool that utilizes an infinite canvas to facilitate non-linear exploratory conversations. Key features include node-based branching, support for local and cloud LLMs via a bring-your-own-key model, and the ability to integrate PDFs, images, and web links directly into the workflow. It is designed for researchers, students, and knowledge workers who require deep context management without the limitations of standard chat interfaces. (verified: 2026-01-29)

Jan 29, 2026
Get Started
Pricing: Freemium
Last verified: Jan 29, 2026
Compare alternativesBrowse by taskGuides

Key facts

Pricing

Freemium

Use cases

Researchers and students organizing complex information across an infinite canvas to prevent context pollution during long-form study sessions (verified: 2026-01-29), Knowledge workers managing multiple simultaneous AI queries by branching conversations into separate nodes for better structural clarity (verified: 2026-01-29), Content creators integrating external files and websites into AI chats to generate outlines or summaries from specific source materials (verified: 2026-01-29)

Strengths

The infinite canvas interface allows users to ask multiple questions simultaneously and cherry-pick context from specific nodes for new branches (verified: 2026-01-29), Users maintain full control over their data through local storage on their own devices and the ability to export chats as JSON or markdown (verified: 2026-01-29), The platform supports a wide range of providers including Ollama for local LLMs, Groq, Anthropic, and custom API integrations (verified: 2026-01-29)

Limitations

Users must provide their own API keys from third-party providers to enable AI model functionality within the application (verified: 2026-01-29), The software requires a manual download and installation process on a supported operating system followed by license key activation (verified: 2026-01-29)

Last verified

Jan 29, 2026

Plan your next step

Use these links to move from this review into compare and task workflows before committing to a tool stack.

CompareBrowse by task GuidesTools Deals

Priority tasks: Content writing tasksCode generation tasksVideo generation tasksMeeting notes tasksTranscription tasks

Priority guides: AI SEO tools guideAI coding tools guideAI video tools guideAI meeting notes guide

Strengths

  • The infinite canvas interface allows users to ask multiple questions simultaneously and cherry-pick context from specific nodes for new branches (verified: 2026-01-29)
  • Users maintain full control over their data through local storage on their own devices and the ability to export chats as JSON or markdown (verified: 2026-01-29)
  • The platform supports a wide range of providers including Ollama for local LLMs, Groq, Anthropic, and custom API integrations (verified: 2026-01-29)

Limitations

  • Users must provide their own API keys from third-party providers to enable AI model functionality within the application (verified: 2026-01-29)
  • The software requires a manual download and installation process on a supported operating system followed by license key activation (verified: 2026-01-29)

FAQ

How does the platform handle user data and the storage of conversation histories?

All chat data and conversation histories are stored locally on the user's own device rather than on external servers. Users have the option to export their canvas data as either JSON or markdown files for external backup or use in other applications (verified: 2026-01-29).

What types of external media and files are supported for integration into the canvas nodes?

The tool supports a variety of file formats including PDF, PNG, JPG, DOCX, PPTX, and XLS. Additionally, users add websites and YouTube links as nodes to provide specific context for their AI-driven exploratory conversations (verified: 2026-01-29).

Does the interface allow for switching between different AI models or providers within a single project?

The interface allows users to switch between different models at any point during their workflow. It supports cloud providers like Anthropic and Google, as well as local LLMs through Ollama and other custom provider settings (verified: 2026-01-29).