Leading digital analytics platform for product insights and customer journey analytics
Key facts
Pricing
Freemium
Use cases
Research teams connecting Large Language Models to internal knowledge sources for real-time collaborative chat and data analysis (verified: 2026-01-30), Knowledge managers centralizing information from over fifty file extensions and external platforms like Slack, Jira, and Notion (verified: 2026-01-30), Privacy-focused organizations deploying self-hosted AI research agents that work with local LLMs like vLLM and Ollama (verified: 2026-01-30)
Strengths
The platform supports over 50 file extensions including documents, images, and video files for building personal knowledge bases (verified: 2026-01-30), Role-Based Access Control provides granular permissions for documents, chats, and connectors with specific roles like Owner, Admin, Editor, and Viewer (verified: 2026-01-30), Integration flexibility exists through the OpenAI spec and LiteLLM, which enables connections to virtually any inference provider (verified: 2026-01-30)
Limitations
The software requires users to manage their own self-hosted deployment and infrastructure for local operation (verified: 2026-01-30), Users must provide and configure their own inference providers or local LLM instances to enable the chat functionality (verified: 2026-01-30)
Last verified
Jan 30, 2026
Plan your next step
Use these links to move from this review into compare and task workflows before committing to a tool stack.
Compare • Browse by task • Guides • Tools • Deals
Priority tasks: Content writing tasks • Code generation tasks • Video generation tasks • Meeting notes tasks • Transcription tasks
Priority guides: AI SEO tools guide • AI coding tools guide • AI video tools guide • AI meeting notes guide
Strengths
- The platform supports over 50 file extensions including documents, images, and video files for building personal knowledge bases (verified: 2026-01-30)
- Role-Based Access Control provides granular permissions for documents, chats, and connectors with specific roles like Owner, Admin, Editor, and Viewer (verified: 2026-01-30)
- Integration flexibility exists through the OpenAI spec and LiteLLM, which enables connections to virtually any inference provider (verified: 2026-01-30)
Limitations
- The software requires users to manage their own self-hosted deployment and infrastructure for local operation (verified: 2026-01-30)
- Users must provide and configure their own inference providers or local LLM instances to enable the chat functionality (verified: 2026-01-30)
FAQ
What specific external data sources does the platform integrate with for research?
The tool integrates with search engines like SearxNG and Tavily, productivity suites including Google Drive and Microsoft Teams, and project management tools such as Jira, Linear, and ClickUp. It also connects to communication platforms like Slack and Discord to pull data into a centralized knowledge base for real-time AI interaction (verified: 2026-01-30).
How does the platform manage team collaboration and data access security?
Collaboration is managed through Role-Based Access Control for search spaces, where owners invite members with specific roles to manage permissions for documents, chats, and settings. This system ensures that granular permissions are applied to all shared knowledge bases, allowing teams to collaborate securely while maintaining strict data governance (verified: 2026-01-30).
Does the software support the use of local Large Language Models?
The software works with local LLMs such as vLLM and Ollama, which allows for private, self-hosted deployments that do not require external cloud providers. By using the OpenAI specification and LiteLLM, users can connect virtually any inference provider to their local instance to maintain full control over their data processing (verified: 2026-01-30).