Agentic AI Tools
Cherry Studio Setup Guide
Set up Cherry Studio as your multi-model AI workspace with knowledge bases, MCP servers, and CLI agent integration for research workflows
Beginner · 20–30 min · API key required
Reference page for: Cherry Studio knowledge bases and MCP wiring. Other pages link here for the central GUI workflow.
Overview
Cherry Studio is your all-in-one research workspace, a powerful GUI that integrates chat, knowledge bases, MCP servers, AND CLI coding agents in one interface. It's the central hub of the Research Memex approach.
Key Features:
- Multiple AI model access through one interface (100+ models)
- Knowledge Base for loading your literature corpus
- MCP servers for enhanced capabilities (Zotero, web search, etc.)
- Code Tools - Launch CLI agents (Claude Code, Gemini CLI, etc.) from within Cherry Studio
- Conversation forking for exploring different analytical paths
- Export to markdown for Obsidian integration
Tip
Official Documentation: For complete Cherry Studio features, see Cherry AI Docs | Installation Guide
Download and Install Cherry Studio
Visit Cherry Studio GitHub Releases and download the version for your operating system.
- Open the downloaded
.dmgfile - Drag Cherry Studio to your Applications folder
- First launch: Right-click → Open (to bypass security warning)
Initial Configuration
When you first open Cherry Studio, you'll see:
- Welcome screen with model selection
- API configuration section
- Settings panel
Navigate to settings by clicking the Settings icon (gear icon) in the sidebar, then select API Keys or Model Configuration.
Info
API Keys Explained: Before configuring providers, you may want to understand API keys, free tiers, and provider options. See the API Keys Setup Guide for comprehensive information on getting API access from Google, Anthropic, DeepSeek, and other providers.
Configure API Provider
You'll need an API key from a provider to access AI models. See the API Keys Setup Guide for detailed instructions on getting keys from Google AI Studio (free), OpenRouter, or other providers.
Info
Following the Systematic Review course? Your instructor will provide a shared OpenRouter API key for the class. Skip the API setup guide and use the provided key instead.
Navigate to API Settings
- In Cherry Studio, click the Settings icon (gear icon)
- In the settings menu, select API Keys
Add Your Provider
- In the API Keys panel, click the Add Provider button
- Select your provider (Google AI Studio, OpenRouter, etc.)
Enter and Test API Key
- A configuration window will appear
- Paste your API key into the field (starts with
sk-or similar) - Click the Test Connection button - you should see a green "Success" message
- Click Save
You're now ready to use AI models in Cherry Studio!
Optional: Additional API Providers
Optional Providers for Specific Models
While OpenRouter provides access to most models you'll need, you may want to configure additional providers for specific models or embedding services:
DeepSeek Provider
Optional - If you want direct access to DeepSeek models:
- Click Add Provider → DeepSeek
- Create account at platform.deepseek.com
- Recommended Model: DeepSeek V3.2
- Note: All DeepSeek models are also available via OpenRouter
Moonshot AI (Kimi)
Optional - For direct access to Kimi models:
- Click Add Provider → Custom
- Name: "Moonshot AI (Kimi)"
- Create account at platform.moonshot.ai
- Base URL:
https://api.moonshot.ai/v1 - Recommended Model: Kimi K2.5
- Note: Kimi models are also available via OpenRouter
Google AI Studio Provider
Recommended Free Backup - For generous daily limits:
- Create account at aistudio.google.com
- Get API key at aistudio.google.com/app/apikey
- In Cherry Studio: Add Provider → Google Gemini
- Paste your API key (starts with
AIza…)
Available Models:
gemini-1.5-flash(1,500 requests/day - high-volume work)gemini-2.5-pro(100 requests/day - complex analysis)gemini-embedding-experimental-0307(document similarity)
Use Cases:
- Processing large literature collections (1M token context)
- Backup when course API credits are low
- Cost-free experimentation
- Document embeddings and semantic search
Daily Limits Reset: Midnight Pacific Time
Which Models Should I Add?
Now that you have configured your API providers, you might be wondering which models to add to your Cherry Studio interface.
For detailed recommendations on which models are best suited for different research tasks, their costs, and how to configure them, please refer to the AI Model Reference Guide. It provides a comprehensive overview to help you make informed choices.
MCP Servers - Give Your AI Superpowers
What Are MCPs?
MCP (Model Context Protocol) is like giving your AI access to external tools and data. Without MCPs, AI can only chat. WITH MCPs, it can:
Real examples of what MCPs enable:
- 📚 Search your Zotero library: "Find all papers about organizational learning from 2020-2024"
- 📁 Read/write files: "Analyze the methodology section in Chapter3.docx"
- 🌐 Search the web in real-time: "What's the latest research on AI in education published this month?"
- 🧠 Step-by-step reasoning: "Break down this complex theory comparison in 5 structured steps"
- 🔗 Multi-model access: "Get Gemini's perspective on this analysis" (via Vox MCP)
The magic: MCPs turn chat into CAPABILITY. Your AI becomes a research partner, not just a text generator!
Recommended MCPs for Research
Essential MCPs:
@cherry/filesystem- AI can read your files, analyze documents@cherry/sequentialthinking- Step-by-step structured reasoning
Powerful Additions:
- Zotero MCP - Direct library access and search
- Web Search MCP - Real-time information
Advanced MCPs:
- Lotus Wisdom MCP - Contemplative problem-solving
- Vox MCP - Multi-model AI gateway (8+ providers)
Tip
New to MCPs? See the MCP Explorer Guide for detailed installation instructions. Following the course? Session 2 covers MCP setup in depth.
How to Install MCPs
In Cherry Studio:
- Settings → MCP Servers
- Click "Add Server"
- Choose from library or paste MCP URL
- Configure and test
Want to explore ALL available MCPs? See MCP Explorer Guide for:
- Complete MCP catalog
- Installation instructions for each MCP
- Cherry Studio, Claude Code, and other client configs
- Use cases and examples
Official MCP docs: Cherry Studio MCP Guide | MCP Protocol
Code Tools - Launch CLI Agents (Advanced)
Access CLI Power from the GUI
Cherry Studio v1.5.7+ includes Code Tools - a feature that lets you launch command-line AI agents (Claude Code, Gemini CLI, Qwen Code, OpenAI Codex) directly from the Cherry Studio interface!
Why use Code Tools?
- Access CLI agent capabilities without leaving Cherry Studio
- No separate terminal setup needed
- Integrated with your API keys and models
- Perfect for file-based research workflows
Enable Code Tools
- Ensure you're running Cherry Studio v1.5.7 or higher
- Settings → Navigation → Set navigation bar to Top position
- Create a new tab or conversation
- Click the Code (</>) icon in the toolbar
Select a CLI Agent
Choose from available CLI tools:
- Claude Code: Premium, excellent for research workflows
- Gemini CLI: Free Google power, 1M context window
- Qwen Code: Alibaba's open-source alternative
- OpenAI Codex: GPT-based coding agent
For Research Memex, we recommend:
- Beginners → Gemini CLI (free, powerful)
- Advanced → Claude Code (best quality)
- Experimenters → Qwen Code (open source)
Configure and Launch
- Select a compatible AI model from your configured providers
- Set working directory (your research project folder)
- Configure environment variables if needed
- Click Launch Agent
- The CLI agent runs in an embedded terminal within Cherry Studio!
Warning
Token Usage: Code Tools consume significant API tokens! Monitor your usage carefully, especially with complex file operations.
Tip
Official Guide: For detailed Code Tools tutorial, see Cherry Studio Code Tools Documentation
When to use Code Tools vs standalone CLI:
- Use Code Tools: When you want GUI convenience and integrated workflow
- Use standalone CLI: When you prefer terminal-native experience and want full control
For standalone CLI setup, see:
Test Your Setup
Create Test Conversation
-
Click New Chat in the sidebar
-
Select a model (start with GPT-5.3 Instant or DeepSeek-chat)
-
Type this test prompt:
Please summarize the key components of a systematic review according to PRISMA guidelines in 3 bullet points. -
Press Enter or click Send
Expected Response: You should receive a concise summary within 5-10 seconds.
Verify Model Access
Test each configured model:
- Create new conversation
- Select different model from dropdown
- Send same test prompt
- Compare responses
Test MCP Tools
Verify MCP servers are working:
- Test Zotero: "Search my Zotero for systematic review papers"
- Test Sequential Thinking: "Help me plan a literature review in 5 steps using sequential thinking"
- Test Web Search: "Find recent papers on AI in management"
Expected: Each command should return relevant results
Set Up Knowledge Base
Prepare Your Documents
- Export your curated papers from Zotero as PDFs
- Create a folder:
systematic-review-papers - Place all PDFs in this folder
Create Knowledge Base in Cherry Studio
- Click Knowledge Base in sidebar
- Click Create New Collection
- Name it: "My Systematic Review"
- Click Add Documents
- Select your PDF folder
- Wait for processing (1-2 min per 10 papers)
Advanced Options:
- OCR Processing: Enable for scanned PDFs (requires v1.4.8+)
- Intent Recognition: Better search accuracy with powerful models
- Multiple formats: Supports PDF, TXT, Markdown, Word, etc.
Enable Knowledge Base in Conversations
- Start new conversation
- Click Knowledge icon in chat toolbar
- Select your collection
- The AI now has access to your papers!
Pro Tip: Enable "Intent Recognition" in knowledge base settings for more accurate search results when asking complex research questions.
Knowledge Management with Obsidian
Why Obsidian?
Obsidian is a powerful markdown editor that creates a "second brain" for your research:
- Local storage: Your notes stay on your computer
- Bidirectional linking: Connect ideas across papers
- Zotero integration: Seamless citation management
- Graph view: Visualize connections in your research
- MCP accessibility: AI can read your knowledge base
Quick Setup Overview
For detailed instructions, see the Obsidian Setup Guide
Essential Steps:
- Install Obsidian from obsidian.md
- Create vault: "Systematic-Review-Research"
- Install plugins:
- Zotero Integration (multiple options available)
- Dataview (for literature tables)
- Local REST API (for MCP access)
- Configure integration with Zotero (requires Better BibTeX)
- Set up MCP for Cherry Studio access
Cherry Studio → Obsidian Workflow
Set Up Folder Structure
Create this structure in your Obsidian vault:
/Research/
/01-Literature-Notes/ # Individual paper notes from Zotero
/02-AI-Conversations/ # Exported Cherry Studio chats
/03-Synthesis/ # Your analysis and connections
/04-Protocol/ # Review protocol development
/05-Daily-Notes/ # Research journal
/Templates/ # Note templatesExport from Cherry Studio to Obsidian
-
In Cherry Studio conversation:
- Click Export button (or
Cmd/Ctrl + E) - Choose Markdown format
- Select Save to Folder
- Navigate to
/02-AI-Conversations/ - Name format:
YYYY-MM-DD-Topic.md
- Click Export button (or
-
The exported file includes:
- Complete conversation history
- Model used and timestamps
- Any code blocks or tables
- Referenced papers (if using Zotero MCP)
Create Literature Note Template
Save this in /Templates/literature-note.md:
# {{title}}
## Metadata
- **Authors**: {{authors}}
- **Year**: {{year}}
- **Journal**: {{publicationTitle}}
- **DOI**: {{DOI}}
- **Tags**: {{tags}}
- **Zotero**: [Open in Zotero]({{zoteroLink}})
## Summary
*AI-generated or your summary*
## Key Contributions
-
## Methodology
-
## Limitations
-
## Relevance to My Research
-
## Connections
- Related papers:
- Contradicts:
- Extends:
## Annotations
{{annotations}}Bidirectional Workflow Benefits
- From Zotero to Obsidian: Import papers with annotations
- From Cherry Studio to Obsidian: Export AI analysis
- Within Obsidian: Link papers, find patterns, build arguments
- Back to Cherry Studio: Copy synthesis for further AI analysis
Advanced Integration: Cherry Studio can also connect directly to Obsidian via MCP or data settings. See: Cherry Studio Obsidian Integration
Troubleshooting
'API Key Invalid' Error
- Double-check key is copied completely (no spaces)
- Ensure you have credits in your account
- Try regenerating the API key
'Rate Limit Exceeded'
- Wait 60 seconds and try again
- Switch to a different model temporarily
- Check your API provider's rate limits
'Connection Failed'
- Check internet connection
- Verify firewall isn't blocking Cherry Studio
- Try using a different API provider
Knowledge Base Not Working
- Ensure PDFs are text-based (not scanned images)
- Check file size (max 10MB per file recommended)
- Try re-importing documents
Model Not Responding
- Check API key configuration
- Verify you have credits remaining
- Try a different model to isolate issue
Quick Reference Card
Keyboard Shortcuts
- New Chat:
Cmd/Ctrl + N - Fork Conversation:
Cmd/Ctrl + Shift + F - Search Conversations:
Cmd/Ctrl + F - Export Chat:
Cmd/Ctrl + E - Settings:
Cmd/Ctrl + ,
Document Processing Options Comparison
| Feature | MinerU (Free) | Mistral API | Direct Text |
|---|---|---|---|
| Daily Limit | 500 documents | Unlimited | Unlimited |
| Cost | Free | $0.10-0.20/doc | Free |
| Quality | Good | Excellent | Basic |
| Math Formulas | ✓ LaTeX | ✓ LaTeX | ✗ |
| Tables | ✓ Preserved | ✓ Enhanced | Partial |
| Images | ✓ Extracted | ✓ OCR | ✗ |
| Multi-column | ✓ | ✓ | ✗ |
| Speed | Fast | Moderate | Instant |
Best Practices
- Choose the right model for the task (See AI Model Reference Guide): if you are not sure, experiment with cheaper/free models for initial exploration and use expensive models when you feel more confident
- Fork conversations before trying different approaches (available in Cherry Studio, ChatWise, ChatGPT, Claude.ai)
- Export important conversations immediately
- Natively supported in Cherry Studio, ChatWise
- Alternatively, for ChatGPT, Claude.ai, Gemini web interface, use Save my Chatbot Chrome & Firefox extension (https://save.hugocollin.com/) or Obsidian Web Clipper (https://obsidian.md/clipper)
- Start a new conversation if you notice performance degradation
What's Next
Cherry Studio is the GUI hub. Once you're comfortable, the natural step is a CLI agent — see the Claude Code Setup Guide for paired terminal work.
External references
- Cherry AI Docs
- MCP Installation
- Knowledge Base Guide
- GitHub Repository
- OpenRouter API Docs
- MCP Protocol Docs
If Cherry Studio doesn't fit, ChatWise (chatwise.ai) is a close alternative; the OpenRouter playground and provider web UIs (ChatGPT, Claude.ai, Gemini) cover the gap for one-off chats.
Checklist
By the end of this guide, you should have:
- Downloaded and installed Cherry Studio
- Created at least one API account (OpenRouter recommended)
- Added $5-10 in API credits
- Successfully sent a test message to any added model
- Enabled and tested at least one MCP server (Zotero, Sequential Thinking, Web Search)
- (Optional) Tested Code Tools by launching a CLI agent
- Installed Obsidian and created a vault (a folder in your computer)
- Exported a conversation to markdown format and imported it to Obsidian
- Created and tested a knowledge base in Cherry Studio with your seed papers
- (Optional) Configured OCR and intent recognition for knowledge base
How Do You Know It's Working?
For more information
If you plan to experiment with command‑line tools or provider‑specific keys later, see the CLI Setup Guide (optional).
For API key setup and provider configuration, see the API Keys Setup Guide.
For model selection and recommended settings (temperature, reasoning effort), refer to the AI Model Reference Guide.