This repository is a practical introduction to Anthropic’s Model Context Protocol (MCP) with full multi-LLM support.
It demonstrates building MCP servers and clients while using:
- Claude (native Anthropic SDK)
- Grok (xAI)
- Gemini (Google)
- Copilot (GitHub)
The orch/ layer uses all four LLMs together to train and improve orchestration logic.
- ⚡ Full MCP server + client implementation
- 🌐 Multi-LLM support (Claude • Grok • Gemini • Copilot)
- 🚀
orch/layer trained with all four LLMs - 🧪 Ultra-modern Python setup with
uv - 📁 Clean, well-documented structure
- 🔧 Ready to run in minutes
- 🇿🇦 Built in South Africa
👩💻 About Me Kholofelo Robyn Rababalela Freelance Web Developer · Computer Engineering Student 📍 Cape Town, Western Cape, South Africa
🔗 Connect With Me
LinkedIn Ko-fi (Support my open-source work) PayPal
Made with ❤️ in South Africa 🇿🇦 Star ⭐ this repo if you found it useful!
# 1. Clone the repo
git clone https://github.com/RobynAwesome/Introduction-to-MCP.git
cd Introduction-to-MCP
# 2. Install dependencies with uv
uv pip install -e .
# 3. Activate the virtual environment (Windows)
.\.CLI_Project\Scripts\activate
# or if using the default venv:
# .\.venv\Scripts\activate
# 4. Run the example
python main.py
