Skip to content

OyaAIProd/Introduction-to-MCP

Repository files navigation

Introduction to MCP Banner

Introduction to MCP

Multi-LLM Model Context Protocol (MCP) Servers & Clients

Claude Grok Gemini GitHub Copilot

Python uv Anthropic

GitHub stars

👋 About the Project

This repository is a practical introduction to Anthropic’s Model Context Protocol (MCP) with full multi-LLM support.

It demonstrates building MCP servers and clients while using:

  • Claude (native Anthropic SDK)
  • Grok (xAI)
  • Gemini (Google)
  • Copilot (GitHub)

The orch/ layer uses all four LLMs together to train and improve orchestration logic.


🛠️ Tech Stack

Tech Stack

✨ Features

  • ⚡ Full MCP server + client implementation
  • 🌐 Multi-LLM support (Claude • Grok • Gemini • Copilot)
  • 🚀 orch/ layer trained with all four LLMs
  • 🧪 Ultra-modern Python setup with uv
  • 📁 Clean, well-documented structure
  • 🔧 Ready to run in minutes
  • 🇿🇦 Built in South Africa

📊 MCP Architecture MCP Diagram

👩‍💻 About Me Kholofelo Robyn Rababalela Freelance Web Developer · Computer Engineering Student 📍 Cape Town, Western Cape, South Africa

🔗 Connect With Me

LinkedIn Ko-fi (Support my open-source work) PayPal

Made with ❤️ in South Africa 🇿🇦 Star ⭐ this repo if you found it useful!

🚀 Quick Start

# 1. Clone the repo
git clone https://github.com/RobynAwesome/Introduction-to-MCP.git
cd Introduction-to-MCP

# 2. Install dependencies with uv
uv pip install -e .

# 3. Activate the virtual environment (Windows)
.\.CLI_Project\Scripts\activate
# or if using the default venv:
# .\.venv\Scripts\activate

# 4. Run the example
python main.py

About

Focusing on building both MCP servers and clients using the Python SDK. Focusing on MCP's three core primitives—tools, resources, and prompts—and understand how they integrate with Claude AI, Gemini AI, Copilot and Grok to create powerful applications without writing extensive integration code.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors