A Model Context Protocol (MCP) server implementation for Pharo Smalltalk, enabling AI language models to interact with live Pharo environments through a standardized protocol.
LLM-Pharo-MCP implements the Model Context Protocol specification, providing a bridge between AI assistants (Claude, GPT, etc.) and live Pharo Smalltalk images. Through MCP, AI models can:
| Document | Description |
|---|---|
| Quick Start | System architecture and design decisions |
| Architecture | System architecture and design decisions |
| API Reference | Complete class and method reference |
| Protocol Guide | MCP protocol details and message formats |
| Tools Guide | How to create and register tools |
| Resources Guide | How to expose resources and resource templates |
| Prompts Guide | How to define and use prompts |
| Transport Guide | Transport layer configuration |
| Testing Guide | How to test your MCP server and extensions |
This implementation targets MCP protocol version 2025-03-26.