ChatMCP is an AI chat client that implements the Model Context Protocol (MCP), enabling interaction with any available LLM (Language Model) server.
To use ChatMCP, you need to install either uvx
or npx
, configure your LLM API key and endpoint, and communicate with the MCP server after installing it. Detailed setup instructions are available in the repository.
Yes! ChatMCP supports various LLMs, including OpenAI, Claude, and OLLama.
Yes! The project is open-source and free for everyone to use and contribute to.
You can submit issues or feature requests directly in the GitHub repository under the Issues section.