mcp-client-langchain-py
mcp-client-langchain-py is a client library for building multi-channel projects in Python using LangChain. Developers can integrate natural language processing capabilities and automate workflows. It provides an easy-to-use API that supports rapid development.
GitHub Stars
9
User Rating
Not Rated
Favorites
0
Views
12
Forks
3
Issues
1
Simple MCP Client to Explore MCP Servers

Quickly test and explore MCP servers from the command line!
A simple, text-based CLI client for Model Context Protocol (MCP) servers built with LangChain and Python.
Suitable for testing MCP servers, exploring their capabilities, and prototyping integrations.
Internally it uses LangChain ReAct Agent and
a utility function convert_mcp_to_langchain_tools()
from langchain_mcp_tools
.
A TypeScript equivalent of this utility is available here
Prerequisites
- Python 3.11+
- [optional]
uv
(uvx
)
installed to run Python package-based MCP servers - [optional] npm 7+ (
npx
)
to run Node.js package-based MCP servers - LLM API key(s) from
OpenAI,
Anthropic,
Google AI Studio (for GenAI/Gemini),
xAI,
Cerebras,
and/or
Groq,
as needed
Quick Start
Install
mcp-chat
tool.
This can take up to a few minutes to complete:pip install mcp-chat
Configure LLM and MCP Servers settings via the configuration file,
llm_mcp_config.json5
code llm_mcp_config.json5
The following is a simple configuration for quick testing:
{ "llm": { "provider": "openai", "model": "gpt-5-mini", // "provider": "anthropic", "model": "claude-3-5-haiku-latest", // "provider": "google_genai", "model": "gemini-2.5-flash", // "provider": "xai", "model": "grok-3-mini", // "provider": "cerebras", "model": "gpt-oss-120b", // "provider": "groq", "model": "openai/gpt-oss-20b", }, "mcp_servers": { "us-weather": { // US weather only "command": "npx", "args": ["-y", "@h1deya/mcp-server-weather"] }, }, "example_queries": [ "Tell me how LLMs work in a few sentences", "Are there any weather alerts in California?", ], }
Set up API keys
echo "ANTHROPIC_API_KEY=sk-ant-... OPENAI_API_KEY=sk-proj-... GOOGLE_API_KEY=AI... XAI_API_KEY=xai-... CEREBRAS_API_KEY=csk-... GROQ_API_KEY=gsk_..." > .env code .env
Run the tool
mcp-chat
By default, it reads the configuration file,
llm_mcp_config.json5
, from the current directory.
Then, it applies the environment variables specified in the.env
file,
as well as the ones that are already defined.
Features
- Easy setup: Works out of the box with popular MCP servers
- Flexible configuration: JSON5 config with environment variable support
- Multiple LLM/API providers: OpenAI, Anthropic, Google (GenAI), xAI, Ceberas, Groq
- Command & URL servers: Support for both local and remote MCP servers
- Local MCP Server logging: Save stdio MCP server logs with customizable log directory
- Interactive testing: Example queries for the convenience of repeated testing
Limitations
- Tool Return Types: Currently, only text results of tool calls are supported.
It uses LangChain'sresponse_format: 'content'
(the default) internally, which only supports text strings.
While MCP tools can return multiple content types (text, images, etc.), this library currently filters and uses only text content. - MCP Features: Only MCP Tools are supported. Other MCP features like Resources, Prompts, and Sampling are not implemented.
Usage
Basic Usage
mcp-chat
By default, it reads the configuration file, llm_mcp_config.json5
, from the current directory.
Then, it applies the environment variables specified in the .env
file,
as well as the ones that are already defined.
It outputs local MCP server logs to the current directory.
With Options
# Specify the config file to use
mcp-chat --config my-config.json5
# Store local (stdio) MCP server logs in specific directory
mcp-chat --log-dir ./logs
# Enable verbose logging
mcp-chat --verbose
# Show help
mcp-chat --help
Supported Model/API Providers
- OpenAI:
gpt-5-mini
,gpt-4.1-nano
, etc. - Anthropic:
claude-sonnet-4-0
,claude-3-5-haiku-latest
, etc. - Google (GenAI):
gemini-2.5-flash
,gemini-2.5-pro
, etc. - xAI:
grok-3-mini
,grok-4
, etc. - Cerebras:
gpt-oss-120b
, etc. - Groq:
openai/gpt-oss-20b
,openai/gpt-oss-120b
, etc.
Configuration
Create a llm_mcp_config.json5
file:
- The configuration file format
for MCP servers follows the same structure as
Claude for Desktop,
with one difference: the key namemcpServers
has been changed
tomcp_servers
to follow the snake_case convention
commonly used in JSON configuration files. - The file format is JSON5,
where comments and trailing commas are allowed. - The format is further extended to replace
${...}
notations
with the values of corresponding environment variables. - Keep all the credentials and private info in the
.env
file
and refer to them with${...}
notation as needed
{
"llm": {
"provider": "openai",
"model": "gpt-4.1-nano",
// model: "gpt-5-mini",
},
// "llm": {
// "provider": "anthropic",
// "model": "claude-3-5-haiku-latest",
// // "model": "claude-sonnet-4-0",
// },
// "llm": {
// "provider": "google_genai",
// "model": "gemini-2.5-flash",
// // "model": "gemini-2.5-pro",
// },
// "llm": {
// "provider": "xai",
// "model": "grok-3-mini",
// // "model": "grok-4",
// },
// "llm": {
// "provider": "cerebras",
// "model": "gpt-oss-120b",
// },
// "llm": {
// "provider": "groq",
// "model": "openai/gpt-oss-20b",
// // "model": "openai/gpt-oss-120b",
// },
"example_queries": [
"Tell me how LLMs work in a few sentences",
"Are there any weather alerts in California?",
"Read the news headlines on bbc.com",
],
"mcp_servers": {
// Local MCP server that uses `npx`
"weather": {
"command": "npx",
"args": [ "-y", "@h1deya/mcp-server-weather" ]
},
// Another local server that uses `uvx`
"fetch": {
"command": "uvx",
"args": [ "mcp-server-fetch" ]
},
// Embedding the value of an environment variable
"brave-search": {
"command": "npx",
"args": [ "-y", "@modelcontextprotocol/server-brave-search" ],
"env": { "BRAVE_API_KEY": "${BRAVE_API_KEY}" }
},
// Remote MCP server via URL
// Auto-detection: tries Streamable HTTP first, falls back to SSE
"remote-mcp-server": {
"url": "https://api.example.com/..."
},
// Server with authentication
"github": {
"type": "http", // recommended to specify the protocol explicitly when authentication is used
"url": "https://api.githubcopilot.com/mcp/",
"headers": {
"Authorization": "Bearer ${GITHUB_PERSONAL_ACCESS_TOKEN}"
}
},
// For MCP servers that require OAuth, consider using "mcp-remote"
"notion": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.notion.com/mcp"],
},
}
}
Environment Variables
Create a .env
file for API keys:
OPENAI_API_KEY=sk-ant-...
ANTHROPIC_API_KEY=sk-proj-...
GOOGLE_API_KEY=AI...
XAI_API_KEY=xai-...
CEREBRAS_API_KEY=csk-...
GROQ_API_KEY=gsk_...
# Other services as needed
GITHUB_PERSONAL_ACCESS_TOKEN=github_pat_...
BRAVE_API_KEY=BSA...
Popular MCP Servers to Try
There are quite a few useful MCP servers already available:
Troubleshooting
- Make sure your configuration and .env files are correct, especially the spelling of the API keys
- Check the local MCP server logs
- Use
--verbose
flag to view the detailed logs - Refer to Debugging Section in MCP documentation
Change Log
Can be found here
Building from Source
See README_DEV.md for details.
License
MIT License - see LICENSE file for details.
Contributing
Issues and pull requests welcome! This tool aims to make MCP server testing as simple as possible.
This project demonstrates a minimal MCP server and client example for basic calculator operations using both SSE and stdio transports. It includes clear instructions for installation and execution, making it accessible for developers. The inclusion of hot-reloading for development enhances the developer experience, while the structure allows for easy understanding of MCP functionalities.