MCP implementation that enables communication between MCP servers and OpenAI-compatible LLMs
Description:
A bridge connecting MCP servers to OpenAI-compatible LLMs through function calling
Category: Integration & Connectivity
Overview: This implementation creates a bidirectional protocol translation layer that enables communication between MCP servers and various LLM endpoints. It supports OpenAI API primarily, but also works with local endpoints implementing the OpenAI API spec like Ollama and LM Studio.
Key features:
Installation:
curl -LsSf https://astral.sh/uv/install.sh | sh
git clone https://github.com/bartolli/mcp-llm-bridge.git
cd mcp-llm-bridge
uv venv
source .venv/bin/activate
uv pip install -e .
Configuration Example:
{
"mcp-llm-bridge": {
"command": "uvx",
"args": ["mcp-server-sqlite", "--db-path", "test.db"],
"env": {
"OPENAI_API_KEY": "your_key",
"OPENAI_MODEL": "gpt-4o"
}
}
}
Supported LLM Endpoints:
Testing:
uv pip install -e ".[test]"
python -m pytest -v tests/
Usage:
python -m mcp_llm_bridge.main
Licensed under MIT. Compatible with various OpenAI models and local implementations.