Model Context Protocol MCP Demo With Langchain MCP ADAPTERS Ollama

Model Context Protocol (MCP) 是一个开源协议,旨在通过标准化接口让大型语言模型(LLM)与外部工具、服务和数据源无缝交互,简化集成流程并扩展AI能力。
2分
16

What is MCP Server?

The MCP Server acts as a translator layer that allows LLMs to interact with external services (like math operations or weather data) in a standardized way. It converts service functionality into a format that LLMs can understand and use.

How to use MCP Server?

1. Start the desired MCP server (math or weather) 2. Connect your LLM application via the MCP client 3. The server will advertise available tools 4. Your LLM can now use these tools through standardized requests

Use Cases

- Building AI assistants that need real-world data - Creating workflows that combine LLMs with existing APIs - Developing applications where LLMs need to perform calculations - Integrating LLMs with business databases securely

Key Features

Standardized InterfaceProvides a consistent way for LLMs to interact with diverse services, similar to how REST standardized web services.
Multi-Service SupportCan connect to multiple services simultaneously (demonstrated in multiclient.py).
LangChain IntegrationWorks seamlessly with LangChain framework through langchain-mcp-adapters.

Pros and Cons

Advantages
Simplifies integration of LLMs with real-world tools
Reduces maintenance overhead with standardized protocol
Future-proof architecture that supports new services
Open-source and extensible
Limitations
Requires setup of server components
Currently has limited pre-built integrations
Performance depends on underlying service response times

Getting Started

Install Requirements
Ensure you have Python installed along with required packages (LangChain, Ollama).
Start Servers
Launch the desired MCP servers in separate terminal windows.
Run Client
Execute either the single-service client or multi-service client.

Example Scenarios

Financial CalculationAn AI assistant helping with shopping calculations
Travel PlanningChecking weather for trip planning

Frequently Asked Questions

Do I need to modify my LLM to use MCP?
Can I add my own custom services?
Is MCP secure for sensitive data?

Additional Resources

LangChain MCP Adapters
Library for using MCP with LangChain
Ollama Documentation
For managing local LLM models
MCP Protocol Discussion
Community forum for MCP development
安装
复制以下命令到你的Client进行配置
注意:您的密钥属于敏感信息,请勿与任何人分享。
精选MCP服务推荐
Duckduckgo MCP Server
已认证
DuckDuckGo搜索MCP服务器,为Claude等LLM提供网页搜索和内容抓取服务
Python
964
4.3分
Firecrawl MCP Server
Firecrawl MCP Server是一个集成Firecrawl网页抓取能力的模型上下文协议服务器,提供丰富的网页抓取、搜索和内容提取功能。
TypeScript
4.0K
5分
Figma Context MCP
Framelink Figma MCP Server是一个为AI编程工具(如Cursor)提供Figma设计数据访问的服务器,通过简化Figma API响应,帮助AI更准确地实现设计到代码的一键转换。
TypeScript
6.8K
4.5分
Baidu Map
已认证
百度地图MCP Server是国内首个兼容MCP协议的地图服务,提供地理编码、路线规划等10个标准化API接口,支持Python和Typescript快速接入,赋能智能体实现地图相关功能。
Python
818
4.5分
Context7
Context7 MCP是一个为AI编程助手提供实时、版本特定文档和代码示例的服务,通过Model Context Protocol直接集成到提示中,解决LLM使用过时信息的问题。
TypeScript
5.4K
4.7分
Exa Web Search
已认证
Exa MCP Server是一个为AI助手(如Claude)提供网络搜索功能的服务器,通过Exa AI搜索API实现实时、安全的网络信息获取。
TypeScript
1.9K
5分
Edgeone Pages MCP Server
EdgeOne Pages MCP是一个通过MCP协议快速部署HTML内容到EdgeOne Pages并获取公开URL的服务
TypeScript
321
4.8分
Minimax MCP Server
MiniMax Model Context Protocol (MCP) 是一个官方服务器,支持与强大的文本转语音、视频/图像生成API交互,适用于多种客户端工具如Claude Desktop、Cursor等。
Python
889
4.8分
AIbase
智启未来,您的人工智能解决方案智库
简体中文