|
1 | 1 | # syncable-cli-mcp-server |
2 | 2 |
|
| 3 | +# syncable-cli-mcp-server |
| 4 | + |
3 | 5 | High-performance **Model Context Protocol** (MCP) server for code analysis, security scanning, and project insights—written in Rust 🦀. |
4 | 6 |
|
| 7 | +--- |
| 8 | + |
| 9 | +## Related Project |
| 10 | + |
| 11 | +This MCP server exposes the capabilities of the [`syncable-cli`](https://crates.io/crates/syncable-cli) tool to AI agents. While `syncable-cli` is a standalone CLI tool for interacting with Syncable workspaces, this server acts as a bridge, allowing AI agents and other clients to access those CLI features programmatically via the Model Context Protocol (MCP). Both projects are closely related and complement each other. |
| 12 | + |
| 13 | +--- |
| 14 | + |
| 15 | +## Table of Contents |
| 16 | + |
| 17 | +* [Features](#features) |
| 18 | +* [Installation](#installation) |
| 19 | + * [CLI Binaries](#cli-binaries) |
| 20 | + * [Add to PATH](#add-to-path) |
| 21 | + * [Python Client Example](#python-client-example) |
| 22 | + * [LangGraph Agent Integration](#langgraph-agent-integration) |
| 23 | + * [Library](#library) |
| 24 | +* [Configuration](#configuration) |
| 25 | +* [Usage](#usage) |
| 26 | +* [Documentation](#documentation) |
| 27 | +* [Contributing](#contributing) |
| 28 | +* [License](#license) |
| 29 | +* [Acknowledgments](#acknowledgments) |
| 30 | + |
| 31 | +--- |
| 32 | + |
| 33 | +## Features |
| 34 | + |
| 35 | +* **Fast & Scalable**: Built with async Rust on the Tokio runtime |
| 36 | +* **Multi-Protocol**: Supports both stdio and SSE (Server-Sent Events) transports |
| 37 | +* **Security Scanning**: Static analysis and vulnerability detection |
| 38 | +* **Extensible**: Easily add new MCP handlers and custom tools |
| 39 | +* **Production-Ready**: Optimized release profile, structured logging, and CI integration |
| 40 | + |
| 41 | +--- |
| 42 | + |
| 43 | +## Installation |
| 44 | + |
| 45 | +`rust-mcp-server-syncable-cli` is published on [crates.io](https://crates.io/crates/rust-mcp-server-syncable-cli). You need a recent Rust toolchain (1.70+ recommended). It works as an MCP server for AI agents where you can use the langgraph framework or similar to connect to this MCP server for code scanning. |
| 46 | + |
| 47 | +### CLI Binaries |
| 48 | + |
| 49 | +Install the server binaries from [crates.io](https://crates.io/crates/rust-mcp-server-syncable-cli): |
| 50 | + |
| 51 | +```bash |
| 52 | +cargo install rust-mcp-server-syncable-cli |
| 53 | +``` |
| 54 | + |
| 55 | +This installs two binaries into your Cargo `bin` directory (usually `~/.cargo/bin`): |
| 56 | + |
| 57 | +- `mcp-stdio` — stdin/stdout-based MCP server |
| 58 | +- `mcp-sse` — HTTP/SSE-based MCP server |
| 59 | + |
| 60 | +--- |
| 61 | + |
| 62 | + |
| 63 | +### Add to PATH |
| 64 | + |
| 65 | +If you see a warning like: |
| 66 | + |
| 67 | +> be sure to add `/Users/yourname/.cargo/bin` to your PATH to be able to run the installed binaries |
| 68 | +
|
| 69 | +Add the following to your shell profile: |
| 70 | + |
| 71 | +For **zsh** (default on recent macOS): |
| 72 | +```bash |
| 73 | +echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.zshrc |
| 74 | +source ~/.zshrc |
| 75 | +``` |
| 76 | + |
| 77 | +For **bash**: |
| 78 | +```bash |
| 79 | +echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.bash_profile |
| 80 | +source ~/.bash_profile |
| 81 | +``` |
| 82 | + |
| 83 | +Verify installation: |
| 84 | +```bash |
| 85 | +which mcp-stdio |
| 86 | +which mcp-sse |
| 87 | +``` |
| 88 | + |
| 89 | +--- |
| 90 | + |
| 91 | + |
| 92 | +### Python Client Example |
| 93 | + |
| 94 | +You can connect to the MCP server from Python using the [mcp](https://pypi.org/project/mcp/) client library or similar. |
| 95 | +Below is an example using `mcp.client.stdio` to launch and communicate with the Rust MCP server via stdio: |
| 96 | + |
| 97 | +```python |
| 98 | +import asyncio |
| 99 | +from mcp.client.session import ClientSession |
| 100 | +from mcp.client.stdio import StdioServerParameters, stdio_client |
| 101 | + |
| 102 | +async def main(): |
| 103 | + async with stdio_client( |
| 104 | + StdioServerParameters(command="mcp-stdio") |
| 105 | + ) as (read, write): |
| 106 | + async with ClientSession(read, write) as session: |
| 107 | + await session.initialize() |
| 108 | + tools = await session.list_tools() |
| 109 | + print("Tools:", tools) |
| 110 | + about_info_result = await session.call_tool("about_info", {{}}) |
| 111 | + print("About info result:", about_info_result) |
| 112 | + code_analyze_result = await session.call_tool("analysis_scan", {{"path": ".", "display": "matrix"}}) |
| 113 | + print("Code analysis result:", code_analyze_result) |
| 114 | + security_scan_result = await session.call_tool("security_scan", {{"path": "."}}) |
| 115 | + print("Security scan result:", security_scan_result) |
| 116 | + dependency_scan_result = await session.call_tool("dependency_scan", {{"path": "."}}) |
| 117 | + print("Dependency scan result:", dependency_scan_result) |
| 118 | + |
| 119 | +asyncio.run(main()) |
| 120 | +``` |
| 121 | + |
| 122 | +#### Using HTTP/SSE Mode |
| 123 | + |
| 124 | +If you prefer to use HTTP/SSE, start the server with: |
| 125 | + |
| 126 | +```bash |
| 127 | +mcp-sse |
| 128 | +``` |
| 129 | + |
| 130 | +```python |
| 131 | +import asyncio |
| 132 | +from mcp.client.session import ClientSession |
| 133 | +from mcp.client.sse import sse_client |
| 134 | +from utils import render_utility_result # Adjust import if needed |
| 135 | + |
| 136 | +async def main(): |
| 137 | + server_url = "http://127.0.0.1:8008/sse" |
| 138 | + async with sse_client(server_url) as (read, write): |
| 139 | + async with ClientSession(read, write) as session: |
| 140 | + await session.initialize() |
| 141 | + |
| 142 | + # List available tools |
| 143 | + tools = await session.list_tools() |
| 144 | + print("Tools:") |
| 145 | + render_utility_result(tools) |
| 146 | + |
| 147 | + # Call the 'about_info' tool |
| 148 | + about_info_result = await session.call_tool("about_info", {{}}) |
| 149 | + print("About info result:") |
| 150 | + render_utility_result(about_info_result) |
| 151 | + |
| 152 | + # Call the 'analysis_scan' tool |
| 153 | + code_analyze_result = await session.call_tool("analysis_scan", {{"path": "../", "display": "matrix"}}) |
| 154 | + print("Code analysis result:") |
| 155 | + render_utility_result(code_analyze_result) |
| 156 | + |
| 157 | + # Call the 'security_scan' tool |
| 158 | + security_scan_result = await session.call_tool("security_scan", {{"path": "../"}}) |
| 159 | + print("Security scan result:") |
| 160 | + render_utility_result(security_scan_result) |
| 161 | + |
| 162 | + # Call the 'dependency_scan' tool |
| 163 | + dependency_scan_result = await session.call_tool("dependency_scan", {{"path": "../"}}) |
| 164 | + print("Dependency scan result:") |
| 165 | + render_utility_result(dependency_scan_result) |
| 166 | + |
| 167 | +if __name__ == "__main__": |
| 168 | + asyncio.run(main()) |
| 169 | +``` |
| 170 | + |
| 171 | +--- |
| 172 | + |
| 173 | + |
| 174 | +### LangGraph Agent Integration |
| 175 | +You can use the LangGraph framework to connect to this MCP server in both stdio and SSE modes. Below are example Python scripts for each mode. |
| 176 | + |
| 177 | +Using Stdio Mode |
| 178 | +This example launches the mcp-stdio binary and connects via stdio: |
| 179 | + |
| 180 | +```python |
| 181 | +import asyncio |
| 182 | +import os |
| 183 | +from dotenv import load_dotenv |
| 184 | + |
| 185 | +from langchain_mcp_adapters.client import MultiServerMCPClient |
| 186 | +from langgraph.prebuilt import create_react_agent |
| 187 | +import openai |
| 188 | + |
| 189 | +load_dotenv() |
| 190 | +openai.api_key = os.getenv("OPENAI_API_KEY") |
| 191 | + |
| 192 | +async def main(): |
| 193 | + client = MultiServerMCPClient({ |
| 194 | + "syncable_cli": { |
| 195 | + # Adjust this path if needed—just needs to point |
| 196 | + # at your compiled mcp-stdio binary. |
| 197 | + "command": "../rust-mcp-server-syncable-cli/target/release/mcp-stdio", |
| 198 | + "args": [], # no extra args |
| 199 | + "transport": "stdio", # stdio transport |
| 200 | + } |
| 201 | + }) |
| 202 | + |
| 203 | + tools = await client.get_tools() |
| 204 | + print(f"Fetched {len(tools)} tools:") |
| 205 | + for t in tools: |
| 206 | + print(f" • {t.name}") |
| 207 | + |
| 208 | + agent = create_react_agent("openai:gpt-4o", tools) |
| 209 | + |
| 210 | + tests = [ |
| 211 | + ("about_info", "Call the 'about_info' tool."), |
| 212 | + ("analysis_scan", "Call 'analysis_scan' on path '../' with display 'matrix'."), |
| 213 | + ("security_scan", "Call 'security_scan' on path '../'."), |
| 214 | + ("dependency_scan","Call 'dependency_scan' on path '../'."), |
| 215 | + ] |
| 216 | + |
| 217 | + for name, prompt in tests: |
| 218 | + print(f"\n--- {name} → {prompt}") |
| 219 | + resp = await agent.ainvoke({ |
| 220 | + "messages": [{"role": "user", "content": prompt}] |
| 221 | + }) |
| 222 | + print(resp) |
| 223 | + |
| 224 | +if __name__ == "__main__": |
| 225 | + asyncio.run(main()) |
| 226 | +``` |
| 227 | + |
| 228 | +Using HTTP/SSE Mode |
| 229 | +This example connects to the MCP server via HTTP/SSE: |
| 230 | + |
| 231 | +```python |
| 232 | +import asyncio |
| 233 | +import os |
| 234 | +from dotenv import load_dotenv |
| 235 | + |
| 236 | +from langchain_mcp_adapters.client import MultiServerMCPClient |
| 237 | +from langgraph.prebuilt import create_react_agent |
| 238 | +import openai |
| 239 | + |
| 240 | +load_dotenv() |
| 241 | +openai.api_key = os.getenv("OPENAI_API_KEY") |
| 242 | + |
| 243 | +async def main(): |
| 244 | + # ← Use /sse here, since `mcp-sse` prints "Server is available at .../sse" |
| 245 | + client = MultiServerMCPClient({ |
| 246 | + "demo": { |
| 247 | + "url": "http://127.0.0.1:8008/sse", |
| 248 | + "transport": "sse", |
| 249 | + } |
| 250 | + }) |
| 251 | + |
| 252 | + tools = await client.get_tools() |
| 253 | + print(f"Fetched {len(tools)} tools from MCP server:") |
| 254 | + for t in tools: |
| 255 | + print(f" • {t.name}") |
| 256 | + |
| 257 | + agent = create_react_agent("openai:gpt-4o", tools) |
| 258 | + |
| 259 | + prompts = [ |
| 260 | + ("about_info", "Call the 'about_info' tool."), |
| 261 | + ("analysis_scan", "Call the 'analysis_scan' tool on path '../' with display 'matrix'."), |
| 262 | + ("security_scan", "Call the 'security_scan' tool on path '../'."), |
| 263 | + ("dependency_scan","Call the 'dependency_scan' tool on path '../'."), |
| 264 | + ] |
| 265 | + |
| 266 | + for name, prompt in prompts: |
| 267 | + print(f"\n--- Invoking {name} ---") |
| 268 | + resp = await agent.ainvoke({ |
| 269 | + "messages": [{"role": "user", "content": prompt}] |
| 270 | + }) |
| 271 | + print(resp) |
| 272 | + |
| 273 | +if __name__ == "__main__": |
| 274 | + asyncio.run(main()) |
| 275 | +``` |
| 276 | + |
| 277 | +--- |
| 278 | + |
| 279 | + |
| 280 | +## Configuration |
| 281 | + |
| 282 | +The SSE server port can be configured using the `MCP_PORT` environment variable. |
| 283 | + |
| 284 | +- **`MCP_PORT`**: Sets the port for the SSE server. |
| 285 | + - **Default**: `8008` |
| 286 | + |
| 287 | +Example of running the server on a custom port: |
| 288 | + |
| 289 | +```bash |
| 290 | +MCP_PORT=9000 mcp-sse |
| 291 | +``` |
| 292 | + |
| 293 | +--- |
| 294 | + |
| 295 | + |
| 296 | +## 🛠️ Features |
| 297 | + |
| 298 | +- **Multi-Transport:** Connect via stdio or SSE to the Rust MCP server. |
| 299 | +- **Tooling:** List and invoke tools such as `about_info`, `analysis_scan`, `security_scan`, and `dependency_scan`. |
| 300 | +- **LangGraph Integration:** Example agents using [LangGraph](https://github.com/langchain-ai/langgraph). |
| 301 | +- **Extensible:** Easily add new tools or adapt to other agent frameworks. |
| 302 | + |
| 303 | +--- |
| 304 | + |
| 305 | + |
| 306 | +## 🧪 Testing |
| 307 | + |
| 308 | +Run Python tests: |
| 309 | + |
| 310 | +```bash |
| 311 | +cargo doc --open |
| 312 | +``` |
| 313 | + |
| 314 | +--- |
| 315 | + |
| 316 | + |
| 317 | +## License |
| 318 | + |
| 319 | +Licensed under the [MIT License](LICENSE). See [LICENSE](LICENSE) for details. |
| 320 | + |
| 321 | +--- |
| 322 | + |
| 323 | + |
| 324 | +## Acknowledgments |
| 325 | + |
| 326 | +- Built on [rust-mcp-sdk](https://crates.io/crates/rust-mcp-sdk) |
| 327 | +- Inspired by [Syncable CLI MCP Server](https://github.com/syncable-dev/syncable-cli-mcp-server) |
| 328 | +- Thanks to the Rust and Python communities! |
| 329 | + |
| 330 | +[crates.io]: https://crates.io/crates/rust-mcp-server-syncable-cli |
| 331 | +[docs.rs]: https://docs.rs/rust-mcp-server- |
| 332 | + |
| 333 | +## Konwn Issues |
| 334 | +- langgraph using sse version is still under development and is not functioning well. (Fixed: json output is set to be true) |
| 335 | +- when use json output, stdio protocal has limitations on the size of json file 8k, which causes programe to hang if the analyze scan result is too big. If this protocal is rally needed, try to disable the json output in analysis_scan |
| 336 | + |
| 337 | + |
5 | 338 | --- |
6 | 339 |
|
7 | 340 | ## Related Project |
|
0 commit comments