feat(mcp): add context length guard to prevent oversized requests#1416
Conversation
- Add MaxContext field to Config (default 0 = no limit) - Add WithMaxContext() option for setting model context limits - Add context_guard.go: token estimation + message truncation - Integrate guard into both BuildMCPRequestBody and BuildRequestBodyFromRequest - Support both map[string]string and map[string]any message formats - Truncates oldest non-system messages when estimated tokens exceed limit - Always preserves system messages and keeps at least 1 non-system message - Logs warning when truncation occurs for debugging Usage: mcp.NewDeepSeekClient(mcp.WithMaxContext(131072))
|
|
1 similar comment
|
|
🤖 Advisory Check ResultsThese are advisory checks to help improve code quality. They won't block your PR from being merged. 📋 PR InformationTitle Format: ✅ Good - Follows Conventional Commits 🔧 Backend ChecksGo Formatting: Files needing formattingGo Vet: ✅ Good Fix locally: go fmt ./... # Format code
go vet ./... # Check for issues
go test ./... # Run tests⚛️ Frontend ChecksBuild & Type Check: ✅ Success Fix locally: cd web
npm run build # Test build (includes type checking)📖 ResourcesQuestions? Feel free to ask in the comments! 🙏 These checks are advisory and won't block your PR from being merged. This comment is automatically generated from pr-checks-run.yml. |
Problem
When nofx calls AI models via claw402, messages can exceed model context limits (e.g. DeepSeek 131K), causing 400 errors:
Solution
MaxContextconfig field (default 0 = no limit, backward compatible)map[string]stringandmap[string]anymessage formatsUsage
Files Changed
mcp/config.go— add MaxContext fieldmcp/options.go— add WithMaxContext() optionmcp/context_guard.go— token estimation + truncation (both message types)mcp/client.go— integrate guard in BuildMCPRequestBody + BuildRequestBodyFromRequestmcp/context_guard_test.go— 6 test cases, all passing