Commit a77a2e5
fix: custom API endpoints for Codex/Claude/Gemini engines
- codex_engine: declare OPENAI_BASE_URL → AWF LLM proxy when firewall enabled
- claude_engine: declare ANTHROPIC_BASE_URL → AWF LLM proxy when firewall enabled
- awf_helpers: add GEMINI_API_BASE_URL → --gemini-api-target support
- codex_mcp: inject openai-compat model_provider in config.toml when
OPENAI_BASE_URL is set in engine.env so any model name (e.g. openrouter/free)
is accepted without the built-in openai provider falling back to gpt-5.3-codex
Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>1 parent 9fe3771 commit a77a2e5
File tree
63 files changed
+394
-0
lines changed- .github/workflows
- pkg/workflow
Some content is hidden
Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
63 files changed
+394
-0
lines changedSome generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
0 commit comments