Skip to content

feat: add CometAPI as LLM provider#2002

Open
KushalLukhi wants to merge 1 commit intomckaywrigley:mainfrom
KushalLukhi:feat/add-cometapi-provider
Open

feat: add CometAPI as LLM provider#2002
KushalLukhi wants to merge 1 commit intomckaywrigley:mainfrom
KushalLukhi:feat/add-cometapi-provider

Conversation

@KushalLukhi
Copy link
Copy Markdown

Description

Adds CometAPI as a first-class LLM provider. CometAPI provides OpenAI-compatible endpoints with competitive pricing.

Closes #1975

Changes

  • New provider file: lib/models/llm/cometapi-llm-list.ts
    • GPT-4o mini, GPT-4o, GPT-4 Turbo, GPT-3.5 Turbo
  • New API endpoint: app/api/chat/cometapi/route.ts
  • Type definitions updated (ModelProvider, LLMID, Supabase types)
  • Environment variable: COMETAPI_API_KEY
  • Base URL: https://api.cometapi.com/v1

Environment Setup

Add to .env.local:

COMETAPI_API_KEY=your_api_key

Or set per-user in profile settings.

Benefits

  • Provider redundancy for reliability
  • Cost flexibility with competitive pricing
  • OpenAI-compatible (streaming, chat completions)
  • No workflow changes required

Testing

  • Model selection works in UI
  • API requests route to CometAPI
  • Streaming responses work
  • API key validation works

CometAPI resources:

Add CometAPI (OpenAI-compatible) as a first-class LLM provider.

Changes:
- New provider models: GPT-4o mini, GPT-4o, GPT-4 Turbo, GPT-3.5 Turbo
- API endpoint at app/api/chat/cometapi/route.ts
- Environment variable: COMETAPI_API_KEY
- Base URL: https://api.cometapi.com/v1
- Type definitions for ModelProvider and LLMID

Closes mckaywrigley#1975
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature request: Add CometAPI as an LLM provider

1 participant