Skip to content

feat(app): adding Preferences/Wingman for OpenAI and ollama (vibe coded)#1558

Open
dvorka wants to merge 1 commit intoenh-1539/ollama-llm-choicefrom
enh-1539/ollama-llm-choice-ii-vibe
Open

feat(app): adding Preferences/Wingman for OpenAI and ollama (vibe coded)#1558
dvorka wants to merge 1 commit intoenh-1539/ollama-llm-choicefrom
enh-1539/ollama-llm-choice-ii-vibe

Conversation

@dvorka
Copy link
Owner

@dvorka dvorka commented Feb 8, 2026

No description provided.

@augmentcode
Copy link

augmentcode bot commented Feb 8, 2026

🤖 Augment PR Summary

Summary: Adds a new “Wingman2” Preferences tab for managing multiple LLM provider profiles (OpenAI/ollama).
Changes:

  • Registers new dialog sources/headers in app/app.pro.
  • Implements Wingman2Tab UI with provider dropdown, details panel, and Add/Edit/Remove/Test actions.
  • Adds dialogs to select a provider type and configure OpenAI/ollama settings (API key/URL + model selection).
  • Extends Configuration with LlmProviderConfig, provider collection APIs, active-provider tracking, and a legacy-migration helper.
  • Fixes OllamaWingman::listModels() to store the parsed model name.
  • Enhances OpenAiWingman::listModels() with an HTTP GET to /v1/models and fallback defaults.
  • Adds a detailed design doc for Wingman2 LLM configuration.
Technical Notes: Provider “probe” routines currently do basic validation and mark configs as valid/invalid for UI feedback.

🤖 Was this summary useful? React with 👍 or 👎

Copy link

@augmentcode augmentcode bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review completed. 7 suggestions posted.

Fix All in Augment

Comment augment review to trigger a new review at any time.


#include <QtWidgets>

#include "../../lib/src/config/configuration.h"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The relative include ../../lib/src/config/configuration.h resolves to app/src/lib/... from this header, which doesn’t exist in the repo, so these new dialogs likely won’t compile. Same issue also appears in the OpenAI/ollama config dialog headers.

Severity: high

Other Locations
  • app/src/qt/dialogs/openai_config_dialog.h:25
  • app/src/qt/dialogs/openai_config_dialog.h:26
  • app/src/qt/dialogs/ollama_config_dialog.h:25
  • app/src/qt/dialogs/ollama_config_dialog.h:26

Fix This in Augment

🤖 Was this useful? React with 👍 or 👎, or 🚀 if it prevented an incident/outage.

QPushButton* addProviderButton;

QGroupBox* providerDetailsGroup;
QLabel* providerTypeLabel;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wingman2Tab declares providerTypeLabel/modelLabel/statusLabel members, but the .cpp constructs local QLabel*s instead; these members remain uninitialized. This is easy to accidentally dereference later and crash.

Severity: low

Fix This in Augment

🤖 Was this useful? React with 👍 or 👎, or 🚀 if it prevented an incident/outage.

: QWidget(parent),
config(Configuration::getInstance())
{
helpLabel = new QLabel(
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The help text says Wingman2 config is “to be used by Wingman”, but the current runtime initialization path still appears to read only the legacy wingmanProvider/wingmanOpenAi*/wingmanOllama* fields. This may make the new tab misleading if Wingman doesn’t actually consume llmProviders yet.

Severity: medium

Fix This in Augment

🤖 Was this useful? React with 👍 or 👎, or 🚀 if it prevented an incident/outage.

// populate providers combo
llmProvidersCombo->clear();

vector<LlmProviderConfig>& providers = config.getLlmProviders();
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wingman2Tab::refresh() populates from config.getLlmProviders() directly; since migrateFromLegacyWingmanConfig() isn’t invoked here, existing Wingman users may see an empty provider list even if legacy config is set.

Severity: medium

Fix This in Augment

🤖 Was this useful? React with 👍 or 👎, or 🚀 if it prevented an incident/outage.

if (providerType == WINGMAN_PROVIDER_OPENAI) {
OpenAiConfigDialog configDialog(this);
if (configDialog.exec() == QDialog::Accepted) {
config.addLlmProvider(configDialog.getProviderConfig());
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Providers added via config.addLlmProvider() look to be stored only in-memory; MarkdownConfigurationRepresentation currently persists only the legacy Wingman fields. That means Wingman2 providers/activeLlmProviderId may be lost after restart.

Severity: high

Fix This in Augment

🤖 Was this useful? React with 👍 or 👎, or 🚀 if it prevented an incident/outage.


// temporarily set API key to fetch models
string originalKey = config.getWingmanOpenAiApiKey();
if (!apiKey.empty()) {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the API key is provided only via MINDFORGER_OPENAI_API_KEY, handleRefresh() doesn’t inject it into config before constructing OpenAiWingman, so the refresh call likely sends an empty Authorization header and fails.

Severity: medium

Fix This in Augment

🤖 Was this useful? React with 👍 or 👎, or 🚀 if it prevented an incident/outage.

modelValue->setText(QString::fromStdString(provider->llmModel));

if (provider->isValid) {
statusValue->setText(tr("Configured ✓"));
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The status text uses a Unicode checkmark in tr("Configured ✓"); this can be problematic for fonts/encoding and also conflicts with the repo guideline about avoiding ✓/✗ symbols in user-facing messaging.

Severity: low

Fix This in Augment

🤖 Was this useful? React with 👍 or 👎, or 🚀 if it prevented an incident/outage.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds a new “Wingman2” preferences workflow to manage multiple LLM provider configurations (OpenAI + ollama), alongside groundwork in Configuration for storing providers, plus improvements to model discovery and a small ollama bug fix.

Changes:

  • Introduces Wingman2 UI (tab + dialogs) to add/configure LLM providers and select an active provider.
  • Extends Configuration with a LlmProviderConfig model and basic CRUD/probe/migration APIs.
  • Enhances OpenAI model listing via /v1/models and fixes an ollama model-list parsing bug.

Reviewed changes

Copilot reviewed 19 out of 19 changed files in this pull request and generated 22 comments.

Show a summary per file
File Description
vibe/designs/wingman2-llm-configuration.md Design spec for Wingman2 provider configuration UX, storage, and probing.
lib/src/mind/ai/llm/openai_wingman.h Adds declaration for HTTP model listing helper.
lib/src/mind/ai/llm/openai_wingman.cpp Implements OpenAI /v1/models model fetching and fallback defaults.
lib/src/mind/ai/llm/ollama_wingman.cpp Fixes model name push bug in model listing.
lib/src/config/configuration.h Adds Wingman2 provider config struct, defaults, and new public APIs.
lib/src/config/configuration.cpp Implements provider CRUD, active selection, probe stubs, and migration helper.
app/src/qt/dialogs/configuration_dialog.h Adds Wingman2Tab declaration and members.
app/src/qt/dialogs/configuration_dialog.cpp Adds Wingman2 tab UI/handlers and wires it into Preferences show/save.
app/src/qt/dialogs/openai_config_dialog.{h,cpp} New OpenAI provider dialog (API key + model + refresh/probe/add).
app/src/qt/dialogs/ollama_config_dialog.{h,cpp} New ollama provider dialog (URL + model + refresh/probe/add).
app/src/qt/dialogs/add_llm_provider_dialog.{h,cpp} New dialog to choose provider type before configuring.
app/app.pro Registers the new dialog sources/headers with qmake.
lib/src/app_info.h Updates legal copyright year range.
build/ubuntu/debian/copyright Updates Debian packaging copyright year range.
build/debian/debian/copyright Updates Debian packaging copyright year range.
.github/copilot-instructions.md Adds repository-specific Copilot guidance.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 48 to +82
OpenAiWingman::OpenAiWingman()
: Wingman(WingmanLlmProviders::WINGMAN_PROVIDER_OPENAI),
config(Configuration::getInstance()),
llmModels{},
defaultLlmModel{LLM_GPT_35_TURBO}
{
// API key to be always read from config as it might be reconfigured
// during MindForger run
MF_DEBUG(
"OpenAiWingman::OpenAiWingman() apiKey: '"
<< config.getWingmanOpenAiApiKey() << "'" << endl);

listModels();
}

OpenAiWingman::~OpenAiWingman()
{
}

std::vector<std::string>& OpenAiWingman::listModels()
{
llmModels.clear();

// TODO list models using OpenAI API - will many models be confusing for user?
llmModels.push_back(LLM_GPT_35_TURBO);
llmModels.push_back(LLM_GPT_4);
// try to fetch models from OpenAI API
try {
listModelsHttpGet();
} catch (...) {
MF_DEBUG("OpenAiWingman::listModels() failed to fetch from API, using defaults" << endl);
}

// if API call failed or returned no models, use defaults
if (llmModels.empty()) {
llmModels.push_back(LLM_GPT_35_TURBO);
llmModels.push_back(LLM_GPT_4);
}
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OpenAiWingman::listModels() now performs a blocking network call (via listModelsHttpGet()) and is invoked from the constructor. This can stall UI flows that instantiate OpenAiWingman (e.g., model refresh dialog) and makes behavior depend on network latency/availability. Consider avoiding network I/O in the constructor and adding explicit timeouts (Qt + cURL) so failures return quickly.

Copilot uses AI. Check for mistakes.
Comment on lines +163 to +168
MF_DEBUG(
"OpenAiWingman::listModelsHttpGet() parsed response:" << endl
<< ">>>"
<< httpResponseJson.dump(4)
<< "<<<"
<< endl);
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

listModelsHttpGet() logs the full parsed JSON response (httpResponseJson.dump(4)). The /v1/models payload can be large, which may bloat logs and add noticeable overhead. Consider logging only a summary (e.g., model count / first N IDs) or gating the full dump behind a more verbose debug flag.

Suggested change
MF_DEBUG(
"OpenAiWingman::listModelsHttpGet() parsed response:" << endl
<< ">>>"
<< httpResponseJson.dump(4)
<< "<<<"
<< endl);
std::size_t modelsCount = 0;
if (httpResponseJson.contains("data") && httpResponseJson["data"].is_array()) {
modelsCount = httpResponseJson["data"].size();
}
MF_DEBUG(
"OpenAiWingman::listModelsHttpGet() parsed response: "
<< "data array size = " << modelsCount << endl);

Copilot uses AI. Check for mistakes.
Comment on lines +43 to +45
AddLlmProviderDialog(const AddLlmProviderDialog&&) = delete;
AddLlmProviderDialog& operator=(const AddLlmProviderDialog&) = delete;
AddLlmProviderDialog& operator=(const AddLlmProviderDialog&&) = delete;
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AddLlmProviderDialog also declares deleted move operations as const AddLlmProviderDialog&& / operator=(const AddLlmProviderDialog&&). This isn’t the normal move signature and can be misleading; use AddLlmProviderDialog&& (or omit move deletions if not needed).

Suggested change
AddLlmProviderDialog(const AddLlmProviderDialog&&) = delete;
AddLlmProviderDialog& operator=(const AddLlmProviderDialog&) = delete;
AddLlmProviderDialog& operator=(const AddLlmProviderDialog&&) = delete;
AddLlmProviderDialog(AddLlmProviderDialog&&) = delete;
AddLlmProviderDialog& operator=(const AddLlmProviderDialog&) = delete;
AddLlmProviderDialog& operator=(AddLlmProviderDialog&&) = delete;

Copilot uses AI. Check for mistakes.
Comment on lines +421 to +423
// Wingman2: collection of configured LLM providers
std::vector<LlmProviderConfig> llmProviders;
std::string activeLlmProviderId;
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New Wingman2 state (llmProviders, activeLlmProviderId) is added, but Configuration::clear() currently doesn't reset these fields. This can lead to stale providers persisting across clear() calls (tests, reloads, etc.). Ensure clear() clears the providers vector and active ID to keep configuration resets complete.

Copilot uses AI. Check for mistakes.
Comment on lines +1339 to +1361
string errorMessage;
bool success = false;

if (provider->providerType == WINGMAN_PROVIDER_OPENAI) {
success = config.probeOpenAiProvider(
provider->apiKey, provider->llmModel, errorMessage);
} else if (provider->providerType == WINGMAN_PROVIDER_OLLAMA) {
success = config.probeOllamaProvider(
provider->url, provider->llmModel, errorMessage);
}

if (success) {
QMessageBox::information(
this,
tr("Connection Test"),
tr("Provider configuration is valid."));
} else {
QMessageBox::critical(
this,
tr("Connection Test"),
tr("Provider configuration test failed: %1")
.arg(QString::fromStdString(errorMessage)));
}
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

handleTestConnection() reports success/failure but never updates provider->isValid, so the status in the “Selected Provider Details” panel won’t reflect the latest test result. Consider setting provider->isValid = success (and persisting via updateLlmProvider if needed) and then refreshing the UI.

Copilot uses AI. Check for mistakes.
Comment on lines +106 to +114
void OpenAiConfigDialog::show()
{
// load current config if any
apiKeyEdit->setText(QString::fromStdString(config.getWingmanOpenAiApiKey()));
llmModelCombo->setCurrentText(LLM_MODEL_GPT35_TURBO);
configValid = false;

QDialog::show();
}
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OpenAiConfigDialog::show() is used to populate fields from Configuration, but the dialog is shown via exec() in Wingman2Tab, and exec() does not call your overridden show() implementation. As a result, the API key/model initialization logic won’t run. Move initialization into the constructor, showEvent(), or call an explicit init method before exec().

Copilot uses AI. Check for mistakes.
- Library dependencies are stored in `deps/` - each have its own build style.
- Library code lives under `lib/`.
- Qt application code lives under `app/`.
- Tests code lives under `lib/tests/`.
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Copilot project instructions claim tests live under lib/tests/, but this repo uses lib/test/ (and the design doc also references lib/test/src/...). Update this path to avoid steering contributors/tools to a non-existent directory.

Suggested change
- Tests code lives under `lib/tests/`.
- Tests code lives under `lib/test/`.

Copilot uses AI. Check for mistakes.
Comment on lines +203 to +219
// generate unique ID using timestamp
auto now = chrono::system_clock::now();
auto timestamp = chrono::duration_cast<chrono::seconds>(now.time_since_epoch()).count();

// extract host from URL for display name
string host = url;
size_t pos = url.find("://");
if (pos != string::npos) {
host = url.substr(pos + 3);
}

providerConfig.id = "ollama-" + to_string(timestamp);
providerConfig.displayName = "ollama " + model + " @ " + host;
providerConfig.providerType = WINGMAN_PROVIDER_OLLAMA;
providerConfig.url = url;
providerConfig.llmModel = model;
providerConfig.isValid = configValid;
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Provider IDs are generated using seconds-resolution timestamps (e.g., ollama-<seconds>), which can collide if the user adds providers quickly. Since IDs are used as stable keys for update/remove/active selection, consider generating IDs with higher entropy (ms + random/UUID) or checking for collisions and regenerating.

Copilot uses AI. Check for mistakes.
return getLlmProviderById(activeLlmProviderId);
}

void Configuration::addLlmProvider(const LlmProviderConfig& provider) {
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

addLlmProvider() blindly appends the provider without enforcing unique IDs (despite id being documented as unique). This can create duplicate IDs (e.g., two providers added within the same second using timestamp IDs) and then getLlmProviderById()/setActiveLlmProvider() become ambiguous. Consider rejecting duplicates (and surfacing an error) or auto-renaming/regenerating the ID on collision.

Suggested change
void Configuration::addLlmProvider(const LlmProviderConfig& provider) {
void Configuration::addLlmProvider(const LlmProviderConfig& provider) {
// Enforce unique provider IDs to avoid ambiguous lookups.
if (!provider.id.empty() && getLlmProviderById(provider.id) != nullptr) {
MF_DEBUG("Configuration::addLlmProvider() duplicate ID, provider not added: " << provider.id << endl);
return;
}

Copilot uses AI. Check for mistakes.
MF_DEBUG("Configuration::removeLlmProvider() removed: " << id << endl);
}

void Configuration::setActiveLlmProvider(const string& id) {
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

setActiveLlmProvider() sets activeLlmProviderId without verifying the ID exists in llmProviders. This can leave the config pointing at a non-existent provider (e.g., after update/remove sequences or malformed persisted data). Consider validating the ID (no-op with error/log, or clear active selection) to keep getActiveLlmProvider() consistent.

Suggested change
void Configuration::setActiveLlmProvider(const string& id) {
void Configuration::setActiveLlmProvider(const string& id) {
// Only set active provider if the ID is known; keep configuration consistent.
if (!id.empty() && getLlmProviderById(id) == nullptr) {
MF_DEBUG("Configuration::setActiveLlmProvider() attempted to set unknown provider id: " << id << endl);
return;
}

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants