Skip to content

Implement container-based CI workflow with Docker image caching#6

Merged
Geekdude merged 9 commits intomainfrom
github-ci
Feb 6, 2026
Merged

Implement container-based CI workflow with Docker image caching#6
Geekdude merged 9 commits intomainfrom
github-ci

Conversation

@Geekdude
Copy link
Copy Markdown
Member

@Geekdude Geekdude commented Feb 5, 2026

  • Add Dockerfile (.github/Dockerfile) to build image with Python 3.11, hatch, and dependencies
  • Build Docker image in initial build job and push to ghcr.io with SHA and latest tags
  • Add automatic cleanup to retain only 5 most recent images (cost management)
  • Refactor test, coverage, and docs jobs to use container (eliminates venv recreation)
  • Enable CI on all branches and PRs for full merge request support
  • Ensure consistent environment across all dependent jobs

Copilot AI review requested due to automatic review settings February 5, 2026 18:09
@github-actions
Copy link
Copy Markdown

github-actions Bot commented Feb 5, 2026

Format results

Metrics

lint_errors 0

Formatter output (last 200 lines)

Creating environment: hatch-static-analysis
Checking dependencies
Syncing dependencies
cmd [1] | ruff check --fix .
All checks passed!
cmd [2] | ruff format .
1 file reformatted, 21 files left unchanged

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR implements a container-based CI workflow that builds a Docker image containing Python 3.11, hatch, and project dependencies, then uses this image across multiple CI jobs to eliminate redundant environment setup. The workflow includes automatic image cleanup to manage storage costs.

Changes:

  • Added Dockerfile that builds an image with Python 3.11, hatch, and all project dependencies pre-installed
  • Refactored CI workflow to build and push Docker images to GitHub Container Registry with SHA and latest tags
  • Migrated docs, test, and coverage jobs to use the container image instead of recreating virtual environments

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 12 comments.

File Description
.github/Dockerfile New Dockerfile that installs Python dependencies, hatch, and converts SSH URLs to HTTPS for package installation
.github/workflows/ci.yml Refactored build job to create Docker image; updated docs, test, and coverage jobs to run in containers; added image cleanup step
Comments suppressed due to low confidence (1)

.github/workflows/ci.yml:222

  • When using GitHub Actions containers, the actions/checkout action may encounter permission issues with git operations due to ownership mismatches between the container user and the checked-out files. Consider adding 'options: --user root' to the container configuration or using 'git config --global --add safe.directory /github/workspace' in a setup step to avoid potential git permission errors.
          name: coverage
          path: |

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread .github/workflows/ci.yml Outdated
- name: Clean up old images
uses: actions/delete-package-versions@v4
with:
package-name: 'ChatHPC-app'
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The package name 'ChatHPC-app' appears to be hardcoded, but this may not match the actual GitHub repository name format. GitHub container registry packages typically use lowercase names. The repository appears to be 'ORNL/ChatHPC-app' based on the README, so the package name in GHCR would likely be 'chathpc-app' (lowercase). This mismatch could cause the cleanup action to fail or target the wrong package.

Suggested change
package-name: 'ChatHPC-app'
package-name: ${{ toLower(github.event.repository.name) }}

Copilot uses AI. Check for mistakes.
Comment thread .github/Dockerfile Outdated
RUN pip install --no-cache-dir hatch hatch-mkdocs

# Convert SSH URLs to HTTPS in pyproject.toml
RUN ./scripts/pyproject_ssh_to_https.sh
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The script execution assumes that the script file is executable, but the COPY command in Docker does not preserve file permissions. Consider either running the script with an explicit shell (e.g., 'RUN bash ./scripts/pyproject_ssh_to_https.sh') or adding a RUN command to make it executable before running it (e.g., 'RUN chmod +x ./scripts/pyproject_ssh_to_https.sh && ./scripts/pyproject_ssh_to_https.sh').

Suggested change
RUN ./scripts/pyproject_ssh_to_https.sh
RUN bash ./scripts/pyproject_ssh_to_https.sh

Copilot uses AI. Check for mistakes.
Comment on lines +28 to +31
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and push Docker image
uses: docker/build-push-action@v5
with:
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Building the Docker image with the entire source code context (context: .) may include sensitive files or unnecessary build artifacts. Consider using a .dockerignore file to exclude unnecessary files and reduce the image size and build time. Common exclusions include .git/, .venv/, pycache/, *.pyc, .pytest_cache/, coverage_html_report/, and test-results.xml.

Copilot uses AI. Check for mistakes.
@@ -15,43 +15,48 @@ jobs:
runs-on: ubuntu-latest
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The build job pushes images to ghcr.io and uses actions/delete-package-versions to clean up old images, but there are no explicit permissions declared for the job. While GITHUB_TOKEN has default permissions, it's a best practice to explicitly declare required permissions using 'permissions:' block. Consider adding 'permissions: packages: write' to ensure the job has the necessary permissions to push images and delete package versions.

Suggested change
runs-on: ubuntu-latest
runs-on: ubuntu-latest
permissions:
packages: write

Copilot uses AI. Check for mistakes.
Comment thread .github/Dockerfile Outdated
Comment on lines +11 to +21
# Copy source code
COPY . .

# Install hatch
RUN pip install --no-cache-dir hatch hatch-mkdocs

# Convert SSH URLs to HTTPS in pyproject.toml
RUN ./scripts/pyproject_ssh_to_https.sh

# Install the package and all dependencies
RUN pip install --no-cache-dir .
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Dockerfile copies all source code before installing hatch and dependencies. This means any source code change will invalidate the Docker cache for all subsequent layers, including package installations. Consider reordering the Dockerfile to copy only pyproject.toml and necessary dependency files first, install dependencies, and then copy the rest of the source code. This would improve build times by leveraging Docker layer caching when only source code changes.

Copilot uses AI. Check for mistakes.
Comment thread .github/workflows/ci.yml
Comment on lines 163 to 164
hatch run docs:build
- name: Upload docs site
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When using GitHub Actions containers, the actions/checkout action may encounter permission issues with git operations due to ownership mismatches between the container user and the checked-out files. Consider adding 'options: --user root' to the container configuration or using 'git config --global --add safe.directory /github/workspace' in a setup step to avoid potential git permission errors.

Copilot uses AI. Check for mistakes.
Comment thread .github/workflows/ci.yml Outdated
Comment on lines 183 to 184
hatch run test -- --junit-xml=test-results.xml
- name: Publish test results
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When using GitHub Actions containers, the actions/checkout action may encounter permission issues with git operations due to ownership mismatches between the container user and the checked-out files. Consider adding 'options: --user root' to the container configuration or using 'git config --global --add safe.directory /github/workspace' in a setup step to avoid potential git permission errors.

Copilot uses AI. Check for mistakes.
package-name: 'ChatHPC-app'
package-type: 'container'
min-versions-to-keep: 5
delete-only-untagged-versions: false
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The 'delete-only-untagged-versions' parameter is set to false, which means this cleanup will delete tagged versions (including the 'latest' tag) when more than 5 versions exist. This could break concurrent or subsequent jobs that depend on the 'latest' tag if they start after cleanup completes. Consider setting this to true to only delete untagged versions, or restructure the cleanup to run in a separate workflow after all jobs complete.

Suggested change
delete-only-untagged-versions: false
delete-only-untagged-versions: true

Copilot uses AI. Check for mistakes.
Comment thread Dockerfile
Comment on lines +11 to +12
# Copy source code
COPY . .
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Docker image includes the entire source code at build time (line 12: COPY . .), but the container jobs also check out the code again (line 166: actions/checkout@v4). This creates a redundant copy of the source code and could lead to confusion about which version is being used. Consider either removing the COPY command from the Dockerfile and only checking out in the workflow, or removing the checkout steps from container jobs and relying on the image's built-in source.

Copilot uses AI. Check for mistakes.
with:
context: .
file: .github/Dockerfile
push: true
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Docker image is tagged with both the commit SHA and 'latest', but dependent jobs reference the SHA tag specifically (e.g., line 164, 184, 222). This is good for reproducibility, but if the image cleanup (lines 37-43) runs before dependent jobs start and there are more than 5 versions, it could delete the SHA-tagged image that the jobs depend on, causing failures. The cleanup should either run after all dependent jobs complete or be moved to a separate scheduled workflow.

Copilot uses AI. Check for mistakes.
@github-actions
Copy link
Copy Markdown

github-actions Bot commented Feb 5, 2026

Test Results

29 tests  ±0   29 ✅ ±0   9s ⏱️ +5s
 1 suites ±0    0 💤 ±0 
 1 files   ±0    0 ❌ ±0 

Results for commit fb86557. ± Comparison against base commit 4cdcb3d.

♻️ This comment has been updated with latest results.

@Geekdude Geekdude force-pushed the github-ci branch 7 times, most recently from 598fc64 to 8d0c4ed Compare February 5, 2026 20:47
@github-actions
Copy link
Copy Markdown

github-actions Bot commented Feb 5, 2026

☂️ Python Coverage

current status: ✅

Overall Coverage

Lines Covered Coverage Threshold Status
1148 491 43% 0% 🟢

New Files

No new covered files...

Modified Files

No covered modified files...

updated for commit: fb86557 by action🐍

@Geekdude Geekdude force-pushed the github-ci branch 3 times, most recently from 29b1cb6 to 63b1ba3 Compare February 5, 2026 21:13
@Geekdude Geekdude self-assigned this Feb 5, 2026
Copy link
Copy Markdown
Collaborator

@williamfgc williamfgc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, nice job! I'd just pick a more representative name (e.g. CI-GitHub-runners). We can generate a badge after this is merged. Thanks!

@@ -1,5 +1,11 @@
name: CI
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd pick a more representative name in case we want to add CI on ExCL in the future.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, I've renamed it.

- Add Dockerfile (.github/Dockerfile) to build image with Python 3.11, hatch, and dependencies
- Build Docker image in initial build job and push to ghcr.io with SHA and latest tags
- Convert repository name to lowercase for Docker tags (Docker requirement)
- Add automatic cleanup to retain only 5 most recent images (cost management)
- Refactor all jobs (format, docs, test, scripts, coverage) to use container
- All jobs now depend on build and use consistent pre-built environment
- Run SSH-to-HTTPS conversion after checkout in container jobs (fixes git clone issue)
- Enable CI on all branches and PRs for full merge request support
- Ensure consistent environment across all jobs
This commit modernizes the GitHub CI workflow to match improvements from
the GitLab CI merge (feature-improve-python-template), bringing the two
CI systems into sync.

Major changes:

Docker image:
- Migrate from pip/hatch to uv for dependency management
- Install uv via official installer script
- Use `uv venv --seed --python 3.11` and `uv sync --all-groups`
- Activate venv by default via PATH environment variable
- Maintain aggressive cleanup of pycache/pyc files for space efficiency

CI workflow:
- Add test step in build job to verify image with `uv run python`
- Update all job commands to use `uv run` prefix:
  * check_format: `uv run hatch fmt`
  * docs: `uv run hatch run docs:build`
  * test: `uv run hatch run test`
  * coverage: `uv run hatch run cov-html`

New check_type job:
- Add type checking with `uv run --all-groups ty check`
- Provide comprehensive reporting (artifacts, summaries, PR comments)
- Set as continue-on-error to allow workflow to proceed on failures
- Capture metrics (error count, exit code) and full output
- Post sticky PR comments with type check results

Benefits:
- Faster dependency resolution and installation with uv
- Better caching and reproducibility
- Consistency between GitLab and GitHub CI pipelines
- Enhanced visibility into type checking issues
@github-actions
Copy link
Copy Markdown

github-actions Bot commented Feb 6, 2026

Type check results

Metrics

type_errors 7
exit_code 0

Type checker output (last 200 lines)

help: Remove the unused suppression comment
1111 |         test_data = load_json_yaml_arg(test_dataset, False)
1112 |         test_data_len = len(test_data)
1113 | 
     -         for i, item in tqdm(enumerate(test_data), "Test", total=test_data_len):  # type: ignore
1114 +         for i, item in tqdm(enumerate(test_data), "Test", total=test_data_len):
1115 |             item_mapped = map_keywords(item)
1116 |             if ollama_model is not None:
1117 |                 response = ollama_chat_evaluate(self.config, ollama_model, **item_mapped)

warning[unused-type-ignore-comment]: Unused blanket `type: ignore` directive
    --> src/chathpc/app/app.py:1150:50
     |
1148 |                 logger.info("Saved test results as markdown to {file}", file=save_test_data_path_name + ".md")
1149 |
1150 |         if "answer" in next(iter(results), {}):  # type: ignore
     |                                                  ^^^^^^^^^^^^^^
1151 |             errors = 0
1152 |             for d in results:
     |
help: Remove the unused suppression comment
1147 |                 save_md(save_test_data_path_name, md)
1148 |                 logger.info("Saved test results as markdown to {file}", file=save_test_data_path_name + ".md")
1149 | 
     -         if "answer" in next(iter(results), {}):  # type: ignore
1150 +         if "answer" in next(iter(results), {}):
1151 |             errors = 0
1152 |             for d in results:
1153 |                 if ignore_minor(d["response"]) != ignore_minor(d["answer"]):

warning[unused-type-ignore-comment]: Unused blanket `type: ignore` directive
  --> src/chathpc/app/utils/datastore.py:64:38
   |
62 |         filename = add_unique_postfix(filename)
63 |
64 |     with open(filename, "wb") as f:  # type: ignore
   |                                      ^^^^^^^^^^^^^^
65 |         pickle.dump(data, f)
   |
help: Remove the unused suppression comment
61 |     if not override:
62 |         filename = add_unique_postfix(filename)
63 | 
   -     with open(filename, "wb") as f:  # type: ignore
64 +     with open(filename, "wb") as f:
65 |         pickle.dump(data, f)
66 | 
67 |     return filename

warning[unused-type-ignore-comment]: Unused blanket `type: ignore` directive
  --> src/chathpc/app/utils/datastore.py:95:28
   |
93 |             data = value
94 |         with open(filename, "w") as f:
95 |             f.write(data)  # type: ignore
   |                            ^^^^^^^^^^^^^^
96 |     return data
   |
help: Remove the unused suppression comment
92 |         else:
93 |             data = value
94 |         with open(filename, "w") as f:
   -             f.write(data)  # type: ignore
95 +             f.write(data)
96 |     return data
97 | 
98 | 

warning[unused-type-ignore-comment]: Unused blanket `type: ignore` directive
   --> src/chathpc/app/utils/datastore.py:109:37
    |
107 |         filename = add_unique_postfix(filename)
108 |
109 |     with open(filename, "w") as f:  # type: ignore
    |                                     ^^^^^^^^^^^^^^
110 |         f.write(data)
    |
help: Remove the unused suppression comment
106 |     if not override:
107 |         filename = add_unique_postfix(filename)
108 | 
    -     with open(filename, "w") as f:  # type: ignore
109 +     with open(filename, "w") as f:
110 |         f.write(data)
111 | 
112 |     return filename

warning[unused-type-ignore-comment]: Unused blanket `type: ignore` directive
   --> src/chathpc/app/utils/datastore.py:153:37
    |
151 |         filename = add_unique_postfix(filename)
152 |
153 |     with open(filename, "w") as f:  # type: ignore
    |                                     ^^^^^^^^^^^^^^
154 |         json.dump(data, f, indent=4, separators=(",", ": "), sort_keys=False, allow_nan=True)
    |
help: Remove the unused suppression comment
150 |     if not override:
151 |         filename = add_unique_postfix(filename)
152 | 
    -     with open(filename, "w") as f:  # type: ignore
153 +     with open(filename, "w") as f:
154 |         json.dump(data, f, indent=4, separators=(",", ": "), sort_keys=False, allow_nan=True)
155 | 
156 |     return filename

warning[unused-type-ignore-comment]: Unused blanket `type: ignore` directive
   --> src/chathpc/app/utils/datastore.py:184:28
    |
182 |             data = value
183 |         with open(filename, "w") as f:
184 |             f.write(data)  # type: ignore
    |                            ^^^^^^^^^^^^^^
185 |     return data
    |
help: Remove the unused suppression comment
181 |         else:
182 |             data = value
183 |         with open(filename, "w") as f:
    -             f.write(data)  # type: ignore
184 +             f.write(data)
185 |     return data
186 | 
187 | 

warning[unused-type-ignore-comment]: Unused blanket `type: ignore` directive
   --> src/chathpc/app/utils/datastore.py:198:37
    |
196 |         filename = add_unique_postfix(filename)
197 |
198 |     with open(filename, "w") as f:  # type: ignore
    |                                     ^^^^^^^^^^^^^^
199 |         f.write(data)
    |
help: Remove the unused suppression comment
195 |     if not override:
196 |         filename = add_unique_postfix(filename)
197 | 
    -     with open(filename, "w") as f:  # type: ignore
198 +     with open(filename, "w") as f:
199 |         f.write(data)
200 | 
201 |     return filename

warning[unused-type-ignore-comment]: Unused blanket `type: ignore` directive
  --> tests/test_preferences.py:19:36
   |
18 |     def test_test_config(self):
19 |         preferences = AppConfig()  # type: ignore
   |                                    ^^^^^^^^^^^^^^
20 |         assert preferences.data_file == Path("files/data_file.json")
   |
help: Remove the unused suppression comment
16 |         os.environ["CHATHPC_PROMPT_TEMPLATE"] = "prompt_template"
17 | 
18 |     def test_test_config(self):
   -         preferences = AppConfig()  # type: ignore
19 +         preferences = AppConfig()
20 |         assert preferences.data_file == Path("files/data_file.json")
21 | 
22 |     def tearDown(self):

warning[unused-type-ignore-comment]: Unused blanket `type: ignore` directive
  --> tests/test_preferences.py:40:36
   |
39 |     def test_create(self):
40 |         preferences = AppConfig()  # type: ignore
   |                                    ^^^^^^^^^^^^^^
41 |         assert preferences.data_file == Path("files/data_file.json"), "incorrect default data_file"
   |
help: Remove the unused suppression comment
37 |         os.environ.pop("CHATHPC_PROMPT_TEMPLATE")
38 | 
39 |     def test_create(self):
   -         preferences = AppConfig()  # type: ignore
40 +         preferences = AppConfig()
41 |         assert preferences.data_file == Path("files/data_file.json"), "incorrect default data_file"
42 | 
43 |     def test_update(self):

warning[unused-type-ignore-comment]: Unused blanket `type: ignore` directive
  --> tests/test_preferences.py:44:36
   |
43 |     def test_update(self):
44 |         preferences = AppConfig()  # type: ignore
   |                                    ^^^^^^^^^^^^^^
45 |         preferences.data_file = Path("new_file.json")
46 |         assert preferences.data_file == Path("new_file.json"), "incorrect default data_file"
   |
help: Remove the unused suppression comment
41 |         assert preferences.data_file == Path("files/data_file.json"), "incorrect default data_file"
42 | 
43 |     def test_update(self):
   -         preferences = AppConfig()  # type: ignore
44 +         preferences = AppConfig()
45 |         preferences.data_file = Path("new_file.json")
46 |         assert preferences.data_file == Path("new_file.json"), "incorrect default data_file"
47 | 

Found 27 diagnostics

@Geekdude Geekdude merged commit bd07b89 into main Feb 6, 2026
15 checks passed
@Geekdude Geekdude deleted the github-ci branch February 6, 2026 21:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants