-
-
Notifications
You must be signed in to change notification settings - Fork 60
Description
Current Behavior
Issue: spin configure gha <environment> command updating ALL environments
Hi! I've raised this issue a long time ago (I think already a year so partly my fault for not creating an issue on GH). I would like this very important command to be fixed because it shouldn't be updating the other environments when I'm clearly specifying the environment that should be affected.
As you can see in the image, I specified "staging". But somehow, the "production" environment is getting mentioned all throughout the steps. Even asking for the SSH password two times - for "staging" and for "production". You can see in the last step that I'm telling the truth.
I tried the spin maintain <environment> command. This one works as expected since it only affects the specified environment. Therefore, the issue with spin configure gha <environment> is clearly a bug.
Expected Behavior
It should only affect the environment specified.
Steps To Reproduce
- On
spin.yml, add two environments on the "servers" section. Up to you what those environments are. For example, let's say "staging" and "production". I specified two IP addresses of course on my end. - Run this command:
spin configure gha staging. - You'll notice that the
productionserver is also being affected. If you have setup an SSH password, you'll be asked two times as well.
Environment
v3.1.1 [stable] (User Installed)
�[1m�[33mOperating System Version:�[m
ProductName: macOS
ProductVersion: 26.1
BuildVersion: 25B78
�[1m�[34mDocker Info:�[m
Client:
Version: 28.5.2
Context: orbstack
Debug Mode: false
Plugins:
buildx: Docker Buildx (Docker Inc.)
Version: v0.29.1
Path: /Users/jsluchavez/.docker/cli-plugins/docker-buildx
compose: Docker Compose (Docker Inc.)
Version: v2.40.3
Path: /Users/jsluchavez/.docker/cli-plugins/docker-compose
Server:
Containers: 93
Running: 18
Paused: 0
Stopped: 75
Images: 168
Server Version: 28.5.2
Storage Driver: overlay2
Backing Filesystem: btrfs
Supports d_type: true
Using metacopy: false
Native Overlay Diff: true
userxattr: false
Logging Driver: json-file
Cgroup Driver: cgroupfs
Cgroup Version: 2
Plugins:
Volume: local
Network: bridge host ipvlan macvlan null overlay
Log: awslogs fluentd gcplogs gelf journald json-file local splunk syslog
CDI spec directories:
/etc/cdi
/var/run/cdi
Swarm: inactive
Runtimes: io.containerd.runc.v2 runc
Default Runtime: runc
Init Binary: docker-init
containerd version: 1c4457e00facac03ce1d75f7b6777a7a851e5c41
runc version: d842d7719497cc3b774fd71620278ac9e17710e0
init version: de40ad0
Security Options:
seccomp
Profile: builtin
cgroupns
Kernel Version: 6.17.8-orbstack-00308-g8f9c941121b1
Operating System: OrbStack
OSType: linux
Architecture: aarch64
CPUs: 8
Total Memory: 2.901GiB
Name: orbstack
ID: c205a1bd-58fe-4a98-bb2e-81cf8f906247
Docker Root Dir: /var/lib/docker
Debug Mode: false
Experimental: false
Insecure Registries:
::1/128
127.0.0.0/8
Live Restore Enabled: false
Product License: Community Engine
Default Address Pools:
Base: 192.168.97.0/24, Size: 24
Base: 192.168.107.0/24, Size: 24
Base: 192.168.117.0/24, Size: 24
Base: 192.168.147.0/24, Size: 24
Base: 192.168.148.0/24, Size: 24
Base: 192.168.155.0/24, Size: 24
Base: 192.168.156.0/24, Size: 24
Base: 192.168.158.0/24, Size: 24
Base: 192.168.163.0/24, Size: 24
Base: 192.168.164.0/24, Size: 24
Base: 192.168.165.0/24, Size: 24
Base: 192.168.166.0/24, Size: 24
Base: 192.168.167.0/24, Size: 24
Base: 192.168.171.0/24, Size: 24
Base: 192.168.172.0/24, Size: 24
Base: 192.168.181.0/24, Size: 24
Base: 192.168.183.0/24, Size: 24
Base: 192.168.186.0/24, Size: 24
Base: 192.168.207.0/24, Size: 24
Base: 192.168.214.0/24, Size: 24
Base: 192.168.215.0/24, Size: 24
Base: 192.168.216.0/24, Size: 24
Base: 192.168.223.0/24, Size: 24
Base: 192.168.227.0/24, Size: 24
Base: 192.168.228.0/24, Size: 24
Base: 192.168.229.0/24, Size: 24
Base: 192.168.237.0/24, Size: 24
Base: 192.168.239.0/24, Size: 24
Base: 192.168.242.0/24, Size: 24
Base: 192.168.247.0/24, Size: 24
Base: fd07:b51a:cc66:d000::/56, Size: 64Anything else?
No response