Skip to content

Memory usage grows very quickly when running commands with large output #9699

@kov

Description

@kov

Description

Simplest way to reproduce this is to run !find / and watch memory usage of the opencode processs. This happens both with the bash tool and with the interactive command. The memory usage grows at a much larger rate than you would expect from the command output.

The problem, I believe, is that we are constantly allocating strings, bigger and bigger ones, when we accumulate the output with string concatenation. That happens so quickly that the GC has little opportunity to catch up, leading to the very quick growth.

I bet it's one of the main issues people keep being bitten by when they see large memory usage issues, but I found no report specifically about the bash tool / interactive command buffer, so I am creating this one to associate with the pull request.

Plugins

No response

OpenCode version

git

Steps to reproduce

  1. open opencode
  2. !find /
  3. watch memory usage over time

Screenshot and/or share link

Running !find /. I've seen it go all the way to almost 70% in some cases. I saw the OOM killer working overtime yesterday because of this. Had to add a new swap file just to get it through the task I was working on.

Image

Operating System

Fedora 43

Terminal

Gnome Terminal and macOS terminal over ssh.

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingperfIndicates a performance issue or need for optimization

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions