This guide shows you how to create structured XML context from your Git repository for LLM analysis.
The gpt prep gr command generates XML representations of your repository files for LLM analysis,
refactoring, or documentation tasks.
gpt prep gr main.py utils.py config.json | gpt -p kilogit ls-files | lines | gpt prep grThis creates XML context for all files tracked by Git.
git ls-files ./gpt | lines | gpt prep gr | bat -l xmlUse bat with XML syntax highlighting to preview the structure before sending to an LLM.
# Only Python files
git ls-files | lines | where ($it | str ends-with ".py") | gpt prep gr
# Multiple file types
git ls-files | lines | where ($it | str ends-with ".py" or $it | str ends-with ".nu") | gpt prep grgit ls-files | lines | where ($it | str contains "tests/" | not) | gpt prep grgit ls-files ./src | lines | gpt prep grThe generated XML has this structure:
<context type="git-repo" path="/path/to/repo" origin="https://github.com/user/repo" caveats="XML special characters have been escaped. Be sure to unescape them before processing">
<file name="main.py">
def main():
print("Hello, world!")
</file>
<file name="utils.py">
def helper_function():
return "utility"
</file>
</context># Custom closure to process file content
git ls-files | lines | gpt prep gr --with-content {|| head -n 20}This reads only the first 20 lines of each file.
git ls-files ./src | lines | gpt prep gr --instructions "Focus on code architecture and design patterns"# Generate context and request review
git ls-files ./src | lines | gpt prep gr | "Review this codebase for potential improvements" | gpt -p kilo# Generate API documentation
git ls-files | lines | where ($it | str ends-with ".py") | gpt prep gr | "Generate API documentation" | gpt -p kilo --bookmark "api-docs"# Analyze system architecture
git ls-files | lines | gpt prep gr --instructions "Focus on system architecture and component relationships" | "Analyze the overall architecture and suggest improvements" | gpt -p kilo# Get refactoring suggestions for specific modules
git ls-files ./legacy | lines | gpt prep gr | "Suggest refactoring strategies for this legacy code" | gpt -p kilo# Include relevant files for bug analysis
["src/main.py", "src/parser.py", "tests/test_parser.py"] | gpt prep gr | "Analyze these files for potential bugs in the parser logic" | gpt -p kiloFor large repositories, be selective about what you include:
# Core modules only
["src/core/", "src/api/"] | each {|dir| git ls-files $dir | lines } | flatten | gpt prep gr
# Recently changed files
git diff --name-only HEAD~10 | lines | gpt prep grBreak large repositories into chunks:
# Analyze backend separately
git ls-files ./backend | lines | gpt prep gr | "Analyze backend architecture" | gpt --bookmark "backend-analysis" -p kilo
# Analyze frontend separately
git ls-files ./frontend | lines | gpt prep gr | "Analyze frontend architecture" | gpt --bookmark "frontend-analysis" -p kilo
# Compare architectures
"Compare the backend and frontend architectures" | gpt --continues [backend-analysis, frontend-analysis] -p kilo# Initial analysis
git ls-files ./src | lines | gpt prep gr | "What are the main components?" | gpt --bookmark "code-review" -p kilo
# Follow-up questions
"What design patterns are used?" | gpt -r -p milli
"Are there any code smells?" | gpt -r -p milli# Include both code and documentation
let doc = (gpt document ./README.md)
git ls-files ./src | lines | gpt prep gr | "How well does the code match the documentation?" | gpt --continues $doc.id -p kilo- Be selective - Don't include unnecessary files (tests, generated files, etc.)
- Use appropriate models - Simple questions can use smaller models like
milli - Break up large requests - Chunk analysis for very large codebases
- Preview context - Use
bat -l xmlto check structure before sending - Save analysis - Use bookmarks for important code review sessions
See the commands reference for complete context generation options.