-
Notifications
You must be signed in to change notification settings - Fork 14
AI Integration Details
Relevant source files
- harbinger/src/harbinger/worker/genai/prompts.py
- harbinger/src/harbinger/worker/genai/tools.py
- harbinger/src/harbinger/worker/workflows.py
- ansible/roles/harbinger/defaults/main.yaml
- harbinger/src/harbinger/job_templates/router.py
- harbinger/src/harbinger/database/schemas.py
- harbinger/src/harbinger/worker/activities.py
Harbinger integrates Artificial Intelligence (AI) capabilities to enhance automation, analysis, and operational efficiency within red team engagements. This integration primarily focuses on leveraging generative AI models to automate the creation of playbook templates from natural language descriptions, summarize complex task outputs, and generate various suggestions for operational tasks. The AI components interact with core system workflows and data models to provide intelligent assistance throughout the engagement lifecycle.
The AI integration streamlines the process of translating operational requirements into executable playbooks and extracts actionable intelligence from raw command and control (C2) outputs, reducing manual effort and accelerating decision-making.
Harbinger provides an AI-powered endpoint to generate playbook templates directly from markdown-formatted README content. This feature allows users to describe desired playbook functionality in natural language, which the AI model then converts into a structured YAML playbook definition.
The FastAPI endpoint responsible for this functionality is located in harbinger/src/harbinger/job_templates/router.py.
# harbinger/src/harbinger/job_templates/router.py:126-130
@router.post(
"/playbooks/ai", # Path as used in the frontend
response_model=schemas.GeneratedYamlOutput, # Use the new output schema
tags=["templates", "ai"], # Add relevant tags for docs
summary="Generate Playbook Template YAML from README",
description="Accepts README markdown content and uses an AI model to Sources: harbinger/src/harbinger/job_templates/router.py:126-130
The input and output schemas for this endpoint are defined in harbinger/src/harbinger/database/schemas.py.
# harbinger/src/harbinger/database/schemas.py:96-99
class ReadmeInput(BaseModel):
"""Schema for receiving README content."""
readme: str = Field(..., description="The README content in markdown format.")
# harbinger/src/harbinger/database/schemas.py:102-104
class GeneratedYamlOutput(BaseModel):
"""Schema for returning the AI-generated YAML."""
yaml: str = Field(..., description="The generated playbook template YAML string.")Sources: harbinger/src/harbinger/database/schemas.py:96-99, harbinger/src/harbinger/database/schemas.py:102-104
The process flow for AI-powered playbook generation is as follows:
graph TD
A[User provides README Markdown] --> B{FastAPI Endpoint /playbooks/ai};
B --> C[ReadmeInput Schema Validation];
C --> D[AI Model (e.g., via prompts.py)];
D --> E[Generated YAML String];
E --> F[GeneratedYamlOutput Schema Validation];
F --> G[Return YAML to User];
AI capabilities are integrated into various background workflows and activities to provide intelligent automation and analysis. These activities are typically executed by the Temporal worker and interact with the AI models through defined prompts and tools.
Key AI-related workflows and the activities they leverage include:
RunPlaybookRunC2JobParseFileCreateTimelineSyncAllCreateSummariesCreateC2ImplantSuggestionCreateDomainSuggestionCreateFileSuggestionCreateChecklistPlaybookDetectionRiskPrivEscSuggestions
Sources: harbinger/src/harbinger/database/router.py:27-36, harbinger/src/harbinger/worker/workflows.py:23-33
The harbinger/src/harbinger/worker/activities.py file defines the actual activity functions that utilize AI prompts and tools. For example, structlog is used for logging AI-related operations.
# harbinger/src/harbinger/worker/activities.py:31-32
from harbinger.worker.genai import prompts, tools
import concurrent.futuresSources: harbinger/src/harbinger/worker/activities.py:31-32
A high-level view of how AI activities fit into the worker architecture:
graph TD
A[Temporal Workflow] --> B(Call AI Activity);
B --> C{AI Activity e.g., create_summaries};
C --> D[Load AI Prompts/Tools];
D --> E[Interact with AI Model];
E --> F[Process AI Output];
F --> G[Update Database/Return Result];
The AI integration relies on structured prompts and defined tools to guide the generative models.
AI prompts are defined in harbinger/src/harbinger/worker/genai/prompts.py. These prompts instruct the AI model on the desired output format and content. Examples include prompts for converting READMEs to YAML, summarizing task outputs, and generating various suggestions.
# harbinger/src/harbinger/worker/genai/prompts.py:17-21
README_TO_YAML_PROMPT = """
You are an expert red team operator. Your task is to convert a README into a YAML playbook.
...
"""Sources: harbinger/src/harbinger/worker/genai/prompts.py:17-21
Another example is a prompt for summarizing C2 task output:
# harbinger/src/harbinger/worker/genai/prompts.py:36-39
SUMMARIZE_TASK_OUTPUT_PROMPT = """
You are an expert red team operator. Your task is to summarize the following C2 task output.
...
"""Sources: harbinger/src/harbinger/worker/genai/prompts.py:36-39
AI tools, defined in harbinger/src/harbinger/worker/genai/tools.py, allow the AI model to interact with external systems or perform specific actions. These tools are typically functions exposed to the AI model.
# harbinger/src/harbinger/worker/genai/tools.py:20-22
def get_tool_config() -> List[rg.Tool]:
return [
rg.Tool(
name="get_playbook_template_schema",
description="""
Get the JSON schema for a playbook template.
...Sources: harbinger/src/harbinger/worker/genai/tools.py:20-22
The AI integration uses specific Pydantic models for inputs and outputs, ensuring data consistency and validation.
| Model Name | Purpose | Fields ... [truncated] graph TD A[Playbook Template YAML] -->|Generate from README| B[AI Model]; B --> C[Generated Playbook Template YAML]; C --> D[Database];
## Configuration
The `ansible/roles/harbinger/defaults/main.yaml` file defines default configuration variables for the Harbinger deployment. While it doesn't contain specific AI model configurations, it sets up the environment where AI services would run, such as:
```yaml
# ansible/roles/harbinger/defaults/main.yaml:2-3
harbinger_version: "0.1.0"
harbinger_release_version: "0.1.0"
Sources: ansible/roles/harbinger/defaults/main.yaml:2-3
Direct AI model specific configurations (e.g., API keys, model endpoints) are not present in the provided defaults/main.yaml file, suggesting they are managed elsewhere, possibly through environment variables or application settings.
Harbinger's AI integration is designed to automate and enhance key aspects of red team operations. By leveraging generative AI for playbook creation, task summarization, and intelligent suggestions, the system aims to improve efficiency and provide richer insights. The architecture supports a modular approach, separating AI prompting and tooling from core workflow activities and API endpoints, allowing for flexible expansion of AI capabilities in the future.