Skip to content

Commit 648826a

Browse files
shivasuryaclaude
andauthored
fix(callgraph,dsl): Add thread-safety, improve logging, and fix progress messages (#451)
* fix(dsl): Add early filtering for empty container rule directories Prevent system hang when loading container rules from directories without @dockerfile_rule or @compose_rule decorators. Exit early with clear error message instead of attempting Python execution. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> * fix(dsl,callgraph): Add early filtering for rules and progress logging 1. Add early filtering for code analysis rules in LoadRules() - Skip executing Python files without @rule decorator - Prevents hanging when normal app.py files are used as rules - Add hasCodeAnalysisRuleDecorators() to check for @rule or codepathfinder imports 2. Add progress logging to BuildCallGraph() - Add progress messages for 4 passes through all Python files - Extracting return types, variable assignments, class attributes, call sites - Prevents appearance of hanging with large repos (20k+ files) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> * feat(dsl): Add verbose logging for loaded rules with ID and pathname - Add Logger interface to dsl package to avoid import cycle with output package - Update LoadRules() and LoadContainerRules() to accept logger parameter - Log each loaded rule with ID and file path in debug mode - Log container rules (Dockerfile and docker-compose) separately - Update all callers in cmd/scan.go and cmd/ci.go to pass logger - Update all tests to pass nil logger Fixes: Shows loaded rules when --debug flag is used Example output: [00:00.030] Loaded docker-compose rule: COMPOSE-SEC-008 from rules/docker-compose/dangerous_capabilities.py [00:00.362] Loaded code analysis rule: CUSTOM-001 from /tmp/test-rules/test_eval.py 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> * fix(callgraph): Make taint summary progress message consistent - Remove 'Pass 5:' prefix from taint summary generation message - Align with other progress messages style (present continuous without pass numbers) Before: 'Pass 5: Generating taint summaries...' After: 'Generating taint summaries...' Matches the style of: - 'Extracting return types from X modules (parallel)...' - 'Extracting variable assignments (parallel)...' - 'Extracting class attributes (parallel)...' - 'Resolving call sites (parallel)...' 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> * fix(callgraph): Add thread-safety to TypeInferenceEngine for parallel processing Fixes race condition: 'fatal error: concurrent map read and map write' during variable assignment extraction phase. Root cause: Multiple goroutines accessing TypeInferenceEngine.Scopes and TypeInferenceEngine.ReturnTypes maps concurrently without synchronization. Changes: - Add sync.RWMutex fields (scopeMutex, typeMutex) to TypeInferenceEngine struct - Protect GetScope() and AddScope() with scopeMutex - Add GetReturnType() method with typeMutex protection - Protect AddReturnTypesToEngine() with typeMutex - Update all direct map accesses to use thread-safe methods: - chaining.go: 6 occurrences - attribute.go: 1 occurrence Thread-safety pattern: - Use RLock/RUnlock for read operations (GetScope, GetReturnType) - Use Lock/Unlock for write operations (AddScope, AddReturnTypesToEngine) This allows parallel extraction passes to safely share the TypeInferenceEngine. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> * fix(dsl): Show loaded rules list with --verbose flag instead of --debug Changes: - Update Logger interface to include IsVerbose() and Statistic() methods - Change rule loading logging from IsDebug() + Debug() to IsVerbose() + Statistic() - This makes rule list visible with --verbose flag (more user-friendly) - Remove duplicate logging in LoadContainerRules (already logged in loadContainerRulesFromFile) Output example with --verbose: Loading container rules... - Loaded docker-compose rule COMPOSE-SEC-008 from rules/docker-compose/dangerous_capabilities.py - Loaded docker-compose rule COMPOSE-SEC-001 from rules/docker-compose/privileged_service.py Loading rules from /tmp/test-rules... - Loaded rule CUSTOM-001 from /tmp/test-rules/test_eval.py Loaded 1 rules Before: Only visible with --debug flag After: Visible with --verbose flag 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> * fix(dsl): Add nolint comment for intentional error ignore in hasAnyContainerRulesInPath - Suppress nilerr linter warning for filepath.Walk error handling - We intentionally ignore errors during walk and return false - This is safe because we're just checking if any container rules exist 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> * chore: Bump version to 1.1.5 * fix(tests): Update test filenames to avoid module registry filtering The module registry now filters out test files (test_*.py) which is correct for production use. Updated test files to use non-test filenames: - test_frameworks.py → frameworks.py - test_os.py → file_ops.py - test_pathlib.py → dir_utils.py - test_json.py → json_handler.py - test_sys.py → cli_args.py - test_comprehensive.py → comprehensive_stdlib.py - test_alias.py → alias_module.py - test_from.py → from_imports.py - test_multi.py → multi_module.py - test_baseline.py → baseline.py - test.py → app.py, process.py, handler.py This fixes TestFrameworkResolution, TestStdlibRegressionSuite, TestStdlibEdgeCases, and TestStdlibNoRegression tests. * feat(scan,ci): Add --skip-tests flag to control test file filtering Add --skip-tests flag (default: true) to scan and ci commands to give users control over whether test files should be excluded from analysis. Changes: - Add skipTests parameter to BuildModuleRegistry(rootPath, skipTests) - Update shouldSkipFile() to respect skipTests parameter - Add --skip-tests flag to scan command (default true) - Add --skip-tests flag to ci command (default true) - Update all test/integration callers to pass skipTests parameter - Add debug logging when test files are being skipped Usage: # Default behavior: skip test files pathfinder scan --rules rules/ --project . # Include test files in analysis pathfinder scan --rules rules/ --project . --skip-tests=false # CI mode with test file analysis pathfinder ci --rules rules/ --project . --output sarif --skip-tests=false Rationale: - Production scanning should skip tests by default (cleaner results) - Development/testing may need to analyze test files - User control via flag provides flexibility - Tests use --skip-tests=false to validate registry behavior * test(python): Improve test coverage to 98.25% and fix formatting Add comprehensive tests for cli and decorators modules to exceed 95% coverage requirement. **Test Coverage:** - cli/__init__.py: 35% → 99% (added 24 tests) - get_binary_path() - all 3 priority levels (bundled, PATH, download) - _is_musl() - musl libc detection for Alpine Linux - _download_binary() - tar.gz and zip archive handling - main() - entry point execution with argument passing - decorators.py: 75% → 90% (added 6 tests) - _enable_auto_execute() - atexit registration - _register_rule() - rule registry management - Auto-execution in __main__ context - JSON output format validation **Overall Results:** - 300 total tests passing (was 280) - 98.25% total coverage (exceeds 95% requirement) - All black formatting checks passing **Files Changed:** - python-dsl/tests/test_cli.py: Added TestIsMusl, TestGetBinaryPath, TestDownloadBinary, TestMain - python-dsl/tests/test_decorators.py: Added TestAutoExecution class - python-dsl/codepathfinder/cli/__init__.py: Black formatting - python-dsl/compile_container_rules.py: Black formatting - python-dsl/rules/container_*.py: Black formatting (6 files) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> * fix(python): Remove unused imports from test files Fix ruff linting errors by removing unused imports: - test_cli.py: Removed subprocess, os, tempfile, Path, call - test_decorators.py: Removed sys, atexit, MagicMock, _auto_execute_enabled All 35 tests still pass. Linting now passes: - lintPython: ✓ All checks passed - lintGo: ✓ 0 issues 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> * chore: Add test coverage files to .gitignore Add coverage artifacts to gitignore: - .coverage (Python coverage) - coverage.out (Go coverage) - htmlcov/ (HTML coverage reports) - *.coverage (any coverage files) These are temporary test artifacts and should not be tracked in version control. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> * test(registry): Add comprehensive tests for shouldSkipFile and skipTests parameter Add tests to improve module registry coverage from 23.52% to 92%+ for new code: **New Tests:** - TestShouldSkipFile: Tests file filtering logic with skipTests parameter - Validates test_ prefix filtering - Validates _test suffix filtering - Validates conftest.py, setup.py, __main__.py filtering - Tests skipTests=false includes all files - TestBuildModuleRegistry_SkipTestsParameter: Integration test for skipTests - Creates 2 regular files + 4 test files - Verifies skipTests=true excludes test files (2 modules) - Verifies skipTests=false includes all files (6 modules) - TestBuildModuleRegistry_SkipTestDirectories: Tests directory filtering - Verifies tests/, test/, fixtures/, mocks/ directories are skipped - Files within these directories are not indexed **Coverage Impact:** - shouldSkipFile(): 0% → 100% - BuildModuleRegistry(): 85% → 92% - module.go overall: 23.52% patch → 90.6% package Addresses Codecov report: 87 lines missing → ~70 lines remaining 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> * chore: Remove result.json and update .gitignore for output files Remove tracked result.json file (37KB test output) and update .gitignore to prevent output files from being tracked: **Changes:** - Removed sast-engine/result.json (test output artifact) - Updated .gitignore to ignore result.json and scan.json under "output files" section - Consolidated scan.json entry (was duplicated) These files are generated during testing/scanning and should not be version controlled. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> * chore: Update .gitignore to exclude output files Add result.json and reorganize .gitignore output files section: **Changes:** - Add result.json to gitignore (test output artifact) - Consolidate output files under dedicated section - Remove duplicate scan.json entry Output files like result.json and scan.json are generated during testing/scanning and should not be version controlled. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <[email protected]> --------- Co-authored-by: Claude Sonnet 4.5 <[email protected]>
1 parent e98e5db commit 648826a

35 files changed

+1255
-273
lines changed

.gitignore

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -34,4 +34,13 @@ docs/public/rules/*.json
3434
node_modules
3535
__pycache__/
3636
compiled_rules.json
37+
38+
# test coverage
39+
*.coverage
40+
.coverage
41+
coverage.out
42+
htmlcov/
43+
44+
# output files
45+
result.json
3746
scan.json

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "codepathfinder",
3-
"version": "1.1.3",
3+
"version": "1.1.5",
44
"description": "DEPRECATED - Use 'pip install codepathfinder' instead. See https://codepathfinder.dev/install",
55
"deprecated": "This package is deprecated. Please use 'pip install codepathfinder' for the complete installation including both CLI and Python DSL.",
66
"goBinary": {

python-dsl/codepathfinder/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@
2222
... )
2323
"""
2424

25-
__version__ = "1.1.3"
25+
__version__ = "1.1.5"
2626

2727
from .matchers import calls, variable
2828
from .decorators import rule

python-dsl/codepathfinder/cli/__init__.py

Lines changed: 9 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@
33
This module provides the entry point for the `pathfinder` command.
44
It locates and executes the bundled Go binary, passing through all arguments.
55
"""
6+
67
import os
78
import sys
89
import subprocess
@@ -49,6 +50,7 @@ def get_binary_path() -> Path:
4950

5051
# 2. Check PATH (development mode or manual install)
5152
import shutil
53+
5254
path_binary = shutil.which("pathfinder")
5355
if path_binary:
5456
return Path(path_binary)
@@ -60,13 +62,13 @@ def get_binary_path() -> Path:
6062
def _is_musl() -> bool:
6163
"""Detect if running on musl libc (Alpine Linux, etc.)."""
6264
try:
63-
result = subprocess.run(['ldd', '--version'], capture_output=True, text=True)
64-
return 'musl' in result.stderr.lower() or 'musl' in result.stdout.lower()
65+
result = subprocess.run(["ldd", "--version"], capture_output=True, text=True)
66+
return "musl" in result.stderr.lower() or "musl" in result.stdout.lower()
6567
except Exception:
6668
try:
67-
with open('/etc/os-release') as f:
69+
with open("/etc/os-release") as f:
6870
content = f.read().lower()
69-
return 'alpine' in content
71+
return "alpine" in content
7072
except Exception:
7173
return False
7274

@@ -155,7 +157,9 @@ def _download_binary(bin_dir: Path, binary_name: str) -> Path:
155157
if archive_ext == ".tar.gz":
156158
with tarfile.open(tmp.name, "r:gz") as tar:
157159
for member in tar.getmembers():
158-
if member.name == "pathfinder" or member.name.endswith("/pathfinder"):
160+
if member.name == "pathfinder" or member.name.endswith(
161+
"/pathfinder"
162+
):
159163
member.name = binary_name
160164
tar.extract(member, bin_dir)
161165
break

python-dsl/compile_container_rules.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,7 @@ def discover_and_import_rules(rules_dir: Path, rule_type: str):
5858
except Exception as e:
5959
print(f" ✗ Failed to import {rule_file.name}: {e}")
6060
import traceback
61+
6162
traceback.print_exc()
6263

6364
return imported_count

python-dsl/pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
44

55
[project]
66
name = "codepathfinder"
7-
version = "1.1.3"
7+
version = "1.1.5"
88
description = "Python DSL for code-pathfinder security patterns"
99
readme = "README.md"
1010
requires-python = ">=3.8"

python-dsl/rules/container_combinators.py

Lines changed: 21 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -10,29 +10,26 @@
1010
@dataclass
1111
class CombinatorMatcher:
1212
"""Represents a logic combinator (AND, OR, NOT)."""
13+
1314
combinator_type: str # "all_of", "any_of", "none_of"
14-
conditions: List[Union[Matcher, 'CombinatorMatcher', Dict, Callable]]
15+
conditions: List[Union[Matcher, "CombinatorMatcher", Dict, Callable]]
1516

1617
def to_dict(self) -> Dict[str, Any]:
1718
"""Convert to JSON IR."""
1819
serialized_conditions = []
1920
for cond in self.conditions:
20-
if hasattr(cond, 'to_dict'):
21+
if hasattr(cond, "to_dict"):
2122
serialized_conditions.append(cond.to_dict())
2223
elif isinstance(cond, dict):
2324
serialized_conditions.append(cond)
2425
elif callable(cond):
25-
serialized_conditions.append({
26-
"type": "custom_function",
27-
"has_callable": True
28-
})
26+
serialized_conditions.append(
27+
{"type": "custom_function", "has_callable": True}
28+
)
2929
else:
3030
serialized_conditions.append(cond)
3131

32-
return {
33-
"type": self.combinator_type,
34-
"conditions": serialized_conditions
35-
}
32+
return {"type": self.combinator_type, "conditions": serialized_conditions}
3633

3734

3835
def all_of(*conditions: Union[Matcher, Dict, Callable]) -> CombinatorMatcher:
@@ -47,10 +44,7 @@ def all_of(*conditions: Union[Matcher, Dict, Callable]) -> CombinatorMatcher:
4744
instruction(type="RUN", contains="sudo")
4845
)
4946
"""
50-
return CombinatorMatcher(
51-
combinator_type="all_of",
52-
conditions=list(conditions)
53-
)
47+
return CombinatorMatcher(combinator_type="all_of", conditions=list(conditions))
5448

5549

5650
def any_of(*conditions: Union[Matcher, Dict, Callable]) -> CombinatorMatcher:
@@ -65,10 +59,7 @@ def any_of(*conditions: Union[Matcher, Dict, Callable]) -> CombinatorMatcher:
6559
instruction(type="FROM", base_image="scratch")
6660
)
6761
"""
68-
return CombinatorMatcher(
69-
combinator_type="any_of",
70-
conditions=list(conditions)
71-
)
62+
return CombinatorMatcher(combinator_type="any_of", conditions=list(conditions))
7263

7364

7465
def none_of(*conditions: Union[Matcher, Dict, Callable]) -> CombinatorMatcher:
@@ -83,26 +74,25 @@ def none_of(*conditions: Union[Matcher, Dict, Callable]) -> CombinatorMatcher:
8374
instruction(type="USER", user_name_not="root")
8475
)
8576
"""
86-
return CombinatorMatcher(
87-
combinator_type="none_of",
88-
conditions=list(conditions)
89-
)
77+
return CombinatorMatcher(combinator_type="none_of", conditions=list(conditions))
9078

9179

9280
@dataclass
9381
class SequenceMatcher:
9482
"""Represents instruction sequence validation."""
83+
9584
sequence_type: str # "after" or "before"
9685
instruction: Union[str, Matcher, Dict]
9786
reference: Union[str, Matcher, Dict]
9887
not_followed_by: bool = False
9988

10089
def to_dict(self) -> Dict[str, Any]:
10190
"""Convert to JSON IR."""
91+
10292
def serialize_ref(ref):
10393
if isinstance(ref, str):
10494
return {"instruction": ref}
105-
elif hasattr(ref, 'to_dict'):
95+
elif hasattr(ref, "to_dict"):
10696
return ref.to_dict()
10797
elif isinstance(ref, dict):
10898
return ref
@@ -112,14 +102,14 @@ def serialize_ref(ref):
112102
"type": f"instruction_{self.sequence_type}",
113103
"instruction": serialize_ref(self.instruction),
114104
"reference": serialize_ref(self.reference),
115-
"not_followed_by": self.not_followed_by
105+
"not_followed_by": self.not_followed_by,
116106
}
117107

118108

119109
def instruction_after(
120110
instruction: Union[str, Matcher],
121111
after: Union[str, Matcher],
122-
not_followed_by: bool = False
112+
not_followed_by: bool = False,
123113
) -> SequenceMatcher:
124114
"""
125115
Check that an instruction appears after another.
@@ -138,14 +128,14 @@ def instruction_after(
138128
sequence_type="after",
139129
instruction=instruction,
140130
reference=after,
141-
not_followed_by=not_followed_by
131+
not_followed_by=not_followed_by,
142132
)
143133

144134

145135
def instruction_before(
146136
instruction: Union[str, Matcher],
147137
before: Union[str, Matcher],
148-
not_followed_by: bool = False
138+
not_followed_by: bool = False,
149139
) -> SequenceMatcher:
150140
"""
151141
Check that an instruction appears before another.
@@ -157,21 +147,19 @@ def instruction_before(
157147
sequence_type="before",
158148
instruction=instruction,
159149
reference=before,
160-
not_followed_by=not_followed_by
150+
not_followed_by=not_followed_by,
161151
)
162152

163153

164154
@dataclass
165155
class StageMatcher:
166156
"""Matcher for multi-stage build stage queries."""
157+
167158
stage_type: str
168159
params: Dict[str, Any] = field(default_factory=dict)
169160

170161
def to_dict(self) -> Dict[str, Any]:
171-
return {
172-
"type": f"stage_{self.stage_type}",
173-
**self.params
174-
}
162+
return {"type": f"stage_{self.stage_type}", **self.params}
175163

176164

177165
def stage(
@@ -213,7 +201,7 @@ def final_stage_has(
213201
if instruction is not None:
214202
if isinstance(instruction, str):
215203
params["instruction"] = instruction
216-
elif hasattr(instruction, 'to_dict'):
204+
elif hasattr(instruction, "to_dict"):
217205
params["instruction"] = instruction.to_dict()
218206
if missing_instruction is not None:
219207
params["missing_instruction"] = missing_instruction

python-dsl/rules/container_decorators.py

Lines changed: 10 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@
1212
@dataclass
1313
class RuleMetadata:
1414
"""Metadata for a container security rule."""
15+
1516
id: str
1617
name: str = ""
1718
severity: str = "MEDIUM"
@@ -24,6 +25,7 @@ class RuleMetadata:
2425
@dataclass
2526
class DockerfileRuleDefinition:
2627
"""Complete definition of a Dockerfile rule."""
28+
2729
metadata: RuleMetadata
2830
matcher: Dict[str, Any]
2931
rule_function: Callable
@@ -32,6 +34,7 @@ class DockerfileRuleDefinition:
3234
@dataclass
3335
class ComposeRuleDefinition:
3436
"""Complete definition of a docker-compose rule."""
37+
3538
metadata: RuleMetadata
3639
matcher: Dict[str, Any]
3740
rule_function: Callable
@@ -63,6 +66,7 @@ def _output_rules():
6366

6467
# Compile rules to JSON IR format
6568
from . import container_ir
69+
6670
compiled = container_ir.compile_all_rules()
6771

6872
# Output to stdout for Go loader to capture
@@ -100,12 +104,13 @@ def dockerfile_rule(
100104
def container_runs_as_root():
101105
return missing(instruction="USER")
102106
"""
107+
103108
def decorator(func: Callable) -> Callable:
104109
# Get matcher from function
105110
matcher_result = func()
106111

107112
# Convert to dict if it's a Matcher object
108-
if hasattr(matcher_result, 'to_dict'):
113+
if hasattr(matcher_result, "to_dict"):
109114
matcher_dict = matcher_result.to_dict()
110115
elif isinstance(matcher_result, dict):
111116
matcher_dict = matcher_result
@@ -115,7 +120,7 @@ def decorator(func: Callable) -> Callable:
115120
# Create rule definition
116121
metadata = RuleMetadata(
117122
id=id,
118-
name=name or func.__name__.replace('_', ' ').title(),
123+
name=name or func.__name__.replace("_", " ").title(),
119124
severity=severity,
120125
category=category,
121126
cwe=cwe,
@@ -154,10 +159,11 @@ def compose_rule(
154159
def privileged_service():
155160
return service_has(key="privileged", equals=True)
156161
"""
162+
157163
def decorator(func: Callable) -> Callable:
158164
matcher_result = func()
159165

160-
if hasattr(matcher_result, 'to_dict'):
166+
if hasattr(matcher_result, "to_dict"):
161167
matcher_dict = matcher_result.to_dict()
162168
elif isinstance(matcher_result, dict):
163169
matcher_dict = matcher_result
@@ -166,7 +172,7 @@ def decorator(func: Callable) -> Callable:
166172

167173
metadata = RuleMetadata(
168174
id=id,
169-
name=name or func.__name__.replace('_', ' ').title(),
175+
name=name or func.__name__.replace("_", " ").title(),
170176
severity=severity,
171177
category=category,
172178
cwe=cwe,

python-dsl/rules/container_ir.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -100,5 +100,5 @@ def write_ir_file(filepath: str, pretty: bool = True):
100100
pretty: If True, format with indentation.
101101
"""
102102
json_str = compile_to_json(pretty=pretty)
103-
with open(filepath, 'w') as f:
103+
with open(filepath, "w") as f:
104104
f.write(json_str)

python-dsl/rules/container_matchers.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -9,19 +9,18 @@
99
@dataclass
1010
class Matcher:
1111
"""Base class for all matchers."""
12+
1213
type: str
1314
params: Dict[str, Any] = field(default_factory=dict)
1415

1516
def to_dict(self) -> Dict[str, Any]:
1617
"""Convert matcher to dictionary for JSON IR."""
17-
return {
18-
"type": self.type,
19-
**self.params
20-
}
18+
return {"type": self.type, **self.params}
2119

2220

2321
# --- Dockerfile Matchers ---
2422

23+
2524
def instruction(
2625
type: str,
2726
# FROM instruction
@@ -158,6 +157,7 @@ def missing(
158157

159158
# --- docker-compose Matchers ---
160159

160+
161161
def service_has(
162162
key: str,
163163
equals: Optional[Any] = None,

0 commit comments

Comments
 (0)