🐞 fix(cli): enable --model.pre_processor.image_size via CLI (fixes #3460)#3482
🐞 fix(cli): enable --model.pre_processor.image_size via CLI (fixes #3460)#3482AbhayKumarDas wants to merge 2 commits intoopen-edge-platform:mainfrom
--model.pre_processor.image_size via CLI (fixes #3460)#3482Conversation
7138325 to
1f4ced7
Compare
|
@AbhayKumarDas thanks for the PR. Current solution touches a lot of files and introduces another argument in all the models (which itself is already part of pre-processor). The CLI already exposes the pre-processor via |
Thanks @ashwinvaidya17 sir for the detailed feedback, this makes sense. I agree that exposing image_size through model constructors is not the cleanest approach, and handling it through the preprocessor aligns better with the existing design. I will explore restructuring this using a configure_preprocessor-style pattern similar to how Lightning CLI handles optimizers, and update the PR accordingly. Thanks for pointing me in the right direction. |
6d6b703 to
f4cb9a4
Compare
Signed-off-by: Abhay Kumar Das <dasabhay.jsr@gmail.com>
f4cb9a4 to
6827a6f
Compare
--model.pre_processor.image_size via CLI (fixes #3460)
|
Good Evening @ashwinvaidya17 , I have updated the implementation based on the feedback and reworked the design to align with the existing pre-processor architecture instead of introducing model-level parameters. The changes now use the configure_pre_processor flow with CLI-based argument injection, keeping the solution scoped to the CLI and avoiding modifications across model files. I have also added validation scenarios to ensure correctness across different models and edge cases. Sharing the validation document here for reference: [Click to View] Would appreciate a quick review when you get a chance. |
Problem
After the pre-processor refactor, changing image size from the CLI requires rebuilding the entire preprocessing pipeline specifying PreProcessor, Compose, Resize, and Normalize just to change one number. For something as commonly adjusted as image size, this creates unnecessary friction.
Previous approach and mentor feedback
My initial attempt added
image_sizeas a pass-through parameter to every model's__init__, forwarding it toAnomalibModule. While functional, this touched 22+ model files with identical boilerplate and duplicated a concept that already belongs to the pre-processor. Every new model would need the same pass-through — not scalable.@ashwinvaidya17 pointed out that the CLI already exposes
--model.pre_processor(defaults toTrue), and every model already has aconfigure_pre_processormethod that acceptsimage_size. The infrastructure was already there. He suggested following the Lightning CLIconfigure_optimizerspattern, where a method on the module is overridden via the CLI usingfunctools.partialas anomalib already uses this exact pattern for optimizers atcli.py:587.Design considerations
Before settling on the current approach, I evaluated a simpler alternative: replace
model.pre_processorafter model instantiation by callingconfigure_pre_processor(image_size=...)post-init. This would have been fewer lines, but it breaks five models — CsFlow, Fastflow, Ganomaly, UFlow, and ReverseDistillation as that readself.input_sizeduring__init__to construct their torch model architectures. By the time we'd replace the pre-processor, the model would already be built with the wrong input dimensions.The chosen approach patches
configure_pre_processoron the model class before instantiation, so the correct image size flows through from the start. This mirrors how Lightning CLI overridesconfigure_optimizers— the method is temporarily replaced with apartialthat has the CLI arguments baked in, then restored after instantiation.How it works
The solution has two stages, both in
cli.py.The first stage handles a parsing constraint.
pre_processoris typed asnn.Module | boolinAnomalibModule.__init__, so jsonargparse rejects--model.pre_processor.image_sizeat parse time. To solve this,_extract_pre_processor_argsintercepts the raw CLI arguments beforeparse_args, extracts any--model.pre_processor.*entries, and removes them from the arg list. Thepre_processorparameter then falls back to its default valueTrue, meaning "useconfigure_pre_processor()to create the default."The second stage injects the extracted arguments into the model's
configure_pre_processormethod. Beforparser.instantiate_classes()creates the model,_patch_configure_pre_processorresolves the model class from config, saves the original method descriptor, and replaces it withstaticmethod(partial(original, **extracted_kwargs)). When the model's__init__calls_resolve_component(True, ..., self.configure_pre_processor), the partial fires withimage_sizealready bound. After instantiation, the original method is restored in atry/finallyblock — no permanent side effects on the class.This approach correctly handles both
@staticmethodand@classmethodoverrides across different models. For models that inheritconfigure_pre_processorfrom the base class without their own override,delattrremoves the temporary patch and restores normal MRO resolution.What doesn't break
When no
--model.pre_processor.*args are passed,_pre_processor_kwargsstays empty, no patching occurs, and the original code path runs untouched. Model-specific overrides are respected — PatchCore keeps itscenter_crop_sizedefault, WinClip and UFlow still warn and ignoreimage_size, DRAEM and EfficientAd still skip normalization. Custom PreProcessor instances passed via--model.pre_processor <class_path>are unaffected since there are no sub-args to extract.This design is also forward-compatible with exposing additional pre-processor parameters in the future. Any keyword argument accepted by a model's
configure_pre_processorcan be passed via--model.pre_processor.<key>without further code changes — for example, PatchCore'scenter_crop_sizealready works today.Example usage
Files changed
src/anomalib/cli/cli.py — 1 file, +114 lines, 0 model files touched
Previous Approach Overview (click to expand)
Previous Approach Overview
This PR fixes #3460 by allowing users to directly configure input image size from the CLI using
--model.image_size.Before this change, modifying image size required redefining the entire preprocessing pipeline, which made a very simple task unnecessarily complicated. This update restores a clean and intuitive way to control input resolution from the CLI.
Problem
After the pre-processor refactor, image resizing became tied to the
PreProcessorconfiguration. Because of this, users could not change the image size directly from the CLI.In practice, this meant users had to either reconstruct the full
PreProcessorusing CLI arguments or write a YAML configuration just to change a single parameter. For something as commonly adjusted as image size, this created unnecessary friction.Root Cause
The CLI is built on top of
jsonargparse, which determines available arguments based on the__init__signatures of model classes.Although
image_sizelogically belongs to the base class (AnomalibModule), it was not exposed in the constructors of model subclasses likePadimorStfpm. As a result, the CLI could not recognize--model.image_size, even though the base module had the capability to handle it.Solution
The solution follows the existing architecture instead of introducing new CLI-level logic.
First,
image_sizeis added to the constructor of every model and forwarded toAnomalibModule. This makes the parameter visible tojsonargparse, allowing it to be used directly from the CLI. This approach is consistent with how other parameters such aspre_processor,post_processor,evaluator, andvisualizerare already handled.Second, the base module (
AnomalibModule) is extended to apply the image size dynamically. Whenimage_sizeis provided and the default pre-processor is being used, the preprocessing pipeline is modified at runtime. If a customPreProcessoris provided by the user, the behavior remains unchanged.Instead of rebuilding the preprocessing pipeline, the implementation updates it in-place. If a
Resizetransform already exists, it is replaced with the new size. If it does not exist, aResizetransform is added at the beginning. Existing attributes such as interpolation and antialiasing are preserved, ensuring consistent behavior.Architecture
This approach keeps the design aligned with the existing system:
jsonargparseautomatically handles CLI parsing based on these constructorsNo changes were made to
cli.py, and no special argument parsing logic was introduced. This keeps the solution clean, scalable, and consistent with the framework’s design principles.Example Usage