Skip to content

feat(models, training): Optimise spectral transform#985

Draft
samhatfield wants to merge 9 commits intomainfrom
feat/combine_two_shts_into_one
Draft

feat(models, training): Optimise spectral transform#985
samhatfield wants to merge 9 commits intomainfrom
feat/combine_two_shts_into_one

Conversation

@samhatfield
Copy link
Copy Markdown
Collaborator

@samhatfield samhatfield commented Mar 18, 2026

Description

This PR implements two optimisations to spectral transforms:

  1. reduces the number of transforms required to compute the spectral L2 loss from two to one
  2. batches FFTs of the same length together

What problem does this change solve?

Optimisation 1

The L2 spectral loss is currently computed like the following:

pred_spectral = self._to_spectral_flat(pred)
target_spectral = self._to_spectral_flat(target)
diff = torch.abs(pred_spectral - target_spectral) ** 2

A spectral transform is a linear operation, so it's not necessary to transform each field individually. Instead you can just transform the residual itself, which reduces the number of calls to _to_spectral_flat from two to one:

diff = torch.abs(self._to_spectral_flat(pred - target)) ** 2

This improves overall speed of training when spectral losses are used at N320 by about 5%.

Optimisation 2

For classic reduced grids like N320, many latitudes have the same length. It makes sense to group these together into a batch and execute a batched FFT.

This provides a further ~20% of performance improvement for N320.

Note that optimisation 2 does nothing for octahedral grids which have different length FFTs at each latitude. There, graphs are still the best solution.

What issue or task does this change relate to?

#599 #974

Additional notes

As a contributor to the Anemoi framework, please ensure that your changes include unit tests, updates to any affected dependencies and documentation, and have been tested in a parallel setting (i.e., with multiple GPUs). As a reviewer, you are also responsible for verifying these aspects and requesting changes if they are not adequately addressed. For guidelines about those please refer to https://anemoi.readthedocs.io/en/latest/

By opening this pull request, I affirm that all authors agree to the Contributor License Agreement.

@sahahner sahahner added the ATS Approval Not Needed No approval needed by ATS label Mar 18, 2026
@samhatfield samhatfield changed the title feat(models): Only call spectral transform once on residual feat(models): Optimised spectral transform Mar 18, 2026
@samhatfield samhatfield changed the title feat(models): Optimised spectral transform feat(models): Optimise spectral transform Mar 18, 2026
Copy link
Copy Markdown
Contributor

@OpheliaMiralles OpheliaMiralles left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks ok in theory. I'm not personally using the SpectralL2Loss but it shouldn't make any difference

@github-project-automation github-project-automation bot moved this from To be triaged to For merging in Anemoi-dev Mar 19, 2026
@samhatfield
Copy link
Copy Markdown
Collaborator Author

Sorry I should have marked it as a draft... I'm not sure what the etiquette is for anemoi PRs. This is not yet ready for merging. Still playing around with a few things offline and consulting people in other channels to make sure what I'm doing here is okay.

@sahahner sahahner mentioned this pull request Mar 19, 2026
@samhatfield samhatfield changed the title feat(models): Optimise spectral transform feat(models, training): Optimise spectral transform Mar 19, 2026
@floriankrb floriankrb marked this pull request as draft April 1, 2026 09:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

Status: For merging

Development

Successfully merging this pull request may close these issues.

4 participants