Skip to content

Conversation

@y-guyon
Copy link
Contributor

@y-guyon y-guyon commented Jun 18, 2025

Based on #2599.

Note that "(Default: psnr)" was incorrect.

@maryla-uc
Copy link
Contributor

WIth this PR, using -a sharpness=0 does nothing because when you set tune=iq it overrides it.
This also happens if you manually set -a sharpness=0 -a tune=iq. It only works as expected with -a tune=iq -a sharpness=0.

@juliobbv-p
Copy link

@maryla-uc good catch! tune iq sets up many optimized defaults (including --sharpness), so the codec control call to set tune iq should happen before the -a parameters.

@y-guyon
Copy link
Contributor Author

y-guyon commented Jul 23, 2025

@maryla-uc good catch! tune iq sets up many optimized defaults (including --sharpness), so the codec control call to set tune iq should happen before the -a parameters.

Done.

@castano
Copy link

castano commented Oct 3, 2025

Any idea what's holding up this PR?

@y-guyon
Copy link
Contributor Author

y-guyon commented Oct 3, 2025

Any idea what's holding up this PR?

There are a few consequences of TUNE=IQ to take into consideration when using it by default:

  • The quality encoding setting does not lead to the same file size and visual distortion, so the default encoding settings may end in noticeably different output after this change.
  • TUNE=IQ is slower to encode. One option may be to increase the default speed to 7 instead of 6, but that could also have side effects on output file size and decoding time.

@juliobbv-p
Copy link

@castano another thing worth mentioning: tune IQ is already considered production quality (on libaom 3.13.1 and newer). If a project doesn't have to worry about specific tune ssim behavior, then said project can just switch to use tune IQ right away.

For example, The Guardian has been using tune IQ for a few months, and we've received positive feedback from the website admin in the form of overall better image quality and consistency. Tune IQ's improvement in bit allocation by itself can be seen as a "bug fix".

I wouldn't wait for this PR to be merged as a sign of feature "readiness". On the contrary, we're just trying to be super thorough in understanding the inherent differences in bitrate allocation and encode time with tune IQ across images, compared to tune SSIM.

@castano
Copy link

castano commented Oct 4, 2025

I think it makes sense to evaluate the consequences of enabling tune IQ before making it the default choice, but it would be helpful to first add support for tune IQ as a user option, and only later make it the default after more users have had the chance to evaluate it and provide feedback.

I have clients interested in using tune IQ, but they won't maintain a custom build of libavif with modifications that they have to merge on every update.

Similarly, tools that use libavif won't offer this option until it's officially supported by libavif, so not having this exposed also delays its upstream adoption.

@juliobbv-p
Copy link

juliobbv-p commented Oct 5, 2025

@castano you can enable tune IQ in avifenc with the following parameter: -a c:tune=iq. This is the official way to switch between different tunings, so it's guaranteed to never change.

avifenc in turn passes in the parameter to libavif as a key ("tune")/ value ("iq") pair in the following manner:

static avifBool avifCodecSpecificOptionsAdd(avifCodecSpecificOptions * options, const char * keyValue)
{
    const char * value = strchr(keyValue, '=');
    if (value) {
        // Keep the parts on the left and on the right of the equal sign,
        // but not the equal sign itself.
        const size_t keyLength = value - keyValue;
        return avifCodecSpecificOptionsAddKeyValue(options, keyValue, keyLength, value + 1);
    }
    // Pass in a non-NULL, empty string. Codecs can use the mere existence of a key as a boolean value.
    return avifCodecSpecificOptionsAddKeyValue(options, keyValue, strlen(keyValue), "");
}

avifenc.c has the necessary code to talk to libavif, including passing in codec-specific options.

@castano
Copy link

castano commented Oct 6, 2025

Hmm... I haven't tried avifenc, but I didn't see the iq option in the aomOptionDefs struct, so I assumed that would not work without additional changes:

https://github.com/AOMediaCodec/libavif/blob/40cb2c1fbff1e14b6c81816b73a364490b12ffb6/src/codec_aom.c#L397C39-L397C49

Ah, I see it gets routed to aom_codec_set_option, never mind!

@wantehchang
Copy link
Collaborator

Ignacio: The aomOptionDefs struct is only used with older versions of libaom that don't have the aom_codec_set_option() function. In that case, the feature detection macro HAVE_AOM_CODEC_SET_OPTION is not defined.

For testing this pull request, please use the latest version of libaom, v3.13.1 if you can. Thanks!

@y-guyon y-guyon changed the title Use AOM_TUNE_IQ by default Use AOM_TUNE_IQ by default for YUV Jan 5, 2026
@y-guyon y-guyon requested a review from wantehchang January 16, 2026 10:56
Co-authored-by: Vincent Rabaud <[email protected]>
src/codec_aom.c Outdated
#if defined(AOM_HAVE_TUNE_IQ) && defined(AOM_USAGE_ALL_INTRA) && defined(ALL_INTRA_HAS_SPEEDS_7_TO_9)
// AOM_TUNE_IQ is favored for its low perceptual distortion on luma and chroma samples.
// AOM_TUNE_IQ sets --deltaq-mode=6 which can only be used in all intra mode.
if (image->matrixCoefficients != AVIF_MATRIX_COEFFICIENTS_IDENTITY && (addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE)) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMPORTANT: If the addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE test is intended to detect whether all intra mode is used, ideally we should wait until we have set the aomUsage variable and then test aomUsage == AOM_USAGE_ALL_INTRA.

This requires moving some code around and possibly save some variables such as aomUsage and useTuneIq in the codec->internal struct.

Would you like me to take a stab at this and attach a patch to this pull request?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would you like me to take a stab at this and attach a patch to this pull request?

Thank you for the proposal. Could you do that in a separate pull request, so that this PR only switches t=IQ on by default?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, we can reorder the code in a separate pull request.

Optional: If the addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE check is intended to detect whether all intra mode is used rather than whether we are encoding a still image, it would be good to use a local variable whose name clarifies the purpose of the check:

            avifBool useAllIntra = (addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE) != 0;
            if (image->matrixCoefficients != AVIF_MATRIX_COEFFICIENTS_IDENTITY && useAllIntra) {

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done and in #2973.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR also includes the changes in #2973. Is that intended? Or is #2973 is prerequisite for this PR?

@wantehchang
Copy link
Collaborator

Yannis: Sorry about the late review. I missed your review request.

Copy link
Collaborator

@wantehchang wantehchang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks!

src/codec_aom.c Outdated
#if defined(AOM_HAVE_TUNE_IQ) && defined(AOM_USAGE_ALL_INTRA) && defined(ALL_INTRA_HAS_SPEEDS_7_TO_9)
// AOM_TUNE_IQ is favored for its low perceptual distortion on luma and chroma samples.
// AOM_TUNE_IQ sets --deltaq-mode=6 which can only be used in all intra mode.
if (image->matrixCoefficients != AVIF_MATRIX_COEFFICIENTS_IDENTITY && (addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE)) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, we can reorder the code in a separate pull request.

Optional: If the addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE check is intended to detect whether all intra mode is used rather than whether we are encoding a still image, it would be good to use a local variable whose name clarifies the purpose of the check:

            avifBool useAllIntra = (addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE) != 0;
            if (image->matrixCoefficients != AVIF_MATRIX_COEFFICIENTS_IDENTITY && useAllIntra) {

Add non-RGB to avifenc help.
Base tune default on aomUsage.
// Speed 8: RealTime CpuUsed 8
// Speed 9: RealTime CpuUsed 9
// Speed 10: RealTime CpuUsed 9
unsigned int libavifDefaultAomUsage = AOM_USAGE_GOOD_QUALITY;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: Remove "Default" from the variable name.

This AOM usage won't be overridden by something else, so it isn't a default value.

I also think the "libavif" prefix is unnecessary. You didn't add the "libavif" prefix to "aomCpuUsed".

libavifDefaultAomUsage = AOM_USAGE_REALTIME;
#endif
}
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suggest also moving the following code that sets up aomCpuUsed:

        // aom_codec.h says: aom_codec_version() == (major<<16 | minor<<8 | patch)
        static const int aomVersion_2_0_0 = (2 << 16);
        const int aomVersion = aom_codec_version();
        if ((aomVersion < aomVersion_2_0_0) && (image->depth > 8)) {
            // Due to a known issue with libaom v1.0.0-errata1-avif, 10bpc and
            // 12bpc image encodes will call the wrong variant of
            // aom_subtract_block when cpu-used is 7 or 8, and crash. Until we get
            // a new tagged release from libaom with the fix and can verify we're
            // running with that version of libaom, we must avoid using
            // cpu-used=7/8 on any >8bpc image encodes.
            //
            // Context:
            //   * https://github.com/AOMediaCodec/libavif/issues/49
            //   * https://bugs.chromium.org/p/aomedia/issues/detail?id=2587
            //
            // Continued bug tracking here:
            //   * https://github.com/AOMediaCodec/libavif/issues/56

            if (aomCpuUsed > 6) {
                aomCpuUsed = 6;
            }
        }

src/codec_aom.c Outdated
#if defined(AOM_HAVE_TUNE_IQ) && defined(AOM_USAGE_ALL_INTRA) && defined(ALL_INTRA_HAS_SPEEDS_7_TO_9)
// AOM_TUNE_IQ is favored for its low perceptual distortion on luma and chroma samples.
// AOM_TUNE_IQ sets --deltaq-mode=6 which can only be used in all intra mode.
if (image->matrixCoefficients != AVIF_MATRIX_COEFFICIENTS_IDENTITY && (addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE)) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR also includes the changes in #2973. Is that intended? Or is #2973 is prerequisite for this PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants