-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How were the default AVIF quality and effort settings chosen? #4227
Comments
Hi Eric, you've provided a great sample image here for comparing image codecs with all that fine, oblique line detail on the animal fur. The AVIF Within libheif this is mapped to libaom's "constant quality level" value (0-63 range, best to worst). int cq_level = ((100 - quality) * 63 + 50) / 100; In terms of video vs still images, libheif always passes It's all rather CPU intensive so as you probably know AVIF remains very much a format suitable for encode-once decode-many scenarios, such as with the superb 11ty. |
Thank you for the prompt and super-helpful response! Would it be accurate to say:
Given the existing complaints about Sharp's slow/expensive AVIF encoding, and the wide range of use cases that Sharp is used for, it probably doesn't make sense to bump Sharp's default effort parameter. It might (after some more testing) make sense to bump it downstream in eleventy-image, though. However – if my understanding above is correct – it might make sense to bump Sharp's AVIF quality default, because still images are always going to show their artifacts more than than video frames that flash by in an instant. Or maybe the (single) image I tested with is an outlier, and the AVIFs Sharp outputs by default are mostly fine for most use cases most of the time. |
The default There was some research carried out a couple of years ago using sharp and dssim to calculate comparable "quality" settings for JPEG/WebP/AVIF. https://www.industrialempathy.com/posts/avif-webp-quality-settings Interestingly, its summary that a JPEG quality of 80 is equivalent to an AVIF quality of 64 happens to align with the "visually lossless" suggested AVIF Given sharp's default JPEG |
Yes! But I'm about to leave for a week of vacation so not until I get back the week of October 14th. |
@eeeps Were you able to make any progress with this? |
Question about an existing feature
What are you trying to achieve?
I am trying to decide which AVIF quality settings to use when using https://github.com/11ty/eleventy-img/, and found that tool's defaults (which rely on Sharp's defaults) too low for my taste, for both quality and effort, in initial testing.
Before embarking on a bunch of tests, I'm looking for some context on how these scales work and how these values were chosen. For instance: is Sharp also just using libaom's defaults? (which are probably more appropriate for video than still images?).
When you searched for similar issues, what did you find that might be related?
I found a few issues where people are complaining about how expensive AVIF encoding is, and some information about how Sharp's settings map to libaom's, but nothing about how the scales work or how the defaults (which are both exactly in the middle of their range...) were chosen.
Please provide sample image(s) that help explain this question
Here's the image that caused me to want to increase the defaults when using eleventy-img (all of the texture in the dog's face-fur vanishes, and in this usage context, I care much less about build times than compression performance):
Original: https://o.img.rodeo/w_800/dogs/2.png
Default settings (quality 50, effort 4): https://o.img.rodeo/w3fr2wrmmrf3pfrtmal7.avif – compared to original
Custom settings (quality 70, effort 7): https://o.img.rodeo/vahj96nhy8xchvlodlbo.avif – compared to original
The text was updated successfully, but these errors were encountered: