Here's something I've wondered about for a while — how does the Auto Quality setting determine the appropriate quality for a mesh? I've often found that the result is blurry compared to setting the quality myself.
Here's an example at 1080p, with a Canvas that's configured for 2160p.
Quality: 80 (manual)
Quality: 40 (auto) (top only)

In the second pic, the top text box is set to Auto Quality while the bottom is manually set to 80. The top gets auto-set to 40 quality, and if you view it at 100% resolution, you can see that it's a little blurry.
Here's TMP for comparison:

Now, TMP might not be a good point of reference, because the whole system seems to be different (there's no quality setting). But I think it looks like the best of the three (it might be similar to 70-ish quality).
Anyway, I could probably work around this by having a script set the quality to "auto * 1.5" or something like that, but I'm wondering if I'm setting this up incorrectly, or if I should be using some other method to determine quality.