It's not just that.
The SDR spec intends for content to be viewed at 100 nits brightness, but this is not enforced and most HDR TVs are capable of displaying it at 5× that brightness - or more.
In comparison, the HDR spec is designed to display content at its intended brightness level, and most displays do not even reach 1/5 of the maximum supported brightness - so they do not have the ability to increase its brightness without compressing the dynamic range.
This means that many people end up watching SDR out-of-spec and a lot brighter than intended, while HDR content is forced in-spec (or closer to it) and appears much darker.
If you compare a calibrated SDR image to an HDR one, the HDR image is generally going to be brighter - though it's often that there are elements of the picture which are brighter, rather than the entire image.
In a dark room - as HDR is intended to be viewed - a high average picture brightness can be uncomfortable to watch, so a lot of content does not raise the average screen brightness significantly above SDR levels.
I think a lot of people are going to prefer the look of high-brightness SDR to an HDR image.
And many HDR TVs are capable of displaying SDR images with much more vivid/saturated colors than HDR too.
Again: it's not as the content was intended to be viewed, but many do not care about that and just want a bright vibrant image.