I'm not sure why you would say it's not really HDR. It is. It's the base standard, the others use the same 10bit dynamic colors range. The 400/500/600 just refer to the maximum brightness of the display, something that may be a benefit but is not related to whether or not something is "true" HDR. HDR10 is true HDR. All HDR600 etc are also HDR10 with the added benefit of increased brightness.
HDR600 etc are not just a maximum required brightness but also a minimum required black value. That gives you an actual dynamic range; 10bit is not a range, it's a resolution.
HDR10 is a 10bit color depth. Not resolution. 1080p is generally considered minimum resolution, but that has nothing to do with the "10bit" part of HDR10: HDR10 Media Profile, more commonly known as HDR10, was announced on August 27, 2015, by the Consumer Technology Association and uses the wide-gamut Rec. 2020 color space, a bit depth of 10-bits [0]
Also, no, the 400/600 etc is just brightness: HDR10 and HDR 400 are the same, except HDR 400 mentions the level of brightness on the display. [1].
You may be thinking of HDR True Black, which is a further enhancement, but something different. Deeper blacks can also be achieved by non-OLED displays that support full-array local dimming, and it looks like most displays that support FALD are also HDR displays.
> Also, no, the 400/600 etc is just brightness: HDR10 and HDR 400 are the same, except HDR 400 mentions the level of brightness on the display. [1].
Vesa [0] disagrees with you. Note the Maximum
Black Level Luminance restrictions. Minimum brightness + maximum black level = minimum contrast. That's why legacy display technologies with mediocre static contrast (TN, VA, IPS) need backlight hacks to support those specifications.