A HDRI with theoretically unlimited range where even the pixels that represent the sun are of the appropriate value is actually the perfect world scenario to light a scene, as technically nothing is being 'faked'. The closest process that can produce this reliably is to make a digital HDRI in a program like Terragen, and save the resulting image as a 32 bit exr. It has to be 32 bit, or else the super bright pixels will most likely render as 'NaN's (not a number), and will incorrectly light your scene.
A photographic HDRI will never be able to properly capture the proper range of the sun, no matter how many exposures you take, as the physical limitations of the shutter speed and the aperture size of 99% of all cameras can not reduce the amount of light hitting the sensor enough to cause the sun to be captured with a value lower that 1. To fix this, you can go into a program like Nuke, and roto in the correct values of the sun onto a photographic HDRI. The brightness of the sun on a perfect blue sky day near the equator is roughly 100,000 times the brightness of the sky. So, basically, to accurately reproduce the sun, find out the average luminescence of a clear patch of sky, multiply that by 100,000, and you will have something close to a sun. If there is some cloud cover, you will have to estimate the percentage that the sun will need to be reduced by.
If the HDRI that you are using has not clipped any values (every light source is accurately represented), then exposing down an image will result in a predictable way. But if any values are clipped, then exposing down will break the linear relationship that the light has with the scene, and will result in incorrect lighting.