Comment 8 for bug 589485

Revision history for this message
In , Yves-Alexis Perez (corsac) wrote :

> The 'screen size' as reported in the core protocol now respects the DPI value
> given by the user or config file and ignores the actual monitor DPI of any
> connected monitor.

Hmmh but what's the point? Over the few past years, my xorg.conf has completely disappeared, either because the config is done somewhere else, or because all the settings are correctly autodetected by X. Why having to revert that and specify display size (which the user usually don't really know anyway, while the server does)?

> The real monitor size is reported through the RandR extension if you really
> need to know the physical size of the display.

Yeah so the user needs to use X to get his display size, then edit xorg.conf to tell X the display size. I find that a bit painful and inconsistent

> This is intentional and follows the practice seen in many other desktop
> environments where the logical DPI of the screen is used as an application and
> font scaling factor.

Hmhm, I don't get it. I'm not really comfortable with all that, but if my screen is a high dpi one and my OS supports it, why would I need to do as other crappy os do, and stick with a low resolution, 96 dpi screen?

> If you don't like it, you're welcome to configure the X server to taste; we
> haven't removed any of those knobs. And, you can even use xrandr to change
> things on the fly.

That's good, but I really don't know why it's necessary. Wasn't autodetection working correctly for the majority of users? Couldn't other people use the DisplaySize stuff?

Cheers,