Comment 91 for bug 589485

Revision history for this message
Sergio Callegari (callegar) wrote :

A couple more notes.

IMHO, the biggest issues with the current situation are the following:

1) The physical display size is available via xrandr. However xrandr delivers it in a format that is rather uncomfortable to parse.
2) The hardware only provides a physical display size and not the expected viewing distance. However, only with the *two* ingredients one can reliably compute a good 'virtual' dpi value to pass to the applications in order to get correctly sized fonts, icons and graphical elements.
3) When it is the desktop environment (DE) alone to decide the dpi value rather than xorg, there is always the risk of getting an unusual working environment when the user: (i) tests a different DE; (ii) has issues with the DE at DE startup. For these reasons, it would be good to have xorg provide a sane environment even before the DE starts.
4) For some reason some java apps seem to ignore the dpi value set by the DE. I'm told that matlab is among them.

So, please:
1) in the short sun re-introduce the xorg patch with the switch that lets one control whether the system should pick the EDID dpi or not as the NVIDIA driver does.
2) in the long run, it may make sense to have some heuristics in place capable of guessing if a monitor is a desktop monitor (normal viewing distance), a TV (larger than normal viewing distance), a projector (larger than normal viewing distance /and/ screen size that may vary depending on distance) or a pad (smaller than normal viewing distance and small screen).