Comment 19 for bug 589485

Revision history for this message
In , Nick Bowler (nbowler) wrote :

(In reply to comment #18)
> http://en.wikipedia.org/wiki/Font_size
> A point is an unit of measure, 0.353 mm.
> If I set a character to be 10 points high, it has to be 3.5mm high on
> a screen.

Just to clarify, setting the font size to 10pt defines the size of an
"em"; the exact meaning of which depends on the selected font (it might
not exactly correspond to the height of a character on screen).

> I understand this could be not the most natural behaviour on
> projectors, but the vast majority of people use screens, there's no
> need to cause them troubles.

The main problem here is that our method of specifying font sizes is not
well suited for devices such as projectors or TVs because it does not
take the viewing distance into account. However, lying about the DPI
doesn't actually improve the situation.

> Fiddling gratuitously with the DPI makes default configurations almost
> unusable on high resolution screens (fonts are rendered too small).

And when it doesn't make fonts unreadable, it makes them ugly.

> Even the default of 96dpi doesn't make sense, this resolution is
> getting less and less common every day.

I have owned exactly one display device in my lifetime with this
resolution: a 17" LCD monitor with 1280x1024 pixels. Most of my CRTs
have higher resolution, and most of my other "external" LCDs have lower.
My laptops have significantly higher resolution than all my other
devices. So from my personal experience, 96 is almost always the wrong
choice. The number seems to have come out of nowhere and makes little
sense as a default.

> Please reconsider this change in behaviour.
> What bug was it supposed to fix?

The commit message says

  Reporting the EDID values in the core means applications get
  inconsistent font sizes in the default configuration.

This makes no sense, since font sizes are consistent only when the DPI
correctly reflects reality! This change *causes* font sizes to be
inconsistent.