Comment 32 for bug 199331

Revision history for this message
In , Gww-u (gww-u) wrote :

If the latin1 bit is not set on a normal font then windows won't use it for anything useful. So FontForge pretty much always sets this bit when outputting normal fonts. When outputting symbol fonts it does not set this bit as it isn't applicable.

The root of the problem, as I keep saying (five times? six?), is that fontforge no longer generates a 3,0 cmap entry for marlett.sfd (unless you specifically request a symbol encoding). This has a number of implications, including the way the OS/2 code pages are defaulted.

If you don't want fontforge to default the setting of the code pages/unicode ranges then you can set them explicitly in Element->Font Info->OS/2->Charsets.

No this doesn't count as a fontforge bug. If you believe you have a fontforge bug please report them on the fontforge mailing list. The wine bug-tracker is not an appropriate place.

>That still doesn't answer the question why fontforge now sets Latin1 *and*
>Symbol bits in the ulCodePageRange1 fileld in the OS2 TrueType header, while
>previously it only set the Symbol one.
When I use fontforge to generate a truetype font with a symbol encoding it does *NOT* set the latin1 bit.
   Open("marlett.sfd")
   Generate("marlett.sym.ttf")

>Also, there is a thing called backwards compatibility. A behaviour of
>fontforge that was valid for years now suddenly called broken, making
>previously valid .sfd files useless.
As I pointed out in my previous post marlett is not a valid sfd file. It claims an encoding (adobe symbol) which it does not have.
  There seems to be an assumption that FontForge's symbol encoding (which is Adobe's) means the same as the symbol cmap type. That is not the case.
  The behavior you are depending on was never documented.

Now can we please leave this topic?
  The old behavior was wrong. Marlett.sfd is mildly wrong. You have a fix which works.