You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This appears to be the case for all the digit glyphs, not just subscripts. For example, only three.glyph has a unicode parameter: <unicode hex="0033"/> and if you inspect any of the variations on three they all show a unicode value of 0033.
As a result, the browser falls back to other installed fonts to display those numeric glyphs and if there are no other fonts, the .notdef slashed rectangle is displayed. Unfortunately, that's the situation I'm in.
Describe the bug
Specifically referring to subscript glyphs with unicode 2080-2089, which are named
zero.subs
tonine.subs
, but have no unicode value.This is also inconsistent with superscript glyphs with unicode 2070-2079, which are named
uni2070
touni2079
, but have the correct unicode value set.Steps to reproduce the bug
Inspect subscript glyphs U+2080 to U+2089
Expected Behavior
Subscript glyphs U+2080 to U+2089 should have correct unicode values set.
Related code
No response
Screenshots
From fontdrop.info:
![image](https://user-images.githubusercontent.com/95587932/230953990-cb61e804-beed-40e2-add9-c8e3d5fe553a.png)
![image](https://user-images.githubusercontent.com/95587932/230954171-d658891b-309b-47f1-ab97-6dab5faa40f0.png)
System setup
Additional context
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: