ttf-parser-0.24.1/.cargo_vcs_info.json0000644000000001360000000000100132070ustar { "git": { "sha1": "dd0dcf1fd18d208414377e327c39b56f84c8778d" }, "path_in_vcs": "" }ttf-parser-0.24.1/.github/workflows/main.yml000064400000000000000000000026751046102023000170550ustar 00000000000000name: Rust on: [push, pull_request] env: CARGO_TERM_COLOR: always jobs: build: runs-on: ubuntu-latest strategy: matrix: rust: - 1.51.0 - stable steps: - name: Checkout uses: actions/checkout@v2 - name: Install toolchain uses: actions-rs/toolchain@v1 with: profile: minimal toolchain: ${{ matrix.rust }} override: true - name: Build with no default features run: cargo build --no-default-features --features=no-std-float - name: Build with std run: cargo build --no-default-features --features=std - name: Build with variable-fonts run: cargo build --no-default-features --features=variable-fonts,no-std-float - name: Build with all features run: cargo build --all-features - name: Run tests run: cargo test - name: Build C API working-directory: c-api run: cargo build --no-default-features - name: Build C API with variable-fonts working-directory: c-api run: cargo build --no-default-features --features=variable-fonts - name: Test C API working-directory: c-api run: | cargo build gcc test.c -o test -L./target/debug/ -lttfparser -Werror -fsanitize=address env LD_LIBRARY_PATH=./target/debug/ ./test - name: Build benches working-directory: benches run: cargo bench dummy # `cargo build` will not actually build it ttf-parser-0.24.1/.gitignore000064400000000000000000000000651046102023000137700ustar 00000000000000target Cargo.lock .directory .DS_Store .vscode .idea ttf-parser-0.24.1/CHANGELOG.md000064400000000000000000000501211046102023000136070ustar 00000000000000# Change Log All notable changes to this project will be documented in this file. The format is based on [Keep a Changelog](http://keepachangelog.com/) and this project adheres to [Semantic Versioning](http://semver.org/). ## [Unreleased] ## [0.24.1] - 2024-08-05 ### Fixed - (`kerx`) `kerx::SubtablesIter` wasn't updating the current subtable index. ## [0.24.0] - 2024-07-02 ### Changed - Make `core_maths` dependency optional. When building for `no_std` one must enable `no-std-float` build feature now. ## [0.23.0] - 2024-07-02 ### Changed - Use `core_maths` instead of `libm`. Should simplify the build process. Thanks to [LaurenzV](https://github.com/LaurenzV). ### Removed - `no-std-float` build flag. Should be handled automatically now. ## [0.22.0] - 2024-06-29 ### Added - `Face::glyph_phantom_points` - `hvar::Table::right_side_bearing_offset`. Thanks to [LaurenzV](https://github.com/LaurenzV). - `vvar::Table::advance_offset`. Thanks to [LaurenzV](https://github.com/LaurenzV). - `vvar::Table::top_side_bearing_offset`. Thanks to [LaurenzV](https://github.com/LaurenzV). - `vvar::Table::bottom_side_bearing_offset`. Thanks to [LaurenzV](https://github.com/LaurenzV). - `vvar::Table::vertical_origin_offset`. Thanks to [LaurenzV](https://github.com/LaurenzV). - `colr::Table::clip_box`. Thanks to [LaurenzV](https://github.com/LaurenzV). ### Changed - `no_std` build of `ttf-parser` requires `--features=no-std-float` now. This is because we need trigonometry functions to flatten transforms in `COLR`. Thanks to [LaurenzV](https://github.com/LaurenzV). - `colr::Painter` no longer has `push_translate`, `push_scale`, `push_rotate` and `push_skew`. Only `push_transform` left. Thanks to [LaurenzV](https://github.com/LaurenzV). - Split `hvar::Table` into `hvar::Table` and `vvar::Table`. Previously, we treated both `HVAR` and `VVAR` tables as identical, but `VVAR` actually has additional fields. Thanks to [LaurenzV](https://github.com/LaurenzV). - Rename `hvar::Table::side_bearing_offset` into `hvar::Table::left_side_bearing_offset`. Thanks to [LaurenzV](https://github.com/LaurenzV). ### Fixed - `Face::glyph_hor_advance` and `Face::glyph_ver_advance` include `gvar`'s phantom points when `HVAR`/`VVAR` tables are missing. Affects only variable fonts. - (`CFF`) Allow MoveTo with width commands in nested subroutines. - `opentype_layout::LookupFlags::mark_attachment_type` parsing. - (`CFF`) Allow empty charsets in `cff::parse_charset`. Thanks to [LaurenzV](https://github.com/LaurenzV). - (`gvar`) Empty sub-glyphs/components is no longer an error. Thanks to [LaurenzV](https://github.com/LaurenzV). - (`GSUB`/`GPOS`) Allow `NULL` offsets in `ChainedContextLookup` Format2 subtables. Thanks to [LaurenzV](https://github.com/LaurenzV). - `Face::glyph_y_origin` properly handles variable fonts now. Thanks to [LaurenzV](https://github.com/LaurenzV). - (`kerx`) Fix `AnchorPoints` parsing. Thanks to [LaurenzV](https://github.com/LaurenzV). ### Removed - `push_translate`, `push_scale`, `push_rotate` and `push_skew` from `colr::Painter`. Use `colr::Painter::push_transform` instead. Thanks to [LaurenzV](https://github.com/LaurenzV). ## [0.21.1] - 2024-05-11 ### Fixed - Delta set length calculation in variable fonts. Thanks to [LaurenzV](https://github.com/LaurenzV).
Got broken in the previous version. ## [0.21.0] - 2024-05-10 ### Added - `COLR` / `CPAL` v1 support. Thanks to [LaurenzV](https://github.com/LaurenzV). ### Changed - Replace `Face::is_bitmap_embedding_allowed` with `Face::is_outline_embedding_allowed`. The old one had a bool flag flipped. Thanks to [Fuzzyzilla](https://github.com/Fuzzyzilla). - Increase lenience of embed permissions for older OS/2 versions. Thanks to [Fuzzyzilla](https://github.com/Fuzzyzilla). - Bump MSRV to 1.51 ## [0.20.0] - 2023-10-15 ### Added - `COLR` / `CPAL` v0 support. Thanks to [laurmaedje](https://github.com/laurmaedje). ### Changed - `svg::SvgDocumentsList` returns `svg::SvgDocument` and not just `&[u8]` now. Thanks to [wjian23](https://github.com/wjian23). - `Face::set_variation` allows duplicated axes now. ## [0.19.2] - 2023-09-13 ### Added - `cff::Table::glyph_cid` ## [0.19.1] - 2023-06-20 ### Fixed - `cff::Table::glyph_width` returns a correct width when subroutines are present. ## [0.19.0] - 2023-04-17 ### Added - `bdat`, `bloc`, `EBDT` and `EBLC` tables support. Thanks to [dzamkov](https://github.com/dzamkov). - `BitmapMono`, `BitmapMonoPacked`, `BitmapGray2`, `BitmapGray2Packed`, `BitmapGray4`, `BitmapGray4Packed`, `BitmapGray8` and `BitmapPremulBgra32` variants to `RasterImageFormat`. ### Fixed - `CBLC` table parsing. Thanks to [dzamkov](https://github.com/dzamkov). ## [0.18.1] - 2023-01-10 ### Fixed - (`MATH`) Handle NULL offsets. Thanks to [laurmaedje](https://github.com/laurmaedje). ## [0.18.0] - 2022-12-25 ### Added - `Face::permissions` - `Face::is_subsetting_allowed` - `Face::is_bitmap_embedding_allowed` - `Face::unicode_ranges` - `os2::Table::permissions` - `os2::Table::is_subsetting_allowed` - `os2::Table::is_bitmap_embedding_allowed` - `os2::Table::unicode_ranges` - `name::Name::language` - `Language` enum with all Windows languages. ### Changed - Using a non-zero index in `Face::parse` for a regular font will return `FaceParsingError::FaceIndexOutOfBounds` now. Thanks to [Pietrek14](https://github.com/Pietrek14). ## [0.17.0] - 2022-09-28 ### Added - `MATH` table support. Thanks to [ruifengx](https://github.com/ruifengx) and [laurmaedje](https://github.com/laurmaedje). ### Fixed - (CFF) Fix large tables parsing. ## [0.16.0] - 2022-09-18 ### Added - CFF Encoding support. - `cff::Table::glyph_index` - `cff::Table::glyph_index_by_name` - `cff::Table::glyph_width` - `cff::Table::number_of_glyphs` - `cff::Table::matrix` - `post::Table::glyph_name` - `post::Table::glyph_index_by_name` - `post::Table::names` - `Face::glyph_index_by_name` - `RawFace` fields and `TableRecord` struct are public now. ### Changed - `Face::from_slice` was replaced by `Face::parse`. - `RawFace::from_slice` was replaced by `RawFace::parse`. - `post::Table::names` is a method and not a field now. - Use `post::Table::glyph_name` instead of `post::Table::names.get()`. ### Fixed - (hmtx/vmtx) Allow missing additional side bearings. - (loca) Allow incomplete table. - Reduce strictness of some table length checks. - (post) `post::Names::len` was returning a wrong value. Now this method is gone completely. You can use `post::Table::names().count()` instead. ## [0.15.2] - 2022-06-17 ### Fixed - Missing advance and side bearing offsets in `HVAR`/`VVAR` is not an error. Simply ignore them. ## [0.15.1] - 2022-06-04 ### Fixed - (cmap) `cmap::Subtable4::glyph_index` correctly handles malformed glyph offsets now. - (cmap) `cmap::Subtable4::codepoints` no longer includes `0xFFFF` codepoint. - (SVG) Fixed table parsing. Thanks to [Shubhamj280](https://github.com/Shubhamj280) ## [0.15.0] - 2022-02-20 ### Added - `apple-layout` build feature. - `ankr`, `feat`, `kerx`, `morx` and `trak` tables. - `kern` AAT subtable format 1. - `RawFace` ### Changed - The `parser` module is private now again. ## [0.14.0] - 2021-12-28 ### Changed - (cmap) `cmap::Subtable::glyph_index` and `cmap::Subtable::glyph_variation_index` accept `u32` instead of `char` now. - (glyf) ~7% faster outline parsing. ## [0.13.4] - 2021-11-23 ### Fixed - (CFF) Panic during `seac` resolving. - (CFF) Stack overflow during `seac` resolving. ## [0.13.3] - 2021-11-19 ### Fixed - (glyf) Endless loop during malformed file parsing. ## [0.13.2] - 2021-10-28 ### Added - `gvar-alloc` build feature that unlocks `gvar` table limits by using heap. Thanks to [OrionNebula](https://github.com/OrionNebula) ## [0.13.1] - 2021-10-27 ### Fixed - `Face::line_gap` logic. ## [0.13.0] - 2021-10-24 ### Added - Complete GSUB and GPOS tables support. Available under the `opentype-layout` feature. - Public access to all supported TrueType tables. This allows a low-level, but still safe, access to internal data layout, which can be used for performance optimization, like caching. - `Style` enum and `Face::style` method. - `Face::glyph_name` can be disabled via the `glyph-names` feature to reduce binary size. ### Changed - Improved ascender/descender/line_gap resolving logic. - `Face` methods: `has_glyph_classes`, `glyph_class`, `glyph_mark_attachment_class`, `is_mark_glyph` and `glyph_variation_delta` are moved to `gdef::Table`. - The `Names` struct is no longer an iterator, but a container. You have to call `into_iter()` manually. - The `VariationAxes` struct is no longer an iterator, but a container. You have to call `into_iter()` manually. - Most of the `Name` struct methods become public fields. - `Face::units_per_em` no longer returns `Option`. - (`cmap`) Improved subtable 12 performance. Thanks to [xnuk](https://github.com/xnuk) ### Removed - (c-api) `ttfp_glyph_class`, `ttfp_get_glyph_class`, `ttfp_get_glyph_mark_attachment_class`, `ttfp_is_mark_glyph`, `ttfp_glyph_variation_delta` and `ttfp_has_table`. - `TableName` enum and `Face::has_table`. Tables can be access directly now. - `Face::character_mapping_subtables`. Use `Face::tables().cmap` instead. - `Face::kerning_subtables`. Use `Face::tables().kern` instead. ### Fixed - `Iterator::count` implementation for `cmap::Subtables`, `name::Names` and `LazyArrayIter32`. ## [0.12.3] - 2021-06-27 ### Changed - (`glyf`) Always use a calculated bbox. ## [0.12.2] - 2021-06-11 ### Fixed - `Face::glyph_bounding_box` for variable `glyf`. - (`glyf`) Do not skip glyphs with zero-sized bbox. ## [0.12.1] - 2021-05-24 ### Added - Support Format 13 subtables in `cmap::Subtable::is_unicode`. Thanks to [csmulhern](https://github.com/csmulhern) - Derive more traits by default. Thanks to [dhardy](https://github.com/dhardy) ## [0.12.0] - 2021-02-14 ### Changed - `Face::ascender` and `Face::descender` will use [usWinAscent](https://docs.microsoft.com/en-us/typography/opentype/spec/os2#uswinascent) and [usWinDescent](https://docs.microsoft.com/en-us/typography/opentype/spec/os2#uswindescent) when `USE_TYPO_METRICS` flag is not set in `OS/2` table. Previously, those values were ignored and [hhea::ascender](https://docs.microsoft.com/en-us/typography/opentype/spec/hhea#ascender) and [hhea::descender](https://docs.microsoft.com/en-us/typography/opentype/spec/hhea#descender) were used. Now `hhea` table values will be used only when `OS/2` table is not present. - `Face::outline_glyph` and `Face::glyph_bounding_box` in case of a `glyf` table can fallback to a calculated bbox when the embedded bbox is malformed now. ## [0.11.0] - 2021-02-04 ### Added - `FaceTables`, which allowed to load `Face` not only from a single chunk of data, but also in a per-table way. Which is useful for WOFF parsing. No changes to the API. Thanks to [fschutt](https://github.com/fschutt) ## [0.10.1] - 2021-01-21 ### Changed - Update a font used for tests. ## [0.10.0] - 2021-01-16 ### Added - `variable-fonts` build feature. Enabled by default. By disabling it you can reduce `ttf-parser` binary size overhead almost twice. ### Changed - (`gvar`) Increase the maximum number of variation tuples from 16 to 32. Increases stack usage and makes `gvar` parsing 10% slower now. ### Fixed - (`CFF`) Fix `seac` processing. Thanks to [wezm](https://github.com/wezm) ## [0.9.0] - 2020-12-05 ### Removed - `kern` AAT subtable 1 aka `kern::state_machine`. Mainly because it's useless without a proper shaping. ## [0.8.3] - 2020-11-15 ### Added - `Face::glyph_variation_delta` ### Fixed - `Iterator::nth` implementation for `cmap::Subtables` and `Names`. ## [0.8.2] - 2020-07-31 ### Added - `cmap::Subtable::codepoints` ### Fixed - (cmap) Incorrectly returning glyph ID `0` instead of `None` for format 0 - (cmap) Possible invalid glyph mapping for format 2 ## [0.8.1] - 2020-07-29 ### Added - `Face::is_monospaced` - `Face::italic_angle` - `Face::typographic_ascender` - `Face::typographic_descender` - `Face::typographic_line_gap` - `Face::capital_height` ## [0.8.0] - 2020-07-21 ### Added - Allow `true` magic. - `FaceParsingError` - `NormalizedCoordinate` - `Face::variation_coordinates` - `Face::has_non_default_variation_coordinates` - `Face::glyph_name` can lookup CFF names too. - `Face::table_data` - `Face::character_mapping_subtables` ### Changed - (CFF,CFF2) 10% faster parsing. - `Face::from_slice` returns `Result` now. - `Name::platform_id` returns `PlatformId` instead of `Option` now. - The `cmap` module became public. ### Fixed - `Face::width` parsing. - Possible u32 overflow on 32-bit platforms during `Face::from_slice`. - (cmap) `Face::glyph_variation_index` processing when the encoding table has only one glyph. ## [0.7.0] - 2020-07-16 ### Added - (CFF) CID fonts support. - (CFF) `seac` support. - `Font::global_bounding_box` ### Changed - Rename `Font` to `Face`, because this is what it actually is. - Rename `Font::from_data` to `Font::from_slice` to match serde and other libraries. - Rename `Name::name_utf8` to `Name::to_string`. ### Removed - `Font::family_name` and `Font::post_script_name`. They were a bit confusing. Prefer: ``` face.names().find(|name| name.name_id() == name_id::FULL_NAME).and_then(|name| name.to_string()) ``` ## [0.6.2] - 2020-07-02 ### Added - `Name::is_unicode` - `Font::family_name` will load names with Windows Symbol encoding now. ### Fixed - `Font::glyph_bounding_box` will apply variation in case of `gvar` fonts. ## [0.6.1] - 2020-05-19 ### Fixed - (`kern`) Support fonts that ignore the subtable size limit. ## [0.6.0] - 2020-05-18 ### Added - `sbix`, `CBLC`, `CBDT` and `SVG` tables support. - `Font::glyph_raster_image` and `Font::glyph_svg_image`. - `Font::kerning_subtables` with subtable formats 0..3 support. ### Changed - (c-api) The library doesn't allocate `ttfp_font` anymore. All allocations should be handled by the caller from now. ### Removed - `Font::glyphs_kerning`. Use `Font::kerning_subtables` instead. - (c-api) `ttfp_create_font` and `ttfp_destroy_font`. Use `ttfp_font_size_of` + `ttfp_font_init` instead. ```c ttfp_font *font = (ttfp_font*)alloca(ttfp_font_size_of()); ttfp_font_init(font_data, font_data_size, 0, font); ``` - Logging support. We haven't used it anyway. ### Fixed - (`gvar`) Integer overflow. - (`cmap`) Integer overflow during subtable format 2 parsing. - (`CFF`, `CFF2`) DICT number parsing. - `Font::glyph_*_advance` will return `None` when glyph ID is larger than the number of metrics in the table. - Ignore variation offset in `Font::glyph_*_advance` and `Font::glyph_*_side_bearing` when `HVAR`/`VVAR` tables are missing. Previously returned `None` which is incorrect. ## [0.5.0] - 2020-03-19 ### Added - Variable fonts support. - C API. - `gvar`, `CFF2`, `avar`, `fvar`, `HVAR`, `VVAR` and `MVAR` tables support. - `Font::variation_axes` - `Font::set_variation` - `Font::is_variable` - `Tag` type. ### Fixed - Multiple issues due to arithmetic overflow. ## [0.4.0] - 2020-02-24 **A major rewrite.** ### Added - `Font::glyph_bounding_box` - `Font::glyph_name` - `Font::has_glyph_classes` - `Font::glyph_class` - `Font::glyph_mark_attachment_class` - `Font::is_mark_glyph` - `Font::glyph_y_origin` - `Font::vertical_ascender` - `Font::vertical_descender` - `Font::vertical_height` - `Font::vertical_line_gap` - Optional `log` dependency. ### Changed - `Font::outline_glyph` now accepts `&mut dyn OutlineBuilder` and not `&mut impl OutlineBuilder`. - `Font::ascender`, `Font::descender` and `Font::line_gap` will check `USE_TYPO_METRICS` flag in OS/2 table now. - `glyph_hor_metrics` was split into `glyph_hor_advance` and `glyph_hor_side_bearing`. - `glyph_ver_metrics` was split into `glyph_ver_advance` and `glyph_ver_side_bearing`. - `CFFError` is no longer public. ### Removed - `Error` enum. All methods will return `Option` now. - All `unsafe`. ### Fixed - `glyph_hor_side_bearing` parsing when the number of metrics is less than the total number of glyphs. - Multiple CFF parsing fixes. The parser is more strict now. ## [0.3.0] - 2019-09-26 ### Added - `no_std` compatibility. ### Changed - The library has one `unsafe` block now. - 35% faster `family_name()` method. - 25% faster `from_data()` method for TrueType fonts. - The `Name` struct has a new API. Public fields became public functions and data is parsed on demand and not beforehand. ## [0.2.2] - 2019-08-12 ### Fixed - Allow format 12 subtables with *Unicode full repertoire* in `cmap`. ## [0.2.1] - 2019-08-12 ### Fixed - Check that `cmap` subtable encoding is Unicode. ## [0.2.0] - 2019-07-10 ### Added - CFF support. - Basic kerning support. - All `cmap` subtable formats except Mixed Coverage (8) are supported. - Vertical metrics querying from the `vmtx` table. - OpenType fonts are allowed now. ### Changed - A major rewrite. TrueType tables are no longer public. - Use `GlyphId` instead of `u16`. ### Removed - `GDEF` table parsing. [Unreleased]: https://github.com/RazrFalcon/ttf-parser/compare/v0.24.1...HEAD [0.24.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.24.0...v0.24.1 [0.24.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.23.0...v0.24.0 [0.23.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.22.0...v0.23.0 [0.22.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.21.1...v0.22.0 [0.21.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.21.0...v0.21.1 [0.21.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.20.0...v0.21.0 [0.20.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.19.2...v0.20.0 [0.19.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.19.1...v0.19.2 [0.19.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.19.0...v0.19.1 [0.19.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.18.1...v0.19.0 [0.18.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.18.0...v0.18.1 [0.18.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.17.0...v0.18.0 [0.17.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.16.0...v0.17.0 [0.16.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.15.2...v0.16.0 [0.15.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.15.1...v0.15.2 [0.15.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.15.0...v0.15.1 [0.15.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.14.0...v0.15.0 [0.14.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.13.4...v0.14.0 [0.13.4]: https://github.com/RazrFalcon/ttf-parser/compare/v0.13.3...v0.13.4 [0.13.3]: https://github.com/RazrFalcon/ttf-parser/compare/v0.13.2...v0.13.3 [0.13.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.13.1...v0.13.2 [0.13.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.13.0...v0.13.1 [0.13.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.12.3...v0.13.0 [0.12.3]: https://github.com/RazrFalcon/ttf-parser/compare/v0.12.2...v0.12.3 [0.12.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.12.1...v0.12.2 [0.12.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.12.0...v0.12.1 [0.12.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.11.0...v0.12.0 [0.11.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.10.1...v0.11.0 [0.10.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.10.0...v0.10.1 [0.10.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.9.0...v0.10.0 [0.9.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.8.3...v0.9.0 [0.8.3]: https://github.com/RazrFalcon/ttf-parser/compare/v0.8.2...v0.8.3 [0.8.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.8.1...v0.8.2 [0.8.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.8.0...v0.8.1 [0.8.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.7.0...v0.8.0 [0.7.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.6.2...v0.7.0 [0.6.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.6.1...v0.6.2 [0.6.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.6.0...v0.6.1 [0.6.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.5.0...v0.6.0 [0.5.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.4.0...v0.5.0 [0.4.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.3.0...v0.4.0 [0.3.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.2.2...v0.3.0 [0.2.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.2.1...v0.2.2 [0.2.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.2.0...v0.2.1 [0.2.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.1.0...v0.2.0 ttf-parser-0.24.1/Cargo.lock0000644000000040300000000000100111570ustar # This file is automatically @generated by Cargo. # It is not intended for manual editing. version = 3 [[package]] name = "arrayref" version = "0.3.7" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "6b4930d2cb77ce62f89ee5d5289b4ac049559b1c45539271f5ed4fdc7db34545" [[package]] name = "base64" version = "0.22.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "72b3254f16251a8381aa12e40e3c4d2f0199f8c6508fbecb9d91f575e0fbb8c6" [[package]] name = "bytemuck" version = "1.15.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "5d6d68c57235a3a081186990eca2867354726650f42f7516ca50c28d6281fd15" [[package]] name = "core_maths" version = "0.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e3b02505ccb8c50b0aa21ace0fc08c3e53adebd4e58caa18a36152803c7709a3" dependencies = [ "libm", ] [[package]] name = "libm" version = "0.2.8" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "4ec2a862134d2a7d32d7983ddcdd1c4923530833c9f2ea1a44fc5fa473989058" [[package]] name = "pico-args" version = "0.5.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "5be167a7af36ee22fe3115051bc51f6e6c7054c9348e28deb4f49bd6f705a315" [[package]] name = "strict-num" version = "0.1.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "6637bab7722d379c8b41ba849228d680cc12d0a45ba1fa2b48f2a30577a06731" [[package]] name = "tiny-skia-path" version = "0.11.4" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9c9e7fc0c2e86a30b117d0462aa261b72b7a99b7ebd7deb3a14ceda95c5bdc93" dependencies = [ "arrayref", "bytemuck", "strict-num", ] [[package]] name = "ttf-parser" version = "0.24.1" dependencies = [ "base64", "core_maths", "pico-args", "tiny-skia-path", "xmlwriter", ] [[package]] name = "xmlwriter" version = "0.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ec7a2a501ed189703dba8b08142f057e887dfc4b2cc4db2d343ac6376ba3e0b9" ttf-parser-0.24.1/Cargo.toml0000644000000026760000000000100112200ustar # THIS FILE IS AUTOMATICALLY GENERATED BY CARGO # # When uploading crates to the registry Cargo will automatically # "normalize" Cargo.toml files for maximal compatibility # with all versions of Cargo and also rewrite `path` dependencies # to registry (e.g., crates.io) dependencies. # # If you are reading this file be aware that the original Cargo.toml # will likely look very different (and much more reasonable). # See Cargo.toml.orig for the original contents. [package] edition = "2018" name = "ttf-parser" version = "0.24.1" authors = ["Yevhenii Reizner "] exclude = ["benches/**"] description = "A high-level, safe, zero-allocation font parser for TrueType, OpenType, and AAT." documentation = "https://docs.rs/ttf-parser/" readme = "README.md" keywords = [ "ttf", "truetype", "opentype", ] categories = ["parser-implementations"] license = "MIT OR Apache-2.0" repository = "https://github.com/RazrFalcon/ttf-parser" [dependencies.core_maths] version = "0.1.0" optional = true [dev-dependencies.base64] version = "0.22.1" [dev-dependencies.pico-args] version = "0.5" [dev-dependencies.tiny-skia-path] version = "0.11.4" [dev-dependencies.xmlwriter] version = "0.1" [features] apple-layout = [] default = [ "std", "opentype-layout", "apple-layout", "variable-fonts", "glyph-names", ] glyph-names = [] gvar-alloc = ["std"] no-std-float = ["core_maths"] opentype-layout = [] std = [] variable-fonts = [] ttf-parser-0.24.1/Cargo.toml.orig0000644000000036550000000000100121550ustar [package] name = "ttf-parser" version = "0.24.1" authors = ["Yevhenii Reizner "] keywords = ["ttf", "truetype", "opentype"] categories = ["parser-implementations"] license = "MIT OR Apache-2.0" description = "A high-level, safe, zero-allocation font parser for TrueType, OpenType, and AAT." repository = "https://github.com/RazrFalcon/ttf-parser" documentation = "https://docs.rs/ttf-parser/" readme = "README.md" edition = "2018" exclude = ["benches/**"] [dependencies] core_maths = { version = "0.1.0", optional = true } # only for no_std builds [features] default = ["std", "opentype-layout", "apple-layout", "variable-fonts", "glyph-names"] # Enables the use of the standard library. # When disabled, the `no-std-float` feature must be enabled instead. std = [] no-std-float = ["core_maths"] # Enables variable fonts support. Increases binary size almost twice. # Includes avar, CFF2, fvar, gvar, HVAR, MVAR and VVAR tables. variable-fonts = [] # Enables GDEF, GPOS, GSUB and MATH tables. opentype-layout = [] # Enables ankr, feat, format1 subtable in kern, kerx, morx and trak tables. apple-layout = [] # Enables glyph name query via `Face::glyph_name`. # TrueType fonts do not store default glyph names, to reduce file size, # which means we have to store them in ttf-parser. And there are almost 500 of them. # By disabling this feature a user can reduce binary size a bit. glyph-names = [] # Enables heap allocations during gvar table parsing used by Apple's variable fonts. # Due to the way gvar table is structured, we cannot avoid allocations. # By default, only up to 32 variable tuples will be allocated on the stack, # while the spec allows up to 4095. Most variable fonts use 10-20 tuples, # so our limit is suitable for most of the cases. But if you need full support, you have to # enable this feature. gvar-alloc = ["std"] [dev-dependencies] base64 = "0.22.1" pico-args = "0.5" tiny-skia-path = "0.11.4" xmlwriter = "0.1" ttf-parser-0.24.1/Cargo.toml.orig000064400000000000000000000036551046102023000146770ustar 00000000000000[package] name = "ttf-parser" version = "0.24.1" authors = ["Yevhenii Reizner "] keywords = ["ttf", "truetype", "opentype"] categories = ["parser-implementations"] license = "MIT OR Apache-2.0" description = "A high-level, safe, zero-allocation font parser for TrueType, OpenType, and AAT." repository = "https://github.com/RazrFalcon/ttf-parser" documentation = "https://docs.rs/ttf-parser/" readme = "README.md" edition = "2018" exclude = ["benches/**"] [dependencies] core_maths = { version = "0.1.0", optional = true } # only for no_std builds [features] default = ["std", "opentype-layout", "apple-layout", "variable-fonts", "glyph-names"] # Enables the use of the standard library. # When disabled, the `no-std-float` feature must be enabled instead. std = [] no-std-float = ["core_maths"] # Enables variable fonts support. Increases binary size almost twice. # Includes avar, CFF2, fvar, gvar, HVAR, MVAR and VVAR tables. variable-fonts = [] # Enables GDEF, GPOS, GSUB and MATH tables. opentype-layout = [] # Enables ankr, feat, format1 subtable in kern, kerx, morx and trak tables. apple-layout = [] # Enables glyph name query via `Face::glyph_name`. # TrueType fonts do not store default glyph names, to reduce file size, # which means we have to store them in ttf-parser. And there are almost 500 of them. # By disabling this feature a user can reduce binary size a bit. glyph-names = [] # Enables heap allocations during gvar table parsing used by Apple's variable fonts. # Due to the way gvar table is structured, we cannot avoid allocations. # By default, only up to 32 variable tuples will be allocated on the stack, # while the spec allows up to 4095. Most variable fonts use 10-20 tuples, # so our limit is suitable for most of the cases. But if you need full support, you have to # enable this feature. gvar-alloc = ["std"] [dev-dependencies] base64 = "0.22.1" pico-args = "0.5" tiny-skia-path = "0.11.4" xmlwriter = "0.1" ttf-parser-0.24.1/LICENSE-APACHE000064400000000000000000000251371046102023000137330ustar 00000000000000 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. ttf-parser-0.24.1/LICENSE-MIT000064400000000000000000000020451046102023000134340ustar 00000000000000Copyright (c) 2018 Yevhenii Reizner Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ttf-parser-0.24.1/README.md000064400000000000000000000273271046102023000132710ustar 00000000000000## ttf-parser ![Build Status](https://github.com/RazrFalcon/ttf-parser/workflows/Rust/badge.svg) [![Crates.io](https://img.shields.io/crates/v/ttf-parser.svg)](https://crates.io/crates/ttf-parser) [![Documentation](https://docs.rs/ttf-parser/badge.svg)](https://docs.rs/ttf-parser) [![Rust 1.51+](https://img.shields.io/badge/rust-1.51+-orange.svg)](https://www.rust-lang.org) ![](https://img.shields.io/badge/unsafe-forbidden-brightgreen.svg) A high-level, safe, zero-allocation font parser for [TrueType](https://docs.microsoft.com/en-us/typography/truetype/), [OpenType](https://docs.microsoft.com/en-us/typography/opentype/spec/), and [AAT](https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6AATIntro.html). Can be used as a Rust or C library. ### Features - A high-level API for most common properties, hiding all parsing and data resolving logic. - A low-level, but safe API to access TrueType tables data. - Highly configurable. You can disable most of the features, reducing binary size. You can also parse TrueType tables separately, without loading the whole font/face. - Zero heap allocations. - Zero unsafe. - Zero dependencies. - `no_std`/WASM compatible. - A basic [C API](./c-api). - Fast. - Stateless. All parsing methods are immutable. - Simple and maintainable code (no magic numbers). ### Safety - The library must not panic. Any panic considered as a critical bug and should be reported. - The library forbids unsafe code. - No heap allocations, so crash due to OOM is not possible. - All recursive methods have a depth limit. - Technically, should use less than 64KiB of stack in the worst case scenario. - Most of arithmetic operations are checked. - Most of numeric casts are checked. ### Alternatives It's very hard to compare different libraries, so we are using table-based comparison. There are roughly three types of TrueType tables: - A table with a list of properties (like `head`, `OS/2`, etc.).
If a library tries to parse it at all then we mark it as supported. - A table that contains a single type of data (`glyf`, `CFF` (kinda), `hmtx`, etc.).
Can only be supported or not. - A table that contains multiple subtables (`cmap`, `kern`, `GPOS`, etc.).
Can be partially supported and we note which subtables are actually supported. | Feature/Library | ttf-parser | FreeType | stb_truetype | | ----------------- | :--------------------: | :-----------------: | :----------------------------: | | Memory safe | ✓ | | | | Thread safe | ✓ | | ~ (mostly reentrant) | | Zero allocation | ✓ | | | | Variable fonts | ✓ | ✓ | | | Rendering | -1 | ✓ | ~ (very primitive) | | `ankr` table | ✓ | | | | `avar` table | ✓ | ✓ | | | `bdat` table | ~ (no 4) | ✓ | | | `bloc` table | ✓ | ✓ | | | `CBDT` table | ~ (no 8, 9) | ✓ | | | `CBLC` table | ✓ | ✓ | | | `COLR` table | ✓ | ✓ | | | `CPAL` table | ✓ | ✓ | | | `CFF ` table | ✓ | ✓ | ~ (no `seac` support) | | `CFF2` table | ✓ | ✓ | | | `cmap` table | ~ (no 8) | ✓ | ~ (no 2,8,10,14; Unicode-only) | | `EBDT` table | ~ (no 8, 9) | ✓ | | | `EBLC` table | ✓ | ✓ | | | `feat` table | ✓ | | | | `fvar` table | ✓ | ✓ | | | `gasp` table | | ✓ | | | `GDEF` table | ~ | | | | `glyf` table | ~2 | ✓ | ~2 | | `GPOS` table | ✓ | | ~ (only 2) | | `GSUB` table | ✓ | | | | `gvar` table | ✓ | ✓ | | | `head` table | ✓ | ✓ | ✓ | | `hhea` table | ✓ | ✓ | ✓ | | `hmtx` table | ✓ | ✓ | ✓ | | `HVAR` table | ✓ | ✓ | | | `kern` table | ✓ | ~ (only 0) | ~ (only 0) | | `kerx` table | ✓ | | | | `MATH` table | ✓ | | | | `maxp` table | ✓ | ✓ | ✓ | | `morx` table | ✓ | | | | `MVAR` table | ✓ | ✓ | | | `name` table | ✓ | ✓ | | | `OS/2` table | ✓ | ✓ | | | `post` table | ✓ | ✓ | | | `sbix` table | ~ (PNG only) | ~ (PNG only) | | | `SVG ` table | ✓ | ✓ | ✓ | | `trak` table | ✓ | | | | `vhea` table | ✓ | ✓ | | | `vmtx` table | ✓ | ✓ | | | `VORG` table | ✓ | ✓ | | | `VVAR` table | ✓ | ✓ | | | Language | Rust + C API | C | C | | Tested version | 0.17.0 | 2.12.0 | 1.24 | | License | MIT / Apache-2.0 | FTL / GPLv2 | public domain | Legend: - ✓ - supported - ~ - partial - *nothing* - not supported Notes: 1. While `ttf-parser` doesn't support rendering by itself, there are multiple rendering libraries on top of it: [rusttype](https://gitlab.redox-os.org/redox-os/rusttype), [ab-glyph](https://github.com/alexheretic/ab-glyph) and [fontdue](https://github.com/mooman219/fontdue). 2. Matching points are not supported. ### Performance TrueType fonts designed for fast querying, so most of the methods are very fast. The main exception is glyph outlining. Glyphs can be stored using two different methods: using [Glyph Data Format](https://docs.microsoft.com/en-us/typography/opentype/spec/glyf) and [Compact Font Format](http://wwwimages.adobe.com/content/dam/Adobe/en/devnet/font/pdfs/5176.CFF.pdf) (pdf). The first one is fairly simple which makes it faster to process. The second one is basically a tiny language with a stack-based VM, which makes it way harder to process. The [benchmark](./benches/outline/) tests how long it takes to outline all glyphs in a font. x86 (AMD 3700X) | Table/Library | ttf-parser | FreeType | stb_truetype | | ------------- | -------------: | ---------: | -------------: | | `glyf` | `0.901 ms` | `1.171 ms` | **`0.675 ms`** | | `gvar` | **`2.972 ms`** | `4.132 ms` | - | | `CFF` | **`1.197 ms`** | `5.647 ms` | `2.813 ms` | | `CFF2` | **`1.968 ms`** | `6.392 ms` | - | ARM (Apple M1) | Table/Library | ttf-parser | FreeType | stb_truetype | | ------------- | -------------: | ---------: | -------------: | | `glyf` | **`0.550 ms`** | `0.854 ms` | `0.703 ms` | | `gvar` | **`2.270 ms`** | `4.594 ms` | - | | `CFF` | **`1.054 ms`** | `5.223 ms` | `3.262 ms` | | `CFF2` | **`1.765 ms`** | `5.995 ms` | - | **Note:** FreeType is surprisingly slow, so I'm worried that I've messed something up. And here are some methods benchmarks: ```text test outline_glyph_276_from_cff2 ... bench: 867 ns/iter (+/- 15) test from_data_otf_cff ... bench: 968 ns/iter (+/- 13) test from_data_otf_cff2 ... bench: 887 ns/iter (+/- 25) test outline_glyph_276_from_cff ... bench: 678 ns/iter (+/- 41) test outline_glyph_276_from_glyf ... bench: 649 ns/iter (+/- 11) test outline_glyph_8_from_cff2 ... bench: 534 ns/iter (+/- 14) test from_data_ttf ... bench: 467 ns/iter (+/- 11) test glyph_name_post_276 ... bench: 223 ns/iter (+/- 5) test outline_glyph_8_from_cff ... bench: 315 ns/iter (+/- 13) test outline_glyph_8_from_glyf ... bench: 291 ns/iter (+/- 5) test family_name ... bench: 183 ns/iter (+/- 102) test glyph_name_cff_276 ... bench: 62 ns/iter (+/- 1) test glyph_index_u41 ... bench: 16 ns/iter (+/- 0) test glyph_name_cff_8 ... bench: 5 ns/iter (+/- 0) test glyph_name_post_8 ... bench: 2 ns/iter (+/- 0) test subscript_metrics ... bench: 2 ns/iter (+/- 0) test glyph_hor_advance ... bench: 2 ns/iter (+/- 0) test glyph_hor_side_bearing ... bench: 2 ns/iter (+/- 0) test glyph_name_8 ... bench: 1 ns/iter (+/- 0) test ascender ... bench: 1 ns/iter (+/- 0) test underline_metrics ... bench: 1 ns/iter (+/- 0) test strikeout_metrics ... bench: 1 ns/iter (+/- 0) test x_height ... bench: 1 ns/iter (+/- 0) test units_per_em ... bench: 0.5 ns/iter (+/- 0) test width ... bench: 0.2 ns/iter (+/- 0) ``` ### License Licensed under either of - Apache License, Version 2.0 ([LICENSE-APACHE](LICENSE-APACHE) or http://www.apache.org/licenses/LICENSE-2.0) - MIT license ([LICENSE-MIT](LICENSE-MIT) or http://opensource.org/licenses/MIT) at your option. ### Contribution Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions. ttf-parser-0.24.1/examples/font-info.rs000064400000000000000000000071721046102023000160710ustar 00000000000000fn main() { let args: Vec<_> = std::env::args().collect(); if args.len() != 2 { println!("Usage:\n\tfont-info font.ttf"); std::process::exit(1); } let font_data = std::fs::read(&args[1]).unwrap(); let now = std::time::Instant::now(); let face = match ttf_parser::Face::parse(&font_data, 0) { Ok(f) => f, Err(e) => { eprint!("Error: {}.", e); std::process::exit(1); } }; let mut family_names = Vec::new(); for name in face.names() { if name.name_id == ttf_parser::name_id::FULL_NAME && name.is_unicode() { if let Some(family_name) = name.to_string() { let language = name.language(); family_names.push(format!( "{} ({}, {})", family_name, language.primary_language(), language.region() )); } } } let post_script_name = face .names() .into_iter() .find(|name| name.name_id == ttf_parser::name_id::POST_SCRIPT_NAME && name.is_unicode()) .and_then(|name| name.to_string()); println!("Family names: {:?}", family_names); println!("PostScript name: {:?}", post_script_name); println!("Units per EM: {:?}", face.units_per_em()); println!("Ascender: {}", face.ascender()); println!("Descender: {}", face.descender()); println!("Line gap: {}", face.line_gap()); println!("Global bbox: {:?}", face.global_bounding_box()); println!("Number of glyphs: {}", face.number_of_glyphs()); println!("Underline: {:?}", face.underline_metrics()); println!("X height: {:?}", face.x_height()); println!("Weight: {:?}", face.weight()); println!("Width: {:?}", face.width()); println!("Regular: {}", face.is_regular()); println!("Italic: {}", face.is_italic()); println!("Bold: {}", face.is_bold()); println!("Oblique: {}", face.is_oblique()); println!("Strikeout: {:?}", face.strikeout_metrics()); println!("Subscript: {:?}", face.subscript_metrics()); println!("Superscript: {:?}", face.superscript_metrics()); println!("Permissions: {:?}", face.permissions()); println!("Variable: {:?}", face.is_variable()); #[cfg(feature = "opentype-layout")] { if let Some(ref table) = face.tables().gpos { print_opentype_layout("positioning", table); } if let Some(ref table) = face.tables().gsub { print_opentype_layout("substitution", table); } } #[cfg(feature = "variable-fonts")] { if face.is_variable() { println!("Variation axes:"); for axis in face.variation_axes() { println!( " {} {}..{}, default {}", axis.tag, axis.min_value, axis.max_value, axis.def_value ); } } } println!("Elapsed: {}us", now.elapsed().as_micros()); } fn print_opentype_layout(name: &str, table: &ttf_parser::opentype_layout::LayoutTable) { println!("OpenType {}:", name); println!(" Scripts:"); for script in table.scripts { println!(" {}", script.tag); if script.languages.is_empty() { println!(" No languages"); continue; } println!(" Languages:"); for lang in script.languages { println!(" {}", lang.tag); } } let mut features: Vec<_> = table.features.into_iter().map(|f| f.tag).collect(); features.dedup(); println!(" Features:"); for feature in features { println!(" {}", feature); } } ttf-parser-0.24.1/examples/font2svg.rs000064400000000000000000000521441046102023000157410ustar 00000000000000#![allow(clippy::too_many_arguments)] use base64::engine::general_purpose::STANDARD; use std::io::Write; use std::path::PathBuf; use ttf_parser as ttf; use ttf_parser::colr::{ClipBox, Paint}; use ttf_parser::{RgbaColor, Transform}; const FONT_SIZE: f64 = 128.0; const COLUMNS: u32 = 100; const HELP: &str = "\ Usage: font2svg font.ttf out.svg font2svg --variations 'wght:500;wdth:200' font.ttf out.svg font2svg --colr-palette 1 colr-font.ttf out.svg "; struct Args { #[allow(dead_code)] variations: Vec, colr_palette: u16, ttf_path: PathBuf, svg_path: PathBuf, } fn main() { let args = match parse_args() { Ok(v) => v, Err(e) => { eprintln!("Error: {}.", e); print!("{}", HELP); std::process::exit(1); } }; if let Err(e) = process(args) { eprintln!("Error: {}.", e); std::process::exit(1); } } fn parse_args() -> Result> { let mut args = pico_args::Arguments::from_env(); if args.contains(["-h", "--help"]) { print!("{}", HELP); std::process::exit(0); } let variations = args.opt_value_from_fn("--variations", parse_variations)?; let colr_palette: u16 = args.opt_value_from_str("--colr-palette")?.unwrap_or(0); let free = args.finish(); if free.len() != 2 { return Err("invalid number of arguments".into()); } Ok(Args { variations: variations.unwrap_or_default(), colr_palette, ttf_path: PathBuf::from(&free[0]), svg_path: PathBuf::from(&free[1]), }) } fn parse_variations(s: &str) -> Result, &'static str> { let mut variations = Vec::new(); for part in s.split(';') { let mut iter = part.split(':'); let axis = iter.next().ok_or("failed to parse a variation")?; let axis = ttf::Tag::from_bytes_lossy(axis.as_bytes()); let value = iter.next().ok_or("failed to parse a variation")?; let value: f32 = value.parse().map_err(|_| "failed to parse a variation")?; variations.push(ttf::Variation { axis, value }); } Ok(variations) } fn process(args: Args) -> Result<(), Box> { let font_data = std::fs::read(&args.ttf_path)?; // Exclude IO operations. let now = std::time::Instant::now(); #[allow(unused_mut)] let mut face = ttf::Face::parse(&font_data, 0)?; if face.is_variable() { #[cfg(feature = "variable-fonts")] { for variation in args.variations { face.set_variation(variation.axis, variation.value) .ok_or("failed to create variation coordinates")?; } } } if face.tables().colr.is_some() { if let Some(total) = face.color_palettes() { if args.colr_palette >= total.get() { return Err(format!("only {} palettes are available", total).into()); } } } let num_glyphs = face.number_of_glyphs(); let units_per_em = face.units_per_em(); let scale = FONT_SIZE / units_per_em as f64; let cell_size = face.height() as f64 * FONT_SIZE / units_per_em as f64; let rows = (num_glyphs as f64 / COLUMNS as f64).ceil() as u32; let mut svg = xmlwriter::XmlWriter::with_capacity( num_glyphs as usize * 512, xmlwriter::Options::default(), ); svg.start_element("svg"); svg.write_attribute("xmlns", "http://www.w3.org/2000/svg"); svg.write_attribute("xmlns:xlink", "http://www.w3.org/1999/xlink"); svg.write_attribute_fmt( "viewBox", format_args!( "{} {} {} {}", 0, 0, cell_size * COLUMNS as f64, cell_size * rows as f64 ), ); draw_grid(num_glyphs, cell_size, &mut svg); let mut path_buf = String::with_capacity(256); let mut row = 0; let mut column = 0; let mut gradient_index = 1; let mut clip_path_index = 1; for id in 0..num_glyphs { let gid = ttf::GlyphId(id); let x = column as f64 * cell_size; let y = row as f64 * cell_size; svg.start_element("text"); svg.write_attribute("x", &(x + 2.0)); svg.write_attribute("y", &(y + cell_size - 4.0)); svg.write_attribute("font-size", "36"); svg.write_attribute("fill", "gray"); svg.write_text_fmt(format_args!("{}", &id)); svg.end_element(); if face.is_color_glyph(gid) { color_glyph( x, y, &face, args.colr_palette, gid, cell_size, scale, &mut gradient_index, &mut clip_path_index, &mut svg, &mut path_buf, ); } else if let Some(img) = face.glyph_raster_image(gid, u16::MAX) { svg.start_element("image"); svg.write_attribute("x", &(x + 2.0 + img.x as f64)); svg.write_attribute("y", &(y - img.y as f64)); svg.write_attribute("width", &img.width); svg.write_attribute("height", &img.height); svg.write_attribute_raw("xlink:href", |buf| { buf.extend_from_slice(b"data:image/png;base64, "); let mut enc = base64::write::EncoderWriter::new(buf, &STANDARD); enc.write_all(img.data).unwrap(); enc.finish().unwrap(); }); svg.end_element(); } else if let Some(img) = face.glyph_svg_image(gid) { svg.start_element("image"); svg.write_attribute("x", &(x + 2.0)); svg.write_attribute("y", &(y + cell_size)); svg.write_attribute("width", &cell_size); svg.write_attribute("height", &cell_size); svg.write_attribute_raw("xlink:href", |buf| { buf.extend_from_slice(b"data:image/svg+xml;base64, "); let mut enc = base64::write::EncoderWriter::new(buf, &STANDARD); enc.write_all(img.data).unwrap(); enc.finish().unwrap(); }); svg.end_element(); } else { glyph_to_path(x, y, &face, gid, cell_size, scale, &mut svg, &mut path_buf); } column += 1; if column == COLUMNS { column = 0; row += 1; } } println!("Elapsed: {}ms", now.elapsed().as_micros() as f64 / 1000.0); std::fs::write(&args.svg_path, svg.end_document())?; Ok(()) } fn draw_grid(n_glyphs: u16, cell_size: f64, svg: &mut xmlwriter::XmlWriter) { let columns = COLUMNS; let rows = (n_glyphs as f64 / columns as f64).ceil() as u32; let width = columns as f64 * cell_size; let height = rows as f64 * cell_size; svg.start_element("path"); svg.write_attribute("fill", "none"); svg.write_attribute("stroke", "black"); svg.write_attribute("stroke-width", "5"); let mut path = String::with_capacity(256); use std::fmt::Write; let mut x = 0.0; for _ in 0..=columns { write!(&mut path, "M {} {} L {} {} ", x, 0.0, x, height).unwrap(); x += cell_size; } let mut y = 0.0; for _ in 0..=rows { write!(&mut path, "M {} {} L {} {} ", 0.0, y, width, y).unwrap(); y += cell_size; } path.pop(); svg.write_attribute("d", &path); svg.end_element(); } struct Builder<'a>(&'a mut String); impl Builder<'_> { fn finish(&mut self) { if !self.0.is_empty() { self.0.pop(); // remove trailing space } } } impl ttf::OutlineBuilder for Builder<'_> { fn move_to(&mut self, x: f32, y: f32) { use std::fmt::Write; write!(self.0, "M {} {} ", x, y).unwrap() } fn line_to(&mut self, x: f32, y: f32) { use std::fmt::Write; write!(self.0, "L {} {} ", x, y).unwrap() } fn quad_to(&mut self, x1: f32, y1: f32, x: f32, y: f32) { use std::fmt::Write; write!(self.0, "Q {} {} {} {} ", x1, y1, x, y).unwrap() } fn curve_to(&mut self, x1: f32, y1: f32, x2: f32, y2: f32, x: f32, y: f32) { use std::fmt::Write; write!(self.0, "C {} {} {} {} {} {} ", x1, y1, x2, y2, x, y).unwrap() } fn close(&mut self) { self.0.push_str("Z ") } } fn glyph_to_path( x: f64, y: f64, face: &ttf::Face, glyph_id: ttf::GlyphId, cell_size: f64, scale: f64, svg: &mut xmlwriter::XmlWriter, path_buf: &mut String, ) { path_buf.clear(); let mut builder = Builder(path_buf); let bbox = match face.outline_glyph(glyph_id, &mut builder) { Some(v) => v, None => return, }; builder.finish(); let bbox_w = (bbox.x_max as f64 - bbox.x_min as f64) * scale; let dx = (cell_size - bbox_w) / 2.0; let y = y + cell_size + face.descender() as f64 * scale; let transform = format!("matrix({} 0 0 {} {} {})", scale, -scale, x + dx, y); svg.start_element("path"); svg.write_attribute("d", path_buf); svg.write_attribute("transform", &transform); svg.end_element(); { let bbox_h = (bbox.y_max as f64 - bbox.y_min as f64) * scale; let bbox_x = x + dx + bbox.x_min as f64 * scale; let bbox_y = y - bbox.y_max as f64 * scale; svg.start_element("rect"); svg.write_attribute("x", &bbox_x); svg.write_attribute("y", &bbox_y); svg.write_attribute("width", &bbox_w); svg.write_attribute("height", &bbox_h); svg.write_attribute("fill", "none"); svg.write_attribute("stroke", "green"); svg.end_element(); } } // NOTE: this is not a feature-full implementation and just a demo. struct GlyphPainter<'a> { face: &'a ttf::Face<'a>, svg: &'a mut xmlwriter::XmlWriter, path_buf: &'a mut String, gradient_index: usize, clip_path_index: usize, palette_index: u16, transform: ttf::Transform, outline_transform: ttf::Transform, transforms_stack: Vec, } impl<'a> GlyphPainter<'a> { fn write_gradient_stops(&mut self, stops: ttf::colr::GradientStopsIter) { for stop in stops { self.svg.start_element("stop"); self.svg.write_attribute("offset", &stop.stop_offset); self.svg.write_color_attribute("stop-color", stop.color); let opacity = f32::from(stop.color.alpha) / 255.0; self.svg.write_attribute("stop-opacity", &opacity); self.svg.end_element(); } } fn paint_solid(&mut self, color: ttf::RgbaColor) { self.svg.start_element("path"); self.svg.write_color_attribute("fill", color); let opacity = f32::from(color.alpha) / 255.0; self.svg.write_attribute("fill-opacity", &opacity); self.svg .write_transform_attribute("transform", self.outline_transform); self.svg.write_attribute("d", self.path_buf); self.svg.end_element(); } fn paint_linear_gradient(&mut self, gradient: ttf::colr::LinearGradient<'a>) { let gradient_id = format!("lg{}", self.gradient_index); self.gradient_index += 1; let gradient_transform = paint_transform(self.outline_transform, self.transform); // TODO: We ignore x2, y2. Have to apply them somehow. // TODO: The way spreadMode works in ttf and svg is a bit different. In SVG, the spreadMode // will always be applied based on x1/y1 and x2/y2. However, in TTF the spreadMode will // be applied from the first/last stop. So if we have a gradient with x1=0 x2=1, and // a stop at x=0.4 and x=0.6, then in SVG we will always see a padding, while in ttf // we will see the actual spreadMode. We need to account for that somehow. self.svg.start_element("linearGradient"); self.svg.write_attribute("id", &gradient_id); self.svg.write_attribute("x1", &gradient.x0); self.svg.write_attribute("y1", &gradient.y0); self.svg.write_attribute("x2", &gradient.x1); self.svg.write_attribute("y2", &gradient.y1); self.svg.write_attribute("gradientUnits", &"userSpaceOnUse"); self.svg.write_spread_method_attribute(gradient.extend); self.svg .write_transform_attribute("gradientTransform", gradient_transform); self.write_gradient_stops(gradient.stops( self.palette_index, #[cfg(feature = "variable-fonts")] self.face.variation_coordinates(), )); self.svg.end_element(); self.svg.start_element("path"); self.svg .write_attribute_fmt("fill", format_args!("url(#{})", gradient_id)); self.svg .write_transform_attribute("transform", self.outline_transform); self.svg.write_attribute("d", self.path_buf); self.svg.end_element(); } fn paint_radial_gradient(&mut self, gradient: ttf::colr::RadialGradient<'a>) { let gradient_id = format!("rg{}", self.gradient_index); self.gradient_index += 1; self.svg.start_element("radialGradient"); self.svg.write_attribute("id", &gradient_id); self.svg.write_attribute("cx", &gradient.x1); self.svg.write_attribute("cy", &gradient.y1); self.svg.write_attribute("r", &gradient.r1); self.svg.write_attribute("fr", &gradient.r0); self.svg.write_attribute("fx", &gradient.x0); self.svg.write_attribute("fy", &gradient.y0); self.svg.write_attribute("gradientUnits", &"userSpaceOnUse"); self.svg.write_spread_method_attribute(gradient.extend); self.svg .write_transform_attribute("gradientTransform", self.transform); self.write_gradient_stops(gradient.stops( self.palette_index, #[cfg(feature = "variable-fonts")] self.face.variation_coordinates(), )); self.svg.end_element(); self.svg.start_element("path"); self.svg .write_attribute_fmt("fill", format_args!("url(#{})", gradient_id)); self.svg .write_transform_attribute("transform", self.outline_transform); self.svg.write_attribute("d", self.path_buf); self.svg.end_element(); } fn paint_sweep_gradient(&mut self, _: ttf::colr::SweepGradient<'a>) { println!("Warning: sweep gradients are not supported.") } } fn paint_transform(outline_transform: Transform, transform: Transform) -> Transform { let outline_transform = tiny_skia_path::Transform::from_row( outline_transform.a, outline_transform.b, outline_transform.c, outline_transform.d, outline_transform.e, outline_transform.f, ); let gradient_transform = tiny_skia_path::Transform::from_row( transform.a, transform.b, transform.c, transform.d, transform.e, transform.f, ); let gradient_transform = outline_transform .invert() .unwrap() .pre_concat(gradient_transform); ttf_parser::Transform { a: gradient_transform.sx, b: gradient_transform.ky, c: gradient_transform.kx, d: gradient_transform.sy, e: gradient_transform.tx, f: gradient_transform.ty, } } impl GlyphPainter<'_> { fn clip_with_path(&mut self, path: &str) { let clip_id = format!("cp{}", self.clip_path_index); self.clip_path_index += 1; self.svg.start_element("clipPath"); self.svg.write_attribute("id", &clip_id); self.svg.start_element("path"); self.svg .write_transform_attribute("transform", self.outline_transform); self.svg.write_attribute("d", &path); self.svg.end_element(); self.svg.end_element(); self.svg.start_element("g"); self.svg .write_attribute_fmt("clip-path", format_args!("url(#{})", clip_id)); } } impl<'a> ttf::colr::Painter<'a> for GlyphPainter<'a> { fn outline_glyph(&mut self, glyph_id: ttf::GlyphId) { self.path_buf.clear(); let mut builder = Builder(self.path_buf); match self.face.outline_glyph(glyph_id, &mut builder) { Some(v) => v, None => return, }; builder.finish(); // We have to write outline using the current transform. self.outline_transform = self.transform; } fn push_layer(&mut self, mode: ttf::colr::CompositeMode) { self.svg.start_element("g"); use ttf::colr::CompositeMode; // TODO: Need to figure out how to represent the other blend modes // in SVG. let mode = match mode { CompositeMode::SourceOver => "normal", CompositeMode::Screen => "screen", CompositeMode::Overlay => "overlay", CompositeMode::Darken => "darken", CompositeMode::Lighten => "lighten", CompositeMode::ColorDodge => "color-dodge", CompositeMode::ColorBurn => "color-burn", CompositeMode::HardLight => "hard-light", CompositeMode::SoftLight => "soft-light", CompositeMode::Difference => "difference", CompositeMode::Exclusion => "exclusion", CompositeMode::Multiply => "multiply", CompositeMode::Hue => "hue", CompositeMode::Saturation => "saturation", CompositeMode::Color => "color", CompositeMode::Luminosity => "luminosity", _ => { println!("Warning: unsupported blend mode: {:?}", mode); "normal" } }; self.svg.write_attribute_fmt( "style", format_args!("mix-blend-mode: {}; isolation: isolate", mode), ); } fn pop_layer(&mut self) { self.svg.end_element(); // g } fn push_transform(&mut self, transform: ttf::Transform) { self.transforms_stack.push(self.transform); self.transform = ttf::Transform::combine(self.transform, transform); } fn paint(&mut self, paint: Paint<'a>) { match paint { Paint::Solid(color) => self.paint_solid(color), Paint::LinearGradient(lg) => self.paint_linear_gradient(lg), Paint::RadialGradient(rg) => self.paint_radial_gradient(rg), Paint::SweepGradient(sg) => self.paint_sweep_gradient(sg), } } fn pop_transform(&mut self) { if let Some(ts) = self.transforms_stack.pop() { self.transform = ts } } fn push_clip(&mut self) { self.clip_with_path(&self.path_buf.clone()); } fn pop_clip(&mut self) { self.svg.end_element(); } fn push_clip_box(&mut self, clipbox: ClipBox) { let x_min = clipbox.x_min; let x_max = clipbox.x_max; let y_min = clipbox.y_min; let y_max = clipbox.y_max; let clip_path = format!( "M {} {} L {} {} L {} {} L {} {} Z", x_min, y_min, x_max, y_min, x_max, y_max, x_min, y_max ); self.clip_with_path(&clip_path); } } fn color_glyph( x: f64, y: f64, face: &ttf::Face, palette_index: u16, glyph_id: ttf::GlyphId, cell_size: f64, scale: f64, gradient_index: &mut usize, clip_path_index: &mut usize, svg: &mut xmlwriter::XmlWriter, path_buf: &mut String, ) { let y = y + cell_size + face.descender() as f64 * scale; let transform = format!("matrix({} 0 0 {} {} {})", scale, -scale, x, y); svg.start_element("g"); svg.write_attribute("transform", &transform); let mut painter = GlyphPainter { face, svg, path_buf, gradient_index: *gradient_index, clip_path_index: *clip_path_index, palette_index, transform: ttf::Transform::default(), outline_transform: ttf::Transform::default(), transforms_stack: vec![ttf::Transform::default()], }; face.paint_color_glyph( glyph_id, palette_index, RgbaColor::new(0, 0, 0, 255), &mut painter, ); *gradient_index = painter.gradient_index; *clip_path_index = painter.clip_path_index; svg.end_element(); } trait XmlWriterExt { fn write_color_attribute(&mut self, name: &str, ts: ttf::RgbaColor); fn write_transform_attribute(&mut self, name: &str, ts: ttf::Transform); fn write_spread_method_attribute(&mut self, method: ttf::colr::GradientExtend); } impl XmlWriterExt for xmlwriter::XmlWriter { fn write_color_attribute(&mut self, name: &str, color: ttf::RgbaColor) { self.write_attribute_fmt( name, format_args!("rgb({}, {}, {})", color.red, color.green, color.blue), ); } fn write_transform_attribute(&mut self, name: &str, ts: ttf::Transform) { if ts.is_default() { return; } self.write_attribute_fmt( name, format_args!( "matrix({} {} {} {} {} {})", ts.a, ts.b, ts.c, ts.d, ts.e, ts.f ), ); } fn write_spread_method_attribute(&mut self, extend: ttf::colr::GradientExtend) { self.write_attribute( "spreadMethod", match extend { ttf::colr::GradientExtend::Pad => &"pad", ttf::colr::GradientExtend::Repeat => &"repeat", ttf::colr::GradientExtend::Reflect => &"reflect", }, ); } } ttf-parser-0.24.1/examples/wasm/.gitignore000064400000000000000000000000071046102023000165510ustar 00000000000000*.wasm ttf-parser-0.24.1/examples/wasm/README.md000064400000000000000000000006021046102023000160410ustar 00000000000000# ttf-parser as a WebAssembly module ## Build ```sh rustup target add wasm32-unknown-unknown cargo build --target wasm32-unknown-unknown --release --manifest-path ../../c-api/Cargo.toml cp ../../c-api/target/wasm32-unknown-unknown/release/ttfparser.wasm . ``` ## Run You can use any webserver that can serve `index.html`. Here is a Python example: ```sh python -m http.server ``` ttf-parser-0.24.1/examples/wasm/TTC.ttc000064400000000000000000000050601046102023000157330ustar 00000000000000ttcfpDSIG OS/2g#`cmap1"tcvt pfpgmtgaspx glyfR4hheadxD6hheaT$$hmtxHloca^L`maxpp name>bpostr% 7prep ,pDSIG OS/2g#`cmap1"tcvt pfpgmtgaspx glyfR4hheadxD6hheaT$$hmtxHloca^L`maxpp name>bpostr% 7prep ,2 NONE &, \@  . & . & 23!'!222v7z73#7zz7z&'(n_<і72 22277"4 J44<C4 \i)x   # #;4 h p  2 p   R J J F\ F\ vCopyright (c) 2015 by FontTools. No rights reserved.Test TTFRegularFontTools: Test TTF: 2015Version 1.000TestTTF-RegularTest TTF is not a trademark of FontTools.FontToolshttps://github.com/behdad/fonttoolshttps://github.com/behdad/fonttools/blob/master/LICENSE.txtCopyright (c) 2015 by FontTools. No rights reserved.Test TTFRegularFontTools: Test TTF: 2015Version 1.000TestTTF-RegularTest TTF is not a trademark of FontTools.FontToolshttps://github.com/behdad/fonttoolshttps://github.com/behdad/fonttools/blob/master/LICENSE.txt2.nullCRttf-parser-0.24.1/examples/wasm/index.html000064400000000000000000000047071046102023000165710ustar 00000000000000

ttf-parser in WebAssembly

(supports font files drag and drop)

TTC.ttc:

ttfp_fonts_in_collection():

ttfp_is_variable():

ttfp_get_weight():

ttf-parser-0.24.1/src/aat.rs000064400000000000000000000421751046102023000137120ustar 00000000000000/*! A collection of [Apple Advanced Typography]( https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6AATIntro.html) related types. */ use core::num::NonZeroU16; use crate::parser::{FromData, LazyArray16, NumFrom, Offset, Offset16, Offset32, Stream}; use crate::GlyphId; /// Predefined states. pub mod state { #![allow(missing_docs)] pub const START_OF_TEXT: u16 = 0; } /// Predefined classes. /// /// Search for _Class Code_ in [Apple Advanced Typography Font Tables]( /// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6Tables.html). pub mod class { #![allow(missing_docs)] pub const END_OF_TEXT: u8 = 0; pub const OUT_OF_BOUNDS: u8 = 1; pub const DELETED_GLYPH: u8 = 2; } /// A State Table entry. /// /// Used by legacy and extended tables. #[derive(Clone, Copy, Debug)] pub struct GenericStateEntry { /// A new state. pub new_state: u16, /// Entry flags. pub flags: u16, /// Additional data. /// /// Use `()` if no data expected. pub extra: T, } impl FromData for GenericStateEntry { const SIZE: usize = 4 + T::SIZE; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(GenericStateEntry { new_state: s.read::()?, flags: s.read::()?, extra: s.read::()?, }) } } impl GenericStateEntry { /// Checks that entry has an offset. #[inline] pub fn has_offset(&self) -> bool { self.flags & 0x3FFF != 0 } /// Returns a value offset. /// /// Used by kern::format1 subtable. #[inline] pub fn value_offset(&self) -> ValueOffset { ValueOffset(self.flags & 0x3FFF) } /// If set, reset the kerning data (clear the stack). #[inline] pub fn has_reset(&self) -> bool { self.flags & 0x2000 != 0 } /// If set, advance to the next glyph before going to the new state. #[inline] pub fn has_advance(&self) -> bool { self.flags & 0x4000 == 0 } /// If set, push this glyph on the kerning stack. #[inline] pub fn has_push(&self) -> bool { self.flags & 0x8000 != 0 } /// If set, remember this glyph as the marked glyph. /// /// Used by kerx::format4 subtable. /// /// Yes, the same as [`has_push`](Self::has_push). #[inline] pub fn has_mark(&self) -> bool { self.flags & 0x8000 != 0 } } /// A legacy state entry used by [StateTable]. pub type StateEntry = GenericStateEntry<()>; /// A type-safe wrapper for a kerning value offset. #[derive(Clone, Copy, Debug)] pub struct ValueOffset(u16); impl ValueOffset { /// Returns the next offset. /// /// After reaching u16::MAX will start from 0. #[inline] pub fn next(self) -> Self { ValueOffset(self.0.wrapping_add(u16::SIZE as u16)) } } /// A [State Table]( /// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6Tables.html). /// /// Also called `STHeader`. /// /// Currently used by `kern` table. #[derive(Clone)] pub struct StateTable<'a> { number_of_classes: u16, first_glyph: GlyphId, class_table: &'a [u8], state_array_offset: u16, state_array: &'a [u8], entry_table: &'a [u8], actions: &'a [u8], } impl<'a> StateTable<'a> { pub(crate) fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let number_of_classes: u16 = s.read()?; // Note that in format1 subtable, offsets are not from the subtable start, // but from subtable start + `header_size`. // So there is not need to subtract the `header_size`. let class_table_offset = s.read::()?.to_usize(); let state_array_offset = s.read::()?.to_usize(); let entry_table_offset = s.read::()?.to_usize(); // Ignore `values_offset` since we don't use it. // Parse class subtable. let mut s = Stream::new_at(data, class_table_offset)?; let first_glyph: GlyphId = s.read()?; let number_of_glyphs: u16 = s.read()?; // The class table contains u8, so it's easier to use just a slice // instead of a LazyArray. let class_table = s.read_bytes(usize::from(number_of_glyphs))?; Some(StateTable { number_of_classes, first_glyph, class_table, state_array_offset: state_array_offset as u16, // We don't know the actual data size and it's kinda expensive to calculate. // So we are simply storing all the data past the offset. // Despite the fact that they may overlap. state_array: data.get(state_array_offset..)?, entry_table: data.get(entry_table_offset..)?, // `ValueOffset` defines an offset from the start of the subtable data. // We do not check that the provided offset is actually after `values_offset`. actions: data, }) } /// Returns a glyph class. #[inline] pub fn class(&self, glyph_id: GlyphId) -> Option { if glyph_id.0 == 0xFFFF { return Some(class::DELETED_GLYPH); } let idx = glyph_id.0.checked_sub(self.first_glyph.0)?; self.class_table.get(usize::from(idx)).copied() } /// Returns a class entry. #[inline] pub fn entry(&self, state: u16, mut class: u8) -> Option { if u16::from(class) >= self.number_of_classes { class = class::OUT_OF_BOUNDS; } let entry_idx = self .state_array .get(usize::from(state) * usize::from(self.number_of_classes) + usize::from(class))?; Stream::read_at(self.entry_table, usize::from(*entry_idx) * StateEntry::SIZE) } /// Returns kerning at offset. #[inline] pub fn kerning(&self, offset: ValueOffset) -> Option { Stream::read_at(self.actions, usize::from(offset.0)) } /// Produces a new state. #[inline] pub fn new_state(&self, state: u16) -> u16 { let n = (i32::from(state) - i32::from(self.state_array_offset)) / i32::from(self.number_of_classes); use core::convert::TryFrom; u16::try_from(n).unwrap_or(0) } } impl core::fmt::Debug for StateTable<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "StateTable {{ ... }}") } } /// An [Extended State Table]( /// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6Tables.html). /// /// Also called `STXHeader`. /// /// Currently used by `kerx` and `morx` tables. #[derive(Clone)] pub struct ExtendedStateTable<'a, T> { number_of_classes: u32, lookup: Lookup<'a>, state_array: &'a [u8], entry_table: &'a [u8], entry_type: core::marker::PhantomData, } impl<'a, T: FromData> ExtendedStateTable<'a, T> { // TODO: make private /// Parses an Extended State Table from a stream. /// /// `number_of_glyphs` is from the `maxp` table. pub fn parse(number_of_glyphs: NonZeroU16, s: &mut Stream<'a>) -> Option { let data = s.tail()?; let number_of_classes = s.read::()?; // Note that offsets are not from the subtable start, // but from subtable start + `header_size`. // So there is not need to subtract the `header_size`. let lookup_table_offset = s.read::()?.to_usize(); let state_array_offset = s.read::()?.to_usize(); let entry_table_offset = s.read::()?.to_usize(); Some(ExtendedStateTable { number_of_classes, lookup: Lookup::parse(number_of_glyphs, data.get(lookup_table_offset..)?)?, // We don't know the actual data size and it's kinda expensive to calculate. // So we are simply storing all the data past the offset. // Despite the fact that they may overlap. state_array: data.get(state_array_offset..)?, entry_table: data.get(entry_table_offset..)?, entry_type: core::marker::PhantomData, }) } /// Returns a glyph class. #[inline] pub fn class(&self, glyph_id: GlyphId) -> Option { if glyph_id.0 == 0xFFFF { return Some(u16::from(class::DELETED_GLYPH)); } self.lookup.value(glyph_id) } /// Returns a class entry. #[inline] pub fn entry(&self, state: u16, mut class: u16) -> Option> { if u32::from(class) >= self.number_of_classes { class = u16::from(class::OUT_OF_BOUNDS); } let state_idx = usize::from(state) * usize::num_from(self.number_of_classes) + usize::from(class); let entry_idx: u16 = Stream::read_at(self.state_array, state_idx * u16::SIZE)?; Stream::read_at( self.entry_table, usize::from(entry_idx) * GenericStateEntry::::SIZE, ) } } impl core::fmt::Debug for ExtendedStateTable<'_, T> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "ExtendedStateTable {{ ... }}") } } /// A [lookup table]( /// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6Tables.html). /// /// u32 values in Format10 tables will be truncated to u16. /// u64 values in Format10 tables are not supported. #[derive(Clone)] pub struct Lookup<'a> { data: LookupInner<'a>, } impl<'a> Lookup<'a> { /// Parses a lookup table from raw data. /// /// `number_of_glyphs` is from the `maxp` table. #[inline] pub fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option { LookupInner::parse(number_of_glyphs, data).map(|data| Self { data }) } /// Returns a value associated with the specified glyph. #[inline] pub fn value(&self, glyph_id: GlyphId) -> Option { self.data.value(glyph_id) } } impl core::fmt::Debug for Lookup<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Lookup {{ ... }}") } } #[derive(Clone)] enum LookupInner<'a> { Format1(LazyArray16<'a, u16>), Format2(BinarySearchTable<'a, LookupSegment>), Format4(BinarySearchTable<'a, LookupSegment>, &'a [u8]), Format6(BinarySearchTable<'a, LookupSingle>), Format8 { first_glyph: u16, values: LazyArray16<'a, u16>, }, Format10 { value_size: u16, first_glyph: u16, glyph_count: u16, data: &'a [u8], }, } impl<'a> LookupInner<'a> { fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option { let mut s = Stream::new(data); let format = s.read::()?; match format { 0 => { let values = s.read_array16::(number_of_glyphs.get())?; Some(Self::Format1(values)) } 2 => { let bsearch = BinarySearchTable::::parse(s.tail()?)?; Some(Self::Format2(bsearch)) } 4 => { let bsearch = BinarySearchTable::::parse(s.tail()?)?; Some(Self::Format4(bsearch, data)) } 6 => { let bsearch = BinarySearchTable::::parse(s.tail()?)?; Some(Self::Format6(bsearch)) } 8 => { let first_glyph = s.read::()?; let glyph_count = s.read::()?; let values = s.read_array16::(glyph_count)?; Some(Self::Format8 { first_glyph, values, }) } 10 => { let value_size = s.read::()?; let first_glyph = s.read::()?; let glyph_count = s.read::()?; Some(Self::Format10 { value_size, first_glyph, glyph_count, data: s.tail()?, }) } _ => None, } } fn value(&self, glyph_id: GlyphId) -> Option { match self { Self::Format1(values) => values.get(glyph_id.0), Self::Format2(ref bsearch) => bsearch.get(glyph_id).map(|v| v.value), Self::Format4(ref bsearch, data) => { // In format 4, LookupSegment contains an offset to a list of u16 values. // One value for each glyph in the LookupSegment range. let segment = bsearch.get(glyph_id)?; let index = glyph_id.0.checked_sub(segment.first_glyph)?; let offset = usize::from(segment.value) + u16::SIZE * usize::from(index); Stream::read_at::(data, offset) } Self::Format6(ref bsearch) => bsearch.get(glyph_id).map(|v| v.value), Self::Format8 { first_glyph, values, } => { let idx = glyph_id.0.checked_sub(*first_glyph)?; values.get(idx) } Self::Format10 { value_size, first_glyph, glyph_count, data, } => { let idx = glyph_id.0.checked_sub(*first_glyph)?; let mut s = Stream::new(data); match value_size { 1 => s.read_array16::(*glyph_count)?.get(idx).map(u16::from), 2 => s.read_array16::(*glyph_count)?.get(idx), // TODO: we should return u32 here, but this is not supported yet 4 => s .read_array16::(*glyph_count)? .get(idx) .map(|n| n as u16), _ => None, // 8 is also supported } } } } } /// A binary searching table as defined at /// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6Tables.html #[derive(Clone)] struct BinarySearchTable<'a, T: BinarySearchValue> { values: LazyArray16<'a, T>, len: NonZeroU16, // values length excluding termination segment } impl<'a, T: BinarySearchValue + core::fmt::Debug> BinarySearchTable<'a, T> { #[inline(never)] fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let segment_size = s.read::()?; let number_of_segments = s.read::()?; s.advance(6); // search_range + entry_selector + range_shift if usize::from(segment_size) != T::SIZE { return None; } if number_of_segments == 0 { return None; } let values = s.read_array16::(number_of_segments)?; // 'The number of termination values that need to be included is table-specific. // The value that indicates binary search termination is 0xFFFF.' let mut len = number_of_segments; if values.last()?.is_termination() { len = len.checked_sub(1)?; } Some(BinarySearchTable { len: NonZeroU16::new(len)?, values, }) } fn get(&self, key: GlyphId) -> Option { let mut min = 0; let mut max = (self.len.get() as isize) - 1; while min <= max { let mid = (min + max) / 2; let v = self.values.get(mid as u16)?; match v.contains(key) { core::cmp::Ordering::Less => max = mid - 1, core::cmp::Ordering::Greater => min = mid + 1, core::cmp::Ordering::Equal => return Some(v), } } None } } trait BinarySearchValue: FromData { fn is_termination(&self) -> bool; fn contains(&self, glyph_id: GlyphId) -> core::cmp::Ordering; } #[derive(Clone, Copy, Debug)] struct LookupSegment { last_glyph: u16, first_glyph: u16, value: u16, } impl FromData for LookupSegment { const SIZE: usize = 6; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(LookupSegment { last_glyph: s.read::()?, first_glyph: s.read::()?, value: s.read::()?, }) } } impl BinarySearchValue for LookupSegment { #[inline] fn is_termination(&self) -> bool { self.last_glyph == 0xFFFF && self.first_glyph == 0xFFFF } #[inline] fn contains(&self, id: GlyphId) -> core::cmp::Ordering { if id.0 < self.first_glyph { core::cmp::Ordering::Less } else if id.0 <= self.last_glyph { core::cmp::Ordering::Equal } else { core::cmp::Ordering::Greater } } } #[derive(Clone, Copy, Debug)] struct LookupSingle { glyph: u16, value: u16, } impl FromData for LookupSingle { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(LookupSingle { glyph: s.read::()?, value: s.read::()?, }) } } impl BinarySearchValue for LookupSingle { #[inline] fn is_termination(&self) -> bool { self.glyph == 0xFFFF } #[inline] fn contains(&self, id: GlyphId) -> core::cmp::Ordering { id.0.cmp(&self.glyph) } } ttf-parser-0.24.1/src/delta_set.rs000064400000000000000000000026531046102023000151060ustar 00000000000000use core::convert::TryFrom; use crate::parser::Stream; #[derive(Clone, Copy, Debug)] pub(crate) struct DeltaSetIndexMap<'a> { data: &'a [u8], } impl<'a> DeltaSetIndexMap<'a> { #[inline] pub(crate) fn new(data: &'a [u8]) -> Self { DeltaSetIndexMap { data } } #[inline] pub(crate) fn map(&self, mut index: u32) -> Option<(u16, u16)> { let mut s = Stream::new(self.data); let format = s.read::()?; let entry_format = s.read::()?; let map_count = if format == 0 { s.read::()? as u32 } else { s.read::()? }; if map_count == 0 { return None; } // 'If a given glyph ID is greater than mapCount-1, then the last entry is used.' if index >= map_count { index = map_count - 1; } let entry_size = ((entry_format >> 4) & 3) + 1; let inner_index_bit_count = u32::from((entry_format & 0xF) + 1); s.advance(usize::from(entry_size) * usize::try_from(index).ok()?); let mut n = 0u32; for b in s.read_bytes(usize::from(entry_size))? { n = (n << 8) + u32::from(*b); } let outer_index = n >> inner_index_bit_count; let inner_index = n & ((1 << inner_index_bit_count) - 1); Some(( u16::try_from(outer_index).ok()?, u16::try_from(inner_index).ok()?, )) } } ttf-parser-0.24.1/src/ggg/chained_context.rs000064400000000000000000000125751046102023000170510ustar 00000000000000use super::{ClassDefinition, Coverage, SequenceLookupRecord}; use crate::parser::{FromSlice, LazyArray16, LazyOffsetArray16, Offset, Offset16, Stream}; /// A [Chained Contextual Lookup Subtable]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#chseqctxt1). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub enum ChainedContextLookup<'a> { /// Simple glyph contexts. Format1 { coverage: Coverage<'a>, sets: ChainedSequenceRuleSets<'a>, }, /// Class-based glyph contexts. Format2 { coverage: Coverage<'a>, backtrack_classes: ClassDefinition<'a>, input_classes: ClassDefinition<'a>, lookahead_classes: ClassDefinition<'a>, sets: ChainedSequenceRuleSets<'a>, }, /// Coverage-based glyph contexts. Format3 { coverage: Coverage<'a>, backtrack_coverages: LazyOffsetArray16<'a, Coverage<'a>>, input_coverages: LazyOffsetArray16<'a, Coverage<'a>>, lookahead_coverages: LazyOffsetArray16<'a, Coverage<'a>>, lookups: LazyArray16<'a, SequenceLookupRecord>, }, } impl<'a> ChainedContextLookup<'a> { pub(crate) fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let count = s.read::()?; let offsets = s.read_array16(count)?; Some(Self::Format1 { coverage, sets: ChainedSequenceRuleSets::new(data, offsets), }) } 2 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let mut parse_func = || match s.read::>()? { Some(offset) => Some(ClassDefinition::parse(data.get(offset.to_usize()..)?)?), None => Some(ClassDefinition::Empty), }; let backtrack_classes = parse_func()?; let input_classes = parse_func()?; let lookahead_classes = parse_func()?; let count = s.read::()?; let offsets = s.read_array16(count)?; Some(Self::Format2 { coverage, backtrack_classes, input_classes, lookahead_classes, sets: LazyOffsetArray16::new(data, offsets), }) } 3 => { let backtrack_count = s.read::()?; let backtrack_coverages = s.read_array16(backtrack_count)?; let input_count = s.read::()?; let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let input_coverages = s.read_array16(input_count.checked_sub(1)?)?; let lookahead_count = s.read::()?; let lookahead_coverages = s.read_array16(lookahead_count)?; let lookup_count = s.read::()?; let lookups = s.read_array16(lookup_count)?; Some(Self::Format3 { coverage, backtrack_coverages: LazyOffsetArray16::new(data, backtrack_coverages), input_coverages: LazyOffsetArray16::new(data, input_coverages), lookahead_coverages: LazyOffsetArray16::new(data, lookahead_coverages), lookups, }) } _ => None, } } /// Returns the subtable coverage. #[inline] pub fn coverage(&self) -> Coverage<'a> { match self { Self::Format1 { coverage, .. } => *coverage, Self::Format2 { coverage, .. } => *coverage, Self::Format3 { coverage, .. } => *coverage, } } } /// A list of [`ChainedSequenceRule`] sets. pub type ChainedSequenceRuleSets<'a> = LazyOffsetArray16<'a, ChainedSequenceRuleSet<'a>>; /// A set of [`ChainedSequenceRule`]. pub type ChainedSequenceRuleSet<'a> = LazyOffsetArray16<'a, ChainedSequenceRule<'a>>; impl<'a> FromSlice<'a> for ChainedSequenceRuleSet<'a> { fn parse(data: &'a [u8]) -> Option { Self::parse(data) } } /// A [Chained Sequence Rule](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#chained-sequence-context-format-1-simple-glyph-contexts). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct ChainedSequenceRule<'a> { /// Contains either glyph IDs or glyph Classes. pub backtrack: LazyArray16<'a, u16>, pub input: LazyArray16<'a, u16>, /// Contains either glyph IDs or glyph Classes. pub lookahead: LazyArray16<'a, u16>, pub lookups: LazyArray16<'a, SequenceLookupRecord>, } impl<'a> FromSlice<'a> for ChainedSequenceRule<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let backtrack_count = s.read::()?; let backtrack = s.read_array16(backtrack_count)?; let input_count = s.read::()?; let input = s.read_array16(input_count.checked_sub(1)?)?; let lookahead_count = s.read::()?; let lookahead = s.read_array16(lookahead_count)?; let lookup_count = s.read::()?; let lookups = s.read_array16(lookup_count)?; Some(Self { backtrack, input, lookahead, lookups, }) } } ttf-parser-0.24.1/src/ggg/context.rs000064400000000000000000000100461046102023000153650ustar 00000000000000use super::{ClassDefinition, Coverage, LookupIndex}; use crate::parser::{FromData, FromSlice, LazyArray16, LazyOffsetArray16, Stream}; /// A [Contextual Lookup Subtable]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#seqctxt1). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub enum ContextLookup<'a> { /// Simple glyph contexts. Format1 { coverage: Coverage<'a>, sets: SequenceRuleSets<'a>, }, /// Class-based glyph contexts. Format2 { coverage: Coverage<'a>, classes: ClassDefinition<'a>, sets: SequenceRuleSets<'a>, }, /// Coverage-based glyph contexts. Format3 { coverage: Coverage<'a>, coverages: LazyOffsetArray16<'a, Coverage<'a>>, lookups: LazyArray16<'a, SequenceLookupRecord>, }, } impl<'a> ContextLookup<'a> { pub(crate) fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let count = s.read::()?; let offsets = s.read_array16(count)?; Some(Self::Format1 { coverage, sets: SequenceRuleSets::new(data, offsets), }) } 2 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let classes = ClassDefinition::parse(s.read_at_offset16(data)?)?; let count = s.read::()?; let offsets = s.read_array16(count)?; Some(Self::Format2 { coverage, classes, sets: SequenceRuleSets::new(data, offsets), }) } 3 => { let input_count = s.read::()?; let lookup_count = s.read::()?; let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let coverages = s.read_array16(input_count.checked_sub(1)?)?; let lookups = s.read_array16(lookup_count)?; Some(Self::Format3 { coverage, coverages: LazyOffsetArray16::new(data, coverages), lookups, }) } _ => None, } } /// Returns the subtable coverage. #[inline] pub fn coverage(&self) -> Coverage<'a> { match self { Self::Format1 { coverage, .. } => *coverage, Self::Format2 { coverage, .. } => *coverage, Self::Format3 { coverage, .. } => *coverage, } } } /// A list of [`SequenceRuleSet`]s. pub type SequenceRuleSets<'a> = LazyOffsetArray16<'a, SequenceRuleSet<'a>>; impl<'a> FromSlice<'a> for SequenceRuleSet<'a> { fn parse(data: &'a [u8]) -> Option { Self::parse(data) } } impl<'a> FromSlice<'a> for SequenceRule<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let input_count = s.read::()?; let lookup_count = s.read::()?; let input = s.read_array16(input_count.checked_sub(1)?)?; let lookups = s.read_array16(lookup_count)?; Some(Self { input, lookups }) } } /// A set of [`SequenceRule`]s. pub type SequenceRuleSet<'a> = LazyOffsetArray16<'a, SequenceRule<'a>>; /// A sequence rule. #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct SequenceRule<'a> { pub input: LazyArray16<'a, u16>, pub lookups: LazyArray16<'a, SequenceLookupRecord>, } /// A sequence rule record. #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct SequenceLookupRecord { pub sequence_index: u16, pub lookup_list_index: LookupIndex, } impl FromData for SequenceLookupRecord { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Self { sequence_index: s.read::()?, lookup_list_index: s.read::()?, }) } } ttf-parser-0.24.1/src/ggg/feature_variations.rs000064400000000000000000000124471046102023000176020ustar 00000000000000use super::{Feature, FeatureIndex, RecordListItem, VariationIndex}; use crate::parser::{FromData, LazyArray16, LazyArray32}; use crate::parser::{Offset, Offset32, Stream}; use crate::{NormalizedCoordinate, Tag}; /// A [Feature Variations Table](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#featurevariations-table). #[derive(Clone, Copy, Debug)] pub struct FeatureVariations<'a> { data: &'a [u8], records: LazyArray32<'a, FeatureVariationRecord>, } impl<'a> FeatureVariations<'a> { pub(crate) fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let major_version = s.read::()?; s.skip::(); // minor version if major_version != 1 { return None; } let count = s.read::()?; let records = s.read_array32(count)?; Some(Self { data, records }) } /// Returns a [`VariationIndex`] for variation coordinates. pub fn find_index(&self, coords: &[NormalizedCoordinate]) -> Option { for i in 0..self.records.len() { let record = self.records.get(i)?; let offset = record.conditions.to_usize(); let set = ConditionSet::parse(self.data.get(offset..)?)?; if set.evaluate(coords) { return Some(i); } } None } /// Returns a [`Feature`] at specified indices. pub fn find_substitute( &self, feature_index: FeatureIndex, variation_index: VariationIndex, ) -> Option> { let offset = self.records.get(variation_index)?.substitutions.to_usize(); let subst = FeatureTableSubstitution::parse(self.data.get(offset..)?)?; subst.find_substitute(feature_index) } } #[derive(Clone, Copy, Debug)] struct FeatureVariationRecord { conditions: Offset32, substitutions: Offset32, } impl FromData for FeatureVariationRecord { const SIZE: usize = 8; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Self { conditions: s.read::()?, substitutions: s.read::()?, }) } } #[derive(Clone, Copy, Debug)] struct ConditionSet<'a> { data: &'a [u8], conditions: LazyArray16<'a, Offset32>, } impl<'a> ConditionSet<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let count = s.read::()?; let conditions = s.read_array16(count)?; Some(Self { data, conditions }) } fn evaluate(&self, coords: &[NormalizedCoordinate]) -> bool { self.conditions.into_iter().all(|offset| { self.data .get(offset.to_usize()..) .and_then(Condition::parse) .map_or(false, |c| c.evaluate(coords)) }) } } #[derive(Clone, Copy, Debug)] enum Condition { Format1 { axis_index: u16, filter_range_min: i16, filter_range_max: i16, }, } impl Condition { fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); let format = s.read::()?; match format { 1 => { let axis_index = s.read::()?; let filter_range_min = s.read::()?; let filter_range_max = s.read::()?; Some(Self::Format1 { axis_index, filter_range_min, filter_range_max, }) } _ => None, } } fn evaluate(&self, coords: &[NormalizedCoordinate]) -> bool { let Self::Format1 { axis_index, filter_range_min, filter_range_max, } = *self; let coord = coords .get(usize::from(axis_index)) .map(|c| c.get()) .unwrap_or(0); filter_range_min <= coord && coord <= filter_range_max } } #[derive(Clone, Copy, Debug)] struct FeatureTableSubstitution<'a> { data: &'a [u8], records: LazyArray16<'a, FeatureTableSubstitutionRecord>, } impl<'a> FeatureTableSubstitution<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let major_version = s.read::()?; s.skip::(); // minor version if major_version != 1 { return None; } let count = s.read::()?; let records = s.read_array16(count)?; Some(Self { data, records }) } fn find_substitute(&self, feature_index: FeatureIndex) -> Option> { for record in self.records { if record.feature_index == feature_index { let offset = record.feature.to_usize(); // TODO: set tag return Feature::parse(Tag::from_bytes(b"DFLT"), self.data.get(offset..)?); } } None } } #[derive(Clone, Copy, Debug)] struct FeatureTableSubstitutionRecord { feature_index: FeatureIndex, feature: Offset32, } impl FromData for FeatureTableSubstitutionRecord { const SIZE: usize = 6; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Self { feature_index: s.read::()?, feature: s.read::()?, }) } } ttf-parser-0.24.1/src/ggg/layout_table.rs000064400000000000000000000206061046102023000163700ustar 00000000000000// Suppresses `minor_version` variable warning. #![allow(unused_variables)] #[cfg(feature = "variable-fonts")] use super::FeatureVariations; use super::LookupList; #[cfg(feature = "variable-fonts")] use crate::parser::Offset32; use crate::parser::{FromData, LazyArray16, Offset, Offset16, Stream}; use crate::Tag; /// A [Layout Table](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#table-organization). #[derive(Clone, Copy, Debug)] pub struct LayoutTable<'a> { /// A list of all supported scripts. pub scripts: ScriptList<'a>, /// A list of all supported features. pub features: FeatureList<'a>, /// A list of all lookups. pub lookups: LookupList<'a>, /// Used to substitute an alternate set of lookup tables /// to use for any given feature under specified conditions. #[cfg(feature = "variable-fonts")] pub variations: Option>, } impl<'a> LayoutTable<'a> { pub(crate) fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let major_version = s.read::()?; let minor_version = s.read::()?; if major_version != 1 { return None; } let scripts = ScriptList::parse(s.read_at_offset16(data)?)?; let features = FeatureList::parse(s.read_at_offset16(data)?)?; let lookups = LookupList::parse(s.read_at_offset16(data)?)?; #[cfg(feature = "variable-fonts")] { let mut variations_offset = None; if minor_version >= 1 { variations_offset = s.read::>()?; } let variations = match variations_offset { Some(offset) => data .get(offset.to_usize()..) .and_then(FeatureVariations::parse), None => None, }; Some(Self { scripts, features, lookups, variations, }) } #[cfg(not(feature = "variable-fonts"))] { Some(Self { scripts, features, lookups, }) } } } /// An index in [`ScriptList`]. pub type ScriptIndex = u16; /// An index in [`LanguageSystemList`]. pub type LanguageIndex = u16; /// An index in [`FeatureList`]. pub type FeatureIndex = u16; /// An index in [`LookupList`]. pub type LookupIndex = u16; /// An index in [`FeatureVariations`]. pub type VariationIndex = u32; /// A trait to parse item in [`RecordList`]. /// /// Internal use only. pub trait RecordListItem<'a>: Sized { /// Parses raw data. fn parse(tag: Tag, data: &'a [u8]) -> Option; } /// A data storage used by [`ScriptList`], [`LanguageSystemList`] and [`FeatureList`] data types. #[derive(Clone, Copy, Debug)] pub struct RecordList<'a, T: RecordListItem<'a>> { data: &'a [u8], records: LazyArray16<'a, TagRecord>, data_type: core::marker::PhantomData, } impl<'a, T: RecordListItem<'a>> RecordList<'a, T> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let count = s.read::()?; let records = s.read_array16(count)?; Some(Self { data, records, data_type: core::marker::PhantomData, }) } /// Returns a number of items in the RecordList. pub fn len(&self) -> u16 { self.records.len() } /// Checks that RecordList is empty. pub fn is_empty(&self) -> bool { self.records.is_empty() } /// Returns RecordList value by index. pub fn get(&self, index: u16) -> Option { let record = self.records.get(index)?; self.data .get(record.offset.to_usize()..) .and_then(|data| T::parse(record.tag, data)) } /// Returns RecordList value by [`Tag`]. pub fn find(&self, tag: Tag) -> Option { let record = self .records .binary_search_by(|record| record.tag.cmp(&tag)) .map(|p| p.1)?; self.data .get(record.offset.to_usize()..) .and_then(|data| T::parse(record.tag, data)) } /// Returns RecordList value index by [`Tag`]. pub fn index(&self, tag: Tag) -> Option { self.records .binary_search_by(|record| record.tag.cmp(&tag)) .map(|p| p.0) } } impl<'a, T: RecordListItem<'a>> IntoIterator for RecordList<'a, T> { type Item = T; type IntoIter = RecordListIter<'a, T>; #[inline] fn into_iter(self) -> Self::IntoIter { RecordListIter { list: self, index: 0, } } } /// An iterator over [`RecordList`] values. #[allow(missing_debug_implementations)] pub struct RecordListIter<'a, T: RecordListItem<'a>> { list: RecordList<'a, T>, index: u16, } impl<'a, T: RecordListItem<'a>> Iterator for RecordListIter<'a, T> { type Item = T; fn next(&mut self) -> Option { if self.index < self.list.len() { self.index += 1; self.list.get(self.index - 1) } else { None } } } /// A list of [`Script`] records. pub type ScriptList<'a> = RecordList<'a, Script<'a>>; /// A list of [`LanguageSystem`] records. pub type LanguageSystemList<'a> = RecordList<'a, LanguageSystem<'a>>; /// A list of [`Feature`] records. pub type FeatureList<'a> = RecordList<'a, Feature<'a>>; #[derive(Clone, Copy, Debug)] struct TagRecord { tag: Tag, offset: Offset16, } impl FromData for TagRecord { const SIZE: usize = 6; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Self { tag: s.read::()?, offset: s.read::()?, }) } } /// A [Script Table](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#script-table-and-language-system-record). #[derive(Clone, Copy, Debug)] pub struct Script<'a> { /// Script tag. pub tag: Tag, /// Default language. pub default_language: Option>, /// List of supported languages, excluding the default one. Listed alphabetically. pub languages: LanguageSystemList<'a>, } impl<'a> RecordListItem<'a> for Script<'a> { fn parse(tag: Tag, data: &'a [u8]) -> Option { let mut s = Stream::new(data); let mut default_language = None; if let Some(offset) = s.read::>()? { default_language = LanguageSystem::parse(Tag::from_bytes(b"dflt"), data.get(offset.to_usize()..)?); } let mut languages = RecordList::parse(s.tail()?)?; // Offsets are relative to this table. languages.data = data; Some(Self { tag, default_language, languages, }) } } /// A [Language System Table](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#language-system-table). #[derive(Clone, Copy, Debug)] pub struct LanguageSystem<'a> { /// Language tag. pub tag: Tag, /// Index of a feature required for this language system. pub required_feature: Option, /// Array of indices into the FeatureList, in arbitrary order. pub feature_indices: LazyArray16<'a, FeatureIndex>, } impl<'a> RecordListItem<'a> for LanguageSystem<'a> { fn parse(tag: Tag, data: &'a [u8]) -> Option { let mut s = Stream::new(data); let _lookup_order = s.read::()?; // Unsupported. let required_feature = match s.read::()? { 0xFFFF => None, v => Some(v), }; let count = s.read::()?; let feature_indices = s.read_array16(count)?; Some(Self { tag, required_feature, feature_indices, }) } } /// A [Feature](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#feature-table). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct Feature<'a> { pub tag: Tag, pub lookup_indices: LazyArray16<'a, LookupIndex>, } impl<'a> RecordListItem<'a> for Feature<'a> { fn parse(tag: Tag, data: &'a [u8]) -> Option { let mut s = Stream::new(data); let _params_offset = s.read::()?; // Unsupported. let count = s.read::()?; let lookup_indices = s.read_array16(count)?; Some(Self { tag, lookup_indices, }) } } ttf-parser-0.24.1/src/ggg/lookup.rs000064400000000000000000000114011046102023000152060ustar 00000000000000use crate::parser::{ FromData, FromSlice, LazyArray16, LazyOffsetArray16, Offset, Offset16, Offset32, Stream, }; /// A list of [`Lookup`] values. pub type LookupList<'a> = LazyOffsetArray16<'a, Lookup<'a>>; /// A [Lookup Table](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#lookup-table). #[derive(Clone, Copy, Debug)] pub struct Lookup<'a> { /// Lookup qualifiers. pub flags: LookupFlags, /// Available subtables. pub subtables: LookupSubtables<'a>, /// Index into GDEF mark glyph sets structure. pub mark_filtering_set: Option, } impl<'a> FromSlice<'a> for Lookup<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let kind = s.read::()?; let flags = s.read::()?; let count = s.read::()?; let offsets = s.read_array16(count)?; let mut mark_filtering_set: Option = None; if flags.use_mark_filtering_set() { mark_filtering_set = Some(s.read::()?); } Some(Self { flags, subtables: LookupSubtables { kind, data, offsets, }, mark_filtering_set, }) } } /// A trait for parsing Lookup subtables. /// /// Internal use only. pub trait LookupSubtable<'a>: Sized { /// Parses raw data. fn parse(data: &'a [u8], kind: u16) -> Option; } /// A list of lookup subtables. #[derive(Clone, Copy)] pub struct LookupSubtables<'a> { kind: u16, data: &'a [u8], offsets: LazyArray16<'a, Offset16>, } impl core::fmt::Debug for LookupSubtables<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "LookupSubtables {{ ... }}") } } impl<'a> LookupSubtables<'a> { /// Returns a number of items in the LookupSubtables. #[inline] pub fn len(&self) -> u16 { self.offsets.len() } /// Checks if there are any items. pub fn is_empty(&self) -> bool { self.offsets.is_empty() } /// Parses a subtable at index. /// /// Accepts either /// [`PositioningSubtable`](crate::gpos::PositioningSubtable) /// or [`SubstitutionSubtable`](crate::gsub::SubstitutionSubtable). /// /// Technically, we can enforce it at compile time, but it makes code too convoluted. pub fn get>(&self, index: u16) -> Option { let offset = self.offsets.get(index)?.to_usize(); let data = self.data.get(offset..)?; T::parse(data, self.kind) } /// Creates an iterator over subtables. /// /// We cannot use `IntoIterator` here, because we have to use user-provided base type. #[allow(clippy::should_implement_trait)] pub fn into_iter>(self) -> LookupSubtablesIter<'a, T> { LookupSubtablesIter { data: self, index: 0, data_type: core::marker::PhantomData, } } } /// An iterator over lookup subtables. #[allow(missing_debug_implementations)] pub struct LookupSubtablesIter<'a, T: LookupSubtable<'a>> { data: LookupSubtables<'a>, index: u16, data_type: core::marker::PhantomData, } impl<'a, T: LookupSubtable<'a>> Iterator for LookupSubtablesIter<'a, T> { type Item = T; fn next(&mut self) -> Option { if self.index < self.data.len() { self.index += 1; self.data.get(self.index - 1) } else { None } } } /// Lookup table flags. #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct LookupFlags(pub u16); #[rustfmt::skip] #[allow(missing_docs)] impl LookupFlags { #[inline] pub fn right_to_left(self) -> bool { self.0 & 0x0001 != 0 } #[inline] pub fn ignore_base_glyphs(self) -> bool { self.0 & 0x0002 != 0 } #[inline] pub fn ignore_ligatures(self) -> bool { self.0 & 0x0004 != 0 } #[inline] pub fn ignore_marks(self) -> bool { self.0 & 0x0008 != 0 } #[inline] pub fn ignore_flags(self) -> bool { self.0 & 0x000E != 0 } #[inline] pub fn use_mark_filtering_set(self) -> bool { self.0 & 0x0010 != 0 } #[inline] pub fn mark_attachment_type(self) -> u8 { ((self.0 & 0xFF00) >> 8) as u8 } } impl FromData for LookupFlags { const SIZE: usize = 2; #[inline] fn parse(data: &[u8]) -> Option { u16::parse(data).map(Self) } } pub(crate) fn parse_extension_lookup<'a, T: 'a>( data: &'a [u8], parse: impl FnOnce(&'a [u8], u16) -> Option, ) -> Option { let mut s = Stream::new(data); let format = s.read::()?; match format { 1 => { let kind = s.read::()?; let offset = s.read::()?.to_usize(); parse(data.get(offset..)?, kind) } _ => None, } } ttf-parser-0.24.1/src/ggg/mod.rs000064400000000000000000000116141046102023000144620ustar 00000000000000//! Common data types used by GDEF/GPOS/GSUB tables. //! //! // A heavily modified port of https://github.com/RazrFalcon/rustybuzz implementation // originally written by https://github.com/laurmaedje use crate::parser::{FromData, FromSlice, LazyArray16, Stream}; use crate::GlyphId; mod chained_context; mod context; #[cfg(feature = "variable-fonts")] mod feature_variations; mod layout_table; mod lookup; pub use chained_context::*; pub use context::*; #[cfg(feature = "variable-fonts")] pub use feature_variations::*; pub use layout_table::*; pub use lookup::*; /// A record that describes a range of glyph IDs. #[derive(Clone, Copy, Debug)] pub struct RangeRecord { /// First glyph ID in the range pub start: GlyphId, /// Last glyph ID in the range pub end: GlyphId, /// Coverage Index of first glyph ID in range. pub value: u16, } impl LazyArray16<'_, RangeRecord> { /// Returns a [`RangeRecord`] for a glyph. pub fn range(&self, glyph: GlyphId) -> Option { self.binary_search_by(|record| { if glyph < record.start { core::cmp::Ordering::Greater } else if glyph <= record.end { core::cmp::Ordering::Equal } else { core::cmp::Ordering::Less } }) .map(|p| p.1) } } impl FromData for RangeRecord { const SIZE: usize = 6; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(RangeRecord { start: s.read::()?, end: s.read::()?, value: s.read::()?, }) } } /// A [Coverage Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#coverage-table). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub enum Coverage<'a> { Format1 { /// Array of glyph IDs. Sorted. glyphs: LazyArray16<'a, GlyphId>, }, Format2 { /// Array of glyph ranges. Ordered by `RangeRecord.start`. records: LazyArray16<'a, RangeRecord>, }, } impl<'a> FromSlice<'a> for Coverage<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let count = s.read::()?; let glyphs = s.read_array16(count)?; Some(Self::Format1 { glyphs }) } 2 => { let count = s.read::()?; let records = s.read_array16(count)?; Some(Self::Format2 { records }) } _ => None, } } } impl<'a> Coverage<'a> { /// Checks that glyph is present. pub fn contains(&self, glyph: GlyphId) -> bool { self.get(glyph).is_some() } /// Returns the coverage index of the glyph or `None` if it is not covered. pub fn get(&self, glyph: GlyphId) -> Option { match self { Self::Format1 { glyphs } => glyphs.binary_search(&glyph).map(|p| p.0), Self::Format2 { records } => { let record = records.range(glyph)?; let offset = glyph.0 - record.start.0; record.value.checked_add(offset) } } } } /// A value of [Class Definition Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#class-definition-table). pub type Class = u16; /// A [Class Definition Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#class-definition-table). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub enum ClassDefinition<'a> { Format1 { start: GlyphId, classes: LazyArray16<'a, Class>, }, Format2 { records: LazyArray16<'a, RangeRecord>, }, Empty, } impl<'a> ClassDefinition<'a> { #[inline] pub(crate) fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let start = s.read::()?; let count = s.read::()?; let classes = s.read_array16(count)?; Some(Self::Format1 { start, classes }) } 2 => { let count = s.read::()?; let records = s.read_array16(count)?; Some(Self::Format2 { records }) } _ => None, } } /// Returns the glyph class of the glyph (zero if it is not defined). pub fn get(&self, glyph: GlyphId) -> Class { match self { Self::Format1 { start, classes } => glyph .0 .checked_sub(start.0) .and_then(|index| classes.get(index)), Self::Format2 { records } => records.range(glyph).map(|record| record.value), Self::Empty => Some(0), } .unwrap_or(0) } } ttf-parser-0.24.1/src/language.rs000064400000000000000000000466031046102023000147300ustar 00000000000000#[rustfmt::skip] static TABLE: &[(u16, Language, &str, &str)] = &[ (0x0000, Language::Unknown, "Unknown", "Unknown"), (0x0436, Language::Afrikaans_SouthAfrica, "Afrikaans", "South Africa"), (0x041C, Language::Albanian_Albania, "Albanian", "Albania"), (0x0484, Language::Alsatian_France, "Alsatian", "France"), (0x045E, Language::Amharic_Ethiopia, "Amharic", "Ethiopia"), (0x1401, Language::Arabic_Algeria, "Arabic", "Algeria"), (0x3C01, Language::Arabic_Bahrain, "Arabic", "Bahrain"), (0x0C01, Language::Arabic_Egypt, "Arabic", "Egypt"), (0x0801, Language::Arabic_Iraq, "Arabic", "Iraq"), (0x2C01, Language::Arabic_Jordan, "Arabic", "Jordan"), (0x3401, Language::Arabic_Kuwait, "Arabic", "Kuwait"), (0x3001, Language::Arabic_Lebanon, "Arabic", "Lebanon"), (0x1001, Language::Arabic_Libya, "Arabic", "Libya"), (0x1801, Language::Arabic_Morocco, "Arabic", "Morocco"), (0x2001, Language::Arabic_Oman, "Arabic", "Oman"), (0x4001, Language::Arabic_Qatar, "Arabic", "Qatar"), (0x0401, Language::Arabic_SaudiArabia, "Arabic", "Saudi Arabia"), (0x2801, Language::Arabic_Syria, "Arabic", "Syria"), (0x1C01, Language::Arabic_Tunisia, "Arabic", "Tunisia"), (0x3801, Language::Arabic_UAE, "Arabic", "U.A.E."), (0x2401, Language::Arabic_Yemen, "Arabic", "Yemen"), (0x042B, Language::Armenian_Armenia, "Armenian", "Armenia"), (0x044D, Language::Assamese_India, "Assamese", "India"), (0x082C, Language::Azeri_Cyrillic_Azerbaijan, "Azeri (Cyrillic)", "Azerbaijan"), (0x042C, Language::Azeri_Latin_Azerbaijan, "Azeri (Latin)", "Azerbaijan"), (0x046D, Language::Bashkir_Russia, "Bashkir", "Russia"), (0x042D, Language::Basque_Basque, "Basque", "Basque"), (0x0423, Language::Belarusian_Belarus, "Belarusian", "Belarus"), (0x0845, Language::Bengali_Bangladesh, "Bengali", "Bangladesh"), (0x0445, Language::Bengali_India, "Bengali", "India"), (0x201A, Language::Bosnian_Cyrillic_BosniaAndHerzegovina, "Bosnian (Cyrillic)", "Bosnia and Herzegovina"), (0x141A, Language::Bosnian_Latin_BosniaAndHerzegovina, "Bosnian (Latin)", "Bosnia and Herzegovina"), (0x047E, Language::Breton_France, "Breton", "France"), (0x0402, Language::Bulgarian_Bulgaria, "Bulgarian", "Bulgaria"), (0x0403, Language::Catalan_Catalan, "Catalan", "Catalan"), (0x0C04, Language::Chinese_HongKongSAR, "Chinese", "Hong Kong S.A.R."), (0x1404, Language::Chinese_MacaoSAR, "Chinese", "Macao S.A.R."), (0x0804, Language::Chinese_PeoplesRepublicOfChina, "Chinese", "People's Republic of China"), (0x1004, Language::Chinese_Singapore, "Chinese", "Singapore"), (0x0404, Language::Chinese_Taiwan, "Chinese", "Taiwan"), (0x0483, Language::Corsican_France, "Corsican", "France"), (0x041A, Language::Croatian_Croatia, "Croatian", "Croatia"), (0x101A, Language::Croatian_Latin_BosniaAndHerzegovina, "Croatian (Latin)", "Bosnia and Herzegovina"), (0x0405, Language::Czech_CzechRepublic, "Czech", "Czech Republic"), (0x0406, Language::Danish_Denmark, "Danish", "Denmark"), (0x048C, Language::Dari_Afghanistan, "Dari", "Afghanistan"), (0x0465, Language::Divehi_Maldives, "Divehi", "Maldives"), (0x0813, Language::Dutch_Belgium, "Dutch", "Belgium"), (0x0413, Language::Dutch_Netherlands, "Dutch", "Netherlands"), (0x0C09, Language::English_Australia, "English", "Australia"), (0x2809, Language::English_Belize, "English", "Belize"), (0x1009, Language::English_Canada, "English", "Canada"), (0x2409, Language::English_Caribbean, "English", "Caribbean"), (0x4009, Language::English_India, "English", "India"), (0x1809, Language::English_Ireland, "English", "Ireland"), (0x2009, Language::English_Jamaica, "English", "Jamaica"), (0x4409, Language::English_Malaysia, "English", "Malaysia"), (0x1409, Language::English_NewZealand, "English", "New Zealand"), (0x3409, Language::English_RepublicOfThePhilippines, "English", "Republic of the Philippines"), (0x4809, Language::English_Singapore, "English", "Singapore"), (0x1C09, Language::English_SouthAfrica, "English", "South Africa"), (0x2C09, Language::English_TrinidadAndTobago, "English", "Trinidad and Tobago"), (0x0809, Language::English_UnitedKingdom, "English", "United Kingdom"), (0x0409, Language::English_UnitedStates, "English", "United States"), (0x3009, Language::English_Zimbabwe, "English", "Zimbabwe"), (0x0425, Language::Estonian_Estonia, "Estonian", "Estonia"), (0x0438, Language::Faroese_FaroeIslands, "Faroese", "Faroe Islands"), (0x0464, Language::Filipino_Philippines, "Filipino", "Philippines"), (0x040B, Language::Finnish_Finland, "Finnish", "Finland"), (0x080C, Language::French_Belgium, "French", "Belgium"), (0x0C0C, Language::French_Canada, "French", "Canada"), (0x040C, Language::French_France, "French", "France"), (0x140c, Language::French_Luxembourg, "French", "Luxembourg"), (0x180C, Language::French_PrincipalityOfMonaco, "French", "Principality of Monaco"), (0x100C, Language::French_Switzerland, "French", "Switzerland"), (0x0462, Language::Frisian_Netherlands, "Frisian", "Netherlands"), (0x0456, Language::Galician_Galician, "Galician", "Galician"), (0x0437, Language::Georgian_Georgia, "Georgian", "Georgia"), (0x0C07, Language::German_Austria, "German", "Austria"), (0x0407, Language::German_Germany, "German", "Germany"), (0x1407, Language::German_Liechtenstein, "German", "Liechtenstein"), (0x1007, Language::German_Luxembourg, "German", "Luxembourg"), (0x0807, Language::German_Switzerland, "German", "Switzerland"), (0x0408, Language::Greek_Greece, "Greek", "Greece"), (0x046F, Language::Greenlandic_Greenland, "Greenlandic", "Greenland"), (0x0447, Language::Gujarati_India, "Gujarati", "India"), (0x0468, Language::Hausa_Latin_Nigeria, "Hausa (Latin)", "Nigeria"), (0x040D, Language::Hebrew_Israel, "Hebrew", "Israel"), (0x0439, Language::Hindi_India, "Hindi", "India"), (0x040E, Language::Hungarian_Hungary, "Hungarian", "Hungary"), (0x040F, Language::Icelandic_Iceland, "Icelandic", "Iceland"), (0x0470, Language::Igbo_Nigeria, "Igbo", "Nigeria"), (0x0421, Language::Indonesian_Indonesia, "Indonesian", "Indonesia"), (0x045D, Language::Inuktitut_Canada, "Inuktitut", "Canada"), (0x085D, Language::Inuktitut_Latin_Canada, "Inuktitut (Latin)", "Canada"), (0x083C, Language::Irish_Ireland, "Irish", "Ireland"), (0x0434, Language::isiXhosa_SouthAfrica, "isiXhosa", "South Africa"), (0x0435, Language::isiZulu_SouthAfrica, "isiZulu", "South Africa"), (0x0410, Language::Italian_Italy, "Italian", "Italy"), (0x0810, Language::Italian_Switzerland, "Italian", "Switzerland"), (0x0411, Language::Japanese_Japan, "Japanese", "Japan"), (0x044B, Language::Kannada_India, "Kannada", "India"), (0x043F, Language::Kazakh_Kazakhstan, "Kazakh", "Kazakhstan"), (0x0453, Language::Khmer_Cambodia, "Khmer", "Cambodia"), (0x0486, Language::Kiche_Guatemala, "K'iche", "Guatemala"), (0x0487, Language::Kinyarwanda_Rwanda, "Kinyarwanda", "Rwanda"), (0x0441, Language::Kiswahili_Kenya, "Kiswahili", "Kenya"), (0x0457, Language::Konkani_India, "Konkani", "India"), (0x0412, Language::Korean_Korea, "Korean", "Korea"), (0x0440, Language::Kyrgyz_Kyrgyzstan, "Kyrgyz", "Kyrgyzstan"), (0x0454, Language::Lao_LaoPDR, "Lao", "Lao P.D.R."), (0x0426, Language::Latvian_Latvia, "Latvian", "Latvia"), (0x0427, Language::Lithuanian_Lithuania, "Lithuanian", "Lithuania"), (0x082E, Language::LowerSorbian_Germany, "Lower Sorbian", "Germany"), (0x046E, Language::Luxembourgish_Luxembourg, "Luxembourgish", "Luxembourg"), (0x042F, Language::Macedonian_NorthMacedonia, "Macedonian", "North Macedonia"), (0x083E, Language::Malay_BruneiDarussalam, "Malay", "Brunei Darussalam"), (0x043E, Language::Malay_Malaysia, "Malay", "Malaysia"), (0x044C, Language::Malayalam_India, "Malayalam", "India"), (0x043A, Language::Maltese_Malta, "Maltese", "Malta"), (0x0481, Language::Maori_NewZealand, "Maori", "New Zealand"), (0x047A, Language::Mapudungun_Chile, "Mapudungun", "Chile"), (0x044E, Language::Marathi_India, "Marathi", "India"), (0x047C, Language::Mohawk_Mohawk, "Mohawk", "Mohawk"), (0x0450, Language::Mongolian_Cyrillic_Mongolia, "Mongolian (Cyrillic)", "Mongolia"), (0x0850, Language::Mongolian_Traditional_PeoplesRepublicOfChina, "Mongolian (Traditional)", "People's Republic of China"), (0x0461, Language::Nepali_Nepal, "Nepali", "Nepal"), (0x0414, Language::Norwegian_Bokmal_Norway, "Norwegian (Bokmal)", "Norway"), (0x0814, Language::Norwegian_Nynorsk_Norway, "Norwegian (Nynorsk)", "Norway"), (0x0482, Language::Occitan_France, "Occitan", "France"), (0x0448, Language::Odia_India, "Odia (formerly Oriya)", "India"), (0x0463, Language::Pashto_Afghanistan, "Pashto", "Afghanistan"), (0x0415, Language::Polish_Poland, "Polish", "Poland"), (0x0416, Language::Portuguese_Brazil, "Portuguese", "Brazil"), (0x0816, Language::Portuguese_Portugal, "Portuguese", "Portugal"), (0x0446, Language::Punjabi_India, "Punjabi", "India"), (0x046B, Language::Quechua_Bolivia, "Quechua", "Bolivia"), (0x086B, Language::Quechua_Ecuador, "Quechua", "Ecuador"), (0x0C6B, Language::Quechua_Peru, "Quechua", "Peru"), (0x0418, Language::Romanian_Romania, "Romanian", "Romania"), (0x0417, Language::Romansh_Switzerland, "Romansh", "Switzerland"), (0x0419, Language::Russian_Russia, "Russian", "Russia"), (0x243B, Language::Sami_Inari_Finland, "Sami (Inari)", "Finland"), (0x103B, Language::Sami_Lule_Norway, "Sami (Lule)", "Norway"), (0x143B, Language::Sami_Lule_Sweden, "Sami (Lule)", "Sweden"), (0x0C3B, Language::Sami_Northern_Finland, "Sami (Northern)", "Finland"), (0x043B, Language::Sami_Northern_Norway, "Sami (Northern)", "Norway"), (0x083B, Language::Sami_Northern_Sweden, "Sami (Northern)", "Sweden"), (0x203B, Language::Sami_Skolt_Finland, "Sami (Skolt)", "Finland"), (0x183B, Language::Sami_Southern_Norway, "Sami (Southern)", "Norway"), (0x1C3B, Language::Sami_Southern_Sweden, "Sami (Southern)", "Sweden"), (0x044F, Language::Sanskrit_India, "Sanskrit", "India"), (0x1C1A, Language::Serbian_Cyrillic_BosniaAndHerzegovina, "Serbian (Cyrillic)", "Bosnia and Herzegovina"), (0x0C1A, Language::Serbian_Cyrillic_Serbia, "Serbian (Cyrillic)", "Serbia"), (0x181A, Language::Serbian_Latin_BosniaAndHerzegovina, "Serbian (Latin)", "Bosnia and Herzegovina"), (0x081A, Language::Serbian_Latin_Serbia, "Serbian (Latin)", "Serbia"), (0x046C, Language::SesothoSaLeboa_SouthAfrica, "Sesotho sa Leboa", "South Africa"), (0x0432, Language::Setswana_SouthAfrica, "Setswana", "South Africa"), (0x045B, Language::Sinhala_SriLanka, "Sinhala", "Sri Lanka"), (0x041B, Language::Slovak_Slovakia, "Slovak", "Slovakia"), (0x0424, Language::Slovenian_Slovenia, "Slovenian", "Slovenia"), (0x2C0A, Language::Spanish_Argentina, "Spanish", "Argentina"), (0x400A, Language::Spanish_Bolivia, "Spanish", "Bolivia"), (0x340A, Language::Spanish_Chile, "Spanish", "Chile"), (0x240A, Language::Spanish_Colombia, "Spanish", "Colombia"), (0x140A, Language::Spanish_CostaRica, "Spanish", "Costa Rica"), (0x1C0A, Language::Spanish_DominicanRepublic, "Spanish", "Dominican Republic"), (0x300A, Language::Spanish_Ecuador, "Spanish", "Ecuador"), (0x440A, Language::Spanish_ElSalvador, "Spanish", "El Salvador"), (0x100A, Language::Spanish_Guatemala, "Spanish", "Guatemala"), (0x480A, Language::Spanish_Honduras, "Spanish", "Honduras"), (0x080A, Language::Spanish_Mexico, "Spanish", "Mexico"), (0x4C0A, Language::Spanish_Nicaragua, "Spanish", "Nicaragua"), (0x180A, Language::Spanish_Panama, "Spanish", "Panama"), (0x3C0A, Language::Spanish_Paraguay, "Spanish", "Paraguay"), (0x280A, Language::Spanish_Peru, "Spanish", "Peru"), (0x500A, Language::Spanish_PuertoRico, "Spanish", "Puerto Rico"), (0x0C0A, Language::Spanish_ModernSort_Spain, "Spanish (Modern Sort)", "Spain"), (0x040A, Language::Spanish_TraditionalSort_Spain, "Spanish (Traditional Sort)", "Spain"), (0x540A, Language::Spanish_UnitedStates, "Spanish", "United States"), (0x380A, Language::Spanish_Uruguay, "Spanish", "Uruguay"), (0x200A, Language::Spanish_Venezuela, "Spanish", "Venezuela"), (0x081D, Language::Swedish_Finland, "Swedish", "Finland"), (0x041D, Language::Swedish_Sweden, "Swedish", "Sweden"), (0x045A, Language::Syriac_Syria, "Syriac", "Syria"), (0x0428, Language::Tajik_Cyrillic_Tajikistan, "Tajik (Cyrillic)", "Tajikistan"), (0x085F, Language::Tamazight_Latin_Algeria, "Tamazight (Latin)", "Algeria"), (0x0449, Language::Tamil_India, "Tamil", "India"), (0x0444, Language::Tatar_Russia, "Tatar", "Russia"), (0x044A, Language::Telugu_India, "Telugu", "India"), (0x041E, Language::Thai_Thailand, "Thai", "Thailand"), (0x0451, Language::Tibetan_PRC, "Tibetan", "PRC"), (0x041F, Language::Turkish_Turkey, "Turkish", "Turkey"), (0x0442, Language::Turkmen_Turkmenistan, "Turkmen", "Turkmenistan"), (0x0480, Language::Uighur_PRC, "Uighur", "PRC"), (0x0422, Language::Ukrainian_Ukraine, "Ukrainian", "Ukraine"), (0x042E, Language::UpperSorbian_Germany, "Upper Sorbian", "Germany"), (0x0420, Language::Urdu_IslamicRepublicOfPakistan, "Urdu", "Islamic Republic of Pakistan"), (0x0843, Language::Uzbek_Cyrillic_Uzbekistan, "Uzbek (Cyrillic)", "Uzbekistan"), (0x0443, Language::Uzbek_Latin_Uzbekistan, "Uzbek (Latin)", "Uzbekistan"), (0x042A, Language::Vietnamese_Vietnam, "Vietnamese", "Vietnam"), (0x0452, Language::Welsh_UnitedKingdom, "Welsh", "United Kingdom"), (0x0488, Language::Wolof_Senegal, "Wolof", "Senegal"), (0x0485, Language::Yakut_Russia, "Yakut", "Russia"), (0x0478, Language::Yi_PRC, "Yi", "PRC"), (0x046A, Language::Yoruba_Nigeria, "Yoruba", "Nigeria"), ]; /// A [`Name`](crate::name::Name) language. /// /// Consists of Language + Region pairs. /// /// #[allow(missing_docs)] #[allow(non_camel_case_types)] #[derive(Clone, Copy, PartialEq, Eq, Debug)] pub enum Language { Unknown = 0, Afrikaans_SouthAfrica, Albanian_Albania, Alsatian_France, Amharic_Ethiopia, Arabic_Algeria, Arabic_Bahrain, Arabic_Egypt, Arabic_Iraq, Arabic_Jordan, Arabic_Kuwait, Arabic_Lebanon, Arabic_Libya, Arabic_Morocco, Arabic_Oman, Arabic_Qatar, Arabic_SaudiArabia, Arabic_Syria, Arabic_Tunisia, Arabic_UAE, Arabic_Yemen, Armenian_Armenia, Assamese_India, Azeri_Cyrillic_Azerbaijan, Azeri_Latin_Azerbaijan, Bashkir_Russia, Basque_Basque, Belarusian_Belarus, Bengali_Bangladesh, Bengali_India, Bosnian_Cyrillic_BosniaAndHerzegovina, Bosnian_Latin_BosniaAndHerzegovina, Breton_France, Bulgarian_Bulgaria, Catalan_Catalan, Chinese_HongKongSAR, Chinese_MacaoSAR, Chinese_PeoplesRepublicOfChina, Chinese_Singapore, Chinese_Taiwan, Corsican_France, Croatian_Croatia, Croatian_Latin_BosniaAndHerzegovina, Czech_CzechRepublic, Danish_Denmark, Dari_Afghanistan, Divehi_Maldives, Dutch_Belgium, Dutch_Netherlands, English_Australia, English_Belize, English_Canada, English_Caribbean, English_India, English_Ireland, English_Jamaica, English_Malaysia, English_NewZealand, English_RepublicOfThePhilippines, English_Singapore, English_SouthAfrica, English_TrinidadAndTobago, English_UnitedKingdom, English_UnitedStates, English_Zimbabwe, Estonian_Estonia, Faroese_FaroeIslands, Filipino_Philippines, Finnish_Finland, French_Belgium, French_Canada, French_France, French_Luxembourg, French_PrincipalityOfMonaco, French_Switzerland, Frisian_Netherlands, Galician_Galician, Georgian_Georgia, German_Austria, German_Germany, German_Liechtenstein, German_Luxembourg, German_Switzerland, Greek_Greece, Greenlandic_Greenland, Gujarati_India, Hausa_Latin_Nigeria, Hebrew_Israel, Hindi_India, Hungarian_Hungary, Icelandic_Iceland, Igbo_Nigeria, Indonesian_Indonesia, Inuktitut_Canada, Inuktitut_Latin_Canada, Irish_Ireland, isiXhosa_SouthAfrica, isiZulu_SouthAfrica, Italian_Italy, Italian_Switzerland, Japanese_Japan, Kannada_India, Kazakh_Kazakhstan, Khmer_Cambodia, Kiche_Guatemala, Kinyarwanda_Rwanda, Kiswahili_Kenya, Konkani_India, Korean_Korea, Kyrgyz_Kyrgyzstan, Lao_LaoPDR, Latvian_Latvia, Lithuanian_Lithuania, LowerSorbian_Germany, Luxembourgish_Luxembourg, Macedonian_NorthMacedonia, Malay_BruneiDarussalam, Malay_Malaysia, Malayalam_India, Maltese_Malta, Maori_NewZealand, Mapudungun_Chile, Marathi_India, Mohawk_Mohawk, Mongolian_Cyrillic_Mongolia, Mongolian_Traditional_PeoplesRepublicOfChina, Nepali_Nepal, Norwegian_Bokmal_Norway, Norwegian_Nynorsk_Norway, Occitan_France, Odia_India, Pashto_Afghanistan, Polish_Poland, Portuguese_Brazil, Portuguese_Portugal, Punjabi_India, Quechua_Bolivia, Quechua_Ecuador, Quechua_Peru, Romanian_Romania, Romansh_Switzerland, Russian_Russia, Sami_Inari_Finland, Sami_Lule_Norway, Sami_Lule_Sweden, Sami_Northern_Finland, Sami_Northern_Norway, Sami_Northern_Sweden, Sami_Skolt_Finland, Sami_Southern_Norway, Sami_Southern_Sweden, Sanskrit_India, Serbian_Cyrillic_BosniaAndHerzegovina, Serbian_Cyrillic_Serbia, Serbian_Latin_BosniaAndHerzegovina, Serbian_Latin_Serbia, SesothoSaLeboa_SouthAfrica, Setswana_SouthAfrica, Sinhala_SriLanka, Slovak_Slovakia, Slovenian_Slovenia, Spanish_Argentina, Spanish_Bolivia, Spanish_Chile, Spanish_Colombia, Spanish_CostaRica, Spanish_DominicanRepublic, Spanish_Ecuador, Spanish_ElSalvador, Spanish_Guatemala, Spanish_Honduras, Spanish_Mexico, Spanish_Nicaragua, Spanish_Panama, Spanish_Paraguay, Spanish_Peru, Spanish_PuertoRico, Spanish_ModernSort_Spain, Spanish_TraditionalSort_Spain, Spanish_UnitedStates, Spanish_Uruguay, Spanish_Venezuela, Swedish_Finland, Swedish_Sweden, Syriac_Syria, Tajik_Cyrillic_Tajikistan, Tamazight_Latin_Algeria, Tamil_India, Tatar_Russia, Telugu_India, Thai_Thailand, Tibetan_PRC, Turkish_Turkey, Turkmen_Turkmenistan, Uighur_PRC, Ukrainian_Ukraine, UpperSorbian_Germany, Urdu_IslamicRepublicOfPakistan, Uzbek_Cyrillic_Uzbekistan, Uzbek_Latin_Uzbekistan, Vietnamese_Vietnam, Welsh_UnitedKingdom, Wolof_Senegal, Yakut_Russia, Yi_PRC, Yoruba_Nigeria, } impl Language { pub(crate) fn windows_language(id: u16) -> Self { if let Some(index) = TABLE.iter().position(|v| v.0 == id) { TABLE[index].1 } else { Self::Unknown } } /// Returns the primary language. pub fn primary_language(&self) -> &'static str { TABLE[*self as usize].2 } /// Returns a language region. pub fn region(&self) -> &'static str { TABLE[*self as usize].3 } } impl core::fmt::Display for Language { fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result { write!(f, "{} ({})", self.primary_language(), self.region()) } } ttf-parser-0.24.1/src/lib.rs000064400000000000000000002336741046102023000137210ustar 00000000000000/*! A high-level, safe, zero-allocation font parser for: * [TrueType](https://docs.microsoft.com/en-us/typography/truetype/), * [OpenType](https://docs.microsoft.com/en-us/typography/opentype/spec/), and * [AAT](https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6AATIntro.html) fonts. Font parsing starts with a [`Face`]. ## Features - A high-level API for most common properties, hiding all parsing and data resolving logic. - A low-level, but safe API to access TrueType tables data. - Highly configurable. You can disable most of the features, reducing binary size. You can also parse TrueType tables separately, without loading the whole font/face. - Zero heap allocations. - Zero unsafe. - Zero dependencies. - `no_std`/WASM compatible. - Fast. - Stateless. All parsing methods are immutable. - Simple and maintainable code (no magic numbers). ## Safety - The library must not panic. Any panic considered as a critical bug and should be reported. - The library forbids unsafe code. - No heap allocations, so crash due to OOM is not possible. - All recursive methods have a depth limit. - Technically, should use less than 64KiB of stack in worst case scenario. - Most of arithmetic operations are checked. - Most of numeric casts are checked. */ #![no_std] #![forbid(unsafe_code)] #![warn(missing_docs)] #![warn(missing_copy_implementations)] #![warn(missing_debug_implementations)] #![allow(clippy::get_first)] // we use it for readability #![allow(clippy::identity_op)] // we use it for readability #![allow(clippy::too_many_arguments)] #![allow(clippy::collapsible_else_if)] #![allow(clippy::field_reassign_with_default)] #![allow(clippy::upper_case_acronyms)] #![allow(clippy::bool_assert_comparison)] #[cfg(feature = "std")] #[macro_use] extern crate std; #[cfg(not(any(feature = "std", feature = "no-std-float")))] compile_error!("You have to activate either the `std` or the `no-std-float` feature."); #[cfg(not(feature = "std"))] use core_maths::CoreFloat; #[cfg(feature = "apple-layout")] mod aat; #[cfg(feature = "variable-fonts")] mod delta_set; #[cfg(feature = "opentype-layout")] mod ggg; mod language; mod parser; mod tables; #[cfg(feature = "variable-fonts")] mod var_store; use head::IndexToLocationFormat; pub use parser::{Fixed, FromData, LazyArray16, LazyArray32, LazyArrayIter16, LazyArrayIter32}; use parser::{NumFrom, Offset, Offset32, Stream, TryNumFrom}; #[cfg(feature = "variable-fonts")] pub use fvar::VariationAxis; pub use language::Language; pub use name::{name_id, PlatformId}; pub use os2::{Permissions, ScriptMetrics, Style, UnicodeRanges, Weight, Width}; pub use tables::CFFError; #[cfg(feature = "apple-layout")] pub use tables::{ankr, feat, kerx, morx, trak}; #[cfg(feature = "variable-fonts")] pub use tables::{avar, cff2, fvar, gvar, hvar, mvar, vvar}; pub use tables::{cbdt, cblc, cff1 as cff, vhea}; pub use tables::{ cmap, colr, cpal, glyf, head, hhea, hmtx, kern, loca, maxp, name, os2, post, sbix, svg, vorg, }; #[cfg(feature = "opentype-layout")] pub use tables::{gdef, gpos, gsub, math}; #[cfg(feature = "opentype-layout")] pub mod opentype_layout { //! This module contains //! [OpenType Layout](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#overview) //! supplementary tables implementation. pub use crate::ggg::*; } #[cfg(feature = "apple-layout")] pub mod apple_layout { //! This module contains //! [Apple Advanced Typography Layout]( //! https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6AATIntro.html) //! supplementary tables implementation. pub use crate::aat::*; } /// A type-safe wrapper for glyph ID. #[repr(transparent)] #[derive(Clone, Copy, Ord, PartialOrd, Eq, PartialEq, Default, Debug, Hash)] pub struct GlyphId(pub u16); impl FromData for GlyphId { const SIZE: usize = 2; #[inline] fn parse(data: &[u8]) -> Option { u16::parse(data).map(GlyphId) } } /// A TrueType font magic. /// /// https://docs.microsoft.com/en-us/typography/opentype/spec/otff#organization-of-an-opentype-font #[derive(Clone, Copy, PartialEq, Debug)] enum Magic { TrueType, OpenType, FontCollection, } impl FromData for Magic { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { match u32::parse(data)? { 0x00010000 | 0x74727565 => Some(Magic::TrueType), 0x4F54544F => Some(Magic::OpenType), 0x74746366 => Some(Magic::FontCollection), _ => None, } } } /// A variation coordinate in a normalized coordinate system. /// /// Basically any number in a -1.0..1.0 range. /// Where 0 is a default value. /// /// The number is stored as f2.16 #[repr(transparent)] #[derive(Clone, Copy, PartialEq, Eq, Default, Debug)] pub struct NormalizedCoordinate(i16); impl From for NormalizedCoordinate { /// Creates a new coordinate. /// /// The provided number will be clamped to the -16384..16384 range. #[inline] fn from(n: i16) -> Self { NormalizedCoordinate(parser::i16_bound(-16384, n, 16384)) } } impl From for NormalizedCoordinate { /// Creates a new coordinate. /// /// The provided number will be clamped to the -1.0..1.0 range. #[inline] fn from(n: f32) -> Self { NormalizedCoordinate((parser::f32_bound(-1.0, n, 1.0) * 16384.0) as i16) } } impl NormalizedCoordinate { /// Returns the coordinate value as f2.14. #[inline] pub fn get(self) -> i16 { self.0 } } /// A font variation value. /// /// # Example /// /// ``` /// use ttf_parser::{Variation, Tag}; /// /// Variation { axis: Tag::from_bytes(b"wght"), value: 500.0 }; /// ``` #[derive(Clone, Copy, PartialEq, Debug)] pub struct Variation { /// An axis tag name. pub axis: Tag, /// An axis value. pub value: f32, } /// A 4-byte tag. #[repr(transparent)] #[derive(Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash)] pub struct Tag(pub u32); impl Tag { /// Creates a `Tag` from bytes. /// /// # Example /// /// ```rust /// println!("{}", ttf_parser::Tag::from_bytes(b"name")); /// ``` #[inline] pub const fn from_bytes(bytes: &[u8; 4]) -> Self { Tag(((bytes[0] as u32) << 24) | ((bytes[1] as u32) << 16) | ((bytes[2] as u32) << 8) | (bytes[3] as u32)) } /// Creates a `Tag` from bytes. /// /// In case of empty data will return `Tag` set to 0. /// /// When `bytes` are shorter than 4, will set missing bytes to ` `. /// /// Data after first 4 bytes is ignored. #[inline] pub fn from_bytes_lossy(bytes: &[u8]) -> Self { if bytes.is_empty() { return Tag::from_bytes(&[0, 0, 0, 0]); } let mut iter = bytes.iter().cloned().chain(core::iter::repeat(b' ')); Tag::from_bytes(&[ iter.next().unwrap(), iter.next().unwrap(), iter.next().unwrap(), iter.next().unwrap(), ]) } /// Returns tag as 4-element byte array. #[inline] pub const fn to_bytes(self) -> [u8; 4] { [ (self.0 >> 24 & 0xff) as u8, (self.0 >> 16 & 0xff) as u8, (self.0 >> 8 & 0xff) as u8, (self.0 >> 0 & 0xff) as u8, ] } /// Returns tag as 4-element byte array. #[inline] pub const fn to_chars(self) -> [char; 4] { [ (self.0 >> 24 & 0xff) as u8 as char, (self.0 >> 16 & 0xff) as u8 as char, (self.0 >> 8 & 0xff) as u8 as char, (self.0 >> 0 & 0xff) as u8 as char, ] } /// Checks if tag is null / `[0, 0, 0, 0]`. #[inline] pub const fn is_null(&self) -> bool { self.0 == 0 } /// Returns tag value as `u32` number. #[inline] pub const fn as_u32(&self) -> u32 { self.0 } } impl core::fmt::Debug for Tag { #[inline] fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result { write!(f, "Tag({})", self) } } impl core::fmt::Display for Tag { #[inline] fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result { let b = self.to_chars(); write!( f, "{}{}{}{}", b.get(0).unwrap_or(&' '), b.get(1).unwrap_or(&' '), b.get(2).unwrap_or(&' '), b.get(3).unwrap_or(&' ') ) } } impl FromData for Tag { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { u32::parse(data).map(Tag) } } /// A line metrics. /// /// Used for underline and strikeout. #[repr(C)] #[derive(Clone, Copy, PartialEq, Eq, Debug)] pub struct LineMetrics { /// Line position. pub position: i16, /// Line thickness. pub thickness: i16, } /// A rectangle. /// /// Doesn't guarantee that `x_min` <= `x_max` and/or `y_min` <= `y_max`. #[repr(C)] #[allow(missing_docs)] #[derive(Clone, Copy, PartialEq, Eq, Debug)] pub struct Rect { pub x_min: i16, pub y_min: i16, pub x_max: i16, pub y_max: i16, } impl Rect { #[inline] fn zero() -> Self { Self { x_min: 0, y_min: 0, x_max: 0, y_max: 0, } } /// Returns rect's width. #[inline] pub fn width(&self) -> i16 { self.x_max - self.x_min } /// Returns rect's height. #[inline] pub fn height(&self) -> i16 { self.y_max - self.y_min } } /// A rectangle described by the left-lower and upper-right points. #[derive(Clone, Copy, Debug, PartialEq)] pub struct RectF { /// The horizontal minimum of the rect. pub x_min: f32, /// The vertical minimum of the rect. pub y_min: f32, /// The horizontal maximum of the rect. pub x_max: f32, /// The vertical maximum of the rect. pub y_max: f32, } impl RectF { #[inline] fn new() -> Self { RectF { x_min: f32::MAX, y_min: f32::MAX, x_max: f32::MIN, y_max: f32::MIN, } } #[inline] fn is_default(&self) -> bool { self.x_min == f32::MAX && self.y_min == f32::MAX && self.x_max == f32::MIN && self.y_max == f32::MIN } #[inline] fn extend_by(&mut self, x: f32, y: f32) { self.x_min = self.x_min.min(x); self.y_min = self.y_min.min(y); self.x_max = self.x_max.max(x); self.y_max = self.y_max.max(y); } #[inline] fn to_rect(self) -> Option { Some(Rect { x_min: i16::try_num_from(self.x_min)?, y_min: i16::try_num_from(self.y_min)?, x_max: i16::try_num_from(self.x_max)?, y_max: i16::try_num_from(self.y_max)?, }) } } /// An affine transform. #[derive(Clone, Copy, PartialEq)] pub struct Transform { /// The 'a' component of the transform. pub a: f32, /// The 'b' component of the transform. pub b: f32, /// The 'c' component of the transform. pub c: f32, /// The 'd' component of the transform. pub d: f32, /// The 'e' component of the transform. pub e: f32, /// The 'f' component of the transform. pub f: f32, } impl Transform { /// Creates a new transform with the specified components. #[inline] pub fn new(a: f32, b: f32, c: f32, d: f32, e: f32, f: f32) -> Self { Transform { a, b, c, d, e, f } } /// Creates a new translation transform. #[inline] pub fn new_translate(tx: f32, ty: f32) -> Self { Transform::new(1.0, 0.0, 0.0, 1.0, tx, ty) } /// Creates a new rotation transform. #[inline] pub fn new_rotate(angle: f32) -> Self { let cc = (angle * core::f32::consts::PI).cos(); let ss = (angle * core::f32::consts::PI).sin(); Transform::new(cc, ss, -ss, cc, 0.0, 0.0) } /// Creates a new skew transform. #[inline] pub fn new_skew(skew_x: f32, skew_y: f32) -> Self { let x = (skew_x * core::f32::consts::PI).tan(); let y = (skew_y * core::f32::consts::PI).tan(); Transform::new(1.0, y, -x, 1.0, 0.0, 0.0) } /// Creates a new scale transform. #[inline] pub fn new_scale(sx: f32, sy: f32) -> Self { Transform::new(sx, 0.0, 0.0, sy, 0.0, 0.0) } /// Combines two transforms with each other. #[inline] pub fn combine(ts1: Self, ts2: Self) -> Self { Transform { a: ts1.a * ts2.a + ts1.c * ts2.b, b: ts1.b * ts2.a + ts1.d * ts2.b, c: ts1.a * ts2.c + ts1.c * ts2.d, d: ts1.b * ts2.c + ts1.d * ts2.d, e: ts1.a * ts2.e + ts1.c * ts2.f + ts1.e, f: ts1.b * ts2.e + ts1.d * ts2.f + ts1.f, } } #[inline] fn apply_to(&self, x: &mut f32, y: &mut f32) { let tx = *x; let ty = *y; *x = self.a * tx + self.c * ty + self.e; *y = self.b * tx + self.d * ty + self.f; } /// Checks whether a transform is the identity transform. #[inline] pub fn is_default(&self) -> bool { // A direct float comparison is fine in our case. self.a == 1.0 && self.b == 0.0 && self.c == 0.0 && self.d == 1.0 && self.e == 0.0 && self.f == 0.0 } } impl Default for Transform { #[inline] fn default() -> Self { Transform { a: 1.0, b: 0.0, c: 0.0, d: 1.0, e: 0.0, f: 0.0, } } } impl core::fmt::Debug for Transform { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!( f, "Transform({} {} {} {} {} {})", self.a, self.b, self.c, self.d, self.e, self.f ) } } /// A float point. #[derive(Clone, Copy, Debug)] pub struct PointF { /// The X-axis coordinate. pub x: f32, /// The Y-axis coordinate. pub y: f32, } /// Phantom points. /// /// Available only for variable fonts with the `gvar` table. #[derive(Clone, Copy, Debug)] pub struct PhantomPoints { /// Left side bearing point. pub left: PointF, /// Right side bearing point. pub right: PointF, /// Top side bearing point. pub top: PointF, /// Bottom side bearing point. pub bottom: PointF, } /// A RGBA color in the sRGB color space. #[allow(missing_docs)] #[derive(Clone, Copy, PartialEq, Eq, Debug)] pub struct RgbaColor { pub red: u8, pub green: u8, pub blue: u8, pub alpha: u8, } impl RgbaColor { /// Creates a new `RgbaColor`. #[inline] pub fn new(red: u8, green: u8, blue: u8, alpha: u8) -> Self { Self { blue, green, red, alpha, } } pub(crate) fn apply_alpha(&mut self, alpha: f32) { self.alpha = (((f32::from(self.alpha) / 255.0) * alpha) * 255.0) as u8; } } /// A trait for glyph outline construction. pub trait OutlineBuilder { /// Appends a MoveTo segment. /// /// Start of a contour. fn move_to(&mut self, x: f32, y: f32); /// Appends a LineTo segment. fn line_to(&mut self, x: f32, y: f32); /// Appends a QuadTo segment. fn quad_to(&mut self, x1: f32, y1: f32, x: f32, y: f32); /// Appends a CurveTo segment. fn curve_to(&mut self, x1: f32, y1: f32, x2: f32, y2: f32, x: f32, y: f32); /// Appends a ClosePath segment. /// /// End of a contour. fn close(&mut self); } struct DummyOutline; impl OutlineBuilder for DummyOutline { fn move_to(&mut self, _: f32, _: f32) {} fn line_to(&mut self, _: f32, _: f32) {} fn quad_to(&mut self, _: f32, _: f32, _: f32, _: f32) {} fn curve_to(&mut self, _: f32, _: f32, _: f32, _: f32, _: f32, _: f32) {} fn close(&mut self) {} } /// A glyph raster image format. #[allow(missing_docs)] #[derive(Clone, Copy, PartialEq, Eq, Debug)] pub enum RasterImageFormat { PNG, /// A monochrome bitmap. /// /// The most significant bit of the first byte corresponds to the top-left pixel, proceeding /// through succeeding bits moving left to right. The data for each row is padded to a byte /// boundary, so the next row begins with the most significant bit of a new byte. 1 corresponds /// to black, and 0 to white. BitmapMono, /// A packed monochrome bitmap. /// /// The most significant bit of the first byte corresponds to the top-left pixel, proceeding /// through succeeding bits moving left to right. Data is tightly packed with no padding. 1 /// corresponds to black, and 0 to white. BitmapMonoPacked, /// A grayscale bitmap with 2 bits per pixel. /// /// The most significant bits of the first byte corresponds to the top-left pixel, proceeding /// through succeeding bits moving left to right. The data for each row is padded to a byte /// boundary, so the next row begins with the most significant bit of a new byte. BitmapGray2, /// A packed grayscale bitmap with 2 bits per pixel. /// /// The most significant bits of the first byte corresponds to the top-left pixel, proceeding /// through succeeding bits moving left to right. Data is tightly packed with no padding. BitmapGray2Packed, /// A grayscale bitmap with 4 bits per pixel. /// /// The most significant bits of the first byte corresponds to the top-left pixel, proceeding /// through succeeding bits moving left to right. The data for each row is padded to a byte /// boundary, so the next row begins with the most significant bit of a new byte. BitmapGray4, /// A packed grayscale bitmap with 4 bits per pixel. /// /// The most significant bits of the first byte corresponds to the top-left pixel, proceeding /// through succeeding bits moving left to right. Data is tightly packed with no padding. BitmapGray4Packed, /// A grayscale bitmap with 8 bits per pixel. /// /// The first byte corresponds to the top-left pixel, proceeding through succeeding bytes /// moving left to right. BitmapGray8, /// A color bitmap with 32 bits per pixel. /// /// The first group of four bytes corresponds to the top-left pixel, proceeding through /// succeeding pixels moving left to right. Each byte corresponds to a color channel and the /// channels within a pixel are in blue, green, red, alpha order. Color values are /// pre-multiplied by the alpha. For example, the color "full-green with half translucency" /// is encoded as `\x00\x80\x00\x80`, and not `\x00\xFF\x00\x80`. BitmapPremulBgra32, } /// A glyph's raster image. /// /// Note, that glyph metrics are in pixels and not in font units. #[derive(Clone, Copy, PartialEq, Eq, Debug)] pub struct RasterGlyphImage<'a> { /// Horizontal offset. pub x: i16, /// Vertical offset. pub y: i16, /// Image width. /// /// It doesn't guarantee that this value is the same as set in the `data`. pub width: u16, /// Image height. /// /// It doesn't guarantee that this value is the same as set in the `data`. pub height: u16, /// A pixels per em of the selected strike. pub pixels_per_em: u16, /// An image format. pub format: RasterImageFormat, /// A raw image data. It's up to the caller to decode it. pub data: &'a [u8], } /// A raw table record. #[derive(Clone, Copy, Debug)] #[allow(missing_docs)] pub struct TableRecord { pub tag: Tag, #[allow(dead_code)] pub check_sum: u32, pub offset: u32, pub length: u32, } impl FromData for TableRecord { const SIZE: usize = 16; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(TableRecord { tag: s.read::()?, check_sum: s.read::()?, offset: s.read::()?, length: s.read::()?, }) } } #[cfg(feature = "variable-fonts")] const MAX_VAR_COORDS: usize = 64; #[cfg(feature = "variable-fonts")] #[derive(Clone)] struct VarCoords { data: [NormalizedCoordinate; MAX_VAR_COORDS], len: u8, } #[cfg(feature = "variable-fonts")] impl Default for VarCoords { fn default() -> Self { Self { data: [NormalizedCoordinate::default(); MAX_VAR_COORDS], len: u8::default(), } } } #[cfg(feature = "variable-fonts")] impl VarCoords { #[inline] fn as_slice(&self) -> &[NormalizedCoordinate] { &self.data[0..usize::from(self.len)] } #[inline] fn as_mut_slice(&mut self) -> &mut [NormalizedCoordinate] { let end = usize::from(self.len); &mut self.data[0..end] } } /// A list of font face parsing errors. #[derive(Clone, Copy, PartialEq, Eq, Debug)] pub enum FaceParsingError { /// An attempt to read out of bounds detected. /// /// Should occur only on malformed fonts. MalformedFont, /// Face data must start with `0x00010000`, `0x74727565`, `0x4F54544F` or `0x74746366`. UnknownMagic, /// The face index is larger than the number of faces in the font. FaceIndexOutOfBounds, /// The `head` table is missing or malformed. NoHeadTable, /// The `hhea` table is missing or malformed. NoHheaTable, /// The `maxp` table is missing or malformed. NoMaxpTable, } impl core::fmt::Display for FaceParsingError { fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result { match self { FaceParsingError::MalformedFont => write!(f, "malformed font"), FaceParsingError::UnknownMagic => write!(f, "unknown magic"), FaceParsingError::FaceIndexOutOfBounds => write!(f, "face index is out of bounds"), FaceParsingError::NoHeadTable => write!(f, "the head table is missing or malformed"), FaceParsingError::NoHheaTable => write!(f, "the hhea table is missing or malformed"), FaceParsingError::NoMaxpTable => write!(f, "the maxp table is missing or malformed"), } } } #[cfg(feature = "std")] impl std::error::Error for FaceParsingError {} /// A raw font face. /// /// You are probably looking for [`Face`]. This is a low-level type. /// /// Unlike [`Face`], [`RawFace`] parses only face table records. /// Meaning all you can get from this type is a raw (`&[u8]`) data of a requested table. /// Then you can either parse just a singe table from a font/face or populate [`RawFaceTables`] /// manually before passing it to [`Face::from_raw_tables`]. #[derive(Clone, Copy)] pub struct RawFace<'a> { /// The input font file data. pub data: &'a [u8], /// An array of table records. pub table_records: LazyArray16<'a, TableRecord>, } impl<'a> RawFace<'a> { /// Creates a new [`RawFace`] from a raw data. /// /// `index` indicates the specific font face in a font collection. /// Use [`fonts_in_collection`] to get the total number of font faces. /// Set to 0 if unsure. /// /// While we do reuse [`FaceParsingError`], `No*Table` errors will not be throws. #[deprecated(since = "0.16.0", note = "use `parse` instead")] pub fn from_slice(data: &'a [u8], index: u32) -> Result { Self::parse(data, index) } /// Creates a new [`RawFace`] from a raw data. /// /// `index` indicates the specific font face in a font collection. /// Use [`fonts_in_collection`] to get the total number of font faces. /// Set to 0 if unsure. /// /// While we do reuse [`FaceParsingError`], `No*Table` errors will not be throws. pub fn parse(data: &'a [u8], index: u32) -> Result { // https://docs.microsoft.com/en-us/typography/opentype/spec/otff#organization-of-an-opentype-font let mut s = Stream::new(data); // Read **font** magic. let magic = s.read::().ok_or(FaceParsingError::UnknownMagic)?; if magic == Magic::FontCollection { s.skip::(); // version let number_of_faces = s.read::().ok_or(FaceParsingError::MalformedFont)?; let offsets = s .read_array32::(number_of_faces) .ok_or(FaceParsingError::MalformedFont)?; let face_offset = offsets .get(index) .ok_or(FaceParsingError::FaceIndexOutOfBounds)?; // Face offset is from the start of the font data, // so we have to adjust it to the current parser offset. let face_offset = face_offset .to_usize() .checked_sub(s.offset()) .ok_or(FaceParsingError::MalformedFont)?; s.advance_checked(face_offset) .ok_or(FaceParsingError::MalformedFont)?; // Read **face** magic. // Each face in a font collection also starts with a magic. let magic = s.read::().ok_or(FaceParsingError::UnknownMagic)?; // And face in a font collection can't be another collection. if magic == Magic::FontCollection { return Err(FaceParsingError::UnknownMagic); } } else { // When reading from a regular font (not a collection) disallow index to be non-zero // Basically treat the font as a one-element collection if index != 0 { return Err(FaceParsingError::FaceIndexOutOfBounds); } } let num_tables = s.read::().ok_or(FaceParsingError::MalformedFont)?; s.advance(6); // searchRange (u16) + entrySelector (u16) + rangeShift (u16) let table_records = s .read_array16::(num_tables) .ok_or(FaceParsingError::MalformedFont)?; Ok(RawFace { data, table_records, }) } /// Returns the raw data of a selected table. pub fn table(&self, tag: Tag) -> Option<&'a [u8]> { let (_, table) = self .table_records .binary_search_by(|record| record.tag.cmp(&tag))?; let offset = usize::num_from(table.offset); let length = usize::num_from(table.length); let end = offset.checked_add(length)?; self.data.get(offset..end) } } impl core::fmt::Debug for RawFace<'_> { fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result { write!(f, "RawFace {{ ... }}") } } /// A list of all supported tables as raw data. /// /// This type should be used in tandem with /// [`Face::from_raw_tables()`](struct.Face.html#method.from_raw_tables). /// /// This allows loading font faces not only from TrueType font files, /// but from any source. Mainly used for parsing WOFF. #[allow(missing_docs)] #[allow(missing_debug_implementations)] #[derive(Clone, Default)] pub struct RawFaceTables<'a> { // Mandatory tables. pub head: &'a [u8], pub hhea: &'a [u8], pub maxp: &'a [u8], pub bdat: Option<&'a [u8]>, pub bloc: Option<&'a [u8]>, pub cbdt: Option<&'a [u8]>, pub cblc: Option<&'a [u8]>, pub cff: Option<&'a [u8]>, pub cmap: Option<&'a [u8]>, pub colr: Option<&'a [u8]>, pub cpal: Option<&'a [u8]>, pub ebdt: Option<&'a [u8]>, pub eblc: Option<&'a [u8]>, pub glyf: Option<&'a [u8]>, pub hmtx: Option<&'a [u8]>, pub kern: Option<&'a [u8]>, pub loca: Option<&'a [u8]>, pub name: Option<&'a [u8]>, pub os2: Option<&'a [u8]>, pub post: Option<&'a [u8]>, pub sbix: Option<&'a [u8]>, pub svg: Option<&'a [u8]>, pub vhea: Option<&'a [u8]>, pub vmtx: Option<&'a [u8]>, pub vorg: Option<&'a [u8]>, #[cfg(feature = "opentype-layout")] pub gdef: Option<&'a [u8]>, #[cfg(feature = "opentype-layout")] pub gpos: Option<&'a [u8]>, #[cfg(feature = "opentype-layout")] pub gsub: Option<&'a [u8]>, #[cfg(feature = "opentype-layout")] pub math: Option<&'a [u8]>, #[cfg(feature = "apple-layout")] pub ankr: Option<&'a [u8]>, #[cfg(feature = "apple-layout")] pub feat: Option<&'a [u8]>, #[cfg(feature = "apple-layout")] pub kerx: Option<&'a [u8]>, #[cfg(feature = "apple-layout")] pub morx: Option<&'a [u8]>, #[cfg(feature = "apple-layout")] pub trak: Option<&'a [u8]>, #[cfg(feature = "variable-fonts")] pub avar: Option<&'a [u8]>, #[cfg(feature = "variable-fonts")] pub cff2: Option<&'a [u8]>, #[cfg(feature = "variable-fonts")] pub fvar: Option<&'a [u8]>, #[cfg(feature = "variable-fonts")] pub gvar: Option<&'a [u8]>, #[cfg(feature = "variable-fonts")] pub hvar: Option<&'a [u8]>, #[cfg(feature = "variable-fonts")] pub mvar: Option<&'a [u8]>, #[cfg(feature = "variable-fonts")] pub vvar: Option<&'a [u8]>, } /// Parsed face tables. /// /// Unlike [`Face`], provides a low-level parsing abstraction over TrueType tables. /// Useful when you need a direct access to tables data. /// /// Also, used when high-level API is problematic to implement. /// A good example would be OpenType layout tables (GPOS/GSUB). #[allow(missing_docs)] #[allow(missing_debug_implementations)] #[derive(Clone)] pub struct FaceTables<'a> { // Mandatory tables. pub head: head::Table, pub hhea: hhea::Table, pub maxp: maxp::Table, pub bdat: Option>, pub cbdt: Option>, pub cff: Option>, pub cmap: Option>, pub colr: Option>, pub ebdt: Option>, pub glyf: Option>, pub hmtx: Option>, pub kern: Option>, pub name: Option>, pub os2: Option>, pub post: Option>, pub sbix: Option>, pub svg: Option>, pub vhea: Option, pub vmtx: Option>, pub vorg: Option>, #[cfg(feature = "opentype-layout")] pub gdef: Option>, #[cfg(feature = "opentype-layout")] pub gpos: Option>, #[cfg(feature = "opentype-layout")] pub gsub: Option>, #[cfg(feature = "opentype-layout")] pub math: Option>, #[cfg(feature = "apple-layout")] pub ankr: Option>, #[cfg(feature = "apple-layout")] pub feat: Option>, #[cfg(feature = "apple-layout")] pub kerx: Option>, #[cfg(feature = "apple-layout")] pub morx: Option>, #[cfg(feature = "apple-layout")] pub trak: Option>, #[cfg(feature = "variable-fonts")] pub avar: Option>, #[cfg(feature = "variable-fonts")] pub cff2: Option>, #[cfg(feature = "variable-fonts")] pub fvar: Option>, #[cfg(feature = "variable-fonts")] pub gvar: Option>, #[cfg(feature = "variable-fonts")] pub hvar: Option>, #[cfg(feature = "variable-fonts")] pub mvar: Option>, #[cfg(feature = "variable-fonts")] pub vvar: Option>, } /// A font face. /// /// Provides a high-level API for working with TrueType fonts. /// If you're not familiar with how TrueType works internally, you should use this type. /// If you do know and want a bit more low-level access - checkout [`FaceTables`]. /// /// Note that `Face` doesn't own the font data and doesn't allocate anything in heap. /// Therefore you cannot "store" it. The idea is that you should parse the `Face` /// when needed, get required data and forget about it. /// That's why the initial parsing is highly optimized and should not become a bottleneck. /// /// If you still want to store `Face` - checkout /// [owned_ttf_parser](https://crates.io/crates/owned_ttf_parser). Requires `unsafe`. /// /// While `Face` is technically copyable, we disallow it because it's almost 2KB big. #[derive(Clone)] pub struct Face<'a> { raw_face: RawFace<'a>, tables: FaceTables<'a>, // Parsed tables. #[cfg(feature = "variable-fonts")] coordinates: VarCoords, } impl<'a> Face<'a> { /// Creates a new [`Face`] from a raw data. /// /// `index` indicates the specific font face in a font collection. /// Use [`fonts_in_collection`] to get the total number of font faces. /// Set to 0 if unsure. /// /// This method will do some parsing and sanitization, /// but in general can be considered free. No significant performance overhead. /// /// Required tables: `head`, `hhea` and `maxp`. /// /// If an optional table has invalid data it will be skipped. #[deprecated(since = "0.16.0", note = "use `parse` instead")] pub fn from_slice(data: &'a [u8], index: u32) -> Result { Self::parse(data, index) } /// Creates a new [`Face`] from a raw data. /// /// `index` indicates the specific font face in a font collection. /// Use [`fonts_in_collection`] to get the total number of font faces. /// Set to 0 if unsure. /// /// This method will do some parsing and sanitization, /// but in general can be considered free. No significant performance overhead. /// /// Required tables: `head`, `hhea` and `maxp`. /// /// If an optional table has invalid data it will be skipped. pub fn parse(data: &'a [u8], index: u32) -> Result { let raw_face = RawFace::parse(data, index)?; let raw_tables = Self::collect_tables(raw_face); #[allow(unused_mut)] let mut face = Face { raw_face, #[cfg(feature = "variable-fonts")] coordinates: VarCoords::default(), tables: Self::parse_tables(raw_tables)?, }; #[cfg(feature = "variable-fonts")] { if let Some(ref fvar) = face.tables.fvar { face.coordinates.len = fvar.axes.len().min(MAX_VAR_COORDS as u16) as u8; } } Ok(face) } fn collect_tables(raw_face: RawFace<'a>) -> RawFaceTables<'a> { let mut tables = RawFaceTables::default(); for record in raw_face.table_records { let start = usize::num_from(record.offset); let end = match start.checked_add(usize::num_from(record.length)) { Some(v) => v, None => continue, }; let table_data = raw_face.data.get(start..end); match &record.tag.to_bytes() { b"bdat" => tables.bdat = table_data, b"bloc" => tables.bloc = table_data, b"CBDT" => tables.cbdt = table_data, b"CBLC" => tables.cblc = table_data, b"CFF " => tables.cff = table_data, #[cfg(feature = "variable-fonts")] b"CFF2" => tables.cff2 = table_data, b"COLR" => tables.colr = table_data, b"CPAL" => tables.cpal = table_data, b"EBDT" => tables.ebdt = table_data, b"EBLC" => tables.eblc = table_data, #[cfg(feature = "opentype-layout")] b"GDEF" => tables.gdef = table_data, #[cfg(feature = "opentype-layout")] b"GPOS" => tables.gpos = table_data, #[cfg(feature = "opentype-layout")] b"GSUB" => tables.gsub = table_data, #[cfg(feature = "opentype-layout")] b"MATH" => tables.math = table_data, #[cfg(feature = "variable-fonts")] b"HVAR" => tables.hvar = table_data, #[cfg(feature = "variable-fonts")] b"MVAR" => tables.mvar = table_data, b"OS/2" => tables.os2 = table_data, b"SVG " => tables.svg = table_data, b"VORG" => tables.vorg = table_data, #[cfg(feature = "variable-fonts")] b"VVAR" => tables.vvar = table_data, #[cfg(feature = "apple-layout")] b"ankr" => tables.ankr = table_data, #[cfg(feature = "variable-fonts")] b"avar" => tables.avar = table_data, b"cmap" => tables.cmap = table_data, #[cfg(feature = "apple-layout")] b"feat" => tables.feat = table_data, #[cfg(feature = "variable-fonts")] b"fvar" => tables.fvar = table_data, b"glyf" => tables.glyf = table_data, #[cfg(feature = "variable-fonts")] b"gvar" => tables.gvar = table_data, b"head" => tables.head = table_data.unwrap_or_default(), b"hhea" => tables.hhea = table_data.unwrap_or_default(), b"hmtx" => tables.hmtx = table_data, b"kern" => tables.kern = table_data, #[cfg(feature = "apple-layout")] b"kerx" => tables.kerx = table_data, b"loca" => tables.loca = table_data, b"maxp" => tables.maxp = table_data.unwrap_or_default(), #[cfg(feature = "apple-layout")] b"morx" => tables.morx = table_data, b"name" => tables.name = table_data, b"post" => tables.post = table_data, b"sbix" => tables.sbix = table_data, #[cfg(feature = "apple-layout")] b"trak" => tables.trak = table_data, b"vhea" => tables.vhea = table_data, b"vmtx" => tables.vmtx = table_data, _ => {} } } tables } /// Creates a new [`Face`] from provided [`RawFaceTables`]. pub fn from_raw_tables(raw_tables: RawFaceTables<'a>) -> Result { #[allow(unused_mut)] let mut face = Face { raw_face: RawFace { data: &[], table_records: LazyArray16::default(), }, #[cfg(feature = "variable-fonts")] coordinates: VarCoords::default(), tables: Self::parse_tables(raw_tables)?, }; #[cfg(feature = "variable-fonts")] { if let Some(ref fvar) = face.tables.fvar { face.coordinates.len = fvar.axes.len().min(MAX_VAR_COORDS as u16) as u8; } } Ok(face) } fn parse_tables(raw_tables: RawFaceTables<'a>) -> Result, FaceParsingError> { let head = head::Table::parse(raw_tables.head).ok_or(FaceParsingError::NoHeadTable)?; let hhea = hhea::Table::parse(raw_tables.hhea).ok_or(FaceParsingError::NoHheaTable)?; let maxp = maxp::Table::parse(raw_tables.maxp).ok_or(FaceParsingError::NoMaxpTable)?; let hmtx = raw_tables.hmtx.and_then(|data| { hmtx::Table::parse(hhea.number_of_metrics, maxp.number_of_glyphs, data) }); let vhea = raw_tables.vhea.and_then(vhea::Table::parse); let vmtx = if let Some(vhea) = vhea { raw_tables.vmtx.and_then(|data| { hmtx::Table::parse(vhea.number_of_metrics, maxp.number_of_glyphs, data) }) } else { None }; let loca = raw_tables.loca.and_then(|data| { loca::Table::parse(maxp.number_of_glyphs, head.index_to_location_format, data) }); let glyf = if let Some(loca) = loca { raw_tables .glyf .and_then(|data| glyf::Table::parse(loca, data)) } else { None }; let bdat = if let Some(bloc) = raw_tables.bloc.and_then(cblc::Table::parse) { raw_tables .bdat .and_then(|data| cbdt::Table::parse(bloc, data)) } else { None }; let cbdt = if let Some(cblc) = raw_tables.cblc.and_then(cblc::Table::parse) { raw_tables .cbdt .and_then(|data| cbdt::Table::parse(cblc, data)) } else { None }; let ebdt = if let Some(eblc) = raw_tables.eblc.and_then(cblc::Table::parse) { raw_tables .ebdt .and_then(|data| cbdt::Table::parse(eblc, data)) } else { None }; let cpal = raw_tables.cpal.and_then(cpal::Table::parse); let colr = if let Some(cpal) = cpal { raw_tables .colr .and_then(|data| colr::Table::parse(cpal, data)) } else { None }; Ok(FaceTables { head, hhea, maxp, bdat, cbdt, cff: raw_tables.cff.and_then(cff::Table::parse), cmap: raw_tables.cmap.and_then(cmap::Table::parse), colr, ebdt, glyf, hmtx, kern: raw_tables.kern.and_then(kern::Table::parse), name: raw_tables.name.and_then(name::Table::parse), os2: raw_tables.os2.and_then(os2::Table::parse), post: raw_tables.post.and_then(post::Table::parse), sbix: raw_tables .sbix .and_then(|data| sbix::Table::parse(maxp.number_of_glyphs, data)), svg: raw_tables.svg.and_then(svg::Table::parse), vhea: raw_tables.vhea.and_then(vhea::Table::parse), vmtx, vorg: raw_tables.vorg.and_then(vorg::Table::parse), #[cfg(feature = "opentype-layout")] gdef: raw_tables.gdef.and_then(gdef::Table::parse), #[cfg(feature = "opentype-layout")] gpos: raw_tables .gpos .and_then(opentype_layout::LayoutTable::parse), #[cfg(feature = "opentype-layout")] gsub: raw_tables .gsub .and_then(opentype_layout::LayoutTable::parse), #[cfg(feature = "opentype-layout")] math: raw_tables.math.and_then(math::Table::parse), #[cfg(feature = "apple-layout")] ankr: raw_tables .ankr .and_then(|data| ankr::Table::parse(maxp.number_of_glyphs, data)), #[cfg(feature = "apple-layout")] feat: raw_tables.feat.and_then(feat::Table::parse), #[cfg(feature = "apple-layout")] kerx: raw_tables .kerx .and_then(|data| kerx::Table::parse(maxp.number_of_glyphs, data)), #[cfg(feature = "apple-layout")] morx: raw_tables .morx .and_then(|data| morx::Table::parse(maxp.number_of_glyphs, data)), #[cfg(feature = "apple-layout")] trak: raw_tables.trak.and_then(trak::Table::parse), #[cfg(feature = "variable-fonts")] avar: raw_tables.avar.and_then(avar::Table::parse), #[cfg(feature = "variable-fonts")] cff2: raw_tables.cff2.and_then(cff2::Table::parse), #[cfg(feature = "variable-fonts")] fvar: raw_tables.fvar.and_then(fvar::Table::parse), #[cfg(feature = "variable-fonts")] gvar: raw_tables.gvar.and_then(gvar::Table::parse), #[cfg(feature = "variable-fonts")] hvar: raw_tables.hvar.and_then(hvar::Table::parse), #[cfg(feature = "variable-fonts")] mvar: raw_tables.mvar.and_then(mvar::Table::parse), #[cfg(feature = "variable-fonts")] vvar: raw_tables.vvar.and_then(vvar::Table::parse), }) } /// Returns low-level face tables. #[inline] pub fn tables(&self) -> &FaceTables<'a> { &self.tables } /// Returns the `RawFace` used to create this `Face`. /// /// Useful if you want to parse the data manually. /// /// Available only for faces created using [`Face::parse()`](struct.Face.html#method.parse). #[inline] pub fn raw_face(&self) -> &RawFace<'a> { &self.raw_face } /// Returns the raw data of a selected table. /// /// Useful if you want to parse the data manually. /// /// Available only for faces created using [`Face::parse()`](struct.Face.html#method.parse). #[deprecated(since = "0.16.0", note = "use `self.raw_face().table()` instead")] #[inline] pub fn table_data(&self, tag: Tag) -> Option<&'a [u8]> { self.raw_face.table(tag) } /// Returns a list of names. /// /// Contains face name and other strings. #[inline] pub fn names(&self) -> name::Names<'a> { self.tables.name.unwrap_or_default().names } /// Checks that face is marked as *Regular*. /// /// Returns `false` when OS/2 table is not present. #[inline] pub fn is_regular(&self) -> bool { self.tables .os2 .map(|s| s.style() == Style::Normal) .unwrap_or(false) } /// Checks that face is marked as *Italic*. /// /// Returns `false` when OS/2 table is not present. #[inline] pub fn is_italic(&self) -> bool { self.tables .os2 .map(|s| s.style() == Style::Italic) .unwrap_or(false) } /// Checks that face is marked as *Bold*. /// /// Returns `false` when OS/2 table is not present. #[inline] pub fn is_bold(&self) -> bool { self.tables.os2.map(|os2| os2.is_bold()).unwrap_or(false) } /// Checks that face is marked as *Oblique*. /// /// Returns `false` when OS/2 table is not present or when its version is < 4. #[inline] pub fn is_oblique(&self) -> bool { self.tables .os2 .map(|s| s.style() == Style::Oblique) .unwrap_or(false) } /// Returns face style. #[inline] pub fn style(&self) -> Style { self.tables.os2.map(|os2| os2.style()).unwrap_or_default() } /// Checks that face is marked as *Monospaced*. /// /// Returns `false` when `post` table is not present. #[inline] pub fn is_monospaced(&self) -> bool { self.tables .post .map(|post| post.is_monospaced) .unwrap_or(false) } /// Checks that face is variable. /// /// Simply checks the presence of a `fvar` table. #[inline] pub fn is_variable(&self) -> bool { #[cfg(feature = "variable-fonts")] { // `fvar::Table::parse` already checked that `axisCount` is non-zero. self.tables.fvar.is_some() } #[cfg(not(feature = "variable-fonts"))] { false } } /// Returns face's weight. /// /// Returns `Weight::Normal` when OS/2 table is not present. #[inline] pub fn weight(&self) -> Weight { self.tables.os2.map(|os2| os2.weight()).unwrap_or_default() } /// Returns face's width. /// /// Returns `Width::Normal` when OS/2 table is not present or when value is invalid. #[inline] pub fn width(&self) -> Width { self.tables.os2.map(|os2| os2.width()).unwrap_or_default() } /// Returns face's italic angle. /// /// Returns `None` when `post` table is not present. #[inline] pub fn italic_angle(&self) -> Option { self.tables.post.map(|table| table.italic_angle) } // Read https://github.com/freetype/freetype/blob/49270c17011491227ec7bd3fb73ede4f674aa065/src/sfnt/sfobjs.c#L1279 // to learn more about the logic behind the following functions. /// Returns a horizontal face ascender. /// /// This method is affected by variation axes. #[inline] pub fn ascender(&self) -> i16 { if let Some(os_2) = self.tables.os2 { if os_2.use_typographic_metrics() { let value = os_2.typographic_ascender(); return self.apply_metrics_variation(Tag::from_bytes(b"hasc"), value); } } let mut value = self.tables.hhea.ascender; if value == 0 { if let Some(os_2) = self.tables.os2 { value = os_2.typographic_ascender(); if value == 0 { value = os_2.windows_ascender(); value = self.apply_metrics_variation(Tag::from_bytes(b"hcla"), value); } else { value = self.apply_metrics_variation(Tag::from_bytes(b"hasc"), value); } } } value } /// Returns a horizontal face descender. /// /// This method is affected by variation axes. #[inline] pub fn descender(&self) -> i16 { if let Some(os_2) = self.tables.os2 { if os_2.use_typographic_metrics() { let value = os_2.typographic_descender(); return self.apply_metrics_variation(Tag::from_bytes(b"hdsc"), value); } } let mut value = self.tables.hhea.descender; if value == 0 { if let Some(os_2) = self.tables.os2 { value = os_2.typographic_descender(); if value == 0 { value = os_2.windows_descender(); value = self.apply_metrics_variation(Tag::from_bytes(b"hcld"), value); } else { value = self.apply_metrics_variation(Tag::from_bytes(b"hdsc"), value); } } } value } /// Returns face's height. /// /// This method is affected by variation axes. #[inline] pub fn height(&self) -> i16 { self.ascender() - self.descender() } /// Returns a horizontal face line gap. /// /// This method is affected by variation axes. #[inline] pub fn line_gap(&self) -> i16 { if let Some(os_2) = self.tables.os2 { if os_2.use_typographic_metrics() { let value = os_2.typographic_line_gap(); return self.apply_metrics_variation(Tag::from_bytes(b"hlgp"), value); } } let mut value = self.tables.hhea.line_gap; // For line gap, we have to check that ascender or descender are 0, not line gap itself. if self.tables.hhea.ascender == 0 || self.tables.hhea.descender == 0 { if let Some(os_2) = self.tables.os2 { if os_2.typographic_ascender() != 0 || os_2.typographic_descender() != 0 { value = os_2.typographic_line_gap(); value = self.apply_metrics_variation(Tag::from_bytes(b"hlgp"), value); } else { value = 0; } } } value } /// Returns a horizontal typographic face ascender. /// /// Prefer `Face::ascender` unless you explicitly want this. This is a more /// low-level alternative. /// /// This method is affected by variation axes. /// /// Returns `None` when OS/2 table is not present. #[inline] pub fn typographic_ascender(&self) -> Option { self.tables.os2.map(|table| { let v = table.typographic_ascender(); self.apply_metrics_variation(Tag::from_bytes(b"hasc"), v) }) } /// Returns a horizontal typographic face descender. /// /// Prefer `Face::descender` unless you explicitly want this. This is a more /// low-level alternative. /// /// This method is affected by variation axes. /// /// Returns `None` when OS/2 table is not present. #[inline] pub fn typographic_descender(&self) -> Option { self.tables.os2.map(|table| { let v = table.typographic_descender(); self.apply_metrics_variation(Tag::from_bytes(b"hdsc"), v) }) } /// Returns a horizontal typographic face line gap. /// /// Prefer `Face::line_gap` unless you explicitly want this. This is a more /// low-level alternative. /// /// This method is affected by variation axes. /// /// Returns `None` when OS/2 table is not present. #[inline] pub fn typographic_line_gap(&self) -> Option { self.tables.os2.map(|table| { let v = table.typographic_line_gap(); self.apply_metrics_variation(Tag::from_bytes(b"hlgp"), v) }) } /// Returns a vertical face ascender. /// /// This method is affected by variation axes. #[inline] pub fn vertical_ascender(&self) -> Option { self.tables .vhea .map(|vhea| vhea.ascender) .map(|v| self.apply_metrics_variation(Tag::from_bytes(b"vasc"), v)) } /// Returns a vertical face descender. /// /// This method is affected by variation axes. #[inline] pub fn vertical_descender(&self) -> Option { self.tables .vhea .map(|vhea| vhea.descender) .map(|v| self.apply_metrics_variation(Tag::from_bytes(b"vdsc"), v)) } /// Returns a vertical face height. /// /// This method is affected by variation axes. #[inline] pub fn vertical_height(&self) -> Option { Some(self.vertical_ascender()? - self.vertical_descender()?) } /// Returns a vertical face line gap. /// /// This method is affected by variation axes. #[inline] pub fn vertical_line_gap(&self) -> Option { self.tables .vhea .map(|vhea| vhea.line_gap) .map(|v| self.apply_metrics_variation(Tag::from_bytes(b"vlgp"), v)) } /// Returns face's units per EM. /// /// Guarantee to be in a 16..=16384 range. #[inline] pub fn units_per_em(&self) -> u16 { self.tables.head.units_per_em } /// Returns face's x height. /// /// This method is affected by variation axes. /// /// Returns `None` when OS/2 table is not present or when its version is < 2. #[inline] pub fn x_height(&self) -> Option { self.tables .os2 .and_then(|os_2| os_2.x_height()) .map(|v| self.apply_metrics_variation(Tag::from_bytes(b"xhgt"), v)) } /// Returns face's capital height. /// /// This method is affected by variation axes. /// /// Returns `None` when OS/2 table is not present or when its version is < 2. #[inline] pub fn capital_height(&self) -> Option { self.tables .os2 .and_then(|os_2| os_2.capital_height()) .map(|v| self.apply_metrics_variation(Tag::from_bytes(b"cpht"), v)) } /// Returns face's underline metrics. /// /// This method is affected by variation axes. /// /// Returns `None` when `post` table is not present. #[inline] pub fn underline_metrics(&self) -> Option { let mut metrics = self.tables.post?.underline_metrics; if self.is_variable() { self.apply_metrics_variation_to(Tag::from_bytes(b"undo"), &mut metrics.position); self.apply_metrics_variation_to(Tag::from_bytes(b"unds"), &mut metrics.thickness); } Some(metrics) } /// Returns face's strikeout metrics. /// /// This method is affected by variation axes. /// /// Returns `None` when OS/2 table is not present. #[inline] pub fn strikeout_metrics(&self) -> Option { let mut metrics = self.tables.os2?.strikeout_metrics(); if self.is_variable() { self.apply_metrics_variation_to(Tag::from_bytes(b"stro"), &mut metrics.position); self.apply_metrics_variation_to(Tag::from_bytes(b"strs"), &mut metrics.thickness); } Some(metrics) } /// Returns face's subscript metrics. /// /// This method is affected by variation axes. /// /// Returns `None` when OS/2 table is not present. #[inline] pub fn subscript_metrics(&self) -> Option { let mut metrics = self.tables.os2?.subscript_metrics(); if self.is_variable() { self.apply_metrics_variation_to(Tag::from_bytes(b"sbxs"), &mut metrics.x_size); self.apply_metrics_variation_to(Tag::from_bytes(b"sbys"), &mut metrics.y_size); self.apply_metrics_variation_to(Tag::from_bytes(b"sbxo"), &mut metrics.x_offset); self.apply_metrics_variation_to(Tag::from_bytes(b"sbyo"), &mut metrics.y_offset); } Some(metrics) } /// Returns face's superscript metrics. /// /// This method is affected by variation axes. /// /// Returns `None` when OS/2 table is not present. #[inline] pub fn superscript_metrics(&self) -> Option { let mut metrics = self.tables.os2?.superscript_metrics(); if self.is_variable() { self.apply_metrics_variation_to(Tag::from_bytes(b"spxs"), &mut metrics.x_size); self.apply_metrics_variation_to(Tag::from_bytes(b"spys"), &mut metrics.y_size); self.apply_metrics_variation_to(Tag::from_bytes(b"spxo"), &mut metrics.x_offset); self.apply_metrics_variation_to(Tag::from_bytes(b"spyo"), &mut metrics.y_offset); } Some(metrics) } /// Returns face permissions. /// /// Returns `None` in case of a malformed value. #[inline] pub fn permissions(&self) -> Option { self.tables.os2?.permissions() } /// Checks if the face allows embedding a subset, further restricted by [`Self::permissions`]. #[inline] pub fn is_subsetting_allowed(&self) -> bool { self.tables .os2 .map(|t| t.is_subsetting_allowed()) .unwrap_or(false) } /// Checks if the face allows outline data to be embedded. /// /// If false, only bitmaps may be embedded in accordance with [`Self::permissions`]. /// /// If the font contains no bitmaps and this flag is not set, it implies no embedding is allowed. #[inline] pub fn is_outline_embedding_allowed(&self) -> bool { self.tables .os2 .map(|t| t.is_outline_embedding_allowed()) .unwrap_or(false) } /// Returns [Unicode Ranges](https://docs.microsoft.com/en-us/typography/opentype/spec/os2#ur). #[inline] pub fn unicode_ranges(&self) -> UnicodeRanges { self.tables .os2 .map(|t| t.unicode_ranges()) .unwrap_or_default() } /// Returns a total number of glyphs in the face. /// /// Never zero. /// /// The value was already parsed, so this function doesn't involve any parsing. #[inline] pub fn number_of_glyphs(&self) -> u16 { self.tables.maxp.number_of_glyphs.get() } /// Resolves a Glyph ID for a code point. /// /// Returns `None` instead of `0` when glyph is not found. /// /// All subtable formats except Mixed Coverage (8) are supported. /// /// If you need a more low-level control, prefer `Face::tables().cmap`. #[inline] pub fn glyph_index(&self, code_point: char) -> Option { for subtable in self.tables.cmap?.subtables { if !subtable.is_unicode() { continue; } if let Some(id) = subtable.glyph_index(u32::from(code_point)) { return Some(id); } } None } /// Resolves a Glyph ID for a glyph name. /// /// Uses the `post` and `CFF` tables as sources. /// /// Returns `None` when no name is associated with a `glyph`. #[cfg(feature = "glyph-names")] #[inline] pub fn glyph_index_by_name(&self, name: &str) -> Option { if let Some(name) = self .tables .post .and_then(|post| post.glyph_index_by_name(name)) { return Some(name); } if let Some(name) = self .tables .cff .as_ref() .and_then(|cff| cff.glyph_index_by_name(name)) { return Some(name); } None } /// Resolves a variation of a Glyph ID from two code points. /// /// Implemented according to /// [Unicode Variation Sequences]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-14-unicode-variation-sequences). /// /// Returns `None` instead of `0` when glyph is not found. #[inline] pub fn glyph_variation_index(&self, code_point: char, variation: char) -> Option { for subtable in self.tables.cmap?.subtables { if let cmap::Format::UnicodeVariationSequences(ref table) = subtable.format { return match table.glyph_index(u32::from(code_point), u32::from(variation))? { cmap::GlyphVariationResult::Found(v) => Some(v), cmap::GlyphVariationResult::UseDefault => self.glyph_index(code_point), }; } } None } /// Returns glyph's horizontal advance. /// /// This method is affected by variation axes. #[inline] pub fn glyph_hor_advance(&self, glyph_id: GlyphId) -> Option { #[cfg(feature = "variable-fonts")] { let mut advance = self.tables.hmtx?.advance(glyph_id)? as f32; if self.is_variable() { // Ignore variation offset when `hvar` is not set. if let Some(hvar) = self.tables.hvar { if let Some(offset) = hvar.advance_offset(glyph_id, self.coords()) { // We can't use `round()` in `no_std`, so this is the next best thing. advance += offset + 0.5; } } else if let Some(points) = self.glyph_phantom_points(glyph_id) { // We can't use `round()` in `no_std`, so this is the next best thing. advance += points.right.x + 0.5 } } u16::try_num_from(advance) } #[cfg(not(feature = "variable-fonts"))] { self.tables.hmtx?.advance(glyph_id) } } /// Returns glyph's vertical advance. /// /// This method is affected by variation axes. #[inline] pub fn glyph_ver_advance(&self, glyph_id: GlyphId) -> Option { #[cfg(feature = "variable-fonts")] { let mut advance = self.tables.vmtx?.advance(glyph_id)? as f32; if self.is_variable() { // Ignore variation offset when `vvar` is not set. if let Some(vvar) = self.tables.vvar { if let Some(offset) = vvar.advance_offset(glyph_id, self.coords()) { // We can't use `round()` in `no_std`, so this is the next best thing. advance += offset + 0.5; } } else if let Some(points) = self.glyph_phantom_points(glyph_id) { // We can't use `round()` in `no_std`, so this is the next best thing. advance += points.bottom.y + 0.5 } } u16::try_num_from(advance) } #[cfg(not(feature = "variable-fonts"))] { self.tables.vmtx?.advance(glyph_id) } } /// Returns glyph's horizontal side bearing. /// /// This method is affected by variation axes. #[inline] pub fn glyph_hor_side_bearing(&self, glyph_id: GlyphId) -> Option { #[cfg(feature = "variable-fonts")] { let mut bearing = self.tables.hmtx?.side_bearing(glyph_id)? as f32; if self.is_variable() { // Ignore variation offset when `hvar` is not set. if let Some(hvar) = self.tables.hvar { if let Some(offset) = hvar.left_side_bearing_offset(glyph_id, self.coords()) { // We can't use `round()` in `no_std`, so this is the next best thing. bearing += offset + 0.5; } } } i16::try_num_from(bearing) } #[cfg(not(feature = "variable-fonts"))] { self.tables.hmtx?.side_bearing(glyph_id) } } /// Returns glyph's vertical side bearing. /// /// This method is affected by variation axes. #[inline] pub fn glyph_ver_side_bearing(&self, glyph_id: GlyphId) -> Option { #[cfg(feature = "variable-fonts")] { let mut bearing = self.tables.vmtx?.side_bearing(glyph_id)? as f32; if self.is_variable() { // Ignore variation offset when `vvar` is not set. if let Some(vvar) = self.tables.vvar { if let Some(offset) = vvar.top_side_bearing_offset(glyph_id, self.coords()) { // We can't use `round()` in `no_std`, so this is the next best thing. bearing += offset + 0.5; } } } i16::try_num_from(bearing) } #[cfg(not(feature = "variable-fonts"))] { self.tables.vmtx?.side_bearing(glyph_id) } } /// Returns glyph's vertical origin according to /// [Vertical Origin Table](https://docs.microsoft.com/en-us/typography/opentype/spec/vorg). /// /// This method is affected by variation axes. pub fn glyph_y_origin(&self, glyph_id: GlyphId) -> Option { #[cfg(feature = "variable-fonts")] { let mut origin = self.tables.vorg.map(|vorg| vorg.glyph_y_origin(glyph_id))? as f32; if self.is_variable() { // Ignore variation offset when `vvar` is not set. if let Some(vvar) = self.tables.vvar { if let Some(offset) = vvar.vertical_origin_offset(glyph_id, self.coords()) { // We can't use `round()` in `no_std`, so this is the next best thing. origin += offset + 0.5; } } } i16::try_num_from(origin) } #[cfg(not(feature = "variable-fonts"))] { self.tables.vorg.map(|vorg| vorg.glyph_y_origin(glyph_id)) } } /// Returns glyph's name. /// /// Uses the `post` and `CFF` tables as sources. /// /// Returns `None` when no name is associated with a `glyph`. #[cfg(feature = "glyph-names")] #[inline] pub fn glyph_name(&self, glyph_id: GlyphId) -> Option<&str> { if let Some(name) = self.tables.post.and_then(|post| post.glyph_name(glyph_id)) { return Some(name); } if let Some(name) = self .tables .cff .as_ref() .and_then(|cff1| cff1.glyph_name(glyph_id)) { return Some(name); } None } /// Outlines a glyph and returns its tight bounding box. /// /// **Warning**: since `ttf-parser` is a pull parser, /// `OutlineBuilder` will emit segments even when outline is partially malformed. /// You must check `outline_glyph()` result before using /// `OutlineBuilder`'s output. /// /// `gvar`, `glyf`, `CFF` and `CFF2` tables are supported. /// And they will be accesses in this specific order. /// /// This method is affected by variation axes. /// /// Returns `None` when glyph has no outline or on error. /// /// # Example /// /// ``` /// use std::fmt::Write; /// use ttf_parser; /// /// struct Builder(String); /// /// impl ttf_parser::OutlineBuilder for Builder { /// fn move_to(&mut self, x: f32, y: f32) { /// write!(&mut self.0, "M {} {} ", x, y).unwrap(); /// } /// /// fn line_to(&mut self, x: f32, y: f32) { /// write!(&mut self.0, "L {} {} ", x, y).unwrap(); /// } /// /// fn quad_to(&mut self, x1: f32, y1: f32, x: f32, y: f32) { /// write!(&mut self.0, "Q {} {} {} {} ", x1, y1, x, y).unwrap(); /// } /// /// fn curve_to(&mut self, x1: f32, y1: f32, x2: f32, y2: f32, x: f32, y: f32) { /// write!(&mut self.0, "C {} {} {} {} {} {} ", x1, y1, x2, y2, x, y).unwrap(); /// } /// /// fn close(&mut self) { /// write!(&mut self.0, "Z ").unwrap(); /// } /// } /// /// let data = std::fs::read("tests/fonts/demo.ttf").unwrap(); /// let face = ttf_parser::Face::parse(&data, 0).unwrap(); /// let mut builder = Builder(String::new()); /// let bbox = face.outline_glyph(ttf_parser::GlyphId(1), &mut builder).unwrap(); /// assert_eq!(builder.0, "M 173 267 L 369 267 L 270 587 L 173 267 Z M 6 0 L 224 656 \ /// L 320 656 L 541 0 L 452 0 L 390 200 L 151 200 L 85 0 L 6 0 Z "); /// assert_eq!(bbox, ttf_parser::Rect { x_min: 6, y_min: 0, x_max: 541, y_max: 656 }); /// ``` #[inline] pub fn outline_glyph( &self, glyph_id: GlyphId, builder: &mut dyn OutlineBuilder, ) -> Option { #[cfg(feature = "variable-fonts")] { if let Some(ref gvar) = self.tables.gvar { return gvar.outline(self.tables.glyf?, self.coords(), glyph_id, builder); } } if let Some(table) = self.tables.glyf { return table.outline(glyph_id, builder); } if let Some(ref cff) = self.tables.cff { return cff.outline(glyph_id, builder).ok(); } #[cfg(feature = "variable-fonts")] { if let Some(ref cff2) = self.tables.cff2 { return cff2.outline(self.coords(), glyph_id, builder).ok(); } } None } /// Returns a tight glyph bounding box. /// /// This is just a shorthand for `outline_glyph()` since only the `glyf` table stores /// a bounding box. We ignore `glyf` table bboxes because they can be malformed. /// In case of CFF and variable fonts we have to actually outline /// a glyph to find it's bounding box. /// /// When a glyph is defined by a raster or a vector image, /// that can be obtained via `glyph_image()`, /// the bounding box must be calculated manually and this method will return `None`. /// /// Note: the returned bbox is not validated in any way. A font file can have a glyph bbox /// set to zero/negative width and/or height and this is perfectly ok. /// For calculated bboxes, zero width and/or height is also perfectly fine. /// /// This method is affected by variation axes. #[inline] pub fn glyph_bounding_box(&self, glyph_id: GlyphId) -> Option { self.outline_glyph(glyph_id, &mut DummyOutline) } /// Returns a bounding box that large enough to enclose any glyph from the face. #[inline] pub fn global_bounding_box(&self) -> Rect { self.tables.head.global_bbox } /// Returns a reference to a glyph's raster image. /// /// A font can define a glyph using a raster or a vector image instead of a simple outline. /// Which is primarily used for emojis. This method should be used to access raster images. /// /// `pixels_per_em` allows selecting a preferred image size. The chosen size will /// be closer to an upper one. So when font has 64px and 96px images and `pixels_per_em` /// is set to 72, 96px image will be returned. /// To get the largest image simply use `std::u16::MAX`. /// /// Note that this method will return an encoded image. It should be decoded /// by the caller. We don't validate or preprocess it in any way. /// /// Also, a font can contain both: images and outlines. So when this method returns `None` /// you should also try `outline_glyph()` afterwards. /// /// There are multiple ways an image can be stored in a TrueType font /// and this method supports most of them. /// This includes `sbix`, `bloc` + `bdat`, `EBLC` + `EBDT`, `CBLC` + `CBDT`. /// And font's tables will be accesses in this specific order. #[inline] pub fn glyph_raster_image( &self, glyph_id: GlyphId, pixels_per_em: u16, ) -> Option { if let Some(table) = self.tables.sbix { if let Some(strike) = table.best_strike(pixels_per_em) { return strike.get(glyph_id); } } if let Some(bdat) = self.tables.bdat { return bdat.get(glyph_id, pixels_per_em); } if let Some(ebdt) = self.tables.ebdt { return ebdt.get(glyph_id, pixels_per_em); } if let Some(cbdt) = self.tables.cbdt { return cbdt.get(glyph_id, pixels_per_em); } None } /// Returns a reference to a glyph's SVG image. /// /// A font can define a glyph using a raster or a vector image instead of a simple outline. /// Which is primarily used for emojis. This method should be used to access SVG images. /// /// Note that this method will return just an SVG data. It should be rendered /// or even decompressed (in case of SVGZ) by the caller. /// We don't validate or preprocess it in any way. /// /// Also, a font can contain both: images and outlines. So when this method returns `None` /// you should also try `outline_glyph()` afterwards. #[inline] pub fn glyph_svg_image(&self, glyph_id: GlyphId) -> Option> { self.tables.svg.and_then(|svg| svg.documents.find(glyph_id)) } /// Returns `true` if the glyph can be colored/painted using the `COLR`+`CPAL` tables. /// /// See [`paint_color_glyph`](Face::paint_color_glyph) for details. pub fn is_color_glyph(&self, glyph_id: GlyphId) -> bool { self.tables() .colr .map(|colr| colr.contains(glyph_id)) .unwrap_or(false) } /// Returns the number of palettes stored in the `COLR`+`CPAL` tables. /// /// See [`paint_color_glyph`](Face::paint_color_glyph) for details. pub fn color_palettes(&self) -> Option { Some(self.tables().colr?.palettes.palettes()) } /// Paints a color glyph from the `COLR` table. /// /// A font can have multiple palettes, which you can check via /// [`color_palettes`](Face::color_palettes). /// If unsure, just pass 0 to the `palette` argument, which is the default. /// /// A font can define a glyph using layers of colored shapes instead of a /// simple outline. Which is primarily used for emojis. This method should /// be used to access glyphs defined in the `COLR` table. /// /// Also, a font can contain both: a layered definition and outlines. So /// when this method returns `None` you should also try /// [`outline_glyph`](Face::outline_glyph) afterwards. /// /// Returns `None` if the glyph has no `COLR` definition or if the glyph /// definition is malformed. /// /// See `examples/font2svg.rs` for usage examples. #[inline] pub fn paint_color_glyph( &self, glyph_id: GlyphId, palette: u16, foreground_color: RgbaColor, painter: &mut dyn colr::Painter<'a>, ) -> Option<()> { self.tables.colr?.paint( glyph_id, palette, painter, #[cfg(feature = "variable-fonts")] self.coords(), foreground_color, ) } /// Returns an iterator over variation axes. #[cfg(feature = "variable-fonts")] #[inline] pub fn variation_axes(&self) -> LazyArray16<'a, VariationAxis> { self.tables.fvar.map(|fvar| fvar.axes).unwrap_or_default() } /// Sets a variation axis coordinate. /// /// This is one of the two only mutable methods in the library. /// We can simplify the API a lot by storing the variable coordinates /// in the face object itself. /// /// Since coordinates are stored on the stack, we allow only 64 of them. /// /// Returns `None` when face is not variable or doesn't have such axis. #[cfg(feature = "variable-fonts")] pub fn set_variation(&mut self, axis: Tag, value: f32) -> Option<()> { if !self.is_variable() { return None; } if usize::from(self.variation_axes().len()) >= MAX_VAR_COORDS { return None; } for (i, var_axis) in self.variation_axes().into_iter().enumerate() { if var_axis.tag == axis { self.coordinates.data[i] = var_axis.normalized_value(value); } } // TODO: optimize if let Some(avar) = self.tables.avar { // Ignore error. let _ = avar.map_coordinates(self.coordinates.as_mut_slice()); } Some(()) } /// Returns the current normalized variation coordinates. #[cfg(feature = "variable-fonts")] #[inline] pub fn variation_coordinates(&self) -> &[NormalizedCoordinate] { self.coordinates.as_slice() } /// Checks that face has non-default variation coordinates. #[cfg(feature = "variable-fonts")] #[inline] pub fn has_non_default_variation_coordinates(&self) -> bool { self.coordinates.as_slice().iter().any(|c| c.0 != 0) } /// Parses glyph's phantom points. /// /// Available only for variable fonts with the `gvar` table. #[cfg(feature = "variable-fonts")] pub fn glyph_phantom_points(&self, glyph_id: GlyphId) -> Option { let glyf = self.tables.glyf?; let gvar = self.tables.gvar?; gvar.phantom_points(glyf, self.coords(), glyph_id) } #[cfg(feature = "variable-fonts")] #[inline] fn metrics_var_offset(&self, tag: Tag) -> f32 { self.tables .mvar .and_then(|table| table.metric_offset(tag, self.coords())) .unwrap_or(0.0) } #[inline] fn apply_metrics_variation(&self, tag: Tag, mut value: i16) -> i16 { self.apply_metrics_variation_to(tag, &mut value); value } #[cfg(feature = "variable-fonts")] #[inline] fn apply_metrics_variation_to(&self, tag: Tag, value: &mut i16) { if self.is_variable() { let v = f32::from(*value) + self.metrics_var_offset(tag); // TODO: Should probably round it, but f32::round is not available in core. if let Some(v) = i16::try_num_from(v) { *value = v; } } } #[cfg(not(feature = "variable-fonts"))] #[inline] fn apply_metrics_variation_to(&self, _: Tag, _: &mut i16) {} #[cfg(feature = "variable-fonts")] #[inline] fn coords(&self) -> &[NormalizedCoordinate] { self.coordinates.as_slice() } } impl core::fmt::Debug for Face<'_> { fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result { write!(f, "Face()") } } /// Returns the number of fonts stored in a TrueType font collection. /// /// Returns `None` if a provided data is not a TrueType font collection. #[inline] pub fn fonts_in_collection(data: &[u8]) -> Option { let mut s = Stream::new(data); if s.read::()? != Magic::FontCollection { return None; } s.skip::(); // version s.read::() } ttf-parser-0.24.1/src/parser.rs000064400000000000000000000546031046102023000144400ustar 00000000000000//! Binary parsing utils. //! //! This module should not be used directly, unless you're planning to parse //! some tables manually. use core::convert::{TryFrom, TryInto}; use core::ops::Range; /// A trait for parsing raw binary data of fixed size. /// /// This is a low-level, internal trait that should not be used directly. pub trait FromData: Sized { /// Object's raw data size. /// /// Not always the same as `mem::size_of`. const SIZE: usize; /// Parses an object from a raw data. fn parse(data: &[u8]) -> Option; } /// A trait for parsing raw binary data of variable size. /// /// This is a low-level, internal trait that should not be used directly. pub trait FromSlice<'a>: Sized { /// Parses an object from a raw data. fn parse(data: &'a [u8]) -> Option; } impl FromData for () { const SIZE: usize = 0; #[inline] fn parse(_: &[u8]) -> Option { Some(()) } } impl FromData for u8 { const SIZE: usize = 1; #[inline] fn parse(data: &[u8]) -> Option { data.get(0).copied() } } impl FromData for i8 { const SIZE: usize = 1; #[inline] fn parse(data: &[u8]) -> Option { data.get(0).copied().map(|n| n as i8) } } impl FromData for u16 { const SIZE: usize = 2; #[inline] fn parse(data: &[u8]) -> Option { data.try_into().ok().map(u16::from_be_bytes) } } impl FromData for i16 { const SIZE: usize = 2; #[inline] fn parse(data: &[u8]) -> Option { data.try_into().ok().map(i16::from_be_bytes) } } impl FromData for u32 { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { data.try_into().ok().map(u32::from_be_bytes) } } impl FromData for i32 { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { data.try_into().ok().map(i32::from_be_bytes) } } impl FromData for u64 { const SIZE: usize = 8; #[inline] fn parse(data: &[u8]) -> Option { data.try_into().ok().map(u64::from_be_bytes) } } /// A u24 number. /// /// Stored as u32, but encoded as 3 bytes in the font. /// /// #[derive(Clone, Copy, Debug)] pub struct U24(pub u32); impl FromData for U24 { const SIZE: usize = 3; #[inline] fn parse(data: &[u8]) -> Option { let data: [u8; 3] = data.try_into().ok()?; Some(U24(u32::from_be_bytes([0, data[0], data[1], data[2]]))) } } /// A 16-bit signed fixed number with the low 14 bits of fraction (2.14). #[derive(Clone, Copy, Debug)] pub struct F2DOT14(pub i16); impl F2DOT14 { /// Converts i16 to f32. #[inline] pub fn to_f32(self) -> f32 { f32::from(self.0) / 16384.0 } #[cfg(feature = "variable-fonts")] #[inline] pub fn apply_float_delta(&self, delta: f32) -> f32 { self.to_f32() + (delta as f64 * (1.0 / 16384.0)) as f32 } } impl FromData for F2DOT14 { const SIZE: usize = 2; #[inline] fn parse(data: &[u8]) -> Option { i16::parse(data).map(F2DOT14) } } /// A 32-bit signed fixed-point number (16.16). #[derive(Clone, Copy, Debug)] pub struct Fixed(pub f32); impl FromData for Fixed { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { // TODO: is it safe to cast? i32::parse(data).map(|n| Fixed(n as f32 / 65536.0)) } } impl Fixed { #[cfg(feature = "variable-fonts")] #[inline] pub(crate) fn apply_float_delta(&self, delta: f32) -> f32 { self.0 + (delta as f64 * (1.0 / 65536.0)) as f32 } } /// A safe u32 to usize casting. /// /// Rust doesn't implement `From for usize`, /// because it has to support 16 bit targets. /// We don't, so we can allow this. pub trait NumFrom: Sized { /// Converts u32 into usize. fn num_from(_: T) -> Self; } impl NumFrom for usize { #[inline] fn num_from(v: u32) -> Self { #[cfg(any(target_pointer_width = "32", target_pointer_width = "64"))] { v as usize } // compilation error on 16 bit targets } } impl NumFrom for usize { #[inline] fn num_from(v: char) -> Self { #[cfg(any(target_pointer_width = "32", target_pointer_width = "64"))] { v as usize } // compilation error on 16 bit targets } } /// Just like TryFrom, but for numeric types not supported by the Rust's std. pub trait TryNumFrom: Sized { /// Casts between numeric types. fn try_num_from(_: T) -> Option; } impl TryNumFrom for u8 { #[inline] fn try_num_from(v: f32) -> Option { i32::try_num_from(v).and_then(|v| u8::try_from(v).ok()) } } impl TryNumFrom for i16 { #[inline] fn try_num_from(v: f32) -> Option { i32::try_num_from(v).and_then(|v| i16::try_from(v).ok()) } } impl TryNumFrom for u16 { #[inline] fn try_num_from(v: f32) -> Option { i32::try_num_from(v).and_then(|v| u16::try_from(v).ok()) } } #[allow(clippy::manual_range_contains)] impl TryNumFrom for i32 { #[inline] fn try_num_from(v: f32) -> Option { // Based on https://github.com/rust-num/num-traits/blob/master/src/cast.rs // Float as int truncates toward zero, so we want to allow values // in the exclusive range `(MIN-1, MAX+1)`. // We can't represent `MIN-1` exactly, but there's no fractional part // at this magnitude, so we can just use a `MIN` inclusive boundary. const MIN: f32 = i32::MIN as f32; // We can't represent `MAX` exactly, but it will round up to exactly // `MAX+1` (a power of two) when we cast it. const MAX_P1: f32 = i32::MAX as f32; if v >= MIN && v < MAX_P1 { Some(v as i32) } else { None } } } /// A slice-like container that converts internal binary data only on access. /// /// Array values are stored in a continuous data chunk. #[derive(Clone, Copy)] pub struct LazyArray16<'a, T> { data: &'a [u8], data_type: core::marker::PhantomData, } impl Default for LazyArray16<'_, T> { #[inline] fn default() -> Self { LazyArray16 { data: &[], data_type: core::marker::PhantomData, } } } impl<'a, T: FromData> LazyArray16<'a, T> { /// Creates a new `LazyArray`. #[inline] pub fn new(data: &'a [u8]) -> Self { LazyArray16 { data, data_type: core::marker::PhantomData, } } /// Returns a value at `index`. #[inline] pub fn get(&self, index: u16) -> Option { if index < self.len() { let start = usize::from(index) * T::SIZE; let end = start + T::SIZE; self.data.get(start..end).and_then(T::parse) } else { None } } /// Returns the last value. #[inline] pub fn last(&self) -> Option { if !self.is_empty() { self.get(self.len() - 1) } else { None } } /// Returns sub-array. #[inline] pub fn slice(&self, range: Range) -> Option { let start = usize::from(range.start) * T::SIZE; let end = usize::from(range.end) * T::SIZE; Some(LazyArray16 { data: self.data.get(start..end)?, ..LazyArray16::default() }) } /// Returns array's length. #[inline] pub fn len(&self) -> u16 { (self.data.len() / T::SIZE) as u16 } /// Checks if array is empty. #[inline] pub fn is_empty(&self) -> bool { self.len() == 0 } /// Performs a binary search by specified `key`. #[inline] pub fn binary_search(&self, key: &T) -> Option<(u16, T)> where T: Ord, { self.binary_search_by(|p| p.cmp(key)) } /// Performs a binary search using specified closure. #[inline] pub fn binary_search_by(&self, mut f: F) -> Option<(u16, T)> where F: FnMut(&T) -> core::cmp::Ordering, { // Based on Rust std implementation. use core::cmp::Ordering; let mut size = self.len(); if size == 0 { return None; } let mut base = 0; while size > 1 { let half = size / 2; let mid = base + half; // mid is always in [0, size), that means mid is >= 0 and < size. // mid >= 0: by definition // mid < size: mid = size / 2 + size / 4 + size / 8 ... let cmp = f(&self.get(mid)?); base = if cmp == Ordering::Greater { base } else { mid }; size -= half; } // base is always in [0, size) because base <= mid. let value = self.get(base)?; if f(&value) == Ordering::Equal { Some((base, value)) } else { None } } } impl<'a, T: FromData + core::fmt::Debug + Copy> core::fmt::Debug for LazyArray16<'a, T> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { f.debug_list().entries(*self).finish() } } impl<'a, T: FromData> IntoIterator for LazyArray16<'a, T> { type Item = T; type IntoIter = LazyArrayIter16<'a, T>; #[inline] fn into_iter(self) -> Self::IntoIter { LazyArrayIter16 { data: self, index: 0, } } } /// An iterator over `LazyArray16`. #[derive(Clone, Copy)] #[allow(missing_debug_implementations)] pub struct LazyArrayIter16<'a, T> { data: LazyArray16<'a, T>, index: u16, } impl Default for LazyArrayIter16<'_, T> { #[inline] fn default() -> Self { LazyArrayIter16 { data: LazyArray16::new(&[]), index: 0, } } } impl<'a, T: FromData> Iterator for LazyArrayIter16<'a, T> { type Item = T; #[inline] fn next(&mut self) -> Option { self.index += 1; // TODO: check self.data.get(self.index - 1) } #[inline] fn count(self) -> usize { usize::from(self.data.len().saturating_sub(self.index)) } } /// A slice-like container that converts internal binary data only on access. /// /// This is a low-level, internal structure that should not be used directly. #[derive(Clone, Copy)] pub struct LazyArray32<'a, T> { data: &'a [u8], data_type: core::marker::PhantomData, } impl Default for LazyArray32<'_, T> { #[inline] fn default() -> Self { LazyArray32 { data: &[], data_type: core::marker::PhantomData, } } } impl<'a, T: FromData> LazyArray32<'a, T> { /// Creates a new `LazyArray`. #[inline] pub fn new(data: &'a [u8]) -> Self { LazyArray32 { data, data_type: core::marker::PhantomData, } } /// Returns a value at `index`. #[inline] pub fn get(&self, index: u32) -> Option { if index < self.len() { let start = usize::num_from(index) * T::SIZE; let end = start + T::SIZE; self.data.get(start..end).and_then(T::parse) } else { None } } /// Returns array's length. #[inline] pub fn len(&self) -> u32 { (self.data.len() / T::SIZE) as u32 } /// Checks if the array is empty. pub fn is_empty(&self) -> bool { self.len() == 0 } /// Performs a binary search by specified `key`. #[inline] pub fn binary_search(&self, key: &T) -> Option<(u32, T)> where T: Ord, { self.binary_search_by(|p| p.cmp(key)) } /// Performs a binary search using specified closure. #[inline] pub fn binary_search_by(&self, mut f: F) -> Option<(u32, T)> where F: FnMut(&T) -> core::cmp::Ordering, { // Based on Rust std implementation. use core::cmp::Ordering; let mut size = self.len(); if size == 0 { return None; } let mut base = 0; while size > 1 { let half = size / 2; let mid = base + half; // mid is always in [0, size), that means mid is >= 0 and < size. // mid >= 0: by definition // mid < size: mid = size / 2 + size / 4 + size / 8 ... let cmp = f(&self.get(mid)?); base = if cmp == Ordering::Greater { base } else { mid }; size -= half; } // base is always in [0, size) because base <= mid. let value = self.get(base)?; if f(&value) == Ordering::Equal { Some((base, value)) } else { None } } } impl<'a, T: FromData + core::fmt::Debug + Copy> core::fmt::Debug for LazyArray32<'a, T> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { f.debug_list().entries(*self).finish() } } impl<'a, T: FromData> IntoIterator for LazyArray32<'a, T> { type Item = T; type IntoIter = LazyArrayIter32<'a, T>; #[inline] fn into_iter(self) -> Self::IntoIter { LazyArrayIter32 { data: self, index: 0, } } } /// An iterator over `LazyArray32`. #[derive(Clone, Copy)] #[allow(missing_debug_implementations)] pub struct LazyArrayIter32<'a, T> { data: LazyArray32<'a, T>, index: u32, } impl<'a, T: FromData> Iterator for LazyArrayIter32<'a, T> { type Item = T; #[inline] fn next(&mut self) -> Option { self.index += 1; // TODO: check self.data.get(self.index - 1) } #[inline] fn count(self) -> usize { usize::num_from(self.data.len().saturating_sub(self.index)) } } /// A [`LazyArray16`]-like container, but data is accessed by offsets. /// /// Unlike [`LazyArray16`], internal storage is not continuous. /// /// Multiple offsets can point to the same data. #[derive(Clone, Copy)] pub struct LazyOffsetArray16<'a, T: FromSlice<'a>> { data: &'a [u8], // Zero offsets must be ignored, therefore we're using `Option`. offsets: LazyArray16<'a, Option>, data_type: core::marker::PhantomData, } impl<'a, T: FromSlice<'a>> LazyOffsetArray16<'a, T> { /// Creates a new `LazyOffsetArray16`. #[allow(dead_code)] pub fn new(data: &'a [u8], offsets: LazyArray16<'a, Option>) -> Self { Self { data, offsets, data_type: core::marker::PhantomData, } } /// Parses `LazyOffsetArray16` from raw data. #[allow(dead_code)] pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let count = s.read::()?; let offsets = s.read_array16(count)?; Some(Self { data, offsets, data_type: core::marker::PhantomData, }) } /// Returns a value at `index`. #[inline] pub fn get(&self, index: u16) -> Option { let offset = self.offsets.get(index)??.to_usize(); self.data.get(offset..).and_then(T::parse) } /// Returns array's length. #[inline] pub fn len(&self) -> u16 { self.offsets.len() } /// Checks if array is empty. #[inline] #[allow(dead_code)] pub fn is_empty(&self) -> bool { self.len() == 0 } } impl<'a, T: FromSlice<'a> + core::fmt::Debug + Copy> core::fmt::Debug for LazyOffsetArray16<'a, T> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { f.debug_list().entries(*self).finish() } } /// An iterator over [`LazyOffsetArray16`] values. #[derive(Clone, Copy)] #[allow(missing_debug_implementations)] pub struct LazyOffsetArrayIter16<'a, T: FromSlice<'a>> { array: LazyOffsetArray16<'a, T>, index: u16, } impl<'a, T: FromSlice<'a>> IntoIterator for LazyOffsetArray16<'a, T> { type Item = T; type IntoIter = LazyOffsetArrayIter16<'a, T>; #[inline] fn into_iter(self) -> Self::IntoIter { LazyOffsetArrayIter16 { array: self, index: 0, } } } impl<'a, T: FromSlice<'a>> Iterator for LazyOffsetArrayIter16<'a, T> { type Item = T; fn next(&mut self) -> Option { if self.index < self.array.len() { self.index += 1; self.array.get(self.index - 1) } else { None } } #[inline] fn count(self) -> usize { usize::from(self.array.len().saturating_sub(self.index)) } } /// A streaming binary parser. #[derive(Clone, Default, Debug)] pub struct Stream<'a> { data: &'a [u8], offset: usize, } impl<'a> Stream<'a> { /// Creates a new `Stream` parser. #[inline] pub fn new(data: &'a [u8]) -> Self { Stream { data, offset: 0 } } /// Creates a new `Stream` parser at offset. /// /// Returns `None` when `offset` is out of bounds. #[inline] pub fn new_at(data: &'a [u8], offset: usize) -> Option { if offset <= data.len() { Some(Stream { data, offset }) } else { None } } /// Checks that stream reached the end of the data. #[inline] pub fn at_end(&self) -> bool { self.offset >= self.data.len() } /// Jumps to the end of the stream. /// /// Useful to indicate that we parsed all the data. #[inline] pub fn jump_to_end(&mut self) { self.offset = self.data.len(); } /// Returns the current offset. #[inline] pub fn offset(&self) -> usize { self.offset } /// Returns the trailing data. /// /// Returns `None` when `Stream` is reached the end. #[inline] pub fn tail(&self) -> Option<&'a [u8]> { self.data.get(self.offset..) } /// Advances by `FromData::SIZE`. /// /// Doesn't check bounds. #[inline] pub fn skip(&mut self) { self.advance(T::SIZE); } /// Advances by the specified `len`. /// /// Doesn't check bounds. #[inline] pub fn advance(&mut self, len: usize) { self.offset += len; } /// Advances by the specified `len` and checks for bounds. #[inline] pub fn advance_checked(&mut self, len: usize) -> Option<()> { if self.offset + len <= self.data.len() { self.advance(len); Some(()) } else { None } } /// Parses the type from the steam. /// /// Returns `None` when there is not enough data left in the stream /// or the type parsing failed. #[inline] pub fn read(&mut self) -> Option { self.read_bytes(T::SIZE).and_then(T::parse) } /// Parses the type from the steam at offset. #[inline] pub fn read_at(data: &[u8], offset: usize) -> Option { data.get(offset..offset + T::SIZE).and_then(T::parse) } /// Reads N bytes from the stream. #[inline] pub fn read_bytes(&mut self, len: usize) -> Option<&'a [u8]> { // An integer overflow here on 32bit systems is almost guarantee to be caused // by an incorrect parsing logic from the caller side. // Simply using `checked_add` here would silently swallow errors, which is not what we want. debug_assert!(self.offset as u64 + len as u64 <= u32::MAX as u64); let v = self.data.get(self.offset..self.offset + len)?; self.advance(len); Some(v) } /// Reads the next `count` types as a slice. #[inline] pub fn read_array16(&mut self, count: u16) -> Option> { let len = usize::from(count) * T::SIZE; self.read_bytes(len).map(LazyArray16::new) } /// Reads the next `count` types as a slice. #[inline] pub fn read_array32(&mut self, count: u32) -> Option> { let len = usize::num_from(count) * T::SIZE; self.read_bytes(len).map(LazyArray32::new) } #[allow(dead_code)] #[inline] pub fn read_at_offset16(&mut self, data: &'a [u8]) -> Option<&'a [u8]> { let offset = self.read::()?.to_usize(); data.get(offset..) } } /// A common offset methods. pub trait Offset { /// Converts the offset to `usize`. fn to_usize(&self) -> usize; } /// A type-safe u16 offset. #[derive(Clone, Copy, Debug)] pub struct Offset16(pub u16); impl Offset for Offset16 { #[inline] fn to_usize(&self) -> usize { usize::from(self.0) } } impl FromData for Offset16 { const SIZE: usize = 2; #[inline] fn parse(data: &[u8]) -> Option { u16::parse(data).map(Offset16) } } impl FromData for Option { const SIZE: usize = Offset16::SIZE; #[inline] fn parse(data: &[u8]) -> Option { let offset = Offset16::parse(data)?; if offset.0 != 0 { Some(Some(offset)) } else { Some(None) } } } /// A type-safe u24 offset. #[derive(Clone, Copy, Debug)] pub struct Offset24(pub u32); impl Offset for Offset24 { #[inline] fn to_usize(&self) -> usize { usize::num_from(self.0) } } impl FromData for Offset24 { const SIZE: usize = 3; #[inline] fn parse(data: &[u8]) -> Option { U24::parse(data).map(|n| Offset24(n.0)) } } impl FromData for Option { const SIZE: usize = Offset24::SIZE; #[inline] fn parse(data: &[u8]) -> Option { let offset = Offset24::parse(data)?; if offset.0 != 0 { Some(Some(offset)) } else { Some(None) } } } /// A type-safe u32 offset. #[derive(Clone, Copy, Debug)] pub struct Offset32(pub u32); impl Offset for Offset32 { #[inline] fn to_usize(&self) -> usize { usize::num_from(self.0) } } impl FromData for Offset32 { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { u32::parse(data).map(Offset32) } } impl FromData for Option { const SIZE: usize = Offset32::SIZE; #[inline] fn parse(data: &[u8]) -> Option { let offset = Offset32::parse(data)?; if offset.0 != 0 { Some(Some(offset)) } else { Some(None) } } } #[inline] pub fn i16_bound(min: i16, val: i16, max: i16) -> i16 { use core::cmp; cmp::max(min, cmp::min(max, val)) } #[inline] pub fn f32_bound(min: f32, val: f32, max: f32) -> f32 { debug_assert!(min.is_finite()); debug_assert!(val.is_finite()); debug_assert!(max.is_finite()); if val > max { return max; } else if val < min { return min; } val } ttf-parser-0.24.1/src/tables/ankr.rs000064400000000000000000000051211046102023000153400ustar 00000000000000//! An [Anchor Point Table]( //! https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6ankr.html) implementation. use core::num::NonZeroU16; use crate::aat; use crate::parser::{FromData, LazyArray32, Offset, Offset32, Stream}; use crate::GlyphId; /// An anchor point. #[allow(missing_docs)] #[derive(Clone, Copy, PartialEq, Eq, Default, Debug)] pub struct Point { pub x: i16, pub y: i16, } impl FromData for Point { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Point { x: s.read::()?, y: s.read::()?, }) } } /// An [Anchor Point Table]( /// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6ankr.html). #[derive(Clone)] pub struct Table<'a> { lookup: aat::Lookup<'a>, // Ideally, Glyphs Data can be represented as an array, // but Apple's spec doesn't specify that Glyphs Data members have padding or not. // Meaning we cannot simply iterate over them. glyphs_data: &'a [u8], } impl core::fmt::Debug for Table<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Table {{ ... }}") } } impl<'a> Table<'a> { /// Parses a table from raw data. /// /// `number_of_glyphs` is from the `maxp` table. pub fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if version != 0 { return None; } s.skip::(); // reserved // TODO: we should probably check that offset is larger than the header size (8) let lookup_table = s.read_at_offset32(data)?; let glyphs_data = s.read_at_offset32(data)?; Some(Table { lookup: aat::Lookup::parse(number_of_glyphs, lookup_table)?, glyphs_data, }) } /// Returns a list of anchor points for the specified glyph. pub fn points(&self, glyph_id: GlyphId) -> Option> { let offset = self.lookup.value(glyph_id)?; let mut s = Stream::new_at(self.glyphs_data, usize::from(offset))?; let number_of_points = s.read::()?; s.read_array32::(number_of_points) } } trait StreamExt<'a> { fn read_at_offset32(&mut self, data: &'a [u8]) -> Option<&'a [u8]>; } impl<'a> StreamExt<'a> for Stream<'a> { fn read_at_offset32(&mut self, data: &'a [u8]) -> Option<&'a [u8]> { let offset = self.read::()?.to_usize(); data.get(offset..) } } ttf-parser-0.24.1/src/tables/avar.rs000064400000000000000000000113321046102023000153370ustar 00000000000000//! An [Axis Variations Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/avar) implementation. use core::convert::TryFrom; use crate::parser::{FromData, LazyArray16, Stream}; use crate::NormalizedCoordinate; /// An axis value map. #[derive(Clone, Copy, Debug)] pub struct AxisValueMap { /// A normalized coordinate value obtained using default normalization. pub from_coordinate: i16, /// The modified, normalized coordinate value. pub to_coordinate: i16, } impl FromData for AxisValueMap { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(AxisValueMap { from_coordinate: s.read::()?, to_coordinate: s.read::()?, }) } } /// A list of segment maps. /// /// Can be empty. /// /// The internal data layout is not designed for random access, /// therefore we're not providing the `get()` method and only an iterator. #[derive(Clone, Copy)] pub struct SegmentMaps<'a> { count: u16, data: &'a [u8], } impl<'a> SegmentMaps<'a> { /// Returns the number of segments. pub fn len(&self) -> u16 { self.count } /// Checks if there are any segments. pub fn is_empty(&self) -> bool { self.count == 0 } } impl core::fmt::Debug for SegmentMaps<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "SegmentMaps {{ ... }}") } } impl<'a> IntoIterator for SegmentMaps<'a> { type Item = LazyArray16<'a, AxisValueMap>; type IntoIter = SegmentMapsIter<'a>; #[inline] fn into_iter(self) -> Self::IntoIter { SegmentMapsIter { stream: Stream::new(self.data), } } } /// An iterator over maps. #[allow(missing_debug_implementations)] pub struct SegmentMapsIter<'a> { stream: Stream<'a>, } impl<'a> Iterator for SegmentMapsIter<'a> { type Item = LazyArray16<'a, AxisValueMap>; fn next(&mut self) -> Option { let count = self.stream.read::()?; self.stream.read_array16::(count) } } /// An [Axis Variations Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/avar). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// The segment maps array — one segment map for each axis /// in the order of axes specified in the `fvar` table. pub segment_maps: SegmentMaps<'a>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if version != 0x00010000 { return None; } s.skip::(); // reserved Some(Self { segment_maps: SegmentMaps { // TODO: check that `axisCount` is the same as in `fvar`? count: s.read::()?, data: s.tail()?, }, }) } /// Maps coordinates. pub fn map_coordinates(&self, coordinates: &mut [NormalizedCoordinate]) -> Option<()> { if usize::from(self.segment_maps.count) != coordinates.len() { return None; } for (map, coord) in self.segment_maps.into_iter().zip(coordinates) { *coord = NormalizedCoordinate::from(map_value(&map, coord.0)?); } Some(()) } } fn map_value(map: &LazyArray16, value: i16) -> Option { // This code is based on harfbuzz implementation. if map.is_empty() { return Some(value); } else if map.len() == 1 { let record = map.get(0)?; return Some(value - record.from_coordinate + record.to_coordinate); } let record_0 = map.get(0)?; if value <= record_0.from_coordinate { return Some(value - record_0.from_coordinate + record_0.to_coordinate); } let mut i = 1; while i < map.len() && value > map.get(i)?.from_coordinate { i += 1; } if i == map.len() { i -= 1; } let record_curr = map.get(i)?; let curr_from = record_curr.from_coordinate; let curr_to = record_curr.to_coordinate; if value >= curr_from { return Some(value - curr_from + curr_to); } let record_prev = map.get(i - 1)?; let prev_from = record_prev.from_coordinate; let prev_to = record_prev.to_coordinate; if prev_from == curr_from { return Some(prev_to); } let curr_from = i32::from(curr_from); let curr_to = i32::from(curr_to); let prev_from = i32::from(prev_from); let prev_to = i32::from(prev_to); let denom = curr_from - prev_from; let k = (curr_to - prev_to) * (i32::from(value) - prev_from) + denom / 2; let value = prev_to + k / denom; i16::try_from(value).ok() } ttf-parser-0.24.1/src/tables/cbdt.rs000064400000000000000000000125611046102023000153270ustar 00000000000000//! A [Color Bitmap Data Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/cbdt) implementation. use crate::cblc::{self, BitmapDataFormat, Metrics, MetricsFormat}; use crate::parser::{NumFrom, Stream}; use crate::{GlyphId, RasterGlyphImage, RasterImageFormat}; /// A [Color Bitmap Data Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/cbdt). /// /// EBDT and bdat also share the same structure, so this is re-used for them. #[derive(Clone, Copy)] pub struct Table<'a> { locations: cblc::Table<'a>, data: &'a [u8], } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(locations: cblc::Table<'a>, data: &'a [u8]) -> Option { Some(Self { locations, data }) } /// Returns a raster image for the glyph. pub fn get(&self, glyph_id: GlyphId, pixels_per_em: u16) -> Option> { let location = self.locations.get(glyph_id, pixels_per_em)?; let mut s = Stream::new_at(self.data, location.offset)?; let metrics = match location.format.metrics { MetricsFormat::Small => { let height = s.read::()?; let width = s.read::()?; let bearing_x = s.read::()?; let bearing_y = s.read::()?; s.skip::(); // advance Metrics { x: bearing_x, y: bearing_y, width, height, } } MetricsFormat::Big => { let height = s.read::()?; let width = s.read::()?; let hor_bearing_x = s.read::()?; let hor_bearing_y = s.read::()?; s.skip::(); // hor_advance s.skip::(); // ver_bearing_x s.skip::(); // ver_bearing_y s.skip::(); // ver_advance Metrics { x: hor_bearing_x, y: hor_bearing_y, width, height, } } MetricsFormat::Shared => location.metrics, }; match location.format.data { BitmapDataFormat::ByteAligned { bit_depth } => { let row_len = (u32::from(metrics.width) * u32::from(bit_depth) + 7) / 8; let data_len = row_len * u32::from(metrics.height); let data = s.read_bytes(usize::num_from(data_len))?; Some(RasterGlyphImage { x: i16::from(metrics.x), // `y` in CBDT is a bottom bound, not top one. y: i16::from(metrics.y) - i16::from(metrics.height), width: u16::from(metrics.width), height: u16::from(metrics.height), pixels_per_em: location.ppem, format: match bit_depth { 1 => RasterImageFormat::BitmapMono, 2 => RasterImageFormat::BitmapGray2, 4 => RasterImageFormat::BitmapGray4, 8 => RasterImageFormat::BitmapGray8, 32 => RasterImageFormat::BitmapPremulBgra32, _ => return None, }, data, }) } BitmapDataFormat::BitAligned { bit_depth } => { let data_len = { let w = u32::from(metrics.width); let h = u32::from(metrics.height); let d = u32::from(bit_depth); (w * h * d + 7) / 8 }; let data = s.read_bytes(usize::num_from(data_len))?; Some(RasterGlyphImage { x: i16::from(metrics.x), // `y` in CBDT is a bottom bound, not top one. y: i16::from(metrics.y) - i16::from(metrics.height), width: u16::from(metrics.width), height: u16::from(metrics.height), pixels_per_em: location.ppem, format: match bit_depth { 1 => RasterImageFormat::BitmapMonoPacked, 2 => RasterImageFormat::BitmapGray2Packed, 4 => RasterImageFormat::BitmapGray4Packed, 8 => RasterImageFormat::BitmapGray8, 32 => RasterImageFormat::BitmapPremulBgra32, _ => return None, }, data, }) } BitmapDataFormat::PNG => { let data_len = s.read::()?; let data = s.read_bytes(usize::num_from(data_len))?; Some(RasterGlyphImage { x: i16::from(metrics.x), // `y` in CBDT is a bottom bound, not top one. y: i16::from(metrics.y) - i16::from(metrics.height), width: u16::from(metrics.width), height: u16::from(metrics.height), pixels_per_em: location.ppem, format: RasterImageFormat::PNG, data, }) } } } } impl core::fmt::Debug for Table<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Table {{ ... }}") } } ttf-parser-0.24.1/src/tables/cblc.rs000064400000000000000000000220021046102023000153050ustar 00000000000000//! A [Color Bitmap Location Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/cblc) implementation. use crate::parser::{FromData, NumFrom, Offset, Offset16, Offset32, Stream}; use crate::GlyphId; #[derive(Clone, Copy, PartialEq, Debug)] pub(crate) struct BitmapFormat { pub metrics: MetricsFormat, pub data: BitmapDataFormat, } #[derive(Clone, Copy, PartialEq, Debug)] pub(crate) enum MetricsFormat { Small, Big, Shared, } #[derive(Clone, Copy, PartialEq, Debug)] pub(crate) enum BitmapDataFormat { ByteAligned { bit_depth: u8 }, BitAligned { bit_depth: u8 }, PNG, } #[derive(Clone, Copy, Default, Debug)] pub(crate) struct Metrics { pub x: i8, pub y: i8, pub width: u8, pub height: u8, } #[derive(Clone, Copy, Debug)] pub(crate) struct Location { pub format: BitmapFormat, pub offset: usize, pub metrics: Metrics, pub ppem: u16, } #[derive(Clone, Copy)] struct BitmapSizeTable { subtable_array_offset: Offset32, number_of_subtables: u32, ppem: u16, bit_depth: u8, // Many fields are omitted. } fn select_bitmap_size_table( glyph_id: GlyphId, pixels_per_em: u16, mut s: Stream, ) -> Option { let subtable_count = s.read::()?; let orig_s = s.clone(); let mut idx = None; let mut max_ppem = 0; let mut bit_depth_for_max_ppem = 0; for i in 0..subtable_count { // Check that the current subtable contains a provided glyph id. s.advance(40); // Jump to `start_glyph_index`. let start_glyph_id = s.read::()?; let end_glyph_id = s.read::()?; let ppem_x = u16::from(s.read::()?); s.advance(1); // ppem_y let bit_depth = s.read::()?; s.advance(1); // flags if !(start_glyph_id..=end_glyph_id).contains(&glyph_id) { continue; } // Select a best matching subtable based on `pixels_per_em`. if (pixels_per_em <= ppem_x && ppem_x < max_ppem) || (pixels_per_em > max_ppem && ppem_x > max_ppem) { idx = Some(usize::num_from(i)); max_ppem = ppem_x; bit_depth_for_max_ppem = bit_depth; } } let mut s = orig_s; s.advance(idx? * 48); // 48 is BitmapSize Table size let subtable_array_offset = s.read::()?; s.skip::(); // index_tables_size let number_of_subtables = s.read::()?; Some(BitmapSizeTable { subtable_array_offset, number_of_subtables, ppem: max_ppem, bit_depth: bit_depth_for_max_ppem, }) } #[derive(Clone, Copy)] struct IndexSubtableInfo { start_glyph_id: GlyphId, offset: usize, // absolute offset } fn select_index_subtable( data: &[u8], size_table: BitmapSizeTable, glyph_id: GlyphId, ) -> Option { let mut s = Stream::new_at(data, size_table.subtable_array_offset.to_usize())?; for _ in 0..size_table.number_of_subtables { let start_glyph_id = s.read::()?; let end_glyph_id = s.read::()?; let offset = s.read::()?; if (start_glyph_id..=end_glyph_id).contains(&glyph_id) { let offset = size_table.subtable_array_offset.to_usize() + offset.to_usize(); return Some(IndexSubtableInfo { start_glyph_id, offset, }); } } None } #[derive(Clone, Copy)] struct GlyphIdOffsetPair { glyph_id: GlyphId, offset: Offset16, } impl FromData for GlyphIdOffsetPair { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(GlyphIdOffsetPair { glyph_id: s.read::()?, offset: s.read::()?, }) } } // TODO: rewrite /// A [Color Bitmap Location Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/cblc). /// /// EBLC and bloc also share the same structure, so this is re-used for them. #[derive(Clone, Copy)] pub struct Table<'a> { data: &'a [u8], } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { Some(Self { data }) } pub(crate) fn get(&self, glyph_id: GlyphId, pixels_per_em: u16) -> Option { let mut s = Stream::new(self.data); // The CBLC table version is a bit tricky, so we are ignoring it for now. // The CBLC table is based on EBLC table, which was based on the `bloc` table. // And before the CBLC table specification was finished, some fonts, // notably Noto Emoji, have used version 2.0, but the final spec allows only 3.0. // So there are perfectly valid fonts in the wild, which have an invalid version. s.skip::(); // version let size_table = select_bitmap_size_table(glyph_id, pixels_per_em, s)?; let info = select_index_subtable(self.data, size_table, glyph_id)?; let mut s = Stream::new_at(self.data, info.offset)?; let index_format = s.read::()?; let image_format = s.read::()?; let mut image_offset = s.read::()?.to_usize(); let bit_depth = size_table.bit_depth; let image_format = match image_format { 1 => BitmapFormat { metrics: MetricsFormat::Small, data: BitmapDataFormat::ByteAligned { bit_depth }, }, 2 => BitmapFormat { metrics: MetricsFormat::Small, data: BitmapDataFormat::BitAligned { bit_depth }, }, 5 => BitmapFormat { metrics: MetricsFormat::Shared, data: BitmapDataFormat::BitAligned { bit_depth }, }, 6 => BitmapFormat { metrics: MetricsFormat::Big, data: BitmapDataFormat::ByteAligned { bit_depth }, }, 7 => BitmapFormat { metrics: MetricsFormat::Big, data: BitmapDataFormat::BitAligned { bit_depth }, }, 17 => BitmapFormat { metrics: MetricsFormat::Small, data: BitmapDataFormat::PNG, }, 18 => BitmapFormat { metrics: MetricsFormat::Big, data: BitmapDataFormat::PNG, }, 19 => BitmapFormat { metrics: MetricsFormat::Shared, data: BitmapDataFormat::PNG, }, _ => return None, // Invalid format. }; // TODO: I wasn't able to find fonts with index 4 and 5, so they are untested. let glyph_diff = glyph_id.0.checked_sub(info.start_glyph_id.0)?; let mut metrics = Metrics::default(); match index_format { 1 => { s.advance(usize::from(glyph_diff) * Offset32::SIZE); let offset = s.read::()?; image_offset += offset.to_usize(); } 2 => { let image_size = s.read::()?; image_offset += usize::from(glyph_diff).checked_mul(usize::num_from(image_size))?; metrics.height = s.read::()?; metrics.width = s.read::()?; metrics.x = s.read::()?; metrics.y = s.read::()?; } 3 => { s.advance(usize::from(glyph_diff) * Offset16::SIZE); let offset = s.read::()?; image_offset += offset.to_usize(); } 4 => { let num_glyphs = s.read::()?; let num_glyphs = num_glyphs.checked_add(1)?; let pairs = s.read_array32::(num_glyphs)?; let pair = pairs.into_iter().find(|pair| pair.glyph_id == glyph_id)?; image_offset += pair.offset.to_usize(); } 5 => { let image_size = s.read::()?; metrics.height = s.read::()?; metrics.width = s.read::()?; metrics.x = s.read::()?; metrics.y = s.read::()?; s.skip::(); // hor_advance s.skip::(); // ver_bearing_x s.skip::(); // ver_bearing_y s.skip::(); // ver_advance let num_glyphs = s.read::()?; let glyphs = s.read_array32::(num_glyphs)?; let (index, _) = glyphs.binary_search(&glyph_id)?; image_offset = image_offset.checked_add( usize::num_from(index).checked_mul(usize::num_from(image_size))?, )?; } _ => return None, // Invalid format. } Some(Location { format: image_format, offset: image_offset, metrics, ppem: size_table.ppem, }) } } impl core::fmt::Debug for Table<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Table {{ ... }}") } } ttf-parser-0.24.1/src/tables/cff/argstack.rs000064400000000000000000000025201046102023000167420ustar 00000000000000use super::CFFError; pub struct ArgumentsStack<'a> { pub data: &'a mut [f32], pub len: usize, pub max_len: usize, } impl<'a> ArgumentsStack<'a> { #[inline] pub fn len(&self) -> usize { self.len } #[inline] pub fn is_empty(&self) -> bool { self.len == 0 } #[inline] pub fn push(&mut self, n: f32) -> Result<(), CFFError> { if self.len == self.max_len { Err(CFFError::ArgumentsStackLimitReached) } else { self.data[self.len] = n; self.len += 1; Ok(()) } } #[inline] pub fn at(&self, index: usize) -> f32 { self.data[index] } #[inline] pub fn pop(&mut self) -> f32 { debug_assert!(!self.is_empty()); self.len -= 1; self.data[self.len] } #[inline] pub fn reverse(&mut self) { if self.is_empty() { return; } // Reverse only the actual data and not the whole stack. let (first, _) = self.data.split_at_mut(self.len); first.reverse(); } #[inline] pub fn clear(&mut self) { self.len = 0; } } impl core::fmt::Debug for ArgumentsStack<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { f.debug_list().entries(&self.data[..self.len]).finish() } } ttf-parser-0.24.1/src/tables/cff/cff1.rs000064400000000000000000001041641046102023000157710ustar 00000000000000//! A [Compact Font Format Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/cff) implementation. // Useful links: // http://wwwimages.adobe.com/content/dam/Adobe/en/devnet/font/pdfs/5176.CFF.pdf // http://wwwimages.adobe.com/content/dam/Adobe/en/devnet/font/pdfs/5177.Type2.pdf // https://github.com/opentypejs/opentype.js/blob/master/src/tables/cff.js use core::convert::TryFrom; use core::num::NonZeroU16; use core::ops::Range; use super::argstack::ArgumentsStack; use super::charset::{parse_charset, Charset}; use super::charstring::CharStringParser; use super::dict::DictionaryParser; use super::encoding::{parse_encoding, Encoding, STANDARD_ENCODING}; use super::index::{parse_index, skip_index, Index}; #[cfg(feature = "glyph-names")] use super::std_names::STANDARD_NAMES; use super::{calc_subroutine_bias, conv_subroutine_index, Builder, CFFError, IsEven, StringId}; use crate::parser::{LazyArray16, NumFrom, Stream, TryNumFrom}; use crate::{DummyOutline, GlyphId, OutlineBuilder, Rect, RectF}; // Limits according to the Adobe Technical Note #5176, chapter 4 DICT Data. const MAX_OPERANDS_LEN: usize = 48; // Limits according to the Adobe Technical Note #5177 Appendix B. const STACK_LIMIT: u8 = 10; const MAX_ARGUMENTS_STACK_LEN: usize = 48; const TWO_BYTE_OPERATOR_MARK: u8 = 12; /// Enumerates some operators defined in the Adobe Technical Note #5177. mod operator { pub const HORIZONTAL_STEM: u8 = 1; pub const VERTICAL_STEM: u8 = 3; pub const VERTICAL_MOVE_TO: u8 = 4; pub const LINE_TO: u8 = 5; pub const HORIZONTAL_LINE_TO: u8 = 6; pub const VERTICAL_LINE_TO: u8 = 7; pub const CURVE_TO: u8 = 8; pub const CALL_LOCAL_SUBROUTINE: u8 = 10; pub const RETURN: u8 = 11; pub const ENDCHAR: u8 = 14; pub const HORIZONTAL_STEM_HINT_MASK: u8 = 18; pub const HINT_MASK: u8 = 19; pub const COUNTER_MASK: u8 = 20; pub const MOVE_TO: u8 = 21; pub const HORIZONTAL_MOVE_TO: u8 = 22; pub const VERTICAL_STEM_HINT_MASK: u8 = 23; pub const CURVE_LINE: u8 = 24; pub const LINE_CURVE: u8 = 25; pub const VV_CURVE_TO: u8 = 26; pub const HH_CURVE_TO: u8 = 27; pub const SHORT_INT: u8 = 28; pub const CALL_GLOBAL_SUBROUTINE: u8 = 29; pub const VH_CURVE_TO: u8 = 30; pub const HV_CURVE_TO: u8 = 31; pub const HFLEX: u8 = 34; pub const FLEX: u8 = 35; pub const HFLEX1: u8 = 36; pub const FLEX1: u8 = 37; pub const FIXED_16_16: u8 = 255; } /// Enumerates some operators defined in the Adobe Technical Note #5176, /// Table 9 Top DICT Operator Entries mod top_dict_operator { pub const CHARSET_OFFSET: u16 = 15; pub const ENCODING_OFFSET: u16 = 16; pub const CHAR_STRINGS_OFFSET: u16 = 17; pub const PRIVATE_DICT_SIZE_AND_OFFSET: u16 = 18; pub const FONT_MATRIX: u16 = 1207; pub const ROS: u16 = 1230; pub const FD_ARRAY: u16 = 1236; pub const FD_SELECT: u16 = 1237; } /// Enumerates some operators defined in the Adobe Technical Note #5176, /// Table 23 Private DICT Operators mod private_dict_operator { pub const LOCAL_SUBROUTINES_OFFSET: u16 = 19; pub const DEFAULT_WIDTH: u16 = 20; pub const NOMINAL_WIDTH: u16 = 21; } /// Enumerates Charset IDs defined in the Adobe Technical Note #5176, Table 22 mod charset_id { pub const ISO_ADOBE: usize = 0; pub const EXPERT: usize = 1; pub const EXPERT_SUBSET: usize = 2; } /// Enumerates Charset IDs defined in the Adobe Technical Note #5176, Table 16 mod encoding_id { pub const STANDARD: usize = 0; pub const EXPERT: usize = 1; } #[derive(Clone, Copy, Debug)] pub(crate) enum FontKind<'a> { SID(SIDMetadata<'a>), CID(CIDMetadata<'a>), } #[derive(Clone, Copy, Default, Debug)] pub(crate) struct SIDMetadata<'a> { local_subrs: Index<'a>, /// Can be zero. default_width: f32, /// Can be zero. nominal_width: f32, encoding: Encoding<'a>, } #[derive(Clone, Copy, Default, Debug)] pub(crate) struct CIDMetadata<'a> { fd_array: Index<'a>, fd_select: FDSelect<'a>, } /// An affine transformation matrix. #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct Matrix { pub sx: f32, pub ky: f32, pub kx: f32, pub sy: f32, pub tx: f32, pub ty: f32, } impl Default for Matrix { fn default() -> Self { Self { sx: 0.001, ky: 0.0, kx: 0.0, sy: 0.001, tx: 0.0, ty: 0.0, } } } #[derive(Default)] struct TopDict { charset_offset: Option, encoding_offset: Option, char_strings_offset: usize, private_dict_range: Option>, matrix: Matrix, has_ros: bool, fd_array_offset: Option, fd_select_offset: Option, } fn parse_top_dict(s: &mut Stream) -> Option { let mut top_dict = TopDict::default(); let index = parse_index::(s)?; // The Top DICT INDEX should have only one dictionary. let data = index.get(0)?; let mut operands_buffer = [0.0; MAX_OPERANDS_LEN]; let mut dict_parser = DictionaryParser::new(data, &mut operands_buffer); while let Some(operator) = dict_parser.parse_next() { match operator.get() { top_dict_operator::CHARSET_OFFSET => { top_dict.charset_offset = dict_parser.parse_offset(); } top_dict_operator::ENCODING_OFFSET => { top_dict.encoding_offset = dict_parser.parse_offset(); } top_dict_operator::CHAR_STRINGS_OFFSET => { top_dict.char_strings_offset = dict_parser.parse_offset()?; } top_dict_operator::PRIVATE_DICT_SIZE_AND_OFFSET => { top_dict.private_dict_range = dict_parser.parse_range(); } top_dict_operator::FONT_MATRIX => { dict_parser.parse_operands()?; let operands = dict_parser.operands(); if operands.len() == 6 { top_dict.matrix = Matrix { sx: operands[0] as f32, ky: operands[1] as f32, kx: operands[2] as f32, sy: operands[3] as f32, tx: operands[4] as f32, ty: operands[5] as f32, }; } } top_dict_operator::ROS => { top_dict.has_ros = true; } top_dict_operator::FD_ARRAY => { top_dict.fd_array_offset = dict_parser.parse_offset(); } top_dict_operator::FD_SELECT => { top_dict.fd_select_offset = dict_parser.parse_offset(); } _ => {} } } Some(top_dict) } // TODO: move to integration #[cfg(test)] mod tests { use super::*; #[test] fn private_dict_size_overflow() { let data = &[ 0x00, 0x01, // count: 1 0x01, // offset size: 1 0x01, // index [0]: 1 0x0C, // index [1]: 14 0x1D, 0x7F, 0xFF, 0xFF, 0xFF, // length: i32::MAX 0x1D, 0x7F, 0xFF, 0xFF, 0xFF, // offset: i32::MAX 0x12, // operator: 18 (private) ]; let top_dict = parse_top_dict(&mut Stream::new(data)).unwrap(); assert_eq!(top_dict.private_dict_range, Some(2147483647..4294967294)); } #[test] fn private_dict_negative_char_strings_offset() { let data = &[ 0x00, 0x01, // count: 1 0x01, // offset size: 1 0x01, // index [0]: 1 0x03, // index [1]: 3 // Item 0 0x8A, // offset: -1 0x11, // operator: 17 (char_string) ]; assert!(parse_top_dict(&mut Stream::new(data)).is_none()); } #[test] fn private_dict_no_char_strings_offset_operand() { let data = &[ 0x00, 0x01, // count: 1 0x01, // offset size: 1 0x01, // index [0]: 1 0x02, // index [1]: 2 // Item 0 // <-- No number here. 0x11, // operator: 17 (char_string) ]; assert!(parse_top_dict(&mut Stream::new(data)).is_none()); } #[test] fn negative_private_dict_offset_and_size() { let data = &[ 0x00, 0x01, // count: 1 0x01, // offset size: 1 0x01, // index [0]: 1 0x04, // index [1]: 4 // Item 0 0x8A, // length: -1 0x8A, // offset: -1 0x12, // operator: 18 (private) ]; let top_dict = parse_top_dict(&mut Stream::new(data)).unwrap(); assert!(top_dict.private_dict_range.is_none()); } } #[derive(Default, Debug)] struct PrivateDict { local_subroutines_offset: Option, default_width: Option, nominal_width: Option, } fn parse_private_dict(data: &[u8]) -> PrivateDict { let mut dict = PrivateDict::default(); let mut operands_buffer = [0.0; MAX_OPERANDS_LEN]; let mut dict_parser = DictionaryParser::new(data, &mut operands_buffer); while let Some(operator) = dict_parser.parse_next() { if operator.get() == private_dict_operator::LOCAL_SUBROUTINES_OFFSET { dict.local_subroutines_offset = dict_parser.parse_offset(); } else if operator.get() == private_dict_operator::DEFAULT_WIDTH { dict.default_width = dict_parser.parse_number().map(|n| n as f32); } else if operator.get() == private_dict_operator::NOMINAL_WIDTH { dict.nominal_width = dict_parser.parse_number().map(|n| n as f32); } } dict } fn parse_font_dict(data: &[u8]) -> Option> { let mut operands_buffer = [0.0; MAX_OPERANDS_LEN]; let mut dict_parser = DictionaryParser::new(data, &mut operands_buffer); while let Some(operator) = dict_parser.parse_next() { if operator.get() == top_dict_operator::PRIVATE_DICT_SIZE_AND_OFFSET { return dict_parser.parse_range(); } } None } /// In CID fonts, to get local subroutines we have to: /// 1. Find Font DICT index via FDSelect by GID. /// 2. Get Font DICT data from FDArray using this index. /// 3. Get a Private DICT offset from a Font DICT. /// 4. Get a local subroutine offset from Private DICT. /// 5. Parse a local subroutine at offset. fn parse_cid_local_subrs<'a>( data: &'a [u8], glyph_id: GlyphId, cid: &CIDMetadata, ) -> Option> { let font_dict_index = cid.fd_select.font_dict_index(glyph_id)?; let font_dict_data = cid.fd_array.get(u32::from(font_dict_index))?; let private_dict_range = parse_font_dict(font_dict_data)?; let private_dict_data = data.get(private_dict_range.clone())?; let private_dict = parse_private_dict(private_dict_data); let subroutines_offset = private_dict.local_subroutines_offset?; // 'The local subroutines offset is relative to the beginning // of the Private DICT data.' let start = private_dict_range.start.checked_add(subroutines_offset)?; let subrs_data = data.get(start..)?; let mut s = Stream::new(subrs_data); parse_index::(&mut s) } struct CharStringParserContext<'a> { metadata: &'a Table<'a>, width: Option, stems_len: u32, has_endchar: bool, has_seac: bool, glyph_id: GlyphId, // Required to parse local subroutine in CID fonts. local_subrs: Option>, } fn parse_char_string( data: &[u8], metadata: &Table, glyph_id: GlyphId, width_only: bool, builder: &mut dyn OutlineBuilder, ) -> Result<(Rect, Option), CFFError> { let local_subrs = match metadata.kind { FontKind::SID(ref sid) => Some(sid.local_subrs), FontKind::CID(_) => None, // Will be resolved on request. }; let mut ctx = CharStringParserContext { metadata, width: None, stems_len: 0, has_endchar: false, has_seac: false, glyph_id, local_subrs, }; let mut inner_builder = Builder { builder, bbox: RectF::new(), }; let stack = ArgumentsStack { data: &mut [0.0; MAX_ARGUMENTS_STACK_LEN], // 192B len: 0, max_len: MAX_ARGUMENTS_STACK_LEN, }; let mut parser = CharStringParser { stack, builder: &mut inner_builder, x: 0.0, y: 0.0, has_move_to: false, is_first_move_to: true, width_only, }; _parse_char_string(&mut ctx, data, 0, &mut parser)?; if width_only { return Ok((Rect::zero(), ctx.width)); } if !ctx.has_endchar { return Err(CFFError::MissingEndChar); } let bbox = parser.builder.bbox; // Check that bbox was changed. if bbox.is_default() { return Err(CFFError::ZeroBBox); } let rect = bbox.to_rect().ok_or(CFFError::BboxOverflow)?; Ok((rect, ctx.width)) } fn _parse_char_string( ctx: &mut CharStringParserContext, char_string: &[u8], depth: u8, p: &mut CharStringParser, ) -> Result<(), CFFError> { let mut s = Stream::new(char_string); while !s.at_end() { let op = s.read::().ok_or(CFFError::ReadOutOfBounds)?; match op { 0 | 2 | 9 | 13 | 15 | 16 | 17 => { // Reserved. return Err(CFFError::InvalidOperator); } operator::HORIZONTAL_STEM | operator::VERTICAL_STEM | operator::HORIZONTAL_STEM_HINT_MASK | operator::VERTICAL_STEM_HINT_MASK => { // y dy {dya dyb}* hstem // x dx {dxa dxb}* vstem // y dy {dya dyb}* hstemhm // x dx {dxa dxb}* vstemhm // If the stack length is uneven, than the first value is a `width`. let len = if p.stack.len().is_odd() && ctx.width.is_none() { ctx.width = Some(p.stack.at(0)); p.stack.len() - 1 } else { p.stack.len() }; ctx.stems_len += len as u32 >> 1; // We are ignoring the hint operators. p.stack.clear(); } operator::VERTICAL_MOVE_TO => { let mut i = 0; if p.stack.len() == 2 { i += 1; if ctx.width.is_none() { ctx.width = Some(p.stack.at(0)); } } p.parse_vertical_move_to(i)?; } operator::LINE_TO => { p.parse_line_to()?; } operator::HORIZONTAL_LINE_TO => { p.parse_horizontal_line_to()?; } operator::VERTICAL_LINE_TO => { p.parse_vertical_line_to()?; } operator::CURVE_TO => { p.parse_curve_to()?; } operator::CALL_LOCAL_SUBROUTINE => { if p.stack.is_empty() { return Err(CFFError::InvalidArgumentsStackLength); } if depth == STACK_LIMIT { return Err(CFFError::NestingLimitReached); } // Parse and remember the local subroutine for the current glyph. // Since it's a pretty complex task, we're doing it only when // a local subroutine is actually requested by the glyphs charstring. if ctx.local_subrs.is_none() { if let FontKind::CID(ref cid) = ctx.metadata.kind { ctx.local_subrs = parse_cid_local_subrs(ctx.metadata.table_data, ctx.glyph_id, cid); } } if let Some(local_subrs) = ctx.local_subrs { let subroutine_bias = calc_subroutine_bias(local_subrs.len()); let index = conv_subroutine_index(p.stack.pop(), subroutine_bias)?; let char_string = local_subrs .get(index) .ok_or(CFFError::InvalidSubroutineIndex)?; _parse_char_string(ctx, char_string, depth + 1, p)?; } else { return Err(CFFError::NoLocalSubroutines); } if ctx.has_endchar && !ctx.has_seac { if !s.at_end() { return Err(CFFError::DataAfterEndChar); } break; } } operator::RETURN => { break; } TWO_BYTE_OPERATOR_MARK => { // flex let op2 = s.read::().ok_or(CFFError::ReadOutOfBounds)?; match op2 { operator::HFLEX => p.parse_hflex()?, operator::FLEX => p.parse_flex()?, operator::HFLEX1 => p.parse_hflex1()?, operator::FLEX1 => p.parse_flex1()?, _ => return Err(CFFError::UnsupportedOperator), } } operator::ENDCHAR => { if p.stack.len() == 4 || (ctx.width.is_none() && p.stack.len() == 5) { // Process 'seac'. let accent_char = seac_code_to_glyph_id(&ctx.metadata.charset, p.stack.pop()) .ok_or(CFFError::InvalidSeacCode)?; let base_char = seac_code_to_glyph_id(&ctx.metadata.charset, p.stack.pop()) .ok_or(CFFError::InvalidSeacCode)?; let dy = p.stack.pop(); let dx = p.stack.pop(); if ctx.width.is_none() && !p.stack.is_empty() { ctx.width = Some(p.stack.pop()) } ctx.has_seac = true; if depth == STACK_LIMIT { return Err(CFFError::NestingLimitReached); } let base_char_string = ctx .metadata .char_strings .get(u32::from(base_char.0)) .ok_or(CFFError::InvalidSeacCode)?; _parse_char_string(ctx, base_char_string, depth + 1, p)?; p.x = dx; p.y = dy; let accent_char_string = ctx .metadata .char_strings .get(u32::from(accent_char.0)) .ok_or(CFFError::InvalidSeacCode)?; _parse_char_string(ctx, accent_char_string, depth + 1, p)?; } else if p.stack.len() == 1 && ctx.width.is_none() { ctx.width = Some(p.stack.pop()); } if !p.is_first_move_to { p.is_first_move_to = true; p.builder.close(); } if !s.at_end() { return Err(CFFError::DataAfterEndChar); } ctx.has_endchar = true; break; } operator::HINT_MASK | operator::COUNTER_MASK => { let mut len = p.stack.len(); // We are ignoring the hint operators. p.stack.clear(); // If the stack length is uneven, than the first value is a `width`. if len.is_odd() { len -= 1; if ctx.width.is_none() { ctx.width = Some(p.stack.at(0)); } } ctx.stems_len += len as u32 >> 1; s.advance(usize::num_from((ctx.stems_len + 7) >> 3)); } operator::MOVE_TO => { let mut i = 0; if p.stack.len() == 3 { i += 1; if ctx.width.is_none() { ctx.width = Some(p.stack.at(0)); } } p.parse_move_to(i)?; } operator::HORIZONTAL_MOVE_TO => { let mut i = 0; if p.stack.len() == 2 { i += 1; if ctx.width.is_none() { ctx.width = Some(p.stack.at(0)); } } p.parse_horizontal_move_to(i)?; } operator::CURVE_LINE => { p.parse_curve_line()?; } operator::LINE_CURVE => { p.parse_line_curve()?; } operator::VV_CURVE_TO => { p.parse_vv_curve_to()?; } operator::HH_CURVE_TO => { p.parse_hh_curve_to()?; } operator::SHORT_INT => { let n = s.read::().ok_or(CFFError::ReadOutOfBounds)?; p.stack.push(f32::from(n))?; } operator::CALL_GLOBAL_SUBROUTINE => { if p.stack.is_empty() { return Err(CFFError::InvalidArgumentsStackLength); } if depth == STACK_LIMIT { return Err(CFFError::NestingLimitReached); } let subroutine_bias = calc_subroutine_bias(ctx.metadata.global_subrs.len()); let index = conv_subroutine_index(p.stack.pop(), subroutine_bias)?; let char_string = ctx .metadata .global_subrs .get(index) .ok_or(CFFError::InvalidSubroutineIndex)?; _parse_char_string(ctx, char_string, depth + 1, p)?; if ctx.has_endchar && !ctx.has_seac { if !s.at_end() { return Err(CFFError::DataAfterEndChar); } break; } } operator::VH_CURVE_TO => { p.parse_vh_curve_to()?; } operator::HV_CURVE_TO => { p.parse_hv_curve_to()?; } 32..=246 => { p.parse_int1(op)?; } 247..=250 => { p.parse_int2(op, &mut s)?; } 251..=254 => { p.parse_int3(op, &mut s)?; } operator::FIXED_16_16 => { p.parse_fixed(&mut s)?; } } if p.width_only && ctx.width.is_some() { break; } } // TODO: 'A charstring subroutine must end with either an endchar or a return operator.' Ok(()) } fn seac_code_to_glyph_id(charset: &Charset, n: f32) -> Option { let code = u8::try_num_from(n)?; let sid = STANDARD_ENCODING[usize::from(code)]; let sid = StringId(u16::from(sid)); match charset { Charset::ISOAdobe => { // ISO Adobe charset only defines string ids up to 228 (zcaron) if code <= 228 { Some(GlyphId(sid.0)) } else { None } } Charset::Expert | Charset::ExpertSubset => None, _ => charset.sid_to_gid(sid), } } #[derive(Clone, Copy, Debug)] enum FDSelect<'a> { Format0(LazyArray16<'a, u8>), Format3(&'a [u8]), // It's easier to parse it in-place. } impl Default for FDSelect<'_> { fn default() -> Self { FDSelect::Format0(LazyArray16::default()) } } impl FDSelect<'_> { fn font_dict_index(&self, glyph_id: GlyphId) -> Option { match self { FDSelect::Format0(ref array) => array.get(glyph_id.0), FDSelect::Format3(data) => { let mut s = Stream::new(data); let number_of_ranges = s.read::()?; if number_of_ranges == 0 { return None; } // 'A sentinel GID follows the last range element and serves // to delimit the last range in the array.' // So we can simply increase the number of ranges by one. let number_of_ranges = number_of_ranges.checked_add(1)?; // Range is: GlyphId + u8 let mut prev_first_glyph = s.read::()?; let mut prev_index = s.read::()?; for _ in 1..number_of_ranges { let curr_first_glyph = s.read::()?; if (prev_first_glyph..curr_first_glyph).contains(&glyph_id) { return Some(prev_index); } else { prev_index = s.read::()?; } prev_first_glyph = curr_first_glyph; } None } } } } fn parse_fd_select<'a>(number_of_glyphs: u16, s: &mut Stream<'a>) -> Option> { let format = s.read::()?; match format { 0 => Some(FDSelect::Format0(s.read_array16::(number_of_glyphs)?)), 3 => Some(FDSelect::Format3(s.tail()?)), _ => None, } } fn parse_sid_metadata<'a>( data: &'a [u8], top_dict: TopDict, encoding: Encoding<'a>, ) -> Option> { let mut metadata = SIDMetadata::default(); metadata.encoding = encoding; let private_dict = if let Some(range) = top_dict.private_dict_range.clone() { parse_private_dict(data.get(range)?) } else { return Some(FontKind::SID(metadata)); }; metadata.default_width = private_dict.default_width.unwrap_or(0.0); metadata.nominal_width = private_dict.nominal_width.unwrap_or(0.0); if let (Some(private_dict_range), Some(subroutines_offset)) = ( top_dict.private_dict_range, private_dict.local_subroutines_offset, ) { // 'The local subroutines offset is relative to the beginning // of the Private DICT data.' if let Some(start) = private_dict_range.start.checked_add(subroutines_offset) { let data = data.get(start..data.len())?; let mut s = Stream::new(data); metadata.local_subrs = parse_index::(&mut s)?; } } Some(FontKind::SID(metadata)) } fn parse_cid_metadata(data: &[u8], top_dict: TopDict, number_of_glyphs: u16) -> Option { let (charset_offset, fd_array_offset, fd_select_offset) = match ( top_dict.charset_offset, top_dict.fd_array_offset, top_dict.fd_select_offset, ) { (Some(a), Some(b), Some(c)) => (a, b, c), _ => return None, // charset, FDArray and FDSelect must be set. }; if charset_offset <= charset_id::EXPERT_SUBSET { // 'There are no predefined charsets for CID fonts.' // Adobe Technical Note #5176, chapter 18 CID-keyed Fonts return None; } let mut metadata = CIDMetadata::default(); metadata.fd_array = { let mut s = Stream::new_at(data, fd_array_offset)?; parse_index::(&mut s)? }; metadata.fd_select = { let mut s = Stream::new_at(data, fd_select_offset)?; parse_fd_select(number_of_glyphs, &mut s)? }; Some(FontKind::CID(metadata)) } /// A [Compact Font Format Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/cff). #[derive(Clone, Copy)] pub struct Table<'a> { // The whole CFF table. // Used to resolve a local subroutine in a CID font. table_data: &'a [u8], #[allow(dead_code)] strings: Index<'a>, global_subrs: Index<'a>, charset: Charset<'a>, number_of_glyphs: NonZeroU16, matrix: Matrix, char_strings: Index<'a>, kind: FontKind<'a>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); // Parse Header. let major = s.read::()?; s.skip::(); // minor let header_size = s.read::()?; s.skip::(); // Absolute offset if major != 1 { return None; } // Jump to Name INDEX. It's not necessarily right after the header. if header_size > 4 { s.advance(usize::from(header_size) - 4); } // Skip Name INDEX. skip_index::(&mut s)?; let top_dict = parse_top_dict(&mut s)?; // Must be set, otherwise there are nothing to parse. if top_dict.char_strings_offset == 0 { return None; } // String INDEX. let strings = parse_index::(&mut s)?; // Parse Global Subroutines INDEX. let global_subrs = parse_index::(&mut s)?; let char_strings = { let mut s = Stream::new_at(data, top_dict.char_strings_offset)?; parse_index::(&mut s)? }; // 'The number of glyphs is the value of the count field in the CharStrings INDEX.' let number_of_glyphs = u16::try_from(char_strings.len()) .ok() .and_then(NonZeroU16::new)?; let charset = match top_dict.charset_offset { Some(charset_id::ISO_ADOBE) => Charset::ISOAdobe, Some(charset_id::EXPERT) => Charset::Expert, Some(charset_id::EXPERT_SUBSET) => Charset::ExpertSubset, Some(offset) => { let mut s = Stream::new_at(data, offset)?; parse_charset(number_of_glyphs, &mut s)? } None => Charset::ISOAdobe, // default }; let matrix = top_dict.matrix; let kind = if top_dict.has_ros { parse_cid_metadata(data, top_dict, number_of_glyphs.get())? } else { // Only SID fonts are allowed to have an Encoding. let encoding = match top_dict.encoding_offset { Some(encoding_id::STANDARD) => Encoding::new_standard(), Some(encoding_id::EXPERT) => Encoding::new_expert(), Some(offset) => parse_encoding(&mut Stream::new_at(data, offset)?)?, None => Encoding::new_standard(), // default }; parse_sid_metadata(data, top_dict, encoding)? }; Some(Self { table_data: data, strings, global_subrs, charset, number_of_glyphs, matrix, char_strings, kind, }) } /// Returns a total number of glyphs in the font. /// /// Never zero. #[inline] pub fn number_of_glyphs(&self) -> u16 { self.number_of_glyphs.get() } /// Returns a font transformation matrix. #[inline] pub fn matrix(&self) -> Matrix { self.matrix } /// Outlines a glyph. pub fn outline( &self, glyph_id: GlyphId, builder: &mut dyn OutlineBuilder, ) -> Result { let data = self .char_strings .get(u32::from(glyph_id.0)) .ok_or(CFFError::NoGlyph)?; parse_char_string(data, self, glyph_id, false, builder).map(|v| v.0) } /// Resolves a Glyph ID for a code point. /// /// Similar to [`Face::glyph_index`](crate::Face::glyph_index) but 8bit /// and uses CFF encoding and charset tables instead of TrueType `cmap`. pub fn glyph_index(&self, code_point: u8) -> Option { match self.kind { FontKind::SID(ref sid_meta) => { match sid_meta.encoding.code_to_gid(&self.charset, code_point) { Some(id) => Some(id), None => { // Try using the Standard encoding otherwise. // Custom Encodings does not guarantee to include all glyphs. Encoding::new_standard().code_to_gid(&self.charset, code_point) } } } FontKind::CID(_) => None, } } /// Returns a glyph width. /// /// This value is different from outline bbox width and is stored separately. /// /// Technically similar to [`Face::glyph_hor_advance`](crate::Face::glyph_hor_advance). pub fn glyph_width(&self, glyph_id: GlyphId) -> Option { match self.kind { FontKind::SID(ref sid) => { let data = self.char_strings.get(u32::from(glyph_id.0))?; let (_, width) = parse_char_string(data, self, glyph_id, true, &mut DummyOutline).ok()?; let width = width .map(|w| sid.nominal_width + w) .unwrap_or(sid.default_width); u16::try_from(width as i32).ok() } FontKind::CID(_) => None, } } /// Returns a glyph ID by a name. #[cfg(feature = "glyph-names")] pub fn glyph_index_by_name(&self, name: &str) -> Option { match self.kind { FontKind::SID(_) => { let sid = if let Some(index) = STANDARD_NAMES.iter().position(|n| *n == name) { StringId(index as u16) } else { let index = self .strings .into_iter() .position(|n| n == name.as_bytes())?; StringId((STANDARD_NAMES.len() + index) as u16) }; self.charset.sid_to_gid(sid) } FontKind::CID(_) => None, } } /// Returns a glyph name. #[cfg(feature = "glyph-names")] pub fn glyph_name(&self, glyph_id: GlyphId) -> Option<&'a str> { match self.kind { FontKind::SID(_) => { let sid = self.charset.gid_to_sid(glyph_id)?; let sid = usize::from(sid.0); match STANDARD_NAMES.get(sid) { Some(name) => Some(name), None => { let idx = u32::try_from(sid - STANDARD_NAMES.len()).ok()?; let name = self.strings.get(idx)?; core::str::from_utf8(name).ok() } } } FontKind::CID(_) => None, } } /// Returns the CID corresponding to a glyph ID. /// /// Returns `None` if this is not a CIDFont. #[cfg(feature = "glyph-names")] pub fn glyph_cid(&self, glyph_id: GlyphId) -> Option { match self.kind { FontKind::SID(_) => None, FontKind::CID(_) => self.charset.gid_to_sid(glyph_id).map(|id| id.0), } } } impl core::fmt::Debug for Table<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Table {{ ... }}") } } ttf-parser-0.24.1/src/tables/cff/cff2.rs000064400000000000000000000442311046102023000157700ustar 00000000000000//! A [Compact Font Format 2 Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/cff2) implementation. // https://docs.microsoft.com/en-us/typography/opentype/spec/cff2charstr use core::convert::TryFrom; use core::ops::Range; use super::argstack::ArgumentsStack; use super::charstring::CharStringParser; use super::dict::DictionaryParser; use super::index::{parse_index, Index}; use super::{calc_subroutine_bias, conv_subroutine_index, Builder, CFFError}; use crate::parser::{NumFrom, Stream, TryNumFrom}; use crate::var_store::*; use crate::{GlyphId, NormalizedCoordinate, OutlineBuilder, Rect, RectF}; // https://docs.microsoft.com/en-us/typography/opentype/spec/cff2#7-top-dict-data // 'Operators in DICT may be preceded by up to a maximum of 513 operands.' const MAX_OPERANDS_LEN: usize = 513; // https://docs.microsoft.com/en-us/typography/opentype/spec/cff2charstr#appendix-b-cff2-charstring-implementation-limits const STACK_LIMIT: u8 = 10; const MAX_ARGUMENTS_STACK_LEN: usize = 513; const TWO_BYTE_OPERATOR_MARK: u8 = 12; // https://docs.microsoft.com/en-us/typography/opentype/spec/cff2charstr#4-charstring-operators mod operator { pub const HORIZONTAL_STEM: u8 = 1; pub const VERTICAL_STEM: u8 = 3; pub const VERTICAL_MOVE_TO: u8 = 4; pub const LINE_TO: u8 = 5; pub const HORIZONTAL_LINE_TO: u8 = 6; pub const VERTICAL_LINE_TO: u8 = 7; pub const CURVE_TO: u8 = 8; pub const CALL_LOCAL_SUBROUTINE: u8 = 10; pub const VS_INDEX: u8 = 15; pub const BLEND: u8 = 16; pub const HORIZONTAL_STEM_HINT_MASK: u8 = 18; pub const HINT_MASK: u8 = 19; pub const COUNTER_MASK: u8 = 20; pub const MOVE_TO: u8 = 21; pub const HORIZONTAL_MOVE_TO: u8 = 22; pub const VERTICAL_STEM_HINT_MASK: u8 = 23; pub const CURVE_LINE: u8 = 24; pub const LINE_CURVE: u8 = 25; pub const VV_CURVE_TO: u8 = 26; pub const HH_CURVE_TO: u8 = 27; pub const SHORT_INT: u8 = 28; pub const CALL_GLOBAL_SUBROUTINE: u8 = 29; pub const VH_CURVE_TO: u8 = 30; pub const HV_CURVE_TO: u8 = 31; pub const HFLEX: u8 = 34; pub const FLEX: u8 = 35; pub const HFLEX1: u8 = 36; pub const FLEX1: u8 = 37; pub const FIXED_16_16: u8 = 255; } // https://docs.microsoft.com/en-us/typography/opentype/spec/cff2#table-9-top-dict-operator-entries mod top_dict_operator { pub const CHAR_STRINGS_OFFSET: u16 = 17; pub const VARIATION_STORE_OFFSET: u16 = 24; pub const FONT_DICT_INDEX_OFFSET: u16 = 1236; } // https://docs.microsoft.com/en-us/typography/opentype/spec/cff2#table-10-font-dict-operator-entries mod font_dict_operator { pub const PRIVATE_DICT_SIZE_AND_OFFSET: u16 = 18; } // https://docs.microsoft.com/en-us/typography/opentype/spec/cff2#table-16-private-dict-operators mod private_dict_operator { pub const LOCAL_SUBROUTINES_OFFSET: u16 = 19; } #[derive(Clone, Copy, Default)] struct TopDictData { char_strings_offset: usize, font_dict_index_offset: Option, variation_store_offset: Option, } fn parse_top_dict(data: &[u8]) -> Option { let mut dict_data = TopDictData::default(); let mut operands_buffer = [0.0; MAX_OPERANDS_LEN]; let mut dict_parser = DictionaryParser::new(data, &mut operands_buffer); while let Some(operator) = dict_parser.parse_next() { if operator.get() == top_dict_operator::CHAR_STRINGS_OFFSET { dict_data.char_strings_offset = dict_parser.parse_offset()?; } else if operator.get() == top_dict_operator::FONT_DICT_INDEX_OFFSET { dict_data.font_dict_index_offset = dict_parser.parse_offset(); } else if operator.get() == top_dict_operator::VARIATION_STORE_OFFSET { dict_data.variation_store_offset = dict_parser.parse_offset(); } } // Must be set, otherwise there are nothing to parse. if dict_data.char_strings_offset == 0 { return None; } Some(dict_data) } fn parse_font_dict(data: &[u8]) -> Option> { let mut private_dict_range = None; let mut operands_buffer = [0.0; MAX_OPERANDS_LEN]; let mut dict_parser = DictionaryParser::new(data, &mut operands_buffer); while let Some(operator) = dict_parser.parse_next() { if operator.get() == font_dict_operator::PRIVATE_DICT_SIZE_AND_OFFSET { dict_parser.parse_operands()?; let operands = dict_parser.operands(); if operands.len() == 2 { let len = usize::try_from(operands[0] as i32).ok()?; let start = usize::try_from(operands[1] as i32).ok()?; let end = start.checked_add(len)?; private_dict_range = Some(start..end); } break; } } private_dict_range } fn parse_private_dict(data: &[u8]) -> Option { let mut subroutines_offset = None; let mut operands_buffer = [0.0; MAX_OPERANDS_LEN]; let mut dict_parser = DictionaryParser::new(data, &mut operands_buffer); while let Some(operator) = dict_parser.parse_next() { if operator.get() == private_dict_operator::LOCAL_SUBROUTINES_OFFSET { dict_parser.parse_operands()?; let operands = dict_parser.operands(); if operands.len() == 1 { subroutines_offset = usize::try_from(operands[0] as i32).ok(); } break; } } subroutines_offset } /// CFF2 allows up to 65535 scalars, but an average font will have 3-5. /// So 64 is more than enough. const SCALARS_MAX: u8 = 64; #[derive(Clone, Copy)] pub(crate) struct Scalars { d: [f32; SCALARS_MAX as usize], // 256B len: u8, } impl Default for Scalars { fn default() -> Self { Scalars { d: [0.0; SCALARS_MAX as usize], len: 0, } } } impl Scalars { pub fn len(&self) -> u8 { self.len } pub fn clear(&mut self) { self.len = 0; } pub fn at(&self, i: u8) -> f32 { if i < self.len { self.d[usize::from(i)] } else { 0.0 } } pub fn push(&mut self, n: f32) -> Option<()> { if self.len < SCALARS_MAX { self.d[usize::from(self.len)] = n; self.len += 1; Some(()) } else { None } } } struct CharStringParserContext<'a> { metadata: &'a Table<'a>, coordinates: &'a [NormalizedCoordinate], scalars: Scalars, had_vsindex: bool, had_blend: bool, stems_len: u32, } impl CharStringParserContext<'_> { fn update_scalars(&mut self, index: u16) -> Result<(), CFFError> { self.scalars.clear(); let indices = self .metadata .item_variation_store .region_indices(index) .ok_or(CFFError::InvalidItemVariationDataIndex)?; for index in indices { let scalar = self .metadata .item_variation_store .regions .evaluate_region(index, self.coordinates); self.scalars .push(scalar) .ok_or(CFFError::BlendRegionsLimitReached)?; } Ok(()) } } fn parse_char_string( data: &[u8], metadata: &Table, coordinates: &[NormalizedCoordinate], builder: &mut dyn OutlineBuilder, ) -> Result { let mut ctx = CharStringParserContext { metadata, coordinates, scalars: Scalars::default(), had_vsindex: false, had_blend: false, stems_len: 0, }; // Load scalars at default index. ctx.update_scalars(0)?; let mut inner_builder = Builder { builder, bbox: RectF::new(), }; let stack = ArgumentsStack { data: &mut [0.0; MAX_ARGUMENTS_STACK_LEN], // 2052B len: 0, max_len: MAX_ARGUMENTS_STACK_LEN, }; let mut parser = CharStringParser { stack, builder: &mut inner_builder, x: 0.0, y: 0.0, has_move_to: false, is_first_move_to: true, width_only: false, }; _parse_char_string(&mut ctx, data, 0, &mut parser)?; // let _ = _parse_char_string(&mut ctx, data, 0.0, 0.0, &mut stack, 0, &mut inner_builder)?; let bbox = parser.builder.bbox; // Check that bbox was changed. if bbox.is_default() { return Err(CFFError::ZeroBBox); } bbox.to_rect().ok_or(CFFError::BboxOverflow) } fn _parse_char_string( ctx: &mut CharStringParserContext, char_string: &[u8], depth: u8, p: &mut CharStringParser, ) -> Result<(), CFFError> { let mut s = Stream::new(char_string); while !s.at_end() { let op = s.read::().ok_or(CFFError::ReadOutOfBounds)?; match op { 0 | 2 | 9 | 11 | 13 | 14 | 17 => { // Reserved. return Err(CFFError::InvalidOperator); } operator::HORIZONTAL_STEM | operator::VERTICAL_STEM | operator::HORIZONTAL_STEM_HINT_MASK | operator::VERTICAL_STEM_HINT_MASK => { // y dy {dya dyb}* hstem // x dx {dxa dxb}* vstem // y dy {dya dyb}* hstemhm // x dx {dxa dxb}* vstemhm ctx.stems_len += p.stack.len() as u32 >> 1; // We are ignoring the hint operators. p.stack.clear(); } operator::VERTICAL_MOVE_TO => { p.parse_vertical_move_to(0)?; } operator::LINE_TO => { p.parse_line_to()?; } operator::HORIZONTAL_LINE_TO => { p.parse_horizontal_line_to()?; } operator::VERTICAL_LINE_TO => { p.parse_vertical_line_to()?; } operator::CURVE_TO => { p.parse_curve_to()?; } operator::CALL_LOCAL_SUBROUTINE => { if p.stack.is_empty() { return Err(CFFError::InvalidArgumentsStackLength); } if depth == STACK_LIMIT { return Err(CFFError::NestingLimitReached); } let subroutine_bias = calc_subroutine_bias(ctx.metadata.local_subrs.len()); let index = conv_subroutine_index(p.stack.pop(), subroutine_bias)?; let char_string = ctx .metadata .local_subrs .get(index) .ok_or(CFFError::InvalidSubroutineIndex)?; _parse_char_string(ctx, char_string, depth + 1, p)?; } TWO_BYTE_OPERATOR_MARK => { // flex let op2 = s.read::().ok_or(CFFError::ReadOutOfBounds)?; match op2 { operator::HFLEX => p.parse_hflex()?, operator::FLEX => p.parse_flex()?, operator::HFLEX1 => p.parse_hflex1()?, operator::FLEX1 => p.parse_flex1()?, _ => return Err(CFFError::UnsupportedOperator), } } operator::VS_INDEX => { // |- ivs vsindex (15) |- // `vsindex` must precede the first `blend` operator, and may occur only once. if ctx.had_blend || ctx.had_vsindex { // TODO: maybe add a custom error return Err(CFFError::InvalidOperator); } if p.stack.len() != 1 { return Err(CFFError::InvalidArgumentsStackLength); } let index = u16::try_num_from(p.stack.pop()) .ok_or(CFFError::InvalidItemVariationDataIndex)?; ctx.update_scalars(index)?; ctx.had_vsindex = true; p.stack.clear(); } operator::BLEND => { // num(0)..num(n-1), delta(0,0)..delta(k-1,0), // delta(0,1)..delta(k-1,1) .. delta(0,n-1)..delta(k-1,n-1) // n blend (16) val(0)..val(n-1) ctx.had_blend = true; let n = u16::try_num_from(p.stack.pop()) .ok_or(CFFError::InvalidNumberOfBlendOperands)?; let k = ctx.scalars.len(); let len = usize::from(n) * (usize::from(k) + 1); if p.stack.len() < len { return Err(CFFError::InvalidArgumentsStackLength); } let start = p.stack.len() - len; for i in (0..n).rev() { for j in 0..k { let delta = p.stack.pop(); p.stack.data[start + usize::from(i)] += delta * ctx.scalars.at(k - j - 1); } } } operator::HINT_MASK | operator::COUNTER_MASK => { ctx.stems_len += p.stack.len() as u32 >> 1; s.advance(usize::num_from((ctx.stems_len + 7) >> 3)); // We are ignoring the hint operators. p.stack.clear(); } operator::MOVE_TO => { p.parse_move_to(0)?; } operator::HORIZONTAL_MOVE_TO => { p.parse_horizontal_move_to(0)?; } operator::CURVE_LINE => { p.parse_curve_line()?; } operator::LINE_CURVE => { p.parse_line_curve()?; } operator::VV_CURVE_TO => { p.parse_vv_curve_to()?; } operator::HH_CURVE_TO => { p.parse_hh_curve_to()?; } operator::SHORT_INT => { let n = s.read::().ok_or(CFFError::ReadOutOfBounds)?; p.stack.push(f32::from(n))?; } operator::CALL_GLOBAL_SUBROUTINE => { if p.stack.is_empty() { return Err(CFFError::InvalidArgumentsStackLength); } if depth == STACK_LIMIT { return Err(CFFError::NestingLimitReached); } let subroutine_bias = calc_subroutine_bias(ctx.metadata.global_subrs.len()); let index = conv_subroutine_index(p.stack.pop(), subroutine_bias)?; let char_string = ctx .metadata .global_subrs .get(index) .ok_or(CFFError::InvalidSubroutineIndex)?; _parse_char_string(ctx, char_string, depth + 1, p)?; } operator::VH_CURVE_TO => { p.parse_vh_curve_to()?; } operator::HV_CURVE_TO => { p.parse_hv_curve_to()?; } 32..=246 => { p.parse_int1(op)?; } 247..=250 => { p.parse_int2(op, &mut s)?; } 251..=254 => { p.parse_int3(op, &mut s)?; } operator::FIXED_16_16 => { p.parse_fixed(&mut s)?; } } } Ok(()) } /// A [Compact Font Format 2 Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/cff2). #[derive(Clone, Copy, Default)] pub struct Table<'a> { global_subrs: Index<'a>, local_subrs: Index<'a>, char_strings: Index<'a>, item_variation_store: ItemVariationStore<'a>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); // Parse Header. let major = s.read::()?; s.skip::(); // minor let header_size = s.read::()?; let top_dict_length = s.read::()?; if major != 2 { return None; } // Jump to Top DICT. It's not necessarily right after the header. if header_size > 5 { s.advance(usize::from(header_size) - 5); } let top_dict_data = s.read_bytes(usize::from(top_dict_length))?; let top_dict = parse_top_dict(top_dict_data)?; let mut metadata = Self::default(); // Parse Global Subroutines INDEX. metadata.global_subrs = parse_index::(&mut s)?; metadata.char_strings = { let mut s = Stream::new_at(data, top_dict.char_strings_offset)?; parse_index::(&mut s)? }; if let Some(offset) = top_dict.variation_store_offset { let mut s = Stream::new_at(data, offset)?; s.skip::(); // length metadata.item_variation_store = ItemVariationStore::parse(s)?; } // TODO: simplify if let Some(offset) = top_dict.font_dict_index_offset { let mut s = Stream::new_at(data, offset)?; 'outer: for font_dict_data in parse_index::(&mut s)? { if let Some(private_dict_range) = parse_font_dict(font_dict_data) { // 'Private DICT size and offset, from start of the CFF2 table.' let private_dict_data = data.get(private_dict_range.clone())?; if let Some(subroutines_offset) = parse_private_dict(private_dict_data) { // 'The local subroutines offset is relative to the beginning // of the Private DICT data.' if let Some(start) = private_dict_range.start.checked_add(subroutines_offset) { let data = data.get(start..data.len())?; let mut s = Stream::new(data); metadata.local_subrs = parse_index::(&mut s)?; break 'outer; } } } } } Some(metadata) } /// Outlines a glyph. pub fn outline( &self, coordinates: &[NormalizedCoordinate], glyph_id: GlyphId, builder: &mut dyn OutlineBuilder, ) -> Result { let data = self .char_strings .get(u32::from(glyph_id.0)) .ok_or(CFFError::NoGlyph)?; parse_char_string(data, self, coordinates, builder) } } impl core::fmt::Debug for Table<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Table {{ ... }}") } } ttf-parser-0.24.1/src/tables/cff/charset.rs000064400000000000000000000202131046102023000165730ustar 00000000000000use super::StringId; use crate::parser::{FromData, LazyArray16, Stream}; use crate::GlyphId; use core::num::NonZeroU16; /// The Expert Encoding conversion as defined in the Adobe Technical Note #5176 Appendix C. #[rustfmt::skip] #[cfg(feature = "glyph-names")] const EXPERT_ENCODING: &[u16] = &[ 0, 1, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 13, 14, 15, 99, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 27, 28, 249, 250, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 109, 110, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306, 307, 308, 309, 310, 311, 312, 313, 314, 315, 316, 317, 318, 158, 155, 163, 319, 320, 321, 322, 323, 324, 325, 326, 150, 164, 169, 327, 328, 329, 330, 331, 332, 333, 334, 335, 336, 337, 338, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348, 349, 350, 351, 352, 353, 354, 355, 356, 357, 358, 359, 360, 361, 362, 363, 364, 365, 366, 367, 368, 369, 370, 371, 372, 373, 374, 375, 376, 377, 378, ]; /// The Expert Subset Encoding conversion as defined in the Adobe Technical Note #5176 Appendix C. #[rustfmt::skip] #[cfg(feature = "glyph-names")] const EXPERT_SUBSET_ENCODING: &[u16] = &[ 0, 1, 231, 232, 235, 236, 237, 238, 13, 14, 15, 99, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 27, 28, 249, 250, 251, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 109, 110, 267, 268, 269, 270, 272, 300, 301, 302, 305, 314, 315, 158, 155, 163, 320, 321, 322, 323, 324, 325, 326, 150, 164, 169, 327, 328, 329, 330, 331, 332, 333, 334, 335, 336, 337, 338, 339, 340, 341, 342, 343, 344, 345, 346 ]; #[derive(Clone, Copy, Debug)] pub(crate) struct Format1Range { first: StringId, left: u8, } impl FromData for Format1Range { const SIZE: usize = 3; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Format1Range { first: s.read::()?, left: s.read::()?, }) } } #[derive(Clone, Copy, Debug)] pub(crate) struct Format2Range { first: StringId, left: u16, } impl FromData for Format2Range { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Format2Range { first: s.read::()?, left: s.read::()?, }) } } #[derive(Clone, Copy, Debug)] pub(crate) enum Charset<'a> { ISOAdobe, Expert, ExpertSubset, Format0(LazyArray16<'a, StringId>), Format1(LazyArray16<'a, Format1Range>), Format2(LazyArray16<'a, Format2Range>), } impl Charset<'_> { pub fn sid_to_gid(&self, sid: StringId) -> Option { if sid.0 == 0 { return Some(GlyphId(0)); } match self { Charset::ISOAdobe | Charset::Expert | Charset::ExpertSubset => None, Charset::Format0(ref array) => { // First glyph is omitted, so we have to add 1. array .into_iter() .position(|n| n == sid) .map(|n| GlyphId(n as u16 + 1)) } Charset::Format1(array) => { let mut glyph_id = GlyphId(1); for range in *array { let last = u32::from(range.first.0) + u32::from(range.left); if range.first <= sid && u32::from(sid.0) <= last { glyph_id.0 += sid.0 - range.first.0; return Some(glyph_id); } glyph_id.0 += u16::from(range.left) + 1; } None } Charset::Format2(array) => { // The same as format 1, but Range::left is u16. let mut glyph_id = GlyphId(1); for range in *array { let last = u32::from(range.first.0) + u32::from(range.left); if sid >= range.first && u32::from(sid.0) <= last { glyph_id.0 += sid.0 - range.first.0; return Some(glyph_id); } glyph_id.0 += range.left + 1; } None } } } #[cfg(feature = "glyph-names")] pub fn gid_to_sid(&self, gid: GlyphId) -> Option { match self { Charset::ISOAdobe => { if gid.0 <= 228 { Some(StringId(gid.0)) } else { None } } Charset::Expert => EXPERT_ENCODING .get(usize::from(gid.0)) .cloned() .map(StringId), Charset::ExpertSubset => EXPERT_SUBSET_ENCODING .get(usize::from(gid.0)) .cloned() .map(StringId), Charset::Format0(ref array) => { if gid.0 == 0 { Some(StringId(0)) } else { array.get(gid.0 - 1) } } Charset::Format1(array) => { if gid.0 == 0 { Some(StringId(0)) } else { let mut sid = gid.0 - 1; for range in *array { if sid <= u16::from(range.left) { sid = sid.checked_add(range.first.0)?; return Some(StringId(sid)); } sid = sid.checked_sub(u16::from(range.left) + 1)?; } None } } Charset::Format2(array) => { if gid.0 == 0 { Some(StringId(0)) } else { let mut sid = gid.0 - 1; for range in *array { if sid <= range.left { sid = sid.checked_add(range.first.0)?; return Some(StringId(sid)); } sid = sid.checked_sub(range.left.checked_add(1)?)?; } None } } } } } pub(crate) fn parse_charset<'a>( number_of_glyphs: NonZeroU16, s: &mut Stream<'a>, ) -> Option> { // -1 everywhere, since `.notdef` is omitted. let format = s.read::()?; match format { 0 => Some(Charset::Format0( s.read_array16::(number_of_glyphs.get() - 1)?, )), 1 => { // The number of ranges is not defined, so we have to // read until no glyphs are left. let mut count = 0; { let mut s = s.clone(); let mut total_left = number_of_glyphs.get() - 1; while total_left > 0 { s.skip::(); // first let left = s.read::()?; total_left = total_left.checked_sub(u16::from(left) + 1)?; count += 1; } } s.read_array16::(count).map(Charset::Format1) } 2 => { // The same as format 1, but Range::left is u16. let mut count = 0; { let mut s = s.clone(); let mut total_left = number_of_glyphs.get() - 1; while total_left > 0 { s.skip::(); // first let left = s.read::()?.checked_add(1)?; total_left = total_left.checked_sub(left)?; count += 1; } } s.read_array16::(count).map(Charset::Format2) } _ => None, } } ttf-parser-0.24.1/src/tables/cff/charstring.rs000064400000000000000000000421131046102023000173110ustar 00000000000000use super::argstack::ArgumentsStack; use super::{f32_abs, Builder, CFFError, IsEven}; use crate::parser::{Fixed, Stream}; pub(crate) struct CharStringParser<'a> { pub stack: ArgumentsStack<'a>, pub builder: &'a mut Builder<'a>, pub x: f32, pub y: f32, pub has_move_to: bool, pub is_first_move_to: bool, pub width_only: bool, // Exit right after the glyph width is parsed. } impl CharStringParser<'_> { #[inline] pub fn parse_move_to(&mut self, offset: usize) -> Result<(), CFFError> { // dx1 dy1 if self.stack.len() != offset + 2 { return Err(CFFError::InvalidArgumentsStackLength); } if self.is_first_move_to { self.is_first_move_to = false; } else { self.builder.close(); } self.has_move_to = true; self.x += self.stack.at(offset + 0); self.y += self.stack.at(offset + 1); self.builder.move_to(self.x, self.y); self.stack.clear(); Ok(()) } #[inline] pub fn parse_horizontal_move_to(&mut self, offset: usize) -> Result<(), CFFError> { // dx1 if self.stack.len() != offset + 1 { return Err(CFFError::InvalidArgumentsStackLength); } if self.is_first_move_to { self.is_first_move_to = false; } else { self.builder.close(); } self.has_move_to = true; self.x += self.stack.at(offset); self.builder.move_to(self.x, self.y); self.stack.clear(); Ok(()) } #[inline] pub fn parse_vertical_move_to(&mut self, offset: usize) -> Result<(), CFFError> { // dy1 if self.stack.len() != offset + 1 { return Err(CFFError::InvalidArgumentsStackLength); } if self.is_first_move_to { self.is_first_move_to = false; } else { self.builder.close(); } self.has_move_to = true; self.y += self.stack.at(offset); self.builder.move_to(self.x, self.y); self.stack.clear(); Ok(()) } #[inline] pub fn parse_line_to(&mut self) -> Result<(), CFFError> { // {dxa dya}+ if !self.has_move_to { return Err(CFFError::MissingMoveTo); } if self.stack.len().is_odd() { return Err(CFFError::InvalidArgumentsStackLength); } let mut i = 0; while i < self.stack.len() { self.x += self.stack.at(i + 0); self.y += self.stack.at(i + 1); self.builder.line_to(self.x, self.y); i += 2; } self.stack.clear(); Ok(()) } #[inline] pub fn parse_horizontal_line_to(&mut self) -> Result<(), CFFError> { // dx1 {dya dxb}* // {dxa dyb}+ if !self.has_move_to { return Err(CFFError::MissingMoveTo); } if self.stack.is_empty() { return Err(CFFError::InvalidArgumentsStackLength); } let mut i = 0; while i < self.stack.len() { self.x += self.stack.at(i); i += 1; self.builder.line_to(self.x, self.y); if i == self.stack.len() { break; } self.y += self.stack.at(i); i += 1; self.builder.line_to(self.x, self.y); } self.stack.clear(); Ok(()) } #[inline] pub fn parse_vertical_line_to(&mut self) -> Result<(), CFFError> { // dy1 {dxa dyb}* // {dya dxb}+ if !self.has_move_to { return Err(CFFError::MissingMoveTo); } if self.stack.is_empty() { return Err(CFFError::InvalidArgumentsStackLength); } let mut i = 0; while i < self.stack.len() { self.y += self.stack.at(i); i += 1; self.builder.line_to(self.x, self.y); if i == self.stack.len() { break; } self.x += self.stack.at(i); i += 1; self.builder.line_to(self.x, self.y); } self.stack.clear(); Ok(()) } #[inline] pub fn parse_curve_to(&mut self) -> Result<(), CFFError> { // {dxa dya dxb dyb dxc dyc}+ if !self.has_move_to { return Err(CFFError::MissingMoveTo); } if self.stack.len() % 6 != 0 { return Err(CFFError::InvalidArgumentsStackLength); } let mut i = 0; while i < self.stack.len() { let x1 = self.x + self.stack.at(i + 0); let y1 = self.y + self.stack.at(i + 1); let x2 = x1 + self.stack.at(i + 2); let y2 = y1 + self.stack.at(i + 3); self.x = x2 + self.stack.at(i + 4); self.y = y2 + self.stack.at(i + 5); self.builder.curve_to(x1, y1, x2, y2, self.x, self.y); i += 6; } self.stack.clear(); Ok(()) } #[inline] pub fn parse_curve_line(&mut self) -> Result<(), CFFError> { // {dxa dya dxb dyb dxc dyc}+ dxd dyd if !self.has_move_to { return Err(CFFError::MissingMoveTo); } if self.stack.len() < 8 { return Err(CFFError::InvalidArgumentsStackLength); } if (self.stack.len() - 2) % 6 != 0 { return Err(CFFError::InvalidArgumentsStackLength); } let mut i = 0; while i < self.stack.len() - 2 { let x1 = self.x + self.stack.at(i + 0); let y1 = self.y + self.stack.at(i + 1); let x2 = x1 + self.stack.at(i + 2); let y2 = y1 + self.stack.at(i + 3); self.x = x2 + self.stack.at(i + 4); self.y = y2 + self.stack.at(i + 5); self.builder.curve_to(x1, y1, x2, y2, self.x, self.y); i += 6; } self.x += self.stack.at(i + 0); self.y += self.stack.at(i + 1); self.builder.line_to(self.x, self.y); self.stack.clear(); Ok(()) } #[inline] pub fn parse_line_curve(&mut self) -> Result<(), CFFError> { // {dxa dya}+ dxb dyb dxc dyc dxd dyd if !self.has_move_to { return Err(CFFError::MissingMoveTo); } if self.stack.len() < 8 { return Err(CFFError::InvalidArgumentsStackLength); } if (self.stack.len() - 6).is_odd() { return Err(CFFError::InvalidArgumentsStackLength); } let mut i = 0; while i < self.stack.len() - 6 { self.x += self.stack.at(i + 0); self.y += self.stack.at(i + 1); self.builder.line_to(self.x, self.y); i += 2; } let x1 = self.x + self.stack.at(i + 0); let y1 = self.y + self.stack.at(i + 1); let x2 = x1 + self.stack.at(i + 2); let y2 = y1 + self.stack.at(i + 3); self.x = x2 + self.stack.at(i + 4); self.y = y2 + self.stack.at(i + 5); self.builder.curve_to(x1, y1, x2, y2, self.x, self.y); self.stack.clear(); Ok(()) } #[inline] pub fn parse_hh_curve_to(&mut self) -> Result<(), CFFError> { // dy1? {dxa dxb dyb dxc}+ if !self.has_move_to { return Err(CFFError::MissingMoveTo); } let mut i = 0; // The odd argument count indicates an Y position. if self.stack.len().is_odd() { self.y += self.stack.at(0); i += 1; } if (self.stack.len() - i) % 4 != 0 { return Err(CFFError::InvalidArgumentsStackLength); } while i < self.stack.len() { let x1 = self.x + self.stack.at(i + 0); let y1 = self.y; let x2 = x1 + self.stack.at(i + 1); let y2 = y1 + self.stack.at(i + 2); self.x = x2 + self.stack.at(i + 3); self.y = y2; self.builder.curve_to(x1, y1, x2, y2, self.x, self.y); i += 4; } self.stack.clear(); Ok(()) } #[inline] pub fn parse_vv_curve_to(&mut self) -> Result<(), CFFError> { // dx1? {dya dxb dyb dyc}+ if !self.has_move_to { return Err(CFFError::MissingMoveTo); } let mut i = 0; // The odd argument count indicates an X position. if self.stack.len().is_odd() { self.x += self.stack.at(0); i += 1; } if (self.stack.len() - i) % 4 != 0 { return Err(CFFError::InvalidArgumentsStackLength); } while i < self.stack.len() { let x1 = self.x; let y1 = self.y + self.stack.at(i + 0); let x2 = x1 + self.stack.at(i + 1); let y2 = y1 + self.stack.at(i + 2); self.x = x2; self.y = y2 + self.stack.at(i + 3); self.builder.curve_to(x1, y1, x2, y2, self.x, self.y); i += 4; } self.stack.clear(); Ok(()) } #[inline] pub fn parse_hv_curve_to(&mut self) -> Result<(), CFFError> { // dx1 dx2 dy2 dy3 {dya dxb dyb dxc dxd dxe dye dyf}* dxf? // {dxa dxb dyb dyc dyd dxe dye dxf}+ dyf? if !self.has_move_to { return Err(CFFError::MissingMoveTo); } if self.stack.len() < 4 { return Err(CFFError::InvalidArgumentsStackLength); } self.stack.reverse(); while !self.stack.is_empty() { if self.stack.len() < 4 { return Err(CFFError::InvalidArgumentsStackLength); } let x1 = self.x + self.stack.pop(); let y1 = self.y; let x2 = x1 + self.stack.pop(); let y2 = y1 + self.stack.pop(); self.y = y2 + self.stack.pop(); self.x = x2; if self.stack.len() == 1 { self.x += self.stack.pop(); } self.builder.curve_to(x1, y1, x2, y2, self.x, self.y); if self.stack.is_empty() { break; } if self.stack.len() < 4 { return Err(CFFError::InvalidArgumentsStackLength); } let x1 = self.x; let y1 = self.y + self.stack.pop(); let x2 = x1 + self.stack.pop(); let y2 = y1 + self.stack.pop(); self.x = x2 + self.stack.pop(); self.y = y2; if self.stack.len() == 1 { self.y += self.stack.pop() } self.builder.curve_to(x1, y1, x2, y2, self.x, self.y); } debug_assert!(self.stack.is_empty()); Ok(()) } #[inline] pub fn parse_vh_curve_to(&mut self) -> Result<(), CFFError> { // dy1 dx2 dy2 dx3 {dxa dxb dyb dyc dyd dxe dye dxf}* dyf? // {dya dxb dyb dxc dxd dxe dye dyf}+ dxf? if !self.has_move_to { return Err(CFFError::MissingMoveTo); } if self.stack.len() < 4 { return Err(CFFError::InvalidArgumentsStackLength); } self.stack.reverse(); while !self.stack.is_empty() { if self.stack.len() < 4 { return Err(CFFError::InvalidArgumentsStackLength); } let x1 = self.x; let y1 = self.y + self.stack.pop(); let x2 = x1 + self.stack.pop(); let y2 = y1 + self.stack.pop(); self.x = x2 + self.stack.pop(); self.y = y2; if self.stack.len() == 1 { self.y += self.stack.pop(); } self.builder.curve_to(x1, y1, x2, y2, self.x, self.y); if self.stack.is_empty() { break; } if self.stack.len() < 4 { return Err(CFFError::InvalidArgumentsStackLength); } let x1 = self.x + self.stack.pop(); let y1 = self.y; let x2 = x1 + self.stack.pop(); let y2 = y1 + self.stack.pop(); self.y = y2 + self.stack.pop(); self.x = x2; if self.stack.len() == 1 { self.x += self.stack.pop(); } self.builder.curve_to(x1, y1, x2, y2, self.x, self.y); } debug_assert!(self.stack.is_empty()); Ok(()) } #[inline] pub fn parse_flex(&mut self) -> Result<(), CFFError> { // dx1 dy1 dx2 dy2 dx3 dy3 dx4 dy4 dx5 dy5 dx6 dy6 fd if !self.has_move_to { return Err(CFFError::MissingMoveTo); } if self.stack.len() != 13 { return Err(CFFError::InvalidArgumentsStackLength); } let dx1 = self.x + self.stack.at(0); let dy1 = self.y + self.stack.at(1); let dx2 = dx1 + self.stack.at(2); let dy2 = dy1 + self.stack.at(3); let dx3 = dx2 + self.stack.at(4); let dy3 = dy2 + self.stack.at(5); let dx4 = dx3 + self.stack.at(6); let dy4 = dy3 + self.stack.at(7); let dx5 = dx4 + self.stack.at(8); let dy5 = dy4 + self.stack.at(9); self.x = dx5 + self.stack.at(10); self.y = dy5 + self.stack.at(11); self.builder.curve_to(dx1, dy1, dx2, dy2, dx3, dy3); self.builder.curve_to(dx4, dy4, dx5, dy5, self.x, self.y); self.stack.clear(); Ok(()) } #[inline] pub fn parse_flex1(&mut self) -> Result<(), CFFError> { // dx1 dy1 dx2 dy2 dx3 dy3 dx4 dy4 dx5 dy5 d6 if !self.has_move_to { return Err(CFFError::MissingMoveTo); } if self.stack.len() != 11 { return Err(CFFError::InvalidArgumentsStackLength); } let dx1 = self.x + self.stack.at(0); let dy1 = self.y + self.stack.at(1); let dx2 = dx1 + self.stack.at(2); let dy2 = dy1 + self.stack.at(3); let dx3 = dx2 + self.stack.at(4); let dy3 = dy2 + self.stack.at(5); let dx4 = dx3 + self.stack.at(6); let dy4 = dy3 + self.stack.at(7); let dx5 = dx4 + self.stack.at(8); let dy5 = dy4 + self.stack.at(9); if f32_abs(dx5 - self.x) > f32_abs(dy5 - self.y) { self.x = dx5 + self.stack.at(10); } else { self.y = dy5 + self.stack.at(10); } self.builder.curve_to(dx1, dy1, dx2, dy2, dx3, dy3); self.builder.curve_to(dx4, dy4, dx5, dy5, self.x, self.y); self.stack.clear(); Ok(()) } #[inline] pub fn parse_hflex(&mut self) -> Result<(), CFFError> { // dx1 dx2 dy2 dx3 dx4 dx5 dx6 if !self.has_move_to { return Err(CFFError::MissingMoveTo); } if self.stack.len() != 7 { return Err(CFFError::InvalidArgumentsStackLength); } let dx1 = self.x + self.stack.at(0); let dy1 = self.y; let dx2 = dx1 + self.stack.at(1); let dy2 = dy1 + self.stack.at(2); let dx3 = dx2 + self.stack.at(3); let dy3 = dy2; let dx4 = dx3 + self.stack.at(4); let dy4 = dy2; let dx5 = dx4 + self.stack.at(5); let dy5 = self.y; self.x = dx5 + self.stack.at(6); self.builder.curve_to(dx1, dy1, dx2, dy2, dx3, dy3); self.builder.curve_to(dx4, dy4, dx5, dy5, self.x, self.y); self.stack.clear(); Ok(()) } #[inline] pub fn parse_hflex1(&mut self) -> Result<(), CFFError> { // dx1 dy1 dx2 dy2 dx3 dx4 dx5 dy5 dx6 if !self.has_move_to { return Err(CFFError::MissingMoveTo); } if self.stack.len() != 9 { return Err(CFFError::InvalidArgumentsStackLength); } let dx1 = self.x + self.stack.at(0); let dy1 = self.y + self.stack.at(1); let dx2 = dx1 + self.stack.at(2); let dy2 = dy1 + self.stack.at(3); let dx3 = dx2 + self.stack.at(4); let dy3 = dy2; let dx4 = dx3 + self.stack.at(5); let dy4 = dy2; let dx5 = dx4 + self.stack.at(6); let dy5 = dy4 + self.stack.at(7); self.x = dx5 + self.stack.at(8); self.builder.curve_to(dx1, dy1, dx2, dy2, dx3, dy3); self.builder.curve_to(dx4, dy4, dx5, dy5, self.x, self.y); self.stack.clear(); Ok(()) } #[inline] pub fn parse_int1(&mut self, op: u8) -> Result<(), CFFError> { let n = i16::from(op) - 139; self.stack.push(f32::from(n))?; Ok(()) } #[inline] pub fn parse_int2(&mut self, op: u8, s: &mut Stream) -> Result<(), CFFError> { let b1 = s.read::().ok_or(CFFError::ReadOutOfBounds)?; let n = (i16::from(op) - 247) * 256 + i16::from(b1) + 108; debug_assert!((108..=1131).contains(&n)); self.stack.push(f32::from(n))?; Ok(()) } #[inline] pub fn parse_int3(&mut self, op: u8, s: &mut Stream) -> Result<(), CFFError> { let b1 = s.read::().ok_or(CFFError::ReadOutOfBounds)?; let n = -(i16::from(op) - 251) * 256 - i16::from(b1) - 108; debug_assert!((-1131..=-108).contains(&n)); self.stack.push(f32::from(n))?; Ok(()) } #[inline] pub fn parse_fixed(&mut self, s: &mut Stream) -> Result<(), CFFError> { let n = s.read::().ok_or(CFFError::ReadOutOfBounds)?; self.stack.push(n.0)?; Ok(()) } } ttf-parser-0.24.1/src/tables/cff/dict.rs000064400000000000000000000205221046102023000160700ustar 00000000000000use core::convert::TryFrom; use core::ops::Range; use crate::Stream; // Limits according to the Adobe Technical Note #5176, chapter 4 DICT Data. const TWO_BYTE_OPERATOR_MARK: u8 = 12; const FLOAT_STACK_LEN: usize = 64; const END_OF_FLOAT_FLAG: u8 = 0xf; #[derive(Clone, Copy, Debug)] pub struct Operator(pub u16); impl Operator { #[inline] pub fn get(self) -> u16 { self.0 } } pub struct DictionaryParser<'a> { data: &'a [u8], // The current offset. offset: usize, // Offset to the last operands start. operands_offset: usize, // Actual operands. // // While CFF can contain only i32 and f32 values, we have to store operands as f64 // since f32 cannot represent the whole i32 range. // Meaning we have a choice of storing operands as f64 or as enum of i32/f32. // In both cases the type size would be 8 bytes, so it's easier to simply use f64. operands: &'a mut [f64], // An amount of operands in the `operands` array. operands_len: u16, } impl<'a> DictionaryParser<'a> { #[inline] pub fn new(data: &'a [u8], operands_buffer: &'a mut [f64]) -> Self { DictionaryParser { data, offset: 0, operands_offset: 0, operands: operands_buffer, operands_len: 0, } } #[inline(never)] pub fn parse_next(&mut self) -> Option { let mut s = Stream::new_at(self.data, self.offset)?; self.operands_offset = self.offset; while !s.at_end() { let b = s.read::()?; // 0..=21 bytes are operators. if is_dict_one_byte_op(b) { let mut operator = u16::from(b); // Check that operator is two byte long. if b == TWO_BYTE_OPERATOR_MARK { // Use a 1200 'prefix' to make two byte operators more readable. // 12 3 => 1203 operator = 1200 + u16::from(s.read::()?); } self.offset = s.offset(); return Some(Operator(operator)); } else { skip_number(b, &mut s)?; } } None } /// Parses operands of the current operator. /// /// In the DICT structure, operands are defined before an operator. /// So we are trying to find an operator first and the we can actually parse the operands. /// /// Since this methods is pretty expensive and we do not care about most of the operators, /// we can speed up parsing by parsing operands only for required operators. /// /// We still have to "skip" operands during operators search (see `skip_number()`), /// but it's still faster that a naive method. pub fn parse_operands(&mut self) -> Option<()> { let mut s = Stream::new_at(self.data, self.operands_offset)?; self.operands_len = 0; while !s.at_end() { let b = s.read::()?; // 0..=21 bytes are operators. if is_dict_one_byte_op(b) { break; } else { let op = parse_number(b, &mut s)?; self.operands[usize::from(self.operands_len)] = op; self.operands_len += 1; if usize::from(self.operands_len) >= self.operands.len() { break; } } } Some(()) } #[inline] pub fn operands(&self) -> &[f64] { &self.operands[..usize::from(self.operands_len)] } #[inline] pub fn parse_number(&mut self) -> Option { self.parse_operands()?; self.operands().get(0).cloned() } #[inline] pub fn parse_offset(&mut self) -> Option { self.parse_operands()?; let operands = self.operands(); if operands.len() == 1 { usize::try_from(operands[0] as i32).ok() } else { None } } #[inline] pub fn parse_range(&mut self) -> Option> { self.parse_operands()?; let operands = self.operands(); if operands.len() == 2 { let len = usize::try_from(operands[0] as i32).ok()?; let start = usize::try_from(operands[1] as i32).ok()?; let end = start.checked_add(len)?; Some(start..end) } else { None } } } // One-byte CFF DICT Operators according to the // Adobe Technical Note #5176, Appendix H CFF DICT Encoding. pub fn is_dict_one_byte_op(b: u8) -> bool { match b { 0..=27 => true, 28..=30 => false, // numbers 31 => true, // Reserved 32..=254 => false, // numbers 255 => true, // Reserved } } // Adobe Technical Note #5177, Table 3 Operand Encoding pub fn parse_number(b0: u8, s: &mut Stream) -> Option { match b0 { 28 => { let n = i32::from(s.read::()?); Some(f64::from(n)) } 29 => { let n = s.read::()?; Some(f64::from(n)) } 30 => parse_float(s), 32..=246 => { let n = i32::from(b0) - 139; Some(f64::from(n)) } 247..=250 => { let b1 = i32::from(s.read::()?); let n = (i32::from(b0) - 247) * 256 + b1 + 108; Some(f64::from(n)) } 251..=254 => { let b1 = i32::from(s.read::()?); let n = -(i32::from(b0) - 251) * 256 - b1 - 108; Some(f64::from(n)) } _ => None, } } fn parse_float(s: &mut Stream) -> Option { let mut data = [0u8; FLOAT_STACK_LEN]; let mut idx = 0; loop { let b1: u8 = s.read()?; let nibble1 = b1 >> 4; let nibble2 = b1 & 15; if nibble1 == END_OF_FLOAT_FLAG { break; } idx = parse_float_nibble(nibble1, idx, &mut data)?; if nibble2 == END_OF_FLOAT_FLAG { break; } idx = parse_float_nibble(nibble2, idx, &mut data)?; } let s = core::str::from_utf8(&data[..idx]).ok()?; let n = s.parse().ok()?; Some(n) } // Adobe Technical Note #5176, Table 5 Nibble Definitions fn parse_float_nibble(nibble: u8, mut idx: usize, data: &mut [u8]) -> Option { if idx == FLOAT_STACK_LEN { return None; } match nibble { 0..=9 => { data[idx] = b'0' + nibble; } 10 => { data[idx] = b'.'; } 11 => { data[idx] = b'E'; } 12 => { if idx + 1 == FLOAT_STACK_LEN { return None; } data[idx] = b'E'; idx += 1; data[idx] = b'-'; } 13 => { return None; } 14 => { data[idx] = b'-'; } _ => { return None; } } idx += 1; Some(idx) } // Just like `parse_number`, but doesn't actually parses the data. pub fn skip_number(b0: u8, s: &mut Stream) -> Option<()> { match b0 { 28 => s.skip::(), 29 => s.skip::(), 30 => { while !s.at_end() { let b1 = s.read::()?; let nibble1 = b1 >> 4; let nibble2 = b1 & 15; if nibble1 == END_OF_FLOAT_FLAG || nibble2 == END_OF_FLOAT_FLAG { break; } } } 32..=246 => {} 247..=250 => s.skip::(), 251..=254 => s.skip::(), _ => return None, } Some(()) } #[cfg(test)] mod tests { use super::*; #[test] fn parse_dict_number() { assert_eq!( parse_number(0xFA, &mut Stream::new(&[0x7C])).unwrap(), 1000.0 ); assert_eq!( parse_number(0xFE, &mut Stream::new(&[0x7C])).unwrap(), -1000.0 ); assert_eq!( parse_number(0x1C, &mut Stream::new(&[0x27, 0x10])).unwrap(), 10000.0 ); assert_eq!( parse_number(0x1C, &mut Stream::new(&[0xD8, 0xF0])).unwrap(), -10000.0 ); assert_eq!( parse_number(0x1D, &mut Stream::new(&[0x00, 0x01, 0x86, 0xA0])).unwrap(), 100000.0 ); assert_eq!( parse_number(0x1D, &mut Stream::new(&[0xFF, 0xFE, 0x79, 0x60])).unwrap(), -100000.0 ); } } ttf-parser-0.24.1/src/tables/cff/encoding.rs000064400000000000000000000134531046102023000167400ustar 00000000000000use super::charset::Charset; use super::StringId; use crate::parser::{FromData, LazyArray16, Stream}; use crate::GlyphId; /// The Standard Encoding as defined in the Adobe Technical Note #5176 Appendix B. #[rustfmt::skip] pub const STANDARD_ENCODING: [u8; 256] = [ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 0, 111, 112, 113, 114, 0, 115, 116, 117, 118, 119, 120, 121, 122, 0, 123, 0, 124, 125, 126, 127, 128, 129, 130, 131, 0, 132, 133, 0, 134, 135, 136, 137, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 138, 0, 139, 0, 0, 0, 0, 140, 141, 142, 143, 0, 0, 0, 0, 0, 144, 0, 0, 0, 145, 0, 0, 146, 147, 148, 149, 0, 0, 0, 0, ]; #[derive(Clone, Copy, Debug)] pub(crate) struct Format1Range { first: u8, left: u8, } impl FromData for Format1Range { const SIZE: usize = 2; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Format1Range { first: s.read::()?, left: s.read::()?, }) } } #[derive(Clone, Copy, Debug)] pub(crate) struct Supplement { code: u8, name: StringId, } impl FromData for Supplement { const SIZE: usize = 3; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Supplement { code: s.read::()?, name: s.read::()?, }) } } #[derive(Clone, Copy, Default, Debug)] pub(crate) struct Encoding<'a> { kind: EncodingKind<'a>, supplemental: LazyArray16<'a, Supplement>, } #[derive(Clone, Copy, Debug)] pub(crate) enum EncodingKind<'a> { Standard, Expert, Format0(LazyArray16<'a, u8>), Format1(LazyArray16<'a, Format1Range>), } impl Default for EncodingKind<'_> { fn default() -> Self { Self::Standard } } impl Encoding<'_> { pub fn new_standard() -> Self { Encoding { kind: EncodingKind::Standard, supplemental: LazyArray16::default(), } } pub fn new_expert() -> Self { Encoding { kind: EncodingKind::Expert, supplemental: LazyArray16::default(), } } pub fn code_to_gid(&self, charset: &Charset, code: u8) -> Option { if !self.supplemental.is_empty() { if let Some(ref s) = self.supplemental.into_iter().find(|s| s.code == code) { return charset.sid_to_gid(s.name); } } let index = usize::from(code); match self.kind { // Standard encodings store a StringID/SID and not GlyphID/GID. // Therefore we have to get SID first and then convert it to GID via Charset. // Custom encodings (FormatN) store GID directly. // // Indexing for predefined encodings never fails, // because `code` is always `u8` and encodings have 256 entries. // // We treat `Expert` as `Standard` as well, since we allow only 8bit codepoints. EncodingKind::Standard | EncodingKind::Expert => { let sid = StringId(u16::from(STANDARD_ENCODING[index])); charset.sid_to_gid(sid) } EncodingKind::Format0(ref table) => { // +1 because .notdef is implicit. table .into_iter() .position(|c| c == code) .map(|i| (i + 1) as u16) .map(GlyphId) } EncodingKind::Format1(ref table) => { // Starts from 1 because .notdef is implicit. let mut gid: u16 = 1; for range in table.into_iter() { let end = range.first.saturating_add(range.left); if (range.first..=end).contains(&code) { gid += u16::from(code - range.first); return Some(GlyphId(gid)); } else { gid += u16::from(range.left) + 1; } } None } } } } pub(crate) fn parse_encoding<'a>(s: &mut Stream<'a>) -> Option> { let format = s.read::()?; // The first high-bit in format indicates that a Supplemental encoding is present. // Check it and clear. let has_supplemental = format & 0x80 != 0; let format = format & 0x7f; let count = u16::from(s.read::()?); let kind = match format { // TODO: read_array8? 0 => s.read_array16::(count).map(EncodingKind::Format0)?, 1 => s .read_array16::(count) .map(EncodingKind::Format1)?, _ => return None, }; let supplemental = if has_supplemental { let count = u16::from(s.read::()?); s.read_array16::(count)? } else { LazyArray16::default() }; Some(Encoding { kind, supplemental }) } ttf-parser-0.24.1/src/tables/cff/index.rs000064400000000000000000000126751046102023000162660ustar 00000000000000use crate::parser::{FromData, NumFrom, Stream, U24}; pub trait IndexSize: FromData { fn to_u32(self) -> u32; } impl IndexSize for u16 { fn to_u32(self) -> u32 { u32::from(self) } } impl IndexSize for u32 { fn to_u32(self) -> u32 { self } } #[inline] pub fn parse_index<'a, T: IndexSize>(s: &mut Stream<'a>) -> Option> { let count = s.read::()?; parse_index_impl(count.to_u32(), s) } #[inline(never)] fn parse_index_impl<'a>(count: u32, s: &mut Stream<'a>) -> Option> { if count == 0 || count == u32::MAX { return Some(Index::default()); } let offset_size = s.read::()?; let offsets_len = (count + 1).checked_mul(offset_size.to_u32())?; let offsets = VarOffsets { data: s.read_bytes(usize::num_from(offsets_len))?, offset_size, }; // Last offset indicates a Data Index size. match offsets.last() { Some(last_offset) => { let data = s.read_bytes(usize::num_from(last_offset))?; Some(Index { data, offsets }) } None => Some(Index::default()), } } #[inline] pub fn skip_index(s: &mut Stream) -> Option<()> { let count = s.read::()?; skip_index_impl(count.to_u32(), s) } #[inline(never)] fn skip_index_impl(count: u32, s: &mut Stream) -> Option<()> { if count == 0 || count == u32::MAX { return Some(()); } let offset_size = s.read::()?; let offsets_len = (count + 1).checked_mul(offset_size.to_u32())?; let offsets = VarOffsets { data: s.read_bytes(usize::num_from(offsets_len))?, offset_size, }; if let Some(last_offset) = offsets.last() { s.advance(usize::num_from(last_offset)); } Some(()) } #[derive(Clone, Copy, Debug)] pub struct VarOffsets<'a> { pub data: &'a [u8], pub offset_size: OffsetSize, } impl<'a> VarOffsets<'a> { pub fn get(&self, index: u32) -> Option { if index >= self.len() { return None; } let start = usize::num_from(index) * self.offset_size.to_usize(); let mut s = Stream::new_at(self.data, start)?; let n: u32 = match self.offset_size { OffsetSize::Size1 => u32::from(s.read::()?), OffsetSize::Size2 => u32::from(s.read::()?), OffsetSize::Size3 => s.read::()?.0, OffsetSize::Size4 => s.read::()?, }; // Offsets are offset by one byte in the font, // so we have to shift them back. n.checked_sub(1) } #[inline] pub fn last(&self) -> Option { if !self.is_empty() { self.get(self.len() - 1) } else { None } } #[inline] pub fn len(&self) -> u32 { self.data.len() as u32 / self.offset_size as u32 } #[inline] pub fn is_empty(&self) -> bool { self.len() == 0 } } #[derive(Clone, Copy, Debug)] pub struct Index<'a> { pub data: &'a [u8], pub offsets: VarOffsets<'a>, } impl<'a> Default for Index<'a> { #[inline] fn default() -> Self { Index { data: b"", offsets: VarOffsets { data: b"", offset_size: OffsetSize::Size1, }, } } } impl<'a> IntoIterator for Index<'a> { type Item = &'a [u8]; type IntoIter = IndexIter<'a>; #[inline] fn into_iter(self) -> Self::IntoIter { IndexIter { data: self, offset_index: 0, } } } impl<'a> Index<'a> { #[inline] pub fn len(&self) -> u32 { // Last offset points to the byte after the `Object data`. We should skip it. self.offsets.len().saturating_sub(1) } pub fn get(&self, index: u32) -> Option<&'a [u8]> { let next_index = index.checked_add(1)?; // make sure we do not overflow let start = usize::num_from(self.offsets.get(index)?); let end = usize::num_from(self.offsets.get(next_index)?); self.data.get(start..end) } } pub struct IndexIter<'a> { data: Index<'a>, offset_index: u32, } impl<'a> Iterator for IndexIter<'a> { type Item = &'a [u8]; #[inline] fn next(&mut self) -> Option { if self.offset_index == self.data.len() { return None; } let index = self.offset_index; self.offset_index += 1; self.data.get(index) } } #[derive(Clone, Copy, PartialEq, Eq, Debug)] pub enum OffsetSize { Size1 = 1, Size2 = 2, Size3 = 3, Size4 = 4, } impl OffsetSize { #[inline] pub fn to_u32(self) -> u32 { self as u32 } #[inline] pub fn to_usize(self) -> usize { self as usize } } impl FromData for OffsetSize { const SIZE: usize = 1; #[inline] fn parse(data: &[u8]) -> Option { match data.get(0)? { 1 => Some(OffsetSize::Size1), 2 => Some(OffsetSize::Size2), 3 => Some(OffsetSize::Size3), 4 => Some(OffsetSize::Size4), _ => None, } } } #[cfg(test)] mod tests { use super::*; #[test] fn parse_offset_size() { assert_eq!(core::mem::size_of::(), 1); assert_eq!(Stream::new(&[0x00]).read::(), None); assert_eq!( Stream::new(&[0x01]).read::(), Some(OffsetSize::Size1) ); assert_eq!(Stream::new(&[0x05]).read::(), None); } } ttf-parser-0.24.1/src/tables/cff/mod.rs000064400000000000000000000061251046102023000157270ustar 00000000000000mod argstack; pub mod cff1; #[cfg(feature = "variable-fonts")] pub mod cff2; mod charset; mod charstring; mod dict; mod encoding; mod index; #[cfg(feature = "glyph-names")] mod std_names; use core::convert::TryFrom; use crate::parser::{FromData, TryNumFrom}; use crate::{OutlineBuilder, RectF}; /// A list of errors that can occur during a CFF glyph outlining. #[allow(missing_docs)] #[derive(Clone, Copy, PartialEq, Eq, Debug)] pub enum CFFError { NoGlyph, ReadOutOfBounds, ZeroBBox, InvalidOperator, UnsupportedOperator, MissingEndChar, DataAfterEndChar, NestingLimitReached, ArgumentsStackLimitReached, InvalidArgumentsStackLength, BboxOverflow, MissingMoveTo, InvalidSubroutineIndex, NoLocalSubroutines, InvalidSeacCode, #[cfg(feature = "variable-fonts")] InvalidItemVariationDataIndex, #[cfg(feature = "variable-fonts")] InvalidNumberOfBlendOperands, #[cfg(feature = "variable-fonts")] BlendRegionsLimitReached, } pub(crate) struct Builder<'a> { builder: &'a mut dyn OutlineBuilder, bbox: RectF, } impl<'a> Builder<'a> { #[inline] fn move_to(&mut self, x: f32, y: f32) { self.bbox.extend_by(x, y); self.builder.move_to(x, y); } #[inline] fn line_to(&mut self, x: f32, y: f32) { self.bbox.extend_by(x, y); self.builder.line_to(x, y); } #[inline] fn curve_to(&mut self, x1: f32, y1: f32, x2: f32, y2: f32, x: f32, y: f32) { self.bbox.extend_by(x1, y1); self.bbox.extend_by(x2, y2); self.bbox.extend_by(x, y); self.builder.curve_to(x1, y1, x2, y2, x, y); } #[inline] fn close(&mut self) { self.builder.close(); } } /// A type-safe wrapper for string ID. #[derive(Clone, Copy, PartialEq, Eq, PartialOrd, Debug)] pub struct StringId(u16); impl FromData for StringId { const SIZE: usize = 2; #[inline] fn parse(data: &[u8]) -> Option { u16::parse(data).map(StringId) } } pub trait IsEven { fn is_even(&self) -> bool; fn is_odd(&self) -> bool; } impl IsEven for usize { #[inline] fn is_even(&self) -> bool { (*self) & 1 == 0 } #[inline] fn is_odd(&self) -> bool { !self.is_even() } } #[cfg(feature = "std")] #[inline] pub fn f32_abs(n: f32) -> f32 { n.abs() } #[cfg(not(feature = "std"))] #[inline] pub fn f32_abs(n: f32) -> f32 { if n.is_sign_negative() { -n } else { n } } #[inline] pub fn conv_subroutine_index(index: f32, bias: u16) -> Result { conv_subroutine_index_impl(index, bias).ok_or(CFFError::InvalidSubroutineIndex) } #[inline] fn conv_subroutine_index_impl(index: f32, bias: u16) -> Option { let index = i32::try_num_from(index)?; let bias = i32::from(bias); let index = index.checked_add(bias)?; u32::try_from(index).ok() } // Adobe Technical Note #5176, Chapter 16 "Local / Global Subrs INDEXes" #[inline] pub fn calc_subroutine_bias(len: u32) -> u16 { if len < 1240 { 107 } else if len < 33900 { 1131 } else { 32768 } } ttf-parser-0.24.1/src/tables/cff/std_names.rs000064400000000000000000000141161046102023000171240ustar 00000000000000pub const STANDARD_NAMES: &[&str] = &[ ".notdef", "space", "exclam", "quotedbl", "numbersign", "dollar", "percent", "ampersand", "quoteright", "parenleft", "parenright", "asterisk", "plus", "comma", "hyphen", "period", "slash", "zero", "one", "two", "three", "four", "five", "six", "seven", "eight", "nine", "colon", "semicolon", "less", "equal", "greater", "question", "at", "A", "B", "C", "D", "E", "F", "G", "H", "I", "J", "K", "L", "M", "N", "O", "P", "Q", "R", "S", "T", "U", "V", "W", "X", "Y", "Z", "bracketleft", "backslash", "bracketright", "asciicircum", "underscore", "quoteleft", "a", "b", "c", "d", "e", "f", "g", "h", "i", "j", "k", "l", "m", "n", "o", "p", "q", "r", "s", "t", "u", "v", "w", "x", "y", "z", "braceleft", "bar", "braceright", "asciitilde", "exclamdown", "cent", "sterling", "fraction", "yen", "florin", "section", "currency", "quotesingle", "quotedblleft", "guillemotleft", "guilsinglleft", "guilsinglright", "fi", "fl", "endash", "dagger", "daggerdbl", "periodcentered", "paragraph", "bullet", "quotesinglbase", "quotedblbase", "quotedblright", "guillemotright", "ellipsis", "perthousand", "questiondown", "grave", "acute", "circumflex", "tilde", "macron", "breve", "dotaccent", "dieresis", "ring", "cedilla", "hungarumlaut", "ogonek", "caron", "emdash", "AE", "ordfeminine", "Lslash", "Oslash", "OE", "ordmasculine", "ae", "dotlessi", "lslash", "oslash", "oe", "germandbls", "onesuperior", "logicalnot", "mu", "trademark", "Eth", "onehalf", "plusminus", "Thorn", "onequarter", "divide", "brokenbar", "degree", "thorn", "threequarters", "twosuperior", "registered", "minus", "eth", "multiply", "threesuperior", "copyright", "Aacute", "Acircumflex", "Adieresis", "Agrave", "Aring", "Atilde", "Ccedilla", "Eacute", "Ecircumflex", "Edieresis", "Egrave", "Iacute", "Icircumflex", "Idieresis", "Igrave", "Ntilde", "Oacute", "Ocircumflex", "Odieresis", "Ograve", "Otilde", "Scaron", "Uacute", "Ucircumflex", "Udieresis", "Ugrave", "Yacute", "Ydieresis", "Zcaron", "aacute", "acircumflex", "adieresis", "agrave", "aring", "atilde", "ccedilla", "eacute", "ecircumflex", "edieresis", "egrave", "iacute", "icircumflex", "idieresis", "igrave", "ntilde", "oacute", "ocircumflex", "odieresis", "ograve", "otilde", "scaron", "uacute", "ucircumflex", "udieresis", "ugrave", "yacute", "ydieresis", "zcaron", "exclamsmall", "Hungarumlautsmall", "dollaroldstyle", "dollarsuperior", "ampersandsmall", "Acutesmall", "parenleftsuperior", "parenrightsuperior", "twodotenleader", "onedotenleader", "zerooldstyle", "oneoldstyle", "twooldstyle", "threeoldstyle", "fouroldstyle", "fiveoldstyle", "sixoldstyle", "sevenoldstyle", "eightoldstyle", "nineoldstyle", "commasuperior", "threequartersemdash", "periodsuperior", "questionsmall", "asuperior", "bsuperior", "centsuperior", "dsuperior", "esuperior", "isuperior", "lsuperior", "msuperior", "nsuperior", "osuperior", "rsuperior", "ssuperior", "tsuperior", "ff", "ffi", "ffl", "parenleftinferior", "parenrightinferior", "Circumflexsmall", "hyphensuperior", "Gravesmall", "Asmall", "Bsmall", "Csmall", "Dsmall", "Esmall", "Fsmall", "Gsmall", "Hsmall", "Ismall", "Jsmall", "Ksmall", "Lsmall", "Msmall", "Nsmall", "Osmall", "Psmall", "Qsmall", "Rsmall", "Ssmall", "Tsmall", "Usmall", "Vsmall", "Wsmall", "Xsmall", "Ysmall", "Zsmall", "colonmonetary", "onefitted", "rupiah", "Tildesmall", "exclamdownsmall", "centoldstyle", "Lslashsmall", "Scaronsmall", "Zcaronsmall", "Dieresissmall", "Brevesmall", "Caronsmall", "Dotaccentsmall", "Macronsmall", "figuredash", "hypheninferior", "Ogoneksmall", "Ringsmall", "Cedillasmall", "questiondownsmall", "oneeighth", "threeeighths", "fiveeighths", "seveneighths", "onethird", "twothirds", "zerosuperior", "foursuperior", "fivesuperior", "sixsuperior", "sevensuperior", "eightsuperior", "ninesuperior", "zeroinferior", "oneinferior", "twoinferior", "threeinferior", "fourinferior", "fiveinferior", "sixinferior", "seveninferior", "eightinferior", "nineinferior", "centinferior", "dollarinferior", "periodinferior", "commainferior", "Agravesmall", "Aacutesmall", "Acircumflexsmall", "Atildesmall", "Adieresissmall", "Aringsmall", "AEsmall", "Ccedillasmall", "Egravesmall", "Eacutesmall", "Ecircumflexsmall", "Edieresissmall", "Igravesmall", "Iacutesmall", "Icircumflexsmall", "Idieresissmall", "Ethsmall", "Ntildesmall", "Ogravesmall", "Oacutesmall", "Ocircumflexsmall", "Otildesmall", "Odieresissmall", "OEsmall", "Oslashsmall", "Ugravesmall", "Uacutesmall", "Ucircumflexsmall", "Udieresissmall", "Yacutesmall", "Thornsmall", "Ydieresissmall", "001.000", "001.001", "001.002", "001.003", "Black", "Bold", "Book", "Light", "Medium", "Regular", "Roman", "Semibold", ]; ttf-parser-0.24.1/src/tables/cmap/format0.rs000064400000000000000000000032701046102023000167000ustar 00000000000000use crate::parser::{NumFrom, Stream}; use crate::GlyphId; /// A [format 0](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-0-byte-encoding-table) /// subtable. #[derive(Clone, Copy, Debug)] pub struct Subtable0<'a> { /// Just a list of 256 8bit glyph IDs. pub glyph_ids: &'a [u8], } impl<'a> Subtable0<'a> { /// Parses a subtable from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); s.skip::(); // format s.skip::(); // length s.skip::(); // language let glyph_ids = s.read_bytes(256)?; Some(Self { glyph_ids }) } /// Returns a glyph index for a code point. pub fn glyph_index(&self, code_point: u32) -> Option { let glyph_id = *self.glyph_ids.get(usize::num_from(code_point))?; // Make sure that the glyph is not zero, the array always has 256 ids, // but some codepoints may be mapped to zero. if glyph_id != 0 { Some(GlyphId(u16::from(glyph_id))) } else { None } } /// Calls `f` for each codepoint defined in this table. pub fn codepoints(&self, mut f: impl FnMut(u32)) { for (i, glyph_id) in self.glyph_ids.iter().enumerate() { // In contrast to every other format, here we take a look at the glyph // id and check whether it is zero because otherwise this method would // always simply call `f` for `0..256` which would be kind of pointless // (this array always has length 256 even when the face has fewer glyphs). if *glyph_id != 0 { f(i as u32); } } } } ttf-parser-0.24.1/src/tables/cmap/format10.rs000064400000000000000000000026701046102023000167640ustar 00000000000000use crate::parser::{LazyArray32, Stream}; use crate::GlyphId; /// A [format 10](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-10-trimmed-array) /// subtable. #[derive(Clone, Copy, Debug)] pub struct Subtable10<'a> { /// First character code covered. pub first_code_point: u32, /// Array of glyph indices for the character codes covered. pub glyphs: LazyArray32<'a, GlyphId>, } impl<'a> Subtable10<'a> { /// Parses a subtable from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); s.skip::(); // format s.skip::(); // reserved s.skip::(); // length s.skip::(); // language let first_code_point = s.read::()?; let count = s.read::()?; let glyphs = s.read_array32::(count)?; Some(Self { first_code_point, glyphs, }) } /// Returns a glyph index for a code point. pub fn glyph_index(&self, code_point: u32) -> Option { let idx = code_point.checked_sub(self.first_code_point)?; self.glyphs.get(idx) } /// Calls `f` for each codepoint defined in this table. pub fn codepoints(&self, mut f: impl FnMut(u32)) { for i in 0..self.glyphs.len() { if let Some(code_point) = self.first_code_point.checked_add(i) { f(code_point); } } } } ttf-parser-0.24.1/src/tables/cmap/format12.rs000064400000000000000000000045301046102023000167630ustar 00000000000000use core::convert::TryFrom; use crate::parser::{FromData, LazyArray32, Stream}; use crate::GlyphId; #[derive(Clone, Copy)] pub struct SequentialMapGroup { pub start_char_code: u32, pub end_char_code: u32, pub start_glyph_id: u32, } impl FromData for SequentialMapGroup { const SIZE: usize = 12; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(SequentialMapGroup { start_char_code: s.read::()?, end_char_code: s.read::()?, start_glyph_id: s.read::()?, }) } } /// A [format 12](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-12-segmented-coverage) /// subtable. #[derive(Clone, Copy)] pub struct Subtable12<'a> { groups: LazyArray32<'a, SequentialMapGroup>, } impl<'a> Subtable12<'a> { /// Parses a subtable from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); s.skip::(); // format s.skip::(); // reserved s.skip::(); // length s.skip::(); // language let count = s.read::()?; let groups = s.read_array32::(count)?; Some(Self { groups }) } /// Returns a glyph index for a code point. pub fn glyph_index(&self, code_point: u32) -> Option { let (_, group) = self.groups.binary_search_by(|range| { use core::cmp::Ordering; if range.start_char_code > code_point { Ordering::Greater } else if range.end_char_code < code_point { Ordering::Less } else { Ordering::Equal } })?; let id = group .start_glyph_id .checked_add(code_point)? .checked_sub(group.start_char_code)?; u16::try_from(id).ok().map(GlyphId) } /// Calls `f` for each codepoint defined in this table. pub fn codepoints(&self, mut f: impl FnMut(u32)) { for group in self.groups { for code_point in group.start_char_code..=group.end_char_code { f(code_point); } } } } impl core::fmt::Debug for Subtable12<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtable12 {{ ... }}") } } ttf-parser-0.24.1/src/tables/cmap/format13.rs000064400000000000000000000034021046102023000167610ustar 00000000000000// https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-13-many-to-one-range-mappings use core::convert::TryFrom; use super::format12::SequentialMapGroup; use crate::parser::{LazyArray32, Stream}; use crate::GlyphId; /// A [format 13](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-13-segmented-coverage) /// subtable. #[derive(Clone, Copy)] pub struct Subtable13<'a> { groups: LazyArray32<'a, SequentialMapGroup>, } impl<'a> Subtable13<'a> { /// Parses a subtable from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); s.skip::(); // format s.skip::(); // reserved s.skip::(); // length s.skip::(); // language let count = s.read::()?; let groups = s.read_array32::(count)?; Some(Self { groups }) } /// Returns a glyph index for a code point. pub fn glyph_index(&self, code_point: u32) -> Option { for group in self.groups { let start_char_code = group.start_char_code; if code_point >= start_char_code && code_point <= group.end_char_code { return u16::try_from(group.start_glyph_id).ok().map(GlyphId); } } None } /// Calls `f` for each codepoint defined in this table. pub fn codepoints(&self, mut f: impl FnMut(u32)) { for group in self.groups { for code_point in group.start_char_code..=group.end_char_code { f(code_point); } } } } impl core::fmt::Debug for Subtable13<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtable13 {{ ... }}") } } ttf-parser-0.24.1/src/tables/cmap/format14.rs000064400000000000000000000102341046102023000167630ustar 00000000000000use crate::parser::{FromData, LazyArray32, Offset, Offset32, Stream, U24}; use crate::GlyphId; #[derive(Clone, Copy)] struct VariationSelectorRecord { var_selector: u32, default_uvs_offset: Option, non_default_uvs_offset: Option, } impl FromData for VariationSelectorRecord { const SIZE: usize = 11; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(VariationSelectorRecord { var_selector: s.read::()?.0, default_uvs_offset: s.read::>()?, non_default_uvs_offset: s.read::>()?, }) } } #[derive(Clone, Copy)] struct UVSMappingRecord { unicode_value: u32, glyph_id: GlyphId, } impl FromData for UVSMappingRecord { const SIZE: usize = 5; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(UVSMappingRecord { unicode_value: s.read::()?.0, glyph_id: s.read::()?, }) } } #[derive(Clone, Copy)] struct UnicodeRangeRecord { start_unicode_value: u32, additional_count: u8, } impl UnicodeRangeRecord { fn contains(&self, c: u32) -> bool { // Never overflows, since `start_unicode_value` is actually u24. let end = self.start_unicode_value + u32::from(self.additional_count); (self.start_unicode_value..=end).contains(&c) } } impl FromData for UnicodeRangeRecord { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(UnicodeRangeRecord { start_unicode_value: s.read::()?.0, additional_count: s.read::()?, }) } } /// A result of a variation glyph mapping. #[derive(Clone, Copy, PartialEq, Eq, Debug)] pub enum GlyphVariationResult { /// Glyph was found in the variation encoding table. Found(GlyphId), /// Glyph should be looked in other, non-variation tables. /// /// Basically, you should use `Encoding::glyph_index` or `Face::glyph_index` /// in this case. UseDefault, } /// A [format 14](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-14-unicode-variation-sequences) /// subtable. #[derive(Clone, Copy)] pub struct Subtable14<'a> { records: LazyArray32<'a, VariationSelectorRecord>, // The whole subtable data. data: &'a [u8], } impl<'a> Subtable14<'a> { /// Parses a subtable from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); s.skip::(); // format s.skip::(); // length let count = s.read::()?; let records = s.read_array32::(count)?; Some(Self { records, data }) } /// Returns a glyph index for a code point. pub fn glyph_index(&self, code_point: u32, variation: u32) -> Option { let (_, record) = self .records .binary_search_by(|v| v.var_selector.cmp(&variation))?; if let Some(offset) = record.default_uvs_offset { let data = self.data.get(offset.to_usize()..)?; let mut s = Stream::new(data); let count = s.read::()?; let ranges = s.read_array32::(count)?; for range in ranges { if range.contains(code_point) { return Some(GlyphVariationResult::UseDefault); } } } if let Some(offset) = record.non_default_uvs_offset { let data = self.data.get(offset.to_usize()..)?; let mut s = Stream::new(data); let count = s.read::()?; let uvs_mappings = s.read_array32::(count)?; let (_, mapping) = uvs_mappings.binary_search_by(|v| v.unicode_value.cmp(&code_point))?; return Some(GlyphVariationResult::Found(mapping.glyph_id)); } None } } impl core::fmt::Debug for Subtable14<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtable14 {{ ... }}") } } ttf-parser-0.24.1/src/tables/cmap/format2.rs000064400000000000000000000126521046102023000167060ustar 00000000000000// This table has a pretty complex parsing algorithm. // A detailed explanation can be found here: // https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-2-high-byte-mapping-through-table // https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6cmap.html // https://github.com/fonttools/fonttools/blob/a360252709a3d65f899915db0a5bd753007fdbb7/Lib/fontTools/ttLib/tables/_c_m_a_p.py#L360 use core::convert::TryFrom; use crate::parser::{FromData, LazyArray16, Stream}; use crate::GlyphId; #[derive(Clone, Copy)] struct SubHeaderRecord { first_code: u16, entry_count: u16, id_delta: i16, id_range_offset: u16, } impl FromData for SubHeaderRecord { const SIZE: usize = 8; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(SubHeaderRecord { first_code: s.read::()?, entry_count: s.read::()?, id_delta: s.read::()?, id_range_offset: s.read::()?, }) } } /// A [format 2](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-2-high-byte-mapping-through-table) /// subtable. #[derive(Clone, Copy)] pub struct Subtable2<'a> { sub_header_keys: LazyArray16<'a, u16>, sub_headers_offset: usize, sub_headers: LazyArray16<'a, SubHeaderRecord>, // The whole subtable data. data: &'a [u8], } impl<'a> Subtable2<'a> { /// Parses a subtable from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); s.skip::(); // format s.skip::(); // length s.skip::(); // language let sub_header_keys = s.read_array16::(256)?; // The maximum index in a sub_header_keys is a sub_headers count. let sub_headers_count = sub_header_keys.into_iter().map(|n| n / 8).max()? + 1; // Remember sub_headers offset before reading. Will be used later. let sub_headers_offset = s.offset(); let sub_headers = s.read_array16::(sub_headers_count)?; Some(Self { sub_header_keys, sub_headers_offset, sub_headers, data, }) } /// Returns a glyph index for a code point. /// /// Returns `None` when `code_point` is larger than `u16`. pub fn glyph_index(&self, code_point: u32) -> Option { // This subtable supports code points only in a u16 range. let code_point = u16::try_from(code_point).ok()?; let high_byte = code_point >> 8; let low_byte = code_point & 0x00FF; let i = if code_point < 0xff { // 'SubHeader 0 is special: it is used for single-byte character codes.' 0 } else { // 'Array that maps high bytes to subHeaders: value is subHeader index × 8.' self.sub_header_keys.get(high_byte)? / 8 }; let sub_header = self.sub_headers.get(i)?; let first_code = sub_header.first_code; let range_end = first_code.checked_add(sub_header.entry_count)?; if low_byte < first_code || low_byte >= range_end { return None; } // SubHeaderRecord::id_range_offset points to SubHeaderRecord::first_code // in the glyphIndexArray. So we have to advance to our code point. let index_offset = usize::from(low_byte.checked_sub(first_code)?) * u16::SIZE; // 'The value of the idRangeOffset is the number of bytes // past the actual location of the idRangeOffset'. let offset = self.sub_headers_offset // Advance to required subheader. + SubHeaderRecord::SIZE * usize::from(i + 1) // Move back to idRangeOffset start. - u16::SIZE // Use defined offset. + usize::from(sub_header.id_range_offset) // Advance to required index in the glyphIndexArray. + index_offset; let glyph: u16 = Stream::read_at(self.data, offset)?; if glyph == 0 { return None; } u16::try_from((i32::from(glyph) + i32::from(sub_header.id_delta)) % 65536) .ok() .map(GlyphId) } /// Calls `f` for each codepoint defined in this table. pub fn codepoints(&self, f: impl FnMut(u32)) { let _ = self.codepoints_inner(f); } #[inline] fn codepoints_inner(&self, mut f: impl FnMut(u32)) -> Option<()> { for first_byte in 0u16..256 { let i = self.sub_header_keys.get(first_byte)? / 8; let sub_header = self.sub_headers.get(i)?; let first_code = sub_header.first_code; if i == 0 { // This is a single byte code. let range_end = first_code.checked_add(sub_header.entry_count)?; if first_byte >= first_code && first_byte < range_end { f(u32::from(first_byte)); } } else { // This is a two byte code. let base = first_code.checked_add(first_byte << 8)?; for k in 0..sub_header.entry_count { let code_point = base.checked_add(k)?; f(u32::from(code_point)); } } } Some(()) } } impl core::fmt::Debug for Subtable2<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtable2 {{ ... }}") } } ttf-parser-0.24.1/src/tables/cmap/format4.rs000064400000000000000000000103621046102023000167040ustar 00000000000000use core::convert::TryFrom; use crate::parser::{LazyArray16, Stream}; use crate::GlyphId; /// A [format 4](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-4-segment-mapping-to-delta-values) /// subtable. #[derive(Clone, Copy)] pub struct Subtable4<'a> { start_codes: LazyArray16<'a, u16>, end_codes: LazyArray16<'a, u16>, id_deltas: LazyArray16<'a, i16>, id_range_offsets: LazyArray16<'a, u16>, id_range_offset_pos: usize, // The whole subtable data. data: &'a [u8], } impl<'a> Subtable4<'a> { /// Parses a subtable from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); s.advance(6); // format + length + language let seg_count_x2 = s.read::()?; if seg_count_x2 < 2 { return None; } let seg_count = seg_count_x2 / 2; s.advance(6); // searchRange + entrySelector + rangeShift let end_codes = s.read_array16::(seg_count)?; s.skip::(); // reservedPad let start_codes = s.read_array16::(seg_count)?; let id_deltas = s.read_array16::(seg_count)?; let id_range_offset_pos = s.offset(); let id_range_offsets = s.read_array16::(seg_count)?; Some(Self { start_codes, end_codes, id_deltas, id_range_offsets, id_range_offset_pos, data, }) } /// Returns a glyph index for a code point. /// /// Returns `None` when `code_point` is larger than `u16`. pub fn glyph_index(&self, code_point: u32) -> Option { // This subtable supports code points only in a u16 range. let code_point = u16::try_from(code_point).ok()?; // A custom binary search. let mut start = 0; let mut end = self.start_codes.len(); while end > start { let index = (start + end) / 2; let end_value = self.end_codes.get(index)?; if end_value >= code_point { let start_value = self.start_codes.get(index)?; if start_value > code_point { end = index; } else { let id_range_offset = self.id_range_offsets.get(index)?; let id_delta = self.id_deltas.get(index)?; if id_range_offset == 0 { return Some(GlyphId(code_point.wrapping_add(id_delta as u16))); } else if id_range_offset == 0xFFFF { // Some malformed fonts have 0xFFFF as the last offset, // which is invalid and should be ignored. return None; } let delta = (u32::from(code_point) - u32::from(start_value)) * 2; let delta = u16::try_from(delta).ok()?; let id_range_offset_pos = (self.id_range_offset_pos + usize::from(index) * 2) as u16; let pos = id_range_offset_pos.wrapping_add(delta); let pos = pos.wrapping_add(id_range_offset); let glyph_array_value: u16 = Stream::read_at(self.data, usize::from(pos))?; // 0 indicates missing glyph. if glyph_array_value == 0 { return None; } let glyph_id = (glyph_array_value as i16).wrapping_add(id_delta); return u16::try_from(glyph_id).ok().map(GlyphId); } } else { start = index + 1; } } None } /// Calls `f` for each codepoint defined in this table. pub fn codepoints(&self, mut f: impl FnMut(u32)) { for (start, end) in self.start_codes.into_iter().zip(self.end_codes) { // OxFFFF value is special and indicates codes end. if start == end && start == 0xFFFF { break; } for code_point in start..=end { f(u32::from(code_point)); } } } } impl core::fmt::Debug for Subtable4<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtable4 {{ ... }}") } } ttf-parser-0.24.1/src/tables/cmap/format6.rs000064400000000000000000000032101046102023000167000ustar 00000000000000use core::convert::TryFrom; use crate::parser::{LazyArray16, Stream}; use crate::GlyphId; /// A [format 6](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-6-trimmed-table-mapping) /// subtable. #[derive(Clone, Copy, Debug)] pub struct Subtable6<'a> { /// First character code of subrange. pub first_code_point: u16, /// Array of glyph indexes for character codes in the range. pub glyphs: LazyArray16<'a, GlyphId>, } impl<'a> Subtable6<'a> { /// Parses a subtable from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); s.skip::(); // format s.skip::(); // length s.skip::(); // language let first_code_point = s.read::()?; let count = s.read::()?; let glyphs = s.read_array16::(count)?; Some(Self { first_code_point, glyphs, }) } /// Returns a glyph index for a code point. /// /// Returns `None` when `code_point` is larger than `u16`. pub fn glyph_index(&self, code_point: u32) -> Option { // This subtable supports code points only in a u16 range. let code_point = u16::try_from(code_point).ok()?; let idx = code_point.checked_sub(self.first_code_point)?; self.glyphs.get(idx) } /// Calls `f` for each codepoint defined in this table. pub fn codepoints(&self, mut f: impl FnMut(u32)) { for i in 0..self.glyphs.len() { if let Some(code_point) = self.first_code_point.checked_add(i) { f(u32::from(code_point)); } } } } ttf-parser-0.24.1/src/tables/cmap/mod.rs000064400000000000000000000232541046102023000161130ustar 00000000000000/*! A [Character to Glyph Index Mapping Table]( https://docs.microsoft.com/en-us/typography/opentype/spec/cmap) implementation. This module provides a low-level alternative to [`Face::glyph_index`](../struct.Face.html#method.glyph_index) and [`Face::glyph_variation_index`](../struct.Face.html#method.glyph_variation_index) methods. */ use crate::parser::{FromData, LazyArray16, Offset, Offset32, Stream}; use crate::{name::PlatformId, GlyphId}; mod format0; mod format10; mod format12; mod format13; mod format14; mod format2; mod format4; mod format6; pub use format0::Subtable0; pub use format10::Subtable10; pub use format12::Subtable12; pub use format13::Subtable13; pub use format14::{GlyphVariationResult, Subtable14}; pub use format2::Subtable2; pub use format4::Subtable4; pub use format6::Subtable6; /// A character encoding subtable variant. #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub enum Format<'a> { ByteEncodingTable(Subtable0<'a>), HighByteMappingThroughTable(Subtable2<'a>), SegmentMappingToDeltaValues(Subtable4<'a>), TrimmedTableMapping(Subtable6<'a>), MixedCoverage, // unsupported TrimmedArray(Subtable10<'a>), SegmentedCoverage(Subtable12<'a>), ManyToOneRangeMappings(Subtable13<'a>), UnicodeVariationSequences(Subtable14<'a>), } /// A character encoding subtable. #[derive(Clone, Copy, Debug)] pub struct Subtable<'a> { /// Subtable platform. pub platform_id: PlatformId, /// Subtable encoding. pub encoding_id: u16, /// A subtable format. pub format: Format<'a>, } impl<'a> Subtable<'a> { /// Checks that the current encoding is Unicode compatible. #[inline] pub fn is_unicode(&self) -> bool { // https://docs.microsoft.com/en-us/typography/opentype/spec/name#windows-encoding-ids const WINDOWS_UNICODE_BMP_ENCODING_ID: u16 = 1; const WINDOWS_UNICODE_FULL_REPERTOIRE_ENCODING_ID: u16 = 10; match self.platform_id { PlatformId::Unicode => true, PlatformId::Windows if self.encoding_id == WINDOWS_UNICODE_BMP_ENCODING_ID => true, PlatformId::Windows => { // "Note: Subtable format 13 has the same structure as format 12; it differs only // in the interpretation of the startGlyphID/glyphID fields". let is_format_12_compatible = matches!( self.format, Format::SegmentedCoverage(..) | Format::ManyToOneRangeMappings(..) ); // "Fonts that support Unicode supplementary-plane characters (U+10000 to U+10FFFF) // on the Windows platform must have a format 12 subtable for platform ID 3, // encoding ID 10." self.encoding_id == WINDOWS_UNICODE_FULL_REPERTOIRE_ENCODING_ID && is_format_12_compatible } _ => false, } } /// Maps a character to a glyph ID. /// /// This is a low-level method and unlike `Face::glyph_index` it doesn't /// check that the current encoding is Unicode. /// It simply maps a `u32` codepoint number to a glyph ID. /// /// Returns `None`: /// - when glyph ID is `0`. /// - when format is `MixedCoverage`, since it's not supported. /// - when format is `UnicodeVariationSequences`. Use `glyph_variation_index` instead. #[inline] pub fn glyph_index(&self, code_point: u32) -> Option { match self.format { Format::ByteEncodingTable(ref subtable) => subtable.glyph_index(code_point), Format::HighByteMappingThroughTable(ref subtable) => subtable.glyph_index(code_point), Format::SegmentMappingToDeltaValues(ref subtable) => subtable.glyph_index(code_point), Format::TrimmedTableMapping(ref subtable) => subtable.glyph_index(code_point), Format::MixedCoverage => None, Format::TrimmedArray(ref subtable) => subtable.glyph_index(code_point), Format::SegmentedCoverage(ref subtable) => subtable.glyph_index(code_point), Format::ManyToOneRangeMappings(ref subtable) => subtable.glyph_index(code_point), // This subtable should be accessed via glyph_variation_index(). Format::UnicodeVariationSequences(_) => None, } } /// Resolves a variation of a glyph ID from two code points. /// /// Returns `None`: /// - when glyph ID is `0`. /// - when format is not `UnicodeVariationSequences`. #[inline] pub fn glyph_variation_index( &self, code_point: u32, variation: u32, ) -> Option { match self.format { Format::UnicodeVariationSequences(ref subtable) => { subtable.glyph_index(code_point, variation) } _ => None, } } /// Calls `f` for all codepoints contained in this subtable. /// /// This is a low-level method and it doesn't check that the current /// encoding is Unicode. It simply calls the function `f` for all `u32` /// codepoints that are present in this subtable. /// /// Note that this may list codepoints for which `glyph_index` still returns /// `None` because this method finds all codepoints which were _defined_ in /// this subtable. The subtable may still map them to glyph ID `0`. /// /// Returns without doing anything: /// - when format is `MixedCoverage`, since it's not supported. /// - when format is `UnicodeVariationSequences`, since it's not supported. pub fn codepoints(&self, f: F) { match self.format { Format::ByteEncodingTable(ref subtable) => subtable.codepoints(f), Format::HighByteMappingThroughTable(ref subtable) => subtable.codepoints(f), Format::SegmentMappingToDeltaValues(ref subtable) => subtable.codepoints(f), Format::TrimmedTableMapping(ref subtable) => subtable.codepoints(f), Format::MixedCoverage => {} // unsupported Format::TrimmedArray(ref subtable) => subtable.codepoints(f), Format::SegmentedCoverage(ref subtable) => subtable.codepoints(f), Format::ManyToOneRangeMappings(ref subtable) => subtable.codepoints(f), Format::UnicodeVariationSequences(_) => {} // unsupported }; } } #[derive(Clone, Copy)] struct EncodingRecord { platform_id: PlatformId, encoding_id: u16, offset: Offset32, } impl FromData for EncodingRecord { const SIZE: usize = 8; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(EncodingRecord { platform_id: s.read::()?, encoding_id: s.read::()?, offset: s.read::()?, }) } } /// A list of subtables. #[derive(Clone, Copy, Default)] pub struct Subtables<'a> { data: &'a [u8], records: LazyArray16<'a, EncodingRecord>, } impl core::fmt::Debug for Subtables<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtables {{ ... }}") } } impl<'a> Subtables<'a> { /// Returns a subtable at an index. pub fn get(&self, index: u16) -> Option> { let record = self.records.get(index)?; let data = self.data.get(record.offset.to_usize()..)?; let format = match Stream::read_at::(data, 0)? { 0 => Format::ByteEncodingTable(Subtable0::parse(data)?), 2 => Format::HighByteMappingThroughTable(Subtable2::parse(data)?), 4 => Format::SegmentMappingToDeltaValues(Subtable4::parse(data)?), 6 => Format::TrimmedTableMapping(Subtable6::parse(data)?), 8 => Format::MixedCoverage, // unsupported 10 => Format::TrimmedArray(Subtable10::parse(data)?), 12 => Format::SegmentedCoverage(Subtable12::parse(data)?), 13 => Format::ManyToOneRangeMappings(Subtable13::parse(data)?), 14 => Format::UnicodeVariationSequences(Subtable14::parse(data)?), _ => return None, }; Some(Subtable { platform_id: record.platform_id, encoding_id: record.encoding_id, format, }) } /// Returns the number of subtables. #[inline] pub fn len(&self) -> u16 { self.records.len() } /// Checks if there are any subtables. pub fn is_empty(&self) -> bool { self.records.is_empty() } } impl<'a> IntoIterator for Subtables<'a> { type Item = Subtable<'a>; type IntoIter = SubtablesIter<'a>; #[inline] fn into_iter(self) -> Self::IntoIter { SubtablesIter { subtables: self, index: 0, } } } /// An iterator over [`Subtables`]. #[allow(missing_debug_implementations)] pub struct SubtablesIter<'a> { subtables: Subtables<'a>, index: u16, } impl<'a> Iterator for SubtablesIter<'a> { type Item = Subtable<'a>; #[inline] fn next(&mut self) -> Option { if self.index < self.subtables.len() { self.index += 1; self.subtables.get(self.index - 1) } else { None } } } /// A [Character to Glyph Index Mapping Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/cmap). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// A list of subtables. pub subtables: Subtables<'a>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); s.skip::(); // version let count = s.read::()?; let records = s.read_array16::(count)?; Some(Table { subtables: Subtables { data, records }, }) } } ttf-parser-0.24.1/src/tables/colr.rs000064400000000000000000001777371046102023000153730ustar 00000000000000//! A [Color Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/colr) implementation. // NOTE: Parts of the implementation have been inspired by // [skrifa](https://github.com/googlefonts/fontations/tree/main/skrifa). #[cfg(feature = "variable-fonts")] use crate::delta_set::DeltaSetIndexMap; use crate::parser::{FromData, LazyArray16, Offset, Offset24, Offset32, Stream, F2DOT14}; #[cfg(feature = "variable-fonts")] use crate::var_store::ItemVariationStore; #[cfg(feature = "variable-fonts")] use crate::NormalizedCoordinate; use crate::{cpal, Fixed, LazyArray32, RectF, Transform}; use crate::{GlyphId, RgbaColor}; /// A [base glyph]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/colr#baseglyph-and-layer-records). #[derive(Clone, Copy, Debug)] struct BaseGlyphRecord { glyph_id: GlyphId, first_layer_index: u16, num_layers: u16, } /// A [ClipBox](https://learn.microsoft.com/en-us/typography/opentype/spec/colr#baseglyphlist-layerlist-and-cliplist). pub type ClipBox = RectF; /// A paint. #[derive(Clone, Debug)] pub enum Paint<'a> { /// A paint with a solid color. Solid(RgbaColor), /// A paint with a linear gradient. LinearGradient(LinearGradient<'a>), /// A paint with a radial gradient. RadialGradient(RadialGradient<'a>), /// A paint with a sweep gradient. SweepGradient(SweepGradient<'a>), } /// A [clip record]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/colr#baseglyphlist-layerlist-and-cliplist). #[derive(Clone, Copy, Debug)] struct ClipRecord { /// The first glyph ID for the range covered by this record. pub start_glyph_id: GlyphId, /// The last glyph ID, *inclusive*, for the range covered by this record. pub end_glyph_id: GlyphId, /// The offset to the clip box. pub clip_box_offset: Offset24, } impl FromData for ClipRecord { const SIZE: usize = 7; fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(ClipRecord { start_glyph_id: s.read::()?, end_glyph_id: s.read::()?, clip_box_offset: s.read::()?, }) } } impl ClipRecord { /// Returns the glyphs range. pub fn glyphs_range(&self) -> core::ops::RangeInclusive { self.start_glyph_id..=self.end_glyph_id } } /// A [clip list]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/colr#baseglyphlist-layerlist-and-cliplist). #[derive(Clone, Copy, Debug, Default)] struct ClipList<'a> { data: &'a [u8], records: LazyArray32<'a, ClipRecord>, } impl<'a> ClipList<'a> { pub fn get( &self, index: u32, #[cfg(feature = "variable-fonts")] variation_data: &VariationData, #[cfg(feature = "variable-fonts")] coords: &[NormalizedCoordinate], ) -> Option { let record = self.records.get(index)?; let offset = record.clip_box_offset.to_usize(); self.data.get(offset..).and_then(|data| { let mut s = Stream::new(data); let format = s.read::()?; #[cfg(not(feature = "variable-fonts"))] let deltas = [0.0, 0.0, 0.0, 0.0]; #[cfg(feature = "variable-fonts")] let deltas = if format == 2 { let mut var_s = s.clone(); var_s.advance(8); let var_index_base = var_s.read::()?; variation_data.read_deltas::<4>(var_index_base, coords) } else { [0.0, 0.0, 0.0, 0.0] }; Some(ClipBox { x_min: s.read::()? as f32 + deltas[0], y_min: s.read::()? as f32 + deltas[1], x_max: s.read::()? as f32 + deltas[2], y_max: s.read::()? as f32 + deltas[3], }) }) } /// Returns a ClipBox by glyph ID. #[inline] pub fn find( &self, glyph_id: GlyphId, #[cfg(feature = "variable-fonts")] variation_data: &VariationData, #[cfg(feature = "variable-fonts")] coords: &[NormalizedCoordinate], ) -> Option { let index = self .records .into_iter() .position(|v| v.glyphs_range().contains(&glyph_id))?; self.get( index as u32, #[cfg(feature = "variable-fonts")] variation_data, #[cfg(feature = "variable-fonts")] coords, ) } } impl FromData for BaseGlyphRecord { const SIZE: usize = 6; fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Self { glyph_id: s.read::()?, first_layer_index: s.read::()?, num_layers: s.read::()?, }) } } /// A [layer]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/colr#baseglyph-and-layer-records). #[derive(Clone, Copy, Debug)] struct LayerRecord { glyph_id: GlyphId, palette_index: u16, } impl FromData for LayerRecord { const SIZE: usize = 4; fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Self { glyph_id: s.read::()?, palette_index: s.read::()?, }) } } /// A [BaseGlyphPaintRecord]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/colr#baseglyphlist-layerlist-and-cliplist). #[derive(Clone, Copy, Debug)] struct BaseGlyphPaintRecord { glyph_id: GlyphId, paint_table_offset: Offset32, } impl FromData for BaseGlyphPaintRecord { const SIZE: usize = 6; fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Self { glyph_id: s.read::()?, paint_table_offset: s.read::()?, }) } } /// A [gradient extend]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/colr#baseglyphlist-layerlist-and-cliplist). #[derive(Clone, Copy, Debug, PartialEq)] pub enum GradientExtend { /// The `Pad` gradient extend mode. Pad, /// The `Repeat` gradient extend mode. Repeat, /// The `Reflect` gradient extend mode. Reflect, } impl FromData for GradientExtend { const SIZE: usize = 1; fn parse(data: &[u8]) -> Option { match data[0] { 0 => Some(Self::Pad), 1 => Some(Self::Repeat), 2 => Some(Self::Reflect), _ => None, } } } /// A [color stop]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/colr#color-references-colorstop-and-colorline). #[derive(Clone, Copy, Debug)] struct ColorStopRaw { stop_offset: F2DOT14, palette_index: u16, alpha: F2DOT14, } impl FromData for ColorStopRaw { const SIZE: usize = 6; fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Self { stop_offset: s.read::()?, palette_index: s.read::()?, alpha: s.read::()?, }) } } /// A [var color stop]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/colr#color-references-colorstop-and-colorline). #[cfg(feature = "variable-fonts")] #[derive(Clone, Copy, Debug)] struct VarColorStopRaw { stop_offset: F2DOT14, palette_index: u16, alpha: F2DOT14, var_index_base: u32, } #[cfg(feature = "variable-fonts")] impl FromData for VarColorStopRaw { const SIZE: usize = 10; fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Self { stop_offset: s.read::()?, palette_index: s.read::()?, alpha: s.read::()?, var_index_base: s.read::()?, }) } } #[derive(Clone)] struct NonVarColorLine<'a> { extend: GradientExtend, colors: LazyArray16<'a, ColorStopRaw>, palettes: cpal::Table<'a>, foreground_color: RgbaColor, } impl NonVarColorLine<'_> { // TODO: Color stops should be sorted, but hard to do without allocations fn get(&self, palette: u16, index: u16) -> Option { let info = self.colors.get(index)?; let mut color = if info.palette_index == u16::MAX { self.foreground_color } else { self.palettes.get(palette, info.palette_index)? }; color.apply_alpha(info.alpha.to_f32()); Some(ColorStop { stop_offset: info.stop_offset.to_f32(), color, }) } } #[cfg(feature = "variable-fonts")] impl VarColorLine<'_> { // TODO: Color stops should be sorted, but hard to do without allocations fn get( &self, palette: u16, index: u16, #[cfg(feature = "variable-fonts")] variation_data: VariationData, #[cfg(feature = "variable-fonts")] coordinates: &[NormalizedCoordinate], ) -> Option { let info = self.colors.get(index)?; let mut color = if info.palette_index == u16::MAX { self.foreground_color } else { self.palettes.get(palette, info.palette_index)? }; let deltas = variation_data.read_deltas::<2>(info.var_index_base, coordinates); let stop_offset = info.stop_offset.apply_float_delta(deltas[0]); color.apply_alpha(info.alpha.apply_float_delta(deltas[1])); Some(ColorStop { stop_offset, color }) } } #[cfg(feature = "variable-fonts")] #[derive(Clone)] struct VarColorLine<'a> { extend: GradientExtend, colors: LazyArray16<'a, VarColorStopRaw>, palettes: cpal::Table<'a>, foreground_color: RgbaColor, } #[derive(Clone)] enum ColorLine<'a> { #[cfg(feature = "variable-fonts")] VarColorLine(VarColorLine<'a>), NonVarColorLine(NonVarColorLine<'a>), } /// A [gradient extend]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/colr#baseglyphlist-layerlist-and-cliplist). #[derive(Clone, Copy, Debug)] pub struct ColorStop { /// The offset of the color stop. pub stop_offset: f32, /// The color of the color stop. pub color: RgbaColor, } /// A [linear gradient](https://learn.microsoft.com/en-us/typography/opentype/spec/colr#formats-4-and-5-paintlineargradient-paintvarlineargradient) #[derive(Clone)] pub struct LinearGradient<'a> { /// The `x0` value. pub x0: f32, /// The `y0` value. pub y0: f32, /// The `x1` value. pub x1: f32, /// The `y1` value. pub y1: f32, /// The `x2` value. pub x2: f32, /// The `y2` value. pub y2: f32, /// The extend. pub extend: GradientExtend, #[cfg(feature = "variable-fonts")] variation_data: VariationData<'a>, color_line: ColorLine<'a>, } impl<'a> core::fmt::Debug for LinearGradient<'a> { fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result { f.debug_struct("LinearGradient") .field("x0", &self.x0) .field("y0", &self.y0) .field("x1", &self.x1) .field("y1", &self.y1) .field("x2", &self.x2) .field("y2", &self.y2) .field("extend", &self.extend) .field( "stops", &self.stops( 0, #[cfg(feature = "variable-fonts")] &[], ), ) .finish() } } impl<'a> LinearGradient<'a> { /// Returns an iterator over the stops of the linear gradient. Stops need to be sorted /// manually by the caller. pub fn stops<'b>( &'b self, palette: u16, #[cfg(feature = "variable-fonts")] coords: &'b [NormalizedCoordinate], ) -> GradientStopsIter<'a, 'b> { GradientStopsIter { color_line: &self.color_line, palette, index: 0, #[cfg(feature = "variable-fonts")] variation_data: self.variation_data, #[cfg(feature = "variable-fonts")] coords, } } } /// A [radial gradient](https://learn.microsoft.com/en-us/typography/opentype/spec/colr#formats-6-and-7-paintradialgradient-paintvarradialgradient) #[derive(Clone)] pub struct RadialGradient<'a> { /// The `x0` value. pub x0: f32, /// The `y0` value. pub y0: f32, /// The `r0` value. pub r0: f32, /// The `r1` value. pub r1: f32, /// The `x1` value. pub x1: f32, /// The `y1` value. pub y1: f32, /// The extend. pub extend: GradientExtend, #[cfg(feature = "variable-fonts")] variation_data: VariationData<'a>, color_line: ColorLine<'a>, } impl<'a> core::fmt::Debug for RadialGradient<'a> { fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result { f.debug_struct("RadialGradient") .field("x0", &self.x0) .field("y0", &self.y0) .field("r0", &self.r0) .field("r1", &self.r1) .field("x1", &self.x1) .field("y1", &self.y1) .field("extend", &self.extend) .field( "stops", &self.stops( 0, #[cfg(feature = "variable-fonts")] &[], ), ) .finish() } } impl<'a> RadialGradient<'a> { /// Returns an iterator over the stops of the radial gradient. Stops need to be sorted /// manually by the caller. pub fn stops<'b>( &'b self, palette: u16, #[cfg(feature = "variable-fonts")] coords: &'a [NormalizedCoordinate], ) -> GradientStopsIter<'a, 'b> { GradientStopsIter { color_line: &self.color_line, palette, index: 0, #[cfg(feature = "variable-fonts")] variation_data: self.variation_data, #[cfg(feature = "variable-fonts")] coords, } } } /// A [sweep gradient](https://learn.microsoft.com/en-us/typography/opentype/spec/colr#formats-8-and-9-paintsweepgradient-paintvarsweepgradient) #[derive(Clone)] pub struct SweepGradient<'a> { /// The x of the center. pub center_x: f32, /// The y of the center. pub center_y: f32, /// The start angle. pub start_angle: f32, /// The end angle. pub end_angle: f32, /// The extend. pub extend: GradientExtend, #[cfg(feature = "variable-fonts")] variation_data: VariationData<'a>, color_line: ColorLine<'a>, } impl<'a> core::fmt::Debug for SweepGradient<'a> { fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result { f.debug_struct("SweepGradient") .field("center_x", &self.center_x) .field("center_y", &self.center_y) .field("start_angle", &self.start_angle) .field("end_angle", &self.end_angle) .field("extend", &self.extend) .field( "stops", &self.stops( 0, #[cfg(feature = "variable-fonts")] &[], ), ) .finish() } } impl<'a> SweepGradient<'a> { // TODO: Make API nicer so that variable coordinates don't // need to be passed by the caller (same for radial and linear gradient) /// Returns an iterator over the stops of the sweep gradient. Stops need to be sorted /// manually by the caller. pub fn stops<'b>( &'b self, palette: u16, #[cfg(feature = "variable-fonts")] coords: &'a [NormalizedCoordinate], ) -> GradientStopsIter<'a, 'b> { GradientStopsIter { color_line: &self.color_line, palette, index: 0, #[cfg(feature = "variable-fonts")] variation_data: self.variation_data, #[cfg(feature = "variable-fonts")] coords, } } } /// An iterator over stops of a gradient. #[derive(Clone, Copy)] pub struct GradientStopsIter<'a, 'b> { color_line: &'b ColorLine<'a>, palette: u16, index: u16, #[cfg(feature = "variable-fonts")] variation_data: VariationData<'a>, #[cfg(feature = "variable-fonts")] coords: &'b [NormalizedCoordinate], } impl Iterator for GradientStopsIter<'_, '_> { type Item = ColorStop; fn next(&mut self) -> Option { let len = match self.color_line { #[cfg(feature = "variable-fonts")] ColorLine::VarColorLine(vcl) => vcl.colors.len(), ColorLine::NonVarColorLine(nvcl) => nvcl.colors.len(), }; if self.index == len { return None; } let index = self.index; self.index = self.index.checked_add(1)?; match self.color_line { #[cfg(feature = "variable-fonts")] ColorLine::VarColorLine(vcl) => { vcl.get(self.palette, index, self.variation_data, self.coords) } ColorLine::NonVarColorLine(nvcl) => nvcl.get(self.palette, index), } } } impl core::fmt::Debug for GradientStopsIter<'_, '_> { fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result { f.debug_list().entries(*self).finish() } } /// A [composite mode](https://learn.microsoft.com/en-us/typography/opentype/spec/colr#format-32-paintcomposite) #[derive(Clone, Copy, PartialEq, Debug)] pub enum CompositeMode { /// The composite mode 'Clear'. Clear, /// The composite mode 'Source'. Source, /// The composite mode 'Destination'. Destination, /// The composite mode 'SourceOver'. SourceOver, /// The composite mode 'DestinationOver'. DestinationOver, /// The composite mode 'SourceIn'. SourceIn, /// The composite mode 'DestinationIn'. DestinationIn, /// The composite mode 'SourceOut'. SourceOut, /// The composite mode 'DestinationOut'. DestinationOut, /// The composite mode 'SourceAtop'. SourceAtop, /// The composite mode 'DestinationAtop'. DestinationAtop, /// The composite mode 'Xor'. Xor, /// The composite mode 'Plus'. Plus, /// The composite mode 'Screen'. Screen, /// The composite mode 'Overlay'. Overlay, /// The composite mode 'Darken'. Darken, /// The composite mode 'Lighten'. Lighten, /// The composite mode 'ColorDodge'. ColorDodge, /// The composite mode 'ColorBurn'. ColorBurn, /// The composite mode 'HardLight'. HardLight, /// The composite mode 'SoftLight'. SoftLight, /// The composite mode 'Difference'. Difference, /// The composite mode 'Exclusion'. Exclusion, /// The composite mode 'Multiply'. Multiply, /// The composite mode 'Hue'. Hue, /// The composite mode 'Saturation'. Saturation, /// The composite mode 'Color'. Color, /// The composite mode 'Luminosity'. Luminosity, } impl FromData for CompositeMode { const SIZE: usize = 1; fn parse(data: &[u8]) -> Option { match data[0] { 0 => Some(Self::Clear), 1 => Some(Self::Source), 2 => Some(Self::Destination), 3 => Some(Self::SourceOver), 4 => Some(Self::DestinationOver), 5 => Some(Self::SourceIn), 6 => Some(Self::DestinationIn), 7 => Some(Self::SourceOut), 8 => Some(Self::DestinationOut), 9 => Some(Self::SourceAtop), 10 => Some(Self::DestinationAtop), 11 => Some(Self::Xor), 12 => Some(Self::Plus), 13 => Some(Self::Screen), 14 => Some(Self::Overlay), 15 => Some(Self::Darken), 16 => Some(Self::Lighten), 17 => Some(Self::ColorDodge), 18 => Some(Self::ColorBurn), 19 => Some(Self::HardLight), 20 => Some(Self::SoftLight), 21 => Some(Self::Difference), 22 => Some(Self::Exclusion), 23 => Some(Self::Multiply), 24 => Some(Self::Hue), 25 => Some(Self::Saturation), 26 => Some(Self::Color), 27 => Some(Self::Luminosity), _ => None, } } } /// A trait for color glyph painting. /// /// See [COLR](https://learn.microsoft.com/en-us/typography/opentype/spec/colr) for details. pub trait Painter<'a> { /// Outline a glyph and store it. fn outline_glyph(&mut self, glyph_id: GlyphId); /// Paint the stored outline using the provided color. fn paint(&mut self, paint: Paint<'a>); /// Push a new clip path using the currently stored outline. fn push_clip(&mut self); /// Push a new clip path using the clip box. fn push_clip_box(&mut self, clipbox: ClipBox); /// Pop the last clip path. fn pop_clip(&mut self); /// Push a new layer with the given composite mode. fn push_layer(&mut self, mode: CompositeMode); /// Pop the last layer. fn pop_layer(&mut self); /// Push a transform. fn push_transform(&mut self, transform: Transform); /// Pop the last transform. fn pop_transform(&mut self); } /// A [Color Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/colr). /// /// Currently, only version 0 is supported. #[derive(Clone, Copy, Debug)] pub struct Table<'a> { pub(crate) palettes: cpal::Table<'a>, data: &'a [u8], version: u8, // v0 base_glyphs: LazyArray16<'a, BaseGlyphRecord>, layers: LazyArray16<'a, LayerRecord>, // v1 base_glyph_paints_offset: Offset32, base_glyph_paints: LazyArray32<'a, BaseGlyphPaintRecord>, layer_paint_offsets_offset: Offset32, layer_paint_offsets: LazyArray32<'a, Offset32>, clip_list_offsets_offset: Offset32, clip_list: ClipList<'a>, #[cfg(feature = "variable-fonts")] var_index_map: Option>, #[cfg(feature = "variable-fonts")] item_variation_store: Option>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(palettes: cpal::Table<'a>, data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if version > 1 { return None; } let num_base_glyphs = s.read::()?; let base_glyphs_offset = s.read::()?; let layers_offset = s.read::()?; let num_layers = s.read::()?; let base_glyphs = Stream::new_at(data, base_glyphs_offset.to_usize())? .read_array16::(num_base_glyphs)?; let layers = Stream::new_at(data, layers_offset.to_usize())? .read_array16::(num_layers)?; let mut table = Self { version: version as u8, data, palettes, base_glyphs, layers, base_glyph_paints_offset: Offset32(0), // the actual value doesn't matter base_glyph_paints: LazyArray32::default(), layer_paint_offsets_offset: Offset32(0), layer_paint_offsets: LazyArray32::default(), clip_list_offsets_offset: Offset32(0), clip_list: ClipList::default(), #[cfg(feature = "variable-fonts")] item_variation_store: None, #[cfg(feature = "variable-fonts")] var_index_map: None, }; if version == 0 { return Some(table); } table.base_glyph_paints_offset = s.read::()?; let layer_list_offset = s.read::>()?; let clip_list_offset = s.read::>()?; #[cfg(feature = "variable-fonts")] let var_index_map_offset = s.read::>()?; #[cfg(feature = "variable-fonts")] let item_variation_offset = s.read::>()?; { let mut s = Stream::new_at(data, table.base_glyph_paints_offset.to_usize())?; let count = s.read::()?; table.base_glyph_paints = s.read_array32::(count)?; } if let Some(offset) = layer_list_offset { table.layer_paint_offsets_offset = offset; let mut s = Stream::new_at(data, offset.to_usize())?; let count = s.read::()?; table.layer_paint_offsets = s.read_array32::(count)?; } if let Some(offset) = clip_list_offset { table.clip_list_offsets_offset = offset; let clip_data = data.get(offset.to_usize()..)?; let mut s = Stream::new(clip_data); s.skip::(); // Format let count = s.read::()?; table.clip_list = ClipList { data: clip_data, records: s.read_array32::(count)?, }; } #[cfg(feature = "variable-fonts")] { if let Some(offset) = item_variation_offset { let item_var_data = data.get(offset.to_usize()..)?; let s = Stream::new(item_var_data); let var_store = ItemVariationStore::parse(s)?; table.item_variation_store = Some(var_store); } } #[cfg(feature = "variable-fonts")] { if let Some(offset) = var_index_map_offset { let var_index_map_data = data.get(offset.to_usize()..)?; let var_index_map = DeltaSetIndexMap::new(var_index_map_data); table.var_index_map = Some(var_index_map); } } Some(table) } /// Returns `true` if the current table has version 0. /// /// A simple table can only emit `outline_glyph` and `paint` /// [`Painter`] methods. pub fn is_simple(&self) -> bool { self.version == 0 } fn get_v0(&self, glyph_id: GlyphId) -> Option { self.base_glyphs .binary_search_by(|base| base.glyph_id.cmp(&glyph_id)) .map(|v| v.1) } fn get_v1(&self, glyph_id: GlyphId) -> Option { self.base_glyph_paints .binary_search_by(|base| base.glyph_id.cmp(&glyph_id)) .map(|v| v.1) } #[cfg(feature = "variable-fonts")] fn variation_data(&self) -> VariationData<'a> { VariationData { variation_store: self.item_variation_store, delta_map: self.var_index_map, } } /// Whether the table contains a definition for the given glyph. pub fn contains(&self, glyph_id: GlyphId) -> bool { self.get_v1(glyph_id).is_some() || self.get_v0(glyph_id).is_some() } /// Returns the clip box for a glyph. pub fn clip_box( &self, glyph_id: GlyphId, #[cfg(feature = "variable-fonts")] coords: &[NormalizedCoordinate], ) -> Option { self.clip_list.find( glyph_id, #[cfg(feature = "variable-fonts")] &self.variation_data(), #[cfg(feature = "variable-fonts")] coords, ) } // This method should only be called from outside, not from within `colr.rs`. // From inside, you always should call paint_impl, so that the recursion stack can // be passed on and any kind of recursion can be prevented. /// Paints the color glyph. pub fn paint( &self, glyph_id: GlyphId, palette: u16, painter: &mut dyn Painter<'a>, #[cfg(feature = "variable-fonts")] coords: &[NormalizedCoordinate], foreground_color: RgbaColor, ) -> Option<()> { let mut recursion_stack = RecursionStack { stack: [0; 64], len: 0, }; self.paint_impl( glyph_id, palette, painter, &mut recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ) } fn paint_impl( &self, glyph_id: GlyphId, palette: u16, painter: &mut dyn Painter<'a>, recusion_stack: &mut RecursionStack, #[cfg(feature = "variable-fonts")] coords: &[NormalizedCoordinate], foreground_color: RgbaColor, ) -> Option<()> { if let Some(base) = self.get_v1(glyph_id) { self.paint_v1( base, palette, painter, recusion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ) } else if let Some(base) = self.get_v0(glyph_id) { self.paint_v0(base, palette, painter, foreground_color) } else { None } } fn paint_v0( &self, base: BaseGlyphRecord, palette: u16, painter: &mut dyn Painter, foreground_color: RgbaColor, ) -> Option<()> { let start = base.first_layer_index; let end = start.checked_add(base.num_layers)?; let layers = self.layers.slice(start..end)?; for layer in layers { if layer.palette_index == 0xFFFF { // A special case. painter.outline_glyph(layer.glyph_id); painter.paint(Paint::Solid(foreground_color)); } else { let color = self.palettes.get(palette, layer.palette_index)?; painter.outline_glyph(layer.glyph_id); painter.paint(Paint::Solid(color)); } } Some(()) } fn paint_v1( &self, base: BaseGlyphPaintRecord, palette: u16, painter: &mut dyn Painter<'a>, recursion_stack: &mut RecursionStack, #[cfg(feature = "variable-fonts")] coords: &[NormalizedCoordinate], foreground_color: RgbaColor, ) -> Option<()> { let clip_box = self.clip_box( base.glyph_id, #[cfg(feature = "variable-fonts")] coords, ); if let Some(clip_box) = clip_box { painter.push_clip_box(clip_box); } self.parse_paint( self.base_glyph_paints_offset.to_usize() + base.paint_table_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); if clip_box.is_some() { painter.pop_clip(); } Some(()) } fn parse_paint( &self, offset: usize, palette: u16, painter: &mut dyn Painter<'a>, recursion_stack: &mut RecursionStack, #[cfg(feature = "variable-fonts")] coords: &[NormalizedCoordinate], foreground_color: RgbaColor, ) -> Option<()> { let mut s = Stream::new_at(self.data, offset)?; let format = s.read::()?; // Cycle detected if recursion_stack.contains(offset) { return None; } recursion_stack.push(offset).ok()?; let result = self.parse_paint_impl( offset, palette, painter, recursion_stack, &mut s, format, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); recursion_stack.pop(); result } fn parse_paint_impl( &self, offset: usize, palette: u16, painter: &mut dyn Painter<'a>, recursion_stack: &mut RecursionStack, s: &mut Stream, format: u8, #[cfg(feature = "variable-fonts")] coords: &[NormalizedCoordinate], foreground_color: RgbaColor, ) -> Option<()> { match format { 1 => { // PaintColrLayers let layers_count = s.read::()?; let first_layer_index = s.read::()?; for i in 0..layers_count { let index = first_layer_index.checked_add(u32::from(i))?; let paint_offset = self.layer_paint_offsets.get(index)?; self.parse_paint( self.layer_paint_offsets_offset.to_usize() + paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); } } 2 => { // PaintSolid let palette_index = s.read::()?; let alpha = s.read::()?; let mut color = if palette_index == u16::MAX { foreground_color } else { self.palettes.get(palette, palette_index)? }; color.apply_alpha(alpha.to_f32()); painter.paint(Paint::Solid(color)); } #[cfg(feature = "variable-fonts")] 3 => { // PaintVarSolid let palette_index = s.read::()?; let alpha = s.read::()?; let var_index_base = s.read::()?; let deltas = self .variation_data() .read_deltas::<1>(var_index_base, coords); let mut color = if palette_index == u16::MAX { foreground_color } else { self.palettes.get(palette, palette_index)? }; color.apply_alpha(alpha.apply_float_delta(deltas[0])); painter.paint(Paint::Solid(color)); } 4 => { // PaintLinearGradient let color_line_offset = s.read::()?; let color_line = self.parse_color_line(offset + color_line_offset.to_usize(), foreground_color)?; painter.paint(Paint::LinearGradient(LinearGradient { x0: s.read::()? as f32, y0: s.read::()? as f32, x1: s.read::()? as f32, y1: s.read::()? as f32, x2: s.read::()? as f32, y2: s.read::()? as f32, extend: color_line.extend, #[cfg(feature = "variable-fonts")] variation_data: self.variation_data(), color_line: ColorLine::NonVarColorLine(color_line), })) } #[cfg(feature = "variable-fonts")] 5 => { // PaintVarLinearGradient let var_color_line_offset = s.read::()?; let color_line = self.parse_var_color_line( offset + var_color_line_offset.to_usize(), foreground_color, )?; let mut var_s = s.clone(); var_s.advance(12); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<6>(var_index_base, coords); painter.paint(Paint::LinearGradient(LinearGradient { x0: s.read::()? as f32 + deltas[0], y0: s.read::()? as f32 + deltas[1], x1: s.read::()? as f32 + deltas[2], y1: s.read::()? as f32 + deltas[3], x2: s.read::()? as f32 + deltas[4], y2: s.read::()? as f32 + deltas[5], extend: color_line.extend, variation_data: self.variation_data(), color_line: ColorLine::VarColorLine(color_line), })) } 6 => { // PaintRadialGradient let color_line_offset = s.read::()?; let color_line = self.parse_color_line(offset + color_line_offset.to_usize(), foreground_color)?; painter.paint(Paint::RadialGradient(RadialGradient { x0: s.read::()? as f32, y0: s.read::()? as f32, r0: s.read::()? as f32, x1: s.read::()? as f32, y1: s.read::()? as f32, r1: s.read::()? as f32, extend: color_line.extend, #[cfg(feature = "variable-fonts")] variation_data: self.variation_data(), color_line: ColorLine::NonVarColorLine(color_line), })) } #[cfg(feature = "variable-fonts")] 7 => { // PaintVarRadialGradient let color_line_offset = s.read::()?; let color_line = self.parse_var_color_line( offset + color_line_offset.to_usize(), foreground_color, )?; let mut var_s = s.clone(); var_s.advance(12); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<6>(var_index_base, coords); painter.paint(Paint::RadialGradient(RadialGradient { x0: s.read::()? as f32 + deltas[0], y0: s.read::()? as f32 + deltas[1], r0: s.read::()? as f32 + deltas[2], x1: s.read::()? as f32 + deltas[3], y1: s.read::()? as f32 + deltas[4], r1: s.read::()? as f32 + deltas[5], extend: color_line.extend, variation_data: self.variation_data(), color_line: ColorLine::VarColorLine(color_line), })) } 8 => { // PaintSweepGradient let color_line_offset = s.read::()?; let color_line = self.parse_color_line(offset + color_line_offset.to_usize(), foreground_color)?; painter.paint(Paint::SweepGradient(SweepGradient { center_x: s.read::()? as f32, center_y: s.read::()? as f32, start_angle: s.read::()?.to_f32(), end_angle: s.read::()?.to_f32(), extend: color_line.extend, color_line: ColorLine::NonVarColorLine(color_line), #[cfg(feature = "variable-fonts")] variation_data: self.variation_data(), })) } #[cfg(feature = "variable-fonts")] 9 => { // PaintVarSweepGradient let color_line_offset = s.read::()?; let color_line = self.parse_var_color_line( offset + color_line_offset.to_usize(), foreground_color, )?; let mut var_s = s.clone(); var_s.advance(8); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<4>(var_index_base, coords); painter.paint(Paint::SweepGradient(SweepGradient { center_x: s.read::()? as f32 + deltas[0], center_y: s.read::()? as f32 + deltas[1], start_angle: s.read::()?.apply_float_delta(deltas[2]), end_angle: s.read::()?.apply_float_delta(deltas[3]), extend: color_line.extend, color_line: ColorLine::VarColorLine(color_line), variation_data: self.variation_data(), })) } 10 => { // PaintGlyph let paint_offset = s.read::()?; let glyph_id = s.read::()?; painter.outline_glyph(glyph_id); painter.push_clip(); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.pop_clip(); } 11 => { // PaintColrGlyph let glyph_id = s.read::()?; self.paint_impl( glyph_id, palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); } 12 => { // PaintTransform let paint_offset = s.read::()?; let ts_offset = s.read::()?; let mut s = Stream::new_at(self.data, offset + ts_offset.to_usize())?; let ts = Transform { a: s.read::().map(|n| n.0)?, b: s.read::().map(|n| n.0)?, c: s.read::().map(|n| n.0)?, d: s.read::().map(|n| n.0)?, e: s.read::().map(|n| n.0)?, f: s.read::().map(|n| n.0)?, }; painter.push_transform(ts); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.pop_transform(); } #[cfg(feature = "variable-fonts")] 13 => { // PaintVarTransform let paint_offset = s.read::()?; let ts_offset = s.read::()?; let mut s = Stream::new_at(self.data, offset + ts_offset.to_usize())?; let mut var_s = s.clone(); var_s.advance(24); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<6>(var_index_base, coords); let ts = Transform { a: s.read::()?.apply_float_delta(deltas[0]), b: s.read::()?.apply_float_delta(deltas[1]), c: s.read::()?.apply_float_delta(deltas[2]), d: s.read::()?.apply_float_delta(deltas[3]), e: s.read::()?.apply_float_delta(deltas[4]), f: s.read::()?.apply_float_delta(deltas[5]), }; painter.push_transform(ts); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, coords, foreground_color, ); painter.pop_transform(); } 14 => { // PaintTranslate let paint_offset = s.read::()?; let tx = f32::from(s.read::()?); let ty = f32::from(s.read::()?); painter.push_transform(Transform::new_translate(tx, ty)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.pop_transform(); } #[cfg(feature = "variable-fonts")] 15 => { // PaintVarTranslate let paint_offset = s.read::()?; let mut var_s = s.clone(); var_s.advance(4); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<2>(var_index_base, coords); let tx = f32::from(s.read::()?) + deltas[0]; let ty = f32::from(s.read::()?) + deltas[1]; painter.push_transform(Transform::new_translate(tx, ty)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, coords, foreground_color, ); painter.pop_transform(); } 16 => { // PaintScale let paint_offset = s.read::()?; let sx = s.read::()?.to_f32(); let sy = s.read::()?.to_f32(); painter.push_transform(Transform::new_scale(sx, sy)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.pop_transform(); } #[cfg(feature = "variable-fonts")] 17 => { // PaintVarScale let paint_offset = s.read::()?; let mut var_s = s.clone(); var_s.advance(4); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<2>(var_index_base, coords); let sx = s.read::()?.apply_float_delta(deltas[0]); let sy = s.read::()?.apply_float_delta(deltas[1]); painter.push_transform(Transform::new_scale(sx, sy)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, coords, foreground_color, ); painter.pop_transform(); } 18 => { // PaintScaleAroundCenter let paint_offset = s.read::()?; let sx = s.read::()?.to_f32(); let sy = s.read::()?.to_f32(); let center_x = f32::from(s.read::()?); let center_y = f32::from(s.read::()?); painter.push_transform(Transform::new_translate(center_x, center_y)); painter.push_transform(Transform::new_scale(sx, sy)); painter.push_transform(Transform::new_translate(-center_x, -center_y)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.pop_transform(); painter.pop_transform(); painter.pop_transform(); } #[cfg(feature = "variable-fonts")] 19 => { // PaintVarScaleAroundCenter let paint_offset = s.read::()?; let mut var_s = s.clone(); var_s.advance(8); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<4>(var_index_base, coords); let sx = s.read::()?.apply_float_delta(deltas[0]); let sy = s.read::()?.apply_float_delta(deltas[1]); let center_x = f32::from(s.read::()?) + deltas[2]; let center_y = f32::from(s.read::()?) + deltas[3]; painter.push_transform(Transform::new_translate(center_x, center_y)); painter.push_transform(Transform::new_scale(sx, sy)); painter.push_transform(Transform::new_translate(-center_x, -center_y)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, coords, foreground_color, ); painter.pop_transform(); painter.pop_transform(); painter.pop_transform(); } 20 => { // PaintScaleUniform let paint_offset = s.read::()?; let scale = s.read::()?.to_f32(); painter.push_transform(Transform::new_scale(scale, scale)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.pop_transform(); } #[cfg(feature = "variable-fonts")] 21 => { // PaintVarScaleUniform let paint_offset = s.read::()?; let mut var_s = s.clone(); var_s.advance(2); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<1>(var_index_base, coords); let scale = s.read::()?.apply_float_delta(deltas[0]); painter.push_transform(Transform::new_scale(scale, scale)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, coords, foreground_color, ); painter.pop_transform(); } 22 => { // PaintScaleUniformAroundCenter let paint_offset = s.read::()?; let scale = s.read::()?.to_f32(); let center_x = f32::from(s.read::()?); let center_y = f32::from(s.read::()?); painter.push_transform(Transform::new_translate(center_x, center_y)); painter.push_transform(Transform::new_scale(scale, scale)); painter.push_transform(Transform::new_translate(-center_x, -center_y)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.pop_transform(); painter.pop_transform(); painter.pop_transform(); } #[cfg(feature = "variable-fonts")] 23 => { // PaintVarScaleUniformAroundCenter let paint_offset = s.read::()?; let mut var_s = s.clone(); var_s.advance(6); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<3>(var_index_base, coords); let scale = s.read::()?.apply_float_delta(deltas[0]); let center_x = f32::from(s.read::()?) + deltas[1]; let center_y = f32::from(s.read::()?) + deltas[2]; painter.push_transform(Transform::new_translate(center_x, center_y)); painter.push_transform(Transform::new_scale(scale, scale)); painter.push_transform(Transform::new_translate(-center_x, -center_y)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, coords, foreground_color, ); painter.pop_transform(); painter.pop_transform(); painter.pop_transform(); } 24 => { // PaintRotate let paint_offset = s.read::()?; let angle = s.read::()?.to_f32(); painter.push_transform(Transform::new_rotate(angle)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.pop_transform(); } #[cfg(feature = "variable-fonts")] 25 => { // PaintVarRotate let paint_offset = s.read::()?; let mut var_s = s.clone(); var_s.advance(2); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<1>(var_index_base, coords); let angle = s.read::()?.apply_float_delta(deltas[0]); painter.push_transform(Transform::new_rotate(angle)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, coords, foreground_color, ); painter.pop_transform(); } 26 => { // PaintRotateAroundCenter let paint_offset = s.read::()?; let angle = s.read::()?.to_f32(); let center_x = f32::from(s.read::()?); let center_y = f32::from(s.read::()?); painter.push_transform(Transform::new_translate(center_x, center_y)); painter.push_transform(Transform::new_rotate(angle)); painter.push_transform(Transform::new_translate(-center_x, -center_y)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.pop_transform(); painter.pop_transform(); painter.pop_transform(); } #[cfg(feature = "variable-fonts")] 27 => { // PaintVarRotateAroundCenter let paint_offset = s.read::()?; let mut var_s = s.clone(); var_s.advance(6); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<3>(var_index_base, coords); let angle = s.read::()?.apply_float_delta(deltas[0]); let center_x = f32::from(s.read::()?) + deltas[1]; let center_y = f32::from(s.read::()?) + deltas[2]; painter.push_transform(Transform::new_translate(center_x, center_y)); painter.push_transform(Transform::new_rotate(angle)); painter.push_transform(Transform::new_translate(-center_x, -center_y)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, coords, foreground_color, ); painter.pop_transform(); painter.pop_transform(); painter.pop_transform(); } 28 => { // PaintSkew let paint_offset = s.read::()?; let skew_x = s.read::()?.to_f32(); let skew_y = s.read::()?.to_f32(); painter.push_transform(Transform::new_skew(skew_x, skew_y)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.pop_transform(); } #[cfg(feature = "variable-fonts")] 29 => { // PaintVarSkew let paint_offset = s.read::()?; let mut var_s = s.clone(); var_s.advance(4); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<2>(var_index_base, coords); let skew_x = s.read::()?.apply_float_delta(deltas[0]); let skew_y = s.read::()?.apply_float_delta(deltas[1]); painter.push_transform(Transform::new_skew(skew_x, skew_y)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, coords, foreground_color, ); painter.pop_transform(); } 30 => { // PaintSkewAroundCenter let paint_offset = s.read::()?; let skew_x = s.read::()?.to_f32(); let skew_y = s.read::()?.to_f32(); let center_x = f32::from(s.read::()?); let center_y = f32::from(s.read::()?); painter.push_transform(Transform::new_translate(center_x, center_y)); painter.push_transform(Transform::new_skew(skew_x, skew_y)); painter.push_transform(Transform::new_translate(-center_x, -center_y)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.pop_transform(); painter.pop_transform(); painter.pop_transform(); } #[cfg(feature = "variable-fonts")] 31 => { // PaintVarSkewAroundCenter let paint_offset = s.read::()?; let mut var_s = s.clone(); var_s.advance(8); let var_index_base = var_s.read::()?; let deltas = self .variation_data() .read_deltas::<4>(var_index_base, coords); let skew_x = s.read::()?.apply_float_delta(deltas[0]); let skew_y = s.read::()?.apply_float_delta(deltas[1]); let center_x = f32::from(s.read::()?) + deltas[2]; let center_y = f32::from(s.read::()?) + deltas[3]; painter.push_transform(Transform::new_translate(center_x, center_y)); painter.push_transform(Transform::new_skew(skew_x, skew_y)); painter.push_transform(Transform::new_translate(-center_x, -center_y)); self.parse_paint( offset + paint_offset.to_usize(), palette, painter, recursion_stack, coords, foreground_color, ); painter.pop_transform(); painter.pop_transform(); painter.pop_transform(); } 32 => { // PaintComposite let source_paint_offset = s.read::()?; let composite_mode = s.read::()?; let backdrop_paint_offset = s.read::()?; painter.push_layer(CompositeMode::SourceOver); self.parse_paint( offset + backdrop_paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.push_layer(composite_mode); self.parse_paint( offset + source_paint_offset.to_usize(), palette, painter, recursion_stack, #[cfg(feature = "variable-fonts")] coords, foreground_color, ); painter.pop_layer(); painter.pop_layer(); } _ => {} } Some(()) } fn parse_color_line( &self, offset: usize, foreground_color: RgbaColor, ) -> Option> { let mut s = Stream::new_at(self.data, offset)?; let extend = s.read::()?; let count = s.read::()?; let colors = s.read_array16::(count)?; Some(NonVarColorLine { extend, colors, foreground_color, palettes: self.palettes, }) } #[cfg(feature = "variable-fonts")] fn parse_var_color_line( &self, offset: usize, foreground_color: RgbaColor, ) -> Option> { let mut s = Stream::new_at(self.data, offset)?; let extend = s.read::()?; let count = s.read::()?; let colors = s.read_array16::(count)?; Some(VarColorLine { extend, colors, foreground_color, palettes: self.palettes, }) } } struct RecursionStack { // The limit of 64 is chosen arbitrarily and not from the spec. But we have to stop somewhere... stack: [usize; 64], len: usize, } impl RecursionStack { #[inline] pub fn is_empty(&self) -> bool { self.len == 0 } #[inline] pub fn push(&mut self, offset: usize) -> Result<(), ()> { if self.len == self.stack.len() { Err(()) } else { self.stack[self.len] = offset; self.len += 1; Ok(()) } } #[inline] pub fn contains(&self, offset: usize) -> bool { if let Some(offsets) = self.stack.get(..self.len) { return offsets.contains(&offset); } false } #[inline] pub fn pop(&mut self) { debug_assert!(!self.is_empty()); self.len -= 1; } } #[cfg(feature = "variable-fonts")] #[derive(Clone, Copy, Debug, Default)] struct VariationData<'a> { variation_store: Option>, delta_map: Option>, } #[cfg(feature = "variable-fonts")] impl VariationData<'_> { // Inspired from `fontations`. fn read_deltas( &self, var_index_base: u32, coordinates: &[NormalizedCoordinate], ) -> [f32; N] { const NO_VARIATION_DELTAS: u32 = 0xFFFFFFFF; let mut deltas = [0.0; N]; if coordinates.is_empty() || self.variation_store.is_none() || var_index_base == NO_VARIATION_DELTAS { return deltas; } let variation_store = self.variation_store.as_ref().unwrap(); for (i, delta) in deltas.iter_mut().enumerate() { *delta = self .delta_map .and_then(|d| d.map(var_index_base + i as u32)) .and_then(|d| variation_store.parse_delta(d.0, d.1, coordinates)) .unwrap_or(0.0); } deltas } } ttf-parser-0.24.1/src/tables/cpal.rs000064400000000000000000000045671046102023000153410ustar 00000000000000//! A [Color Palette Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/cpal) implementation. use core::num::NonZeroU16; use crate::parser::{FromData, LazyArray16, Offset, Offset32, Stream}; use crate::RgbaColor; /// A [Color Palette Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/cpal). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { color_indices: LazyArray16<'a, u16>, colors: LazyArray16<'a, BgraColor>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if version > 1 { return None; } s.skip::(); // number of palette entries let num_palettes = s.read::()?; if num_palettes == 0 { return None; // zero palettes is an error } let num_colors = s.read::()?; let color_records_offset = s.read::()?; let color_indices = s.read_array16::(num_palettes)?; let colors = Stream::new_at(data, color_records_offset.to_usize())? .read_array16::(num_colors)?; Some(Self { color_indices, colors, }) } /// Returns the number of palettes. pub fn palettes(&self) -> NonZeroU16 { // Already checked during parsing. NonZeroU16::new(self.color_indices.len()).unwrap() } /// Returns the color at the given index into the given palette. pub fn get(&self, palette_index: u16, palette_entry: u16) -> Option { let index = self .color_indices .get(palette_index)? .checked_add(palette_entry)?; self.colors.get(index).map(|c| c.to_rgba()) } } #[derive(Clone, Copy, PartialEq, Eq, Debug)] struct BgraColor { blue: u8, green: u8, red: u8, alpha: u8, } impl BgraColor { #[inline] fn to_rgba(self) -> RgbaColor { RgbaColor::new(self.red, self.green, self.blue, self.alpha) } } impl FromData for BgraColor { const SIZE: usize = 4; fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Self { blue: s.read::()?, green: s.read::()?, red: s.read::()?, alpha: s.read::()?, }) } } ttf-parser-0.24.1/src/tables/feat.rs000064400000000000000000000117731046102023000153360ustar 00000000000000//! A [Feature Name Table]( //! https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6feat.html) implementation. use crate::parser::{FromData, LazyArray16, Offset, Offset32, Stream}; #[derive(Clone, Copy, Debug)] struct FeatureNameRecord { feature: u16, setting_table_records_count: u16, // Offset from the beginning of the table. setting_table_offset: Offset32, flags: u8, default_setting_index: u8, name_index: u16, } impl FromData for FeatureNameRecord { const SIZE: usize = 12; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(FeatureNameRecord { feature: s.read::()?, setting_table_records_count: s.read::()?, setting_table_offset: s.read::()?, flags: s.read::()?, default_setting_index: s.read::()?, name_index: s.read::()?, }) } } /// A setting name. #[derive(Clone, Copy, Debug)] pub struct SettingName { /// The setting. pub setting: u16, /// The `name` table index for the feature's name in a 256..32768 range. pub name_index: u16, } impl FromData for SettingName { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(SettingName { setting: s.read::()?, name_index: s.read::()?, }) } } /// A feature names. #[derive(Clone, Copy, Debug)] pub struct FeatureName<'a> { /// The feature's ID. pub feature: u16, /// The feature's setting names. pub setting_names: LazyArray16<'a, SettingName>, /// The index of the default setting in the `setting_names`. pub default_setting_index: u8, /// The feature's exclusive settings. If set, the feature settings are mutually exclusive. pub exclusive: bool, /// The `name` table index for the feature's name in a 256..32768 range. pub name_index: u16, } /// A list fo feature names. #[derive(Clone, Copy)] pub struct FeatureNames<'a> { data: &'a [u8], records: LazyArray16<'a, FeatureNameRecord>, } impl<'a> FeatureNames<'a> { /// Returns a feature name at an index. pub fn get(&self, index: u16) -> Option> { let record = self.records.get(index)?; let data = self.data.get(record.setting_table_offset.to_usize()..)?; let mut s = Stream::new(data); let setting_names = s.read_array16::(record.setting_table_records_count)?; Some(FeatureName { feature: record.feature, setting_names, default_setting_index: if record.flags & 0x40 != 0 { record.default_setting_index } else { 0 }, exclusive: record.flags & 0x80 != 0, name_index: record.name_index, }) } /// Finds a feature name by ID. pub fn find(&self, feature: u16) -> Option> { let index = self .records .binary_search_by(|name| name.feature.cmp(&feature)) .map(|(i, _)| i)?; self.get(index) } /// Returns the number of feature names. pub fn len(&self) -> u16 { self.records.len() } /// Checks if there are any feature names. pub fn is_empty(&self) -> bool { self.records.is_empty() } } impl<'a> core::fmt::Debug for FeatureNames<'a> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { f.debug_list().entries(*self).finish() } } impl<'a> IntoIterator for FeatureNames<'a> { type Item = FeatureName<'a>; type IntoIter = FeatureNamesIter<'a>; #[inline] fn into_iter(self) -> Self::IntoIter { FeatureNamesIter { names: self, index: 0, } } } /// An iterator over [`FeatureNames`]. #[allow(missing_debug_implementations)] pub struct FeatureNamesIter<'a> { names: FeatureNames<'a>, index: u16, } impl<'a> Iterator for FeatureNamesIter<'a> { type Item = FeatureName<'a>; fn next(&mut self) -> Option { if self.index < self.names.len() { self.index += 1; self.names.get(self.index - 1) } else { None } } } /// A [Feature Name Table]( /// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6feat.html). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// A list of feature names. Sorted by `FeatureName.feature`. pub names: FeatureNames<'a>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if version != 0x00010000 { return None; } let count = s.read::()?; s.advance_checked(6)?; // reserved let records = s.read_array16::(count)?; Some(Table { names: FeatureNames { data, records }, }) } } ttf-parser-0.24.1/src/tables/fvar.rs000064400000000000000000000057441046102023000153560ustar 00000000000000//! A [Font Variations Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/fvar) implementation. use core::num::NonZeroU16; use crate::parser::{f32_bound, Fixed, FromData, LazyArray16, Offset, Offset16, Stream}; use crate::{NormalizedCoordinate, Tag}; /// A [variation axis](https://docs.microsoft.com/en-us/typography/opentype/spec/fvar#variationaxisrecord). #[repr(C)] #[allow(missing_docs)] #[derive(Clone, Copy, PartialEq, Debug)] pub struct VariationAxis { pub tag: Tag, pub min_value: f32, pub def_value: f32, pub max_value: f32, /// An axis name in the `name` table. pub name_id: u16, pub hidden: bool, } impl FromData for VariationAxis { const SIZE: usize = 20; fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); let tag = s.read::()?; let min_value = s.read::()?; let def_value = s.read::()?; let max_value = s.read::()?; let flags = s.read::()?; let name_id = s.read::()?; Some(VariationAxis { tag, min_value: def_value.0.min(min_value.0), def_value: def_value.0, max_value: def_value.0.max(max_value.0), name_id, hidden: (flags >> 3) & 1 == 1, }) } } impl VariationAxis { /// Returns a normalized variation coordinate for this axis. pub(crate) fn normalized_value(&self, mut v: f32) -> NormalizedCoordinate { // Based on // https://docs.microsoft.com/en-us/typography/opentype/spec/avar#overview v = f32_bound(self.min_value, v, self.max_value); if v == self.def_value { v = 0.0; } else if v < self.def_value { v = (v - self.def_value) / (self.def_value - self.min_value); } else { v = (v - self.def_value) / (self.max_value - self.def_value); } NormalizedCoordinate::from(v) } } /// A [Font Variations Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/fvar). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// A list of variation axes. pub axes: LazyArray16<'a, VariationAxis>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if version != 0x00010000 { return None; } let axes_array_offset = s.read::()?; s.skip::(); // reserved let axis_count = s.read::()?; // 'If axisCount is zero, then the font is not functional as a variable font, // and must be treated as a non-variable font; // any variation-specific tables or data is ignored.' let axis_count = NonZeroU16::new(axis_count)?; let mut s = Stream::new_at(data, axes_array_offset.to_usize())?; let axes = s.read_array16::(axis_count.get())?; Some(Table { axes }) } } ttf-parser-0.24.1/src/tables/gdef.rs000064400000000000000000000152321046102023000153160ustar 00000000000000//! A [Glyph Definition Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/gdef) implementation. use crate::opentype_layout::{Class, ClassDefinition, Coverage}; use crate::parser::{FromSlice, LazyArray16, Offset, Offset16, Offset32, Stream}; use crate::GlyphId; #[cfg(feature = "variable-fonts")] use crate::var_store::ItemVariationStore; #[cfg(feature = "variable-fonts")] use crate::NormalizedCoordinate; /// A [glyph class](https://docs.microsoft.com/en-us/typography/opentype/spec/gdef#glyph-class-definition-table). #[allow(missing_docs)] #[derive(Clone, Copy, Eq, PartialEq, Ord, PartialOrd, Debug, Hash)] pub enum GlyphClass { Base = 1, Ligature = 2, Mark = 3, Component = 4, } /// A [Glyph Definition Table](https://docs.microsoft.com/en-us/typography/opentype/spec/gdef). #[allow(missing_debug_implementations)] #[derive(Clone, Copy, Default)] pub struct Table<'a> { glyph_classes: Option>, mark_attach_classes: Option>, mark_glyph_coverage_offsets: Option<(&'a [u8], LazyArray16<'a, Offset32>)>, #[cfg(feature = "variable-fonts")] variation_store: Option>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if !(version == 0x00010000 || version == 0x00010002 || version == 0x00010003) { return None; } let glyph_class_def_offset = s.read::>()?; s.skip::(); // attachListOffset s.skip::(); // ligCaretListOffset let mark_attach_class_def_offset = s.read::>()?; let mut mark_glyph_sets_def_offset: Option = None; if version > 0x00010000 { mark_glyph_sets_def_offset = s.read::>()?; } #[allow(unused_mut)] #[allow(unused_variables)] let mut var_store_offset: Option = None; #[cfg(feature = "variable-fonts")] { if version > 0x00010002 { var_store_offset = s.read::>()?; } } let mut table = Table::default(); if let Some(offset) = glyph_class_def_offset { if let Some(subdata) = data.get(offset.to_usize()..) { table.glyph_classes = ClassDefinition::parse(subdata); } } if let Some(offset) = mark_attach_class_def_offset { if let Some(subdata) = data.get(offset.to_usize()..) { table.mark_attach_classes = ClassDefinition::parse(subdata); } } if let Some(offset) = mark_glyph_sets_def_offset { if let Some(subdata) = data.get(offset.to_usize()..) { let mut s = Stream::new(subdata); let format = s.read::()?; if format == 1 { if let Some(count) = s.read::() { if let Some(array) = s.read_array16::(count) { table.mark_glyph_coverage_offsets = Some((subdata, array)); } } } } } #[cfg(feature = "variable-fonts")] { if let Some(offset) = var_store_offset { if let Some(subdata) = data.get(offset.to_usize()..) { let s = Stream::new(subdata); table.variation_store = ItemVariationStore::parse(s); } } } Some(table) } /// Checks that face has /// [Glyph Class Definition Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gdef#glyph-class-definition-table). #[inline] pub fn has_glyph_classes(&self) -> bool { self.glyph_classes.is_some() } /// Returns glyph's class according to /// [Glyph Class Definition Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gdef#glyph-class-definition-table). /// /// Returns `None` when *Glyph Class Definition Table* is not set /// or glyph class is not set or invalid. #[inline] pub fn glyph_class(&self, glyph_id: GlyphId) -> Option { match self.glyph_classes?.get(glyph_id) { 1 => Some(GlyphClass::Base), 2 => Some(GlyphClass::Ligature), 3 => Some(GlyphClass::Mark), 4 => Some(GlyphClass::Component), _ => None, } } /// Returns glyph's mark attachment class according to /// [Mark Attachment Class Definition Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gdef#mark-attachment-class-definition-table). /// /// All glyphs not assigned to a class fall into Class 0. #[inline] pub fn glyph_mark_attachment_class(&self, glyph_id: GlyphId) -> Class { self.mark_attach_classes .map(|def| def.get(glyph_id)) .unwrap_or(0) } /// Checks that glyph is a mark according to /// [Mark Glyph Sets Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gdef#mark-glyph-sets-table). /// /// `set_index` allows checking a specific glyph coverage set. /// Otherwise all sets will be checked. #[inline] pub fn is_mark_glyph(&self, glyph_id: GlyphId, set_index: Option) -> bool { is_mark_glyph_impl(self, glyph_id, set_index).is_some() } /// Returns glyph's variation delta at a specified index according to /// [Item Variation Store Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gdef#item-variation-store-table). #[cfg(feature = "variable-fonts")] #[inline] pub fn glyph_variation_delta( &self, outer_index: u16, inner_index: u16, coordinates: &[NormalizedCoordinate], ) -> Option { self.variation_store .and_then(|store| store.parse_delta(outer_index, inner_index, coordinates)) } } #[inline(never)] fn is_mark_glyph_impl(table: &Table, glyph_id: GlyphId, set_index: Option) -> Option<()> { let (data, offsets) = table.mark_glyph_coverage_offsets?; if let Some(set_index) = set_index { if let Some(offset) = offsets.get(set_index) { let table = Coverage::parse(data.get(offset.to_usize()..)?)?; if table.contains(glyph_id) { return Some(()); } } } else { for offset in offsets { let table = Coverage::parse(data.get(offset.to_usize()..)?)?; if table.contains(glyph_id) { return Some(()); } } } None } ttf-parser-0.24.1/src/tables/glyf.rs000064400000000000000000000501741046102023000153560ustar 00000000000000//! A [Glyph Data Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/glyf) implementation. use core::num::NonZeroU16; use crate::parser::{LazyArray16, NumFrom, Stream, F2DOT14}; use crate::{loca, GlyphId, OutlineBuilder, Rect, RectF, Transform}; pub(crate) struct Builder<'a> { pub builder: &'a mut dyn OutlineBuilder, pub transform: Transform, is_default_ts: bool, // `bool` is faster than `Option` or `is_default`. // We have to always calculate the bbox, because `gvar` doesn't store one // and in case of a malformed bbox in `glyf`. pub bbox: RectF, first_on_curve: Option, first_off_curve: Option, last_off_curve: Option, } impl<'a> Builder<'a> { #[inline] pub fn new(transform: Transform, bbox: RectF, builder: &'a mut dyn OutlineBuilder) -> Self { Builder { builder, transform, is_default_ts: transform.is_default(), bbox, first_on_curve: None, first_off_curve: None, last_off_curve: None, } } #[inline] fn move_to(&mut self, mut x: f32, mut y: f32) { if !self.is_default_ts { self.transform.apply_to(&mut x, &mut y); } self.bbox.extend_by(x, y); self.builder.move_to(x, y); } #[inline] fn line_to(&mut self, mut x: f32, mut y: f32) { if !self.is_default_ts { self.transform.apply_to(&mut x, &mut y); } self.bbox.extend_by(x, y); self.builder.line_to(x, y); } #[inline] fn quad_to(&mut self, mut x1: f32, mut y1: f32, mut x: f32, mut y: f32) { if !self.is_default_ts { self.transform.apply_to(&mut x1, &mut y1); self.transform.apply_to(&mut x, &mut y); } self.bbox.extend_by(x1, y1); self.bbox.extend_by(x, y); self.builder.quad_to(x1, y1, x, y); } // Useful links: // // - https://developer.apple.com/fonts/TrueType-Reference-Manual/RM01/Chap1.html // - https://stackoverflow.com/a/20772557 #[inline] pub fn push_point(&mut self, x: f32, y: f32, on_curve_point: bool, last_point: bool) { let p = Point { x, y }; if self.first_on_curve.is_none() { if on_curve_point { self.first_on_curve = Some(p); self.move_to(p.x, p.y); } else { if let Some(offcurve) = self.first_off_curve { let mid = offcurve.lerp(p, 0.5); self.first_on_curve = Some(mid); self.last_off_curve = Some(p); self.move_to(mid.x, mid.y); } else { self.first_off_curve = Some(p); } } } else { match (self.last_off_curve, on_curve_point) { (Some(offcurve), true) => { self.last_off_curve = None; self.quad_to(offcurve.x, offcurve.y, p.x, p.y); } (Some(offcurve), false) => { self.last_off_curve = Some(p); let mid = offcurve.lerp(p, 0.5); self.quad_to(offcurve.x, offcurve.y, mid.x, mid.y); } (None, true) => { self.line_to(p.x, p.y); } (None, false) => { self.last_off_curve = Some(p); } } } if last_point { self.finish_contour(); } } #[inline] fn finish_contour(&mut self) { if let (Some(offcurve1), Some(offcurve2)) = (self.first_off_curve, self.last_off_curve) { self.last_off_curve = None; let mid = offcurve2.lerp(offcurve1, 0.5); self.quad_to(offcurve2.x, offcurve2.y, mid.x, mid.y); } if let (Some(p), Some(offcurve1)) = (self.first_on_curve, self.first_off_curve) { self.quad_to(offcurve1.x, offcurve1.y, p.x, p.y); } else if let (Some(p), Some(offcurve2)) = (self.first_on_curve, self.last_off_curve) { self.quad_to(offcurve2.x, offcurve2.y, p.x, p.y); } else if let Some(p) = self.first_on_curve { self.line_to(p.x, p.y); } self.first_on_curve = None; self.first_off_curve = None; self.last_off_curve = None; self.builder.close(); } } #[derive(Clone, Copy, Debug)] pub(crate) struct CompositeGlyphInfo { pub glyph_id: GlyphId, pub transform: Transform, #[allow(dead_code)] pub flags: CompositeGlyphFlags, } #[derive(Clone)] pub(crate) struct CompositeGlyphIter<'a> { stream: Stream<'a>, } impl<'a> CompositeGlyphIter<'a> { #[inline] pub fn new(data: &'a [u8]) -> Self { CompositeGlyphIter { stream: Stream::new(data), } } } impl<'a> Iterator for CompositeGlyphIter<'a> { type Item = CompositeGlyphInfo; #[inline] fn next(&mut self) -> Option { let flags = CompositeGlyphFlags(self.stream.read::()?); let glyph_id = self.stream.read::()?; let mut ts = Transform::default(); if flags.args_are_xy_values() { if flags.arg_1_and_2_are_words() { ts.e = f32::from(self.stream.read::()?); ts.f = f32::from(self.stream.read::()?); } else { ts.e = f32::from(self.stream.read::()?); ts.f = f32::from(self.stream.read::()?); } } if flags.we_have_a_two_by_two() { ts.a = self.stream.read::()?.to_f32(); ts.b = self.stream.read::()?.to_f32(); ts.c = self.stream.read::()?.to_f32(); ts.d = self.stream.read::()?.to_f32(); } else if flags.we_have_an_x_and_y_scale() { ts.a = self.stream.read::()?.to_f32(); ts.d = self.stream.read::()?.to_f32(); } else if flags.we_have_a_scale() { ts.a = self.stream.read::()?.to_f32(); ts.d = ts.a; } if !flags.more_components() { // Finish the iterator even if stream still has some data. self.stream.jump_to_end(); } Some(CompositeGlyphInfo { glyph_id, transform: ts, flags, }) } } // Due to some optimization magic, using f32 instead of i16 // makes the code ~10% slower. At least on my machine. // I guess it's due to the fact that with i16 the struct // fits into the machine word. #[derive(Clone, Copy, Debug)] pub(crate) struct GlyphPoint { pub x: i16, pub y: i16, /// Indicates that a point is a point on curve /// and not a control point. pub on_curve_point: bool, pub last_point: bool, } #[derive(Clone, Default)] pub(crate) struct GlyphPointsIter<'a> { endpoints: EndpointsIter<'a>, flags: FlagsIter<'a>, x_coords: CoordsIter<'a>, y_coords: CoordsIter<'a>, pub points_left: u16, // Number of points left in the glyph. } #[cfg(feature = "variable-fonts")] impl GlyphPointsIter<'_> { #[inline] pub fn current_contour(&self) -> u16 { self.endpoints.index - 1 } } impl<'a> Iterator for GlyphPointsIter<'a> { type Item = GlyphPoint; #[inline] fn next(&mut self) -> Option { self.points_left = self.points_left.checked_sub(1)?; // TODO: skip empty contours let last_point = self.endpoints.next(); let flags = self.flags.next()?; Some(GlyphPoint { x: self .x_coords .next(flags.x_short(), flags.x_is_same_or_positive_short()), y: self .y_coords .next(flags.y_short(), flags.y_is_same_or_positive_short()), on_curve_point: flags.on_curve_point(), last_point, }) } } /// A simple flattening iterator for glyph's endpoints. /// /// Translates endpoints like: 2 4 7 /// into flags: 0 0 1 0 1 0 0 1 #[derive(Clone, Copy, Default)] struct EndpointsIter<'a> { endpoints: LazyArray16<'a, u16>, // Each endpoint indicates a contour end. index: u16, left: u16, } impl<'a> EndpointsIter<'a> { #[inline] fn new(endpoints: LazyArray16<'a, u16>) -> Option { Some(EndpointsIter { endpoints, index: 1, left: endpoints.get(0)?, }) } #[inline] fn next(&mut self) -> bool { if self.left == 0 { if let Some(end) = self.endpoints.get(self.index) { let prev = self.endpoints.get(self.index - 1).unwrap_or(0); // Malformed font can have endpoints not in increasing order, // so we have to use checked_sub. self.left = end.saturating_sub(prev); self.left = self.left.saturating_sub(1); } // Always advance the index, so we can check the current contour number. if let Some(n) = self.index.checked_add(1) { self.index = n; } true } else { self.left -= 1; false } } } #[derive(Clone, Default)] struct FlagsIter<'a> { stream: Stream<'a>, // Number of times the `flags` should be used // before reading the next one from `stream`. repeats: u8, flags: SimpleGlyphFlags, } impl<'a> FlagsIter<'a> { #[inline] fn new(data: &'a [u8]) -> Self { FlagsIter { stream: Stream::new(data), repeats: 0, flags: SimpleGlyphFlags(0), } } } impl<'a> Iterator for FlagsIter<'a> { type Item = SimpleGlyphFlags; #[inline] fn next(&mut self) -> Option { if self.repeats == 0 { self.flags = SimpleGlyphFlags(self.stream.read::().unwrap_or(0)); if self.flags.repeat_flag() { self.repeats = self.stream.read::().unwrap_or(0); } } else { self.repeats -= 1; } Some(self.flags) } } #[derive(Clone, Default)] struct CoordsIter<'a> { stream: Stream<'a>, prev: i16, // Points are stored as deltas, so we have to keep the previous one. } impl<'a> CoordsIter<'a> { #[inline] fn new(data: &'a [u8]) -> Self { CoordsIter { stream: Stream::new(data), prev: 0, } } #[inline] fn next(&mut self, is_short: bool, is_same_or_short: bool) -> i16 { // See https://docs.microsoft.com/en-us/typography/opentype/spec/glyf#simple-glyph-description // for details about Simple Glyph Flags processing. // We've already checked the coords data, so it's safe to fallback to 0. let mut n = 0; if is_short { n = i16::from(self.stream.read::().unwrap_or(0)); if !is_same_or_short { n = -n; } } else if !is_same_or_short { n = self.stream.read::().unwrap_or(0); } self.prev = self.prev.wrapping_add(n); self.prev } } #[derive(Clone, Copy, Debug)] struct Point { x: f32, y: f32, } impl Point { #[inline] fn lerp(self, other: Point, t: f32) -> Point { Point { x: self.x + t * (other.x - self.x), y: self.y + t * (other.y - self.y), } } } // https://docs.microsoft.com/en-us/typography/opentype/spec/glyf#simple-glyph-description #[derive(Clone, Copy, Default)] struct SimpleGlyphFlags(u8); #[rustfmt::skip] impl SimpleGlyphFlags { #[inline] fn on_curve_point(self) -> bool { self.0 & 0x01 != 0 } #[inline] fn x_short(self) -> bool { self.0 & 0x02 != 0 } #[inline] fn y_short(self) -> bool { self.0 & 0x04 != 0 } #[inline] fn repeat_flag(self) -> bool { self.0 & 0x08 != 0 } #[inline] fn x_is_same_or_positive_short(self) -> bool { self.0 & 0x10 != 0 } #[inline] fn y_is_same_or_positive_short(self) -> bool { self.0 & 0x20 != 0 } } // https://docs.microsoft.com/en-us/typography/opentype/spec/glyf#composite-glyph-description #[derive(Clone, Copy, Debug)] pub(crate) struct CompositeGlyphFlags(u16); #[rustfmt::skip] impl CompositeGlyphFlags { #[inline] pub fn arg_1_and_2_are_words(self) -> bool { self.0 & 0x0001 != 0 } #[inline] pub fn args_are_xy_values(self) -> bool { self.0 & 0x0002 != 0 } #[inline] pub fn we_have_a_scale(self) -> bool { self.0 & 0x0008 != 0 } #[inline] pub fn more_components(self) -> bool { self.0 & 0x0020 != 0 } #[inline] pub fn we_have_an_x_and_y_scale(self) -> bool { self.0 & 0x0040 != 0 } #[inline] pub fn we_have_a_two_by_two(self) -> bool { self.0 & 0x0080 != 0 } } // It's not defined in the spec, so we are using our own value. pub(crate) const MAX_COMPONENTS: u8 = 32; #[allow(clippy::comparison_chain)] #[inline] fn outline_impl( loca_table: loca::Table, glyf_table: &[u8], data: &[u8], depth: u8, builder: &mut Builder, ) -> Option> { if depth >= MAX_COMPONENTS { return None; } let mut s = Stream::new(data); let number_of_contours = s.read::()?; s.advance(8); // Skip bbox. We use calculated one. if number_of_contours > 0 { // Simple glyph. // u16 casting is safe, since we already checked that the value is positive. let number_of_contours = NonZeroU16::new(number_of_contours as u16)?; for point in parse_simple_outline(s.tail()?, number_of_contours)? { builder.push_point( f32::from(point.x), f32::from(point.y), point.on_curve_point, point.last_point, ); } } else if number_of_contours < 0 { // Composite glyph. for comp in CompositeGlyphIter::new(s.tail()?) { if let Some(range) = loca_table.glyph_range(comp.glyph_id) { if let Some(glyph_data) = glyf_table.get(range) { let transform = Transform::combine(builder.transform, comp.transform); let mut b = Builder::new(transform, builder.bbox, builder.builder); outline_impl(loca_table, glyf_table, glyph_data, depth + 1, &mut b)?; // Take updated bbox. builder.bbox = b.bbox; } } } } if builder.bbox.is_default() { return Some(None); } Some(builder.bbox.to_rect()) } #[inline] pub(crate) fn parse_simple_outline( glyph_data: &[u8], number_of_contours: NonZeroU16, ) -> Option { let mut s = Stream::new(glyph_data); let endpoints = s.read_array16::(number_of_contours.get())?; let points_total = endpoints.last()?.checked_add(1)?; // Contours with a single point should be ignored. // But this is not an error, so we should return an "empty" iterator. if points_total == 1 { return Some(GlyphPointsIter::default()); } // Skip instructions byte code. let instructions_len = s.read::()?; s.advance(usize::from(instructions_len)); let flags_offset = s.offset(); let (x_coords_len, y_coords_len) = resolve_coords_len(&mut s, points_total)?; let x_coords_offset = s.offset(); let y_coords_offset = x_coords_offset + usize::num_from(x_coords_len); let y_coords_end = y_coords_offset + usize::num_from(y_coords_len); Some(GlyphPointsIter { endpoints: EndpointsIter::new(endpoints)?, flags: FlagsIter::new(glyph_data.get(flags_offset..x_coords_offset)?), x_coords: CoordsIter::new(glyph_data.get(x_coords_offset..y_coords_offset)?), y_coords: CoordsIter::new(glyph_data.get(y_coords_offset..y_coords_end)?), points_left: points_total, }) } /// Resolves coordinate arrays length. /// /// The length depends on *Simple Glyph Flags*, so we have to process them all to find it. fn resolve_coords_len(s: &mut Stream, points_total: u16) -> Option<(u32, u32)> { let mut flags_left = u32::from(points_total); let mut repeats; let mut x_coords_len = 0; let mut y_coords_len = 0; while flags_left > 0 { let flags = SimpleGlyphFlags(s.read::()?); // The number of times a glyph point repeats. repeats = if flags.repeat_flag() { let repeats = s.read::()?; u32::from(repeats) + 1 } else { 1 }; if repeats > flags_left { return None; } // No need to check for `*_coords_len` overflow since u32 is more than enough. // Non-obfuscated code below. // Branchless version is surprisingly faster. // // if flags.x_short() { // // Coordinate is 1 byte long. // x_coords_len += repeats; // } else if !flags.x_is_same_or_positive_short() { // // Coordinate is 2 bytes long. // x_coords_len += repeats * 2; // } // if flags.y_short() { // // Coordinate is 1 byte long. // y_coords_len += repeats; // } else if !flags.y_is_same_or_positive_short() { // // Coordinate is 2 bytes long. // y_coords_len += repeats * 2; // } x_coords_len += (flags.0 & 0x02 != 0) as u32 * repeats; x_coords_len += (flags.0 & (0x02 | 0x10) == 0) as u32 * (repeats * 2); y_coords_len += (flags.0 & 0x04 != 0) as u32 * repeats; y_coords_len += (flags.0 & (0x04 | 0x20) == 0) as u32 * (repeats * 2); flags_left -= repeats; } Some((x_coords_len, y_coords_len)) } /// A [Glyph Data Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/glyf). #[derive(Clone, Copy)] pub struct Table<'a> { pub(crate) data: &'a [u8], loca_table: loca::Table<'a>, } impl core::fmt::Debug for Table<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Table {{ ... }}") } } impl<'a> Table<'a> { /// Parses a table from raw data. #[inline] pub fn parse(loca_table: loca::Table<'a>, data: &'a [u8]) -> Option { Some(Table { loca_table, data }) } /// Outlines a glyph. #[inline] pub fn outline(&self, glyph_id: GlyphId, builder: &mut dyn OutlineBuilder) -> Option { let mut b = Builder::new(Transform::default(), RectF::new(), builder); let glyph_data = self.get(glyph_id)?; outline_impl(self.loca_table, self.data, glyph_data, 0, &mut b)? } /// The bounding box of the glyph. Unlike the `outline` method, this method does not /// calculate the bounding box manually by outlining the glyph, but instead uses the /// bounding box in the `glyf` program. As a result, this method will be much faster, /// but the bounding box could be more inaccurate. #[inline] pub fn bbox(&self, glyph_id: GlyphId) -> Option { let glyph_data = self.get(glyph_id)?; let mut s = Stream::new(glyph_data); // number of contours let _ = s.read::()?; Some(Rect { x_min: s.read::()?, y_min: s.read::()?, x_max: s.read::()?, y_max: s.read::()?, }) } #[inline] pub(crate) fn get(&self, glyph_id: GlyphId) -> Option<&'a [u8]> { let range = self.loca_table.glyph_range(glyph_id)?; self.data.get(range) } /// Returns the number of points in this outline. pub(crate) fn outline_points(&self, glyph_id: GlyphId) -> u16 { self.outline_points_impl(glyph_id).unwrap_or(0) } fn outline_points_impl(&self, glyph_id: GlyphId) -> Option { let data = self.get(glyph_id)?; let mut s = Stream::new(data); let number_of_contours = s.read::()?; // Skip bbox. s.advance(8); if number_of_contours > 0 { // Simple glyph. let number_of_contours = NonZeroU16::new(number_of_contours as u16)?; let glyph_points = parse_simple_outline(s.tail()?, number_of_contours)?; Some(glyph_points.points_left) } else if number_of_contours < 0 { // Composite glyph. let components = CompositeGlyphIter::new(s.tail()?); Some(components.clone().count() as u16) } else { // An empty glyph. None } } } ttf-parser-0.24.1/src/tables/gpos.rs000064400000000000000000000760321046102023000153660ustar 00000000000000//! A [Glyph Positioning Table](https://docs.microsoft.com/en-us/typography/opentype/spec/gpos) //! implementation. // A heavily modified port of https://github.com/RazrFalcon/rustybuzz implementation // originally written by https://github.com/laurmaedje use core::convert::TryFrom; use crate::opentype_layout::ChainedContextLookup; use crate::opentype_layout::{Class, ClassDefinition, ContextLookup, Coverage, LookupSubtable}; use crate::parser::{ FromData, FromSlice, LazyArray16, LazyArray32, NumFrom, Offset, Offset16, Stream, }; use crate::GlyphId; /// A [Device Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#devVarIdxTbls) /// hinting values. #[derive(Clone, Copy)] pub struct HintingDevice<'a> { start_size: u16, end_size: u16, delta_format: u16, delta_values: LazyArray16<'a, u16>, } impl HintingDevice<'_> { /// Returns X-axis delta. pub fn x_delta(&self, units_per_em: u16, pixels_per_em: Option<(u16, u16)>) -> Option { let ppem = pixels_per_em.map(|(x, _)| x)?; self.get_delta(ppem, units_per_em) } /// Returns Y-axis delta. pub fn y_delta(&self, units_per_em: u16, pixels_per_em: Option<(u16, u16)>) -> Option { let ppem = pixels_per_em.map(|(_, y)| y)?; self.get_delta(ppem, units_per_em) } fn get_delta(&self, ppem: u16, scale: u16) -> Option { let f = self.delta_format; debug_assert!(matches!(f, 1..=3)); if ppem == 0 || ppem < self.start_size || ppem > self.end_size { return None; } let s = ppem - self.start_size; let byte = self.delta_values.get(s >> (4 - f))?; let bits = byte >> (16 - (((s & ((1 << (4 - f)) - 1)) + 1) << f)); let mask = 0xFFFF >> (16 - (1 << f)); let mut delta = i64::from(bits & mask); if delta >= i64::from((mask + 1) >> 1) { delta -= i64::from(mask + 1); } i32::try_from(delta * i64::from(scale) / i64::from(ppem)).ok() } } impl core::fmt::Debug for HintingDevice<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "HintingDevice {{ ... }}") } } /// A [Device Table](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#devVarIdxTbls) /// indexes into [Item Variation Store]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/otvarcommonformats#IVS). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct VariationDevice { pub outer_index: u16, pub inner_index: u16, } /// A [Device Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#devVarIdxTbls). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub enum Device<'a> { Hinting(HintingDevice<'a>), Variation(VariationDevice), } impl<'a> Device<'a> { pub(crate) fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let first = s.read::()?; let second = s.read::()?; let format = s.read::()?; match format { 1..=3 => { let start_size = first; let end_size = second; let count = (1 + (end_size - start_size)) >> (4 - format); let delta_values = s.read_array16(count)?; Some(Self::Hinting(HintingDevice { start_size, end_size, delta_format: format, delta_values, })) } 0x8000 => Some(Self::Variation(VariationDevice { outer_index: first, inner_index: second, })), _ => None, } } } #[derive(Clone, Copy, Default, Debug)] struct ValueFormatFlags(u8); #[rustfmt::skip] impl ValueFormatFlags { #[inline] fn x_placement(self) -> bool { self.0 & 0x01 != 0 } #[inline] fn y_placement(self) -> bool { self.0 & 0x02 != 0 } #[inline] fn x_advance(self) -> bool { self.0 & 0x04 != 0 } #[inline] fn y_advance(self) -> bool { self.0 & 0x08 != 0 } #[inline] fn x_placement_device(self) -> bool { self.0 & 0x10 != 0 } #[inline] fn y_placement_device(self) -> bool { self.0 & 0x20 != 0 } #[inline] fn x_advance_device(self) -> bool { self.0 & 0x40 != 0 } #[inline] fn y_advance_device(self) -> bool { self.0 & 0x80 != 0 } // The ValueRecord struct constrain either i16 values or Offset16 offsets // and the total size depend on how many flags are enabled. fn size(self) -> usize { // The high 8 bits are not used, so make sure we ignore them using 0xFF. u16::SIZE * usize::num_from(self.0.count_ones()) } } impl FromData for ValueFormatFlags { const SIZE: usize = 2; #[inline] fn parse(data: &[u8]) -> Option { // There is no data in high 8 bits, so skip it. Some(Self(data[1])) } } /// A [Value Record](https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#value-record). #[derive(Clone, Copy, Default, Debug)] pub struct ValueRecord<'a> { /// Horizontal adjustment for placement, in design units. pub x_placement: i16, /// Vertical adjustment for placement, in design units. pub y_placement: i16, /// Horizontal adjustment for advance, in design units — only used for horizontal layout. pub x_advance: i16, /// Vertical adjustment for advance, in design units — only used for vertical layout. pub y_advance: i16, /// A [`Device`] table with horizontal adjustment for placement. pub x_placement_device: Option>, /// A [`Device`] table with vertical adjustment for placement. pub y_placement_device: Option>, /// A [`Device`] table with horizontal adjustment for advance. pub x_advance_device: Option>, /// A [`Device`] table with vertical adjustment for advance. pub y_advance_device: Option>, } impl<'a> ValueRecord<'a> { // Returns `None` only on parsing error. fn parse( table_data: &'a [u8], s: &mut Stream, flags: ValueFormatFlags, ) -> Option> { let mut record = ValueRecord::default(); if flags.x_placement() { record.x_placement = s.read::()?; } if flags.y_placement() { record.y_placement = s.read::()?; } if flags.x_advance() { record.x_advance = s.read::()?; } if flags.y_advance() { record.y_advance = s.read::()?; } if flags.x_placement_device() { if let Some(offset) = s.read::>()? { record.x_placement_device = table_data.get(offset.to_usize()..).and_then(Device::parse) } } if flags.y_placement_device() { if let Some(offset) = s.read::>()? { record.y_placement_device = table_data.get(offset.to_usize()..).and_then(Device::parse) } } if flags.x_advance_device() { if let Some(offset) = s.read::>()? { record.x_advance_device = table_data.get(offset.to_usize()..).and_then(Device::parse) } } if flags.y_advance_device() { if let Some(offset) = s.read::>()? { record.y_advance_device = table_data.get(offset.to_usize()..).and_then(Device::parse) } } Some(record) } } /// An array of /// [Value Records](https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#value-record). #[derive(Clone, Copy)] pub struct ValueRecordsArray<'a> { // We have to store the original table data because ValueRecords can have // a offset to Device tables and offset is from the beginning of the table. table_data: &'a [u8], // A slice that contains all ValueRecords. data: &'a [u8], // Number of records. len: u16, // Size of the single record. value_len: usize, // Flags, used during ValueRecord parsing. flags: ValueFormatFlags, } impl<'a> ValueRecordsArray<'a> { fn parse( table_data: &'a [u8], count: u16, flags: ValueFormatFlags, s: &mut Stream<'a>, ) -> Option { Some(Self { table_data, flags, len: count, value_len: flags.size(), data: s.read_bytes(usize::from(count) * flags.size())?, }) } /// Returns array's length. #[inline] pub fn len(&self) -> u16 { self.len } /// Checks if the array is empty. pub fn is_empty(&self) -> bool { self.len == 0 } /// Returns a [`ValueRecord`] at index. pub fn get(&self, index: u16) -> Option> { let start = usize::from(index) * self.value_len; let end = start + self.value_len; let data = self.data.get(start..end)?; let mut s = Stream::new(data); ValueRecord::parse(self.table_data, &mut s, self.flags) } } impl core::fmt::Debug for ValueRecordsArray<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "ValueRecordsArray {{ ... }}") } } /// A [Single Adjustment Positioning Subtable]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#SP). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub enum SingleAdjustment<'a> { Format1 { coverage: Coverage<'a>, value: ValueRecord<'a>, }, Format2 { coverage: Coverage<'a>, values: ValueRecordsArray<'a>, }, } impl<'a> SingleAdjustment<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let flags = s.read::()?; let value = ValueRecord::parse(data, &mut s, flags)?; Some(Self::Format1 { coverage, value }) } 2 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let flags = s.read::()?; let count = s.read::()?; let values = ValueRecordsArray::parse(data, count, flags, &mut s)?; Some(Self::Format2 { coverage, values }) } _ => None, } } /// Returns the subtable coverage. #[inline] pub fn coverage(&self) -> Coverage<'a> { match self { Self::Format1 { coverage, .. } => *coverage, Self::Format2 { coverage, .. } => *coverage, } } } /// A [`ValueRecord`] pairs set used by [`PairAdjustment`]. #[derive(Clone, Copy)] pub struct PairSet<'a> { data: &'a [u8], flags: (ValueFormatFlags, ValueFormatFlags), record_len: u8, } impl<'a> PairSet<'a> { fn parse(data: &'a [u8], flags: (ValueFormatFlags, ValueFormatFlags)) -> Option { let mut s = Stream::new(data); let count = s.read::()?; // Max len is 34, so u8 is just enough. let record_len = (GlyphId::SIZE + flags.0.size() + flags.1.size()) as u8; let data = s.read_bytes(usize::from(count) * usize::from(record_len))?; Some(Self { data, flags, record_len, }) } #[inline] fn binary_search(&self, second: GlyphId) -> Option<&'a [u8]> { // Based on Rust std implementation. let mut size = self.data.len() / usize::from(self.record_len); if size == 0 { return None; } let get_record = |index| { let start = index * usize::from(self.record_len); let end = start + usize::from(self.record_len); self.data.get(start..end) }; let get_glyph = |data: &[u8]| GlyphId(u16::from_be_bytes([data[0], data[1]])); let mut base = 0; while size > 1 { let half = size / 2; let mid = base + half; // mid is always in [0, size), that means mid is >= 0 and < size. // mid >= 0: by definition // mid < size: mid = size / 2 + size / 4 + size / 8 ... let cmp = get_glyph(get_record(mid)?).cmp(&second); base = if cmp == core::cmp::Ordering::Greater { base } else { mid }; size -= half; } // base is always in [0, size) because base <= mid. let value = get_record(base)?; if get_glyph(value).cmp(&second) == core::cmp::Ordering::Equal { Some(value) } else { None } } /// Returns a [`ValueRecord`] pair using the second glyph. pub fn get(&self, second: GlyphId) -> Option<(ValueRecord<'a>, ValueRecord<'a>)> { let record_data = self.binary_search(second)?; let mut s = Stream::new(record_data); s.skip::(); Some(( ValueRecord::parse(self.data, &mut s, self.flags.0)?, ValueRecord::parse(self.data, &mut s, self.flags.1)?, )) } } impl core::fmt::Debug for PairSet<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "PairSet {{ ... }}") } } // Essentially a `LazyOffsetArray16` but stores additional data required to parse [`PairSet`]. /// A list of [`PairSet`]s. #[derive(Clone, Copy)] pub struct PairSets<'a> { data: &'a [u8], // Zero offsets must be ignored, therefore we're using `Option`. offsets: LazyArray16<'a, Option>, flags: (ValueFormatFlags, ValueFormatFlags), } impl<'a> PairSets<'a> { fn new( data: &'a [u8], offsets: LazyArray16<'a, Option>, flags: (ValueFormatFlags, ValueFormatFlags), ) -> Self { Self { data, offsets, flags, } } /// Returns a value at `index`. #[inline] pub fn get(&self, index: u16) -> Option> { let offset = self.offsets.get(index)??.to_usize(); self.data .get(offset..) .and_then(|data| PairSet::parse(data, self.flags)) } /// Returns array's length. #[inline] pub fn len(&self) -> u16 { self.offsets.len() } /// Checks if the array is empty. pub fn is_empty(&self) -> bool { self.offsets.is_empty() } } impl core::fmt::Debug for PairSets<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "PairSets {{ ... }}") } } /// A [`ValueRecord`] pairs matrix used by [`PairAdjustment`]. #[derive(Clone, Copy)] pub struct ClassMatrix<'a> { // We have to store table's original slice, // because offsets in ValueRecords are from the begging of the table. table_data: &'a [u8], matrix: &'a [u8], counts: (u16, u16), flags: (ValueFormatFlags, ValueFormatFlags), record_len: u8, } impl<'a> ClassMatrix<'a> { fn parse( table_data: &'a [u8], counts: (u16, u16), flags: (ValueFormatFlags, ValueFormatFlags), s: &mut Stream<'a>, ) -> Option { let count = usize::num_from(u32::from(counts.0) * u32::from(counts.1)); // Max len is 32, so u8 is just enough. let record_len = (flags.0.size() + flags.1.size()) as u8; let matrix = s.read_bytes(count * usize::from(record_len))?; Some(Self { table_data, matrix, counts, flags, record_len, }) } /// Returns a [`ValueRecord`] pair using specified classes. pub fn get(&self, classes: (u16, u16)) -> Option<(ValueRecord<'a>, ValueRecord<'a>)> { if classes.0 >= self.counts.0 || classes.1 >= self.counts.1 { return None; } let idx = usize::from(classes.0) * usize::from(self.counts.1) + usize::from(classes.1); let record = self.matrix.get(idx * usize::from(self.record_len)..)?; let mut s = Stream::new(record); Some(( ValueRecord::parse(self.table_data, &mut s, self.flags.0)?, ValueRecord::parse(self.table_data, &mut s, self.flags.1)?, )) } } impl core::fmt::Debug for ClassMatrix<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "ClassMatrix {{ ... }}") } } /// A [Pair Adjustment Positioning Subtable]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#PP). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub enum PairAdjustment<'a> { Format1 { coverage: Coverage<'a>, sets: PairSets<'a>, }, Format2 { coverage: Coverage<'a>, classes: (ClassDefinition<'a>, ClassDefinition<'a>), matrix: ClassMatrix<'a>, }, } impl<'a> PairAdjustment<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let flags = (s.read::()?, s.read::()?); let count = s.read::()?; let offsets = s.read_array16(count)?; Some(Self::Format1 { coverage, sets: PairSets::new(data, offsets, flags), }) } 2 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let flags = (s.read::()?, s.read::()?); let classes = ( ClassDefinition::parse(s.read_at_offset16(data)?)?, ClassDefinition::parse(s.read_at_offset16(data)?)?, ); let counts = (s.read::()?, s.read::()?); Some(Self::Format2 { coverage, classes, matrix: ClassMatrix::parse(data, counts, flags, &mut s)?, }) } _ => None, } } /// Returns the subtable coverage. #[inline] pub fn coverage(&self) -> Coverage<'a> { match self { Self::Format1 { coverage, .. } => *coverage, Self::Format2 { coverage, .. } => *coverage, } } } #[derive(Clone, Copy)] struct EntryExitRecord { entry_anchor_offset: Option, exit_anchor_offset: Option, } impl FromData for EntryExitRecord { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Self { entry_anchor_offset: s.read::>()?, exit_anchor_offset: s.read::>()?, }) } } /// A list of entry and exit [`Anchor`] pairs. #[derive(Clone, Copy)] pub struct CursiveAnchorSet<'a> { data: &'a [u8], records: LazyArray16<'a, EntryExitRecord>, } impl<'a> CursiveAnchorSet<'a> { /// Returns an entry [`Anchor`] at index. pub fn entry(&self, index: u16) -> Option> { let offset = self.records.get(index)?.entry_anchor_offset?.to_usize(); self.data.get(offset..).and_then(Anchor::parse) } /// Returns an exit [`Anchor`] at index. pub fn exit(&self, index: u16) -> Option> { let offset = self.records.get(index)?.exit_anchor_offset?.to_usize(); self.data.get(offset..).and_then(Anchor::parse) } /// Returns the number of items. pub fn len(&self) -> u16 { self.records.len() } /// Checks if the set is empty. pub fn is_empty(&self) -> bool { self.records.is_empty() } } impl core::fmt::Debug for CursiveAnchorSet<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "CursiveAnchorSet {{ ... }}") } } /// A [Cursive Attachment Positioning Subtable]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#CAP). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct CursiveAdjustment<'a> { pub coverage: Coverage<'a>, pub sets: CursiveAnchorSet<'a>, } impl<'a> CursiveAdjustment<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let count = s.read::()?; let records = s.read_array16(count)?; Some(Self { coverage, sets: CursiveAnchorSet { data, records }, }) } _ => None, } } } /// A [Mark-to-Base Attachment Positioning Subtable]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#MBP). #[derive(Clone, Copy, Debug)] pub struct MarkToBaseAdjustment<'a> { /// A mark coverage. pub mark_coverage: Coverage<'a>, /// A base coverage. pub base_coverage: Coverage<'a>, /// A list of mark anchors. pub marks: MarkArray<'a>, /// An anchors matrix. pub anchors: AnchorMatrix<'a>, } impl<'a> MarkToBaseAdjustment<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let mark_coverage = Coverage::parse(s.read_at_offset16(data)?)?; let base_coverage = Coverage::parse(s.read_at_offset16(data)?)?; let class_count = s.read::()?; let marks = MarkArray::parse(s.read_at_offset16(data)?)?; let anchors = AnchorMatrix::parse(s.read_at_offset16(data)?, class_count)?; Some(Self { mark_coverage, base_coverage, marks, anchors, }) } _ => None, } } } /// A [Mark-to-Ligature Attachment Positioning Subtable]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#MLP). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct MarkToLigatureAdjustment<'a> { pub mark_coverage: Coverage<'a>, pub ligature_coverage: Coverage<'a>, pub marks: MarkArray<'a>, pub ligature_array: LigatureArray<'a>, } impl<'a> MarkToLigatureAdjustment<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let mark_coverage = Coverage::parse(s.read_at_offset16(data)?)?; let ligature_coverage = Coverage::parse(s.read_at_offset16(data)?)?; let class_count = s.read::()?; let marks = MarkArray::parse(s.read_at_offset16(data)?)?; let ligature_array = LigatureArray::parse(s.read_at_offset16(data)?, class_count)?; Some(Self { mark_coverage, ligature_coverage, marks, ligature_array, }) } _ => None, } } } /// An array or ligature anchor matrices. #[derive(Clone, Copy)] pub struct LigatureArray<'a> { data: &'a [u8], class_count: u16, offsets: LazyArray16<'a, Offset16>, } impl<'a> LigatureArray<'a> { fn parse(data: &'a [u8], class_count: u16) -> Option { let mut s = Stream::new(data); let count = s.read::()?; let offsets = s.read_array16(count)?; Some(Self { data, class_count, offsets, }) } /// Returns an [`AnchorMatrix`] at index. pub fn get(&self, index: u16) -> Option> { let offset = self.offsets.get(index)?.to_usize(); let data = self.data.get(offset..)?; AnchorMatrix::parse(data, self.class_count) } /// Returns the array length. pub fn len(&self) -> u16 { self.offsets.len() } /// Checks if the array is empty. pub fn is_empty(&self) -> bool { self.offsets.is_empty() } } impl core::fmt::Debug for LigatureArray<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "LigatureArray {{ ... }}") } } #[derive(Clone, Copy)] struct MarkRecord { class: Class, mark_anchor: Offset16, } impl FromData for MarkRecord { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Self { class: s.read::()?, mark_anchor: s.read::()?, }) } } /// A [Mark Array](https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#mark-array-table). #[derive(Clone, Copy)] pub struct MarkArray<'a> { data: &'a [u8], array: LazyArray16<'a, MarkRecord>, } impl<'a> MarkArray<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let count = s.read::()?; let array = s.read_array16(count)?; Some(Self { data, array }) } /// Returns contained data at index. pub fn get(&self, index: u16) -> Option<(Class, Anchor<'a>)> { let record = self.array.get(index)?; let anchor = self .data .get(record.mark_anchor.to_usize()..) .and_then(Anchor::parse)?; Some((record.class, anchor)) } /// Returns the array length. pub fn len(&self) -> u16 { self.array.len() } /// Checks if the array is empty. pub fn is_empty(&self) -> bool { self.array.is_empty() } } impl core::fmt::Debug for MarkArray<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "MarkArray {{ ... }}") } } /// An [Anchor Table](https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#anchor-tables). /// /// The *Anchor Table Format 2: Design Units Plus Contour Point* is not supported. #[derive(Clone, Copy, Debug)] pub struct Anchor<'a> { /// Horizontal value, in design units. pub x: i16, /// Vertical value, in design units. pub y: i16, /// A [`Device`] table with horizontal value. pub x_device: Option>, /// A [`Device`] table with vertical value. pub y_device: Option>, } impl<'a> Anchor<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let format = s.read::()?; if !matches!(format, 1..=3) { return None; } let mut table = Anchor { x: s.read::()?, y: s.read::()?, x_device: None, y_device: None, }; // Note: Format 2 is not handled since there is currently no way to // get a glyph contour point by index. if format == 3 { table.x_device = s .read::>()? .and_then(|offset| data.get(offset.to_usize()..)) .and_then(Device::parse); table.y_device = s .read::>()? .and_then(|offset| data.get(offset.to_usize()..)) .and_then(Device::parse); } Some(table) } } /// An [`Anchor`] parsing helper. #[derive(Clone, Copy)] pub struct AnchorMatrix<'a> { data: &'a [u8], /// Number of rows in the matrix. pub rows: u16, /// Number of columns in the matrix. pub cols: u16, matrix: LazyArray32<'a, Option>, } impl<'a> AnchorMatrix<'a> { fn parse(data: &'a [u8], cols: u16) -> Option { let mut s = Stream::new(data); let rows = s.read::()?; let count = u32::from(rows) * u32::from(cols); let matrix = s.read_array32(count)?; Some(Self { data, rows, cols, matrix, }) } /// Returns an [`Anchor`] at position. pub fn get(&self, row: u16, col: u16) -> Option { let idx = u32::from(row) * u32::from(self.cols) + u32::from(col); let offset = self.matrix.get(idx)??.to_usize(); Anchor::parse(self.data.get(offset..)?) } } impl core::fmt::Debug for AnchorMatrix<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "AnchorMatrix {{ ... }}") } } /// A [Mark-to-Mark Attachment Positioning Subtable]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#MMP). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct MarkToMarkAdjustment<'a> { pub mark1_coverage: Coverage<'a>, pub mark2_coverage: Coverage<'a>, pub marks: MarkArray<'a>, pub mark2_matrix: AnchorMatrix<'a>, } impl<'a> MarkToMarkAdjustment<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let mark1_coverage = Coverage::parse(s.read_at_offset16(data)?)?; let mark2_coverage = Coverage::parse(s.read_at_offset16(data)?)?; let class_count = s.read::()?; let marks = MarkArray::parse(s.read_at_offset16(data)?)?; let mark2_matrix = AnchorMatrix::parse(s.read_at_offset16(data)?, class_count)?; Some(Self { mark1_coverage, mark2_coverage, marks, mark2_matrix, }) } _ => None, } } } /// A glyph positioning /// [lookup subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#table-organization) /// enumeration. #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub enum PositioningSubtable<'a> { Single(SingleAdjustment<'a>), Pair(PairAdjustment<'a>), Cursive(CursiveAdjustment<'a>), MarkToBase(MarkToBaseAdjustment<'a>), MarkToLigature(MarkToLigatureAdjustment<'a>), MarkToMark(MarkToMarkAdjustment<'a>), Context(ContextLookup<'a>), ChainContext(ChainedContextLookup<'a>), } impl<'a> LookupSubtable<'a> for PositioningSubtable<'a> { fn parse(data: &'a [u8], kind: u16) -> Option { match kind { 1 => SingleAdjustment::parse(data).map(Self::Single), 2 => PairAdjustment::parse(data).map(Self::Pair), 3 => CursiveAdjustment::parse(data).map(Self::Cursive), 4 => MarkToBaseAdjustment::parse(data).map(Self::MarkToBase), 5 => MarkToLigatureAdjustment::parse(data).map(Self::MarkToLigature), 6 => MarkToMarkAdjustment::parse(data).map(Self::MarkToMark), 7 => ContextLookup::parse(data).map(Self::Context), 8 => ChainedContextLookup::parse(data).map(Self::ChainContext), 9 => crate::ggg::parse_extension_lookup(data, Self::parse), _ => None, } } } impl<'a> PositioningSubtable<'a> { /// Returns the subtable coverage. #[inline] pub fn coverage(&self) -> Coverage<'a> { match self { Self::Single(t) => t.coverage(), Self::Pair(t) => t.coverage(), Self::Cursive(t) => t.coverage, Self::MarkToBase(t) => t.mark_coverage, Self::MarkToLigature(t) => t.mark_coverage, Self::MarkToMark(t) => t.mark1_coverage, Self::Context(t) => t.coverage(), Self::ChainContext(t) => t.coverage(), } } } ttf-parser-0.24.1/src/tables/gsub.rs000064400000000000000000000233001046102023000153440ustar 00000000000000//! A [Glyph Substitution Table](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub) //! implementation. // A heavily modified port of https://github.com/RazrFalcon/rustybuzz implementation // originally written by https://github.com/laurmaedje use crate::opentype_layout::{ChainedContextLookup, ContextLookup, Coverage, LookupSubtable}; use crate::parser::{FromSlice, LazyArray16, LazyOffsetArray16, Stream}; use crate::GlyphId; /// A [Single Substitution Subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#SS). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub enum SingleSubstitution<'a> { Format1 { coverage: Coverage<'a>, delta: i16, }, Format2 { coverage: Coverage<'a>, substitutes: LazyArray16<'a, GlyphId>, }, } impl<'a> SingleSubstitution<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let delta = s.read::()?; Some(Self::Format1 { coverage, delta }) } 2 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let count = s.read::()?; let substitutes = s.read_array16(count)?; Some(Self::Format2 { coverage, substitutes, }) } _ => None, } } /// Returns the subtable coverage. #[inline] pub fn coverage(&self) -> Coverage<'a> { match self { Self::Format1 { coverage, .. } => *coverage, Self::Format2 { coverage, .. } => *coverage, } } } /// A sequence of glyphs for /// [Multiple Substitution Subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#MS). #[derive(Clone, Copy, Debug)] pub struct Sequence<'a> { /// A list of substitute glyphs. pub substitutes: LazyArray16<'a, GlyphId>, } impl<'a> FromSlice<'a> for Sequence<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let count = s.read::()?; let substitutes = s.read_array16(count)?; Some(Self { substitutes }) } } /// A list of [`Sequence`] tables. pub type SequenceList<'a> = LazyOffsetArray16<'a, Sequence<'a>>; /// A [Multiple Substitution Subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#MS). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct MultipleSubstitution<'a> { pub coverage: Coverage<'a>, pub sequences: SequenceList<'a>, } impl<'a> MultipleSubstitution<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let count = s.read::()?; let offsets = s.read_array16(count)?; Some(Self { coverage, sequences: SequenceList::new(data, offsets), }) } _ => None, } } } /// A list of glyphs for /// [Alternate Substitution Subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#AS). #[derive(Clone, Copy, Debug)] pub struct AlternateSet<'a> { /// Array of alternate glyph IDs, in arbitrary order. pub alternates: LazyArray16<'a, GlyphId>, } impl<'a> FromSlice<'a> for AlternateSet<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let count = s.read::()?; let alternates = s.read_array16(count)?; Some(Self { alternates }) } } /// A set of [`AlternateSet`]. pub type AlternateSets<'a> = LazyOffsetArray16<'a, AlternateSet<'a>>; /// A [Alternate Substitution Subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#AS). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct AlternateSubstitution<'a> { pub coverage: Coverage<'a>, pub alternate_sets: AlternateSets<'a>, } impl<'a> AlternateSubstitution<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let count = s.read::()?; let offsets = s.read_array16(count)?; Some(Self { coverage, alternate_sets: AlternateSets::new(data, offsets), }) } _ => None, } } } /// Glyph components for one ligature. #[derive(Clone, Copy, Debug)] pub struct Ligature<'a> { /// Ligature to substitute. pub glyph: GlyphId, /// Glyph components for one ligature. pub components: LazyArray16<'a, GlyphId>, } impl<'a> FromSlice<'a> for Ligature<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let glyph = s.read::()?; let count = s.read::()?; let components = s.read_array16(count.checked_sub(1)?)?; Some(Self { glyph, components }) } } /// A [`Ligature`] set. pub type LigatureSet<'a> = LazyOffsetArray16<'a, Ligature<'a>>; impl<'a> FromSlice<'a> for LigatureSet<'a> { fn parse(data: &'a [u8]) -> Option { Self::parse(data) } } /// A list of [`Ligature`] sets. pub type LigatureSets<'a> = LazyOffsetArray16<'a, LigatureSet<'a>>; /// A [Ligature Substitution Subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#LS). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct LigatureSubstitution<'a> { pub coverage: Coverage<'a>, pub ligature_sets: LigatureSets<'a>, } impl<'a> LigatureSubstitution<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let count = s.read::()?; let offsets = s.read_array16(count)?; Some(Self { coverage, ligature_sets: LigatureSets::new(data, offsets), }) } _ => None, } } } /// A [Reverse Chaining Contextual Single Substitution Subtable]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#RCCS). #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub struct ReverseChainSingleSubstitution<'a> { pub coverage: Coverage<'a>, pub backtrack_coverages: LazyOffsetArray16<'a, Coverage<'a>>, pub lookahead_coverages: LazyOffsetArray16<'a, Coverage<'a>>, pub substitutes: LazyArray16<'a, GlyphId>, } impl<'a> ReverseChainSingleSubstitution<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); match s.read::()? { 1 => { let coverage = Coverage::parse(s.read_at_offset16(data)?)?; let backtrack_count = s.read::()?; let backtrack_coverages = s.read_array16(backtrack_count)?; let lookahead_count = s.read::()?; let lookahead_coverages = s.read_array16(lookahead_count)?; let substitute_count = s.read::()?; let substitutes = s.read_array16(substitute_count)?; Some(Self { coverage, backtrack_coverages: LazyOffsetArray16::new(data, backtrack_coverages), lookahead_coverages: LazyOffsetArray16::new(data, lookahead_coverages), substitutes, }) } _ => None, } } } /// A glyph substitution /// [lookup subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#table-organization) /// enumeration. #[allow(missing_docs)] #[derive(Clone, Copy, Debug)] pub enum SubstitutionSubtable<'a> { Single(SingleSubstitution<'a>), Multiple(MultipleSubstitution<'a>), Alternate(AlternateSubstitution<'a>), Ligature(LigatureSubstitution<'a>), Context(ContextLookup<'a>), ChainContext(ChainedContextLookup<'a>), ReverseChainSingle(ReverseChainSingleSubstitution<'a>), } impl<'a> LookupSubtable<'a> for SubstitutionSubtable<'a> { fn parse(data: &'a [u8], kind: u16) -> Option { match kind { 1 => SingleSubstitution::parse(data).map(Self::Single), 2 => MultipleSubstitution::parse(data).map(Self::Multiple), 3 => AlternateSubstitution::parse(data).map(Self::Alternate), 4 => LigatureSubstitution::parse(data).map(Self::Ligature), 5 => ContextLookup::parse(data).map(Self::Context), 6 => ChainedContextLookup::parse(data).map(Self::ChainContext), 7 => crate::ggg::parse_extension_lookup(data, Self::parse), 8 => ReverseChainSingleSubstitution::parse(data).map(Self::ReverseChainSingle), _ => None, } } } impl<'a> SubstitutionSubtable<'a> { /// Returns the subtable coverage. #[inline] pub fn coverage(&self) -> Coverage<'a> { match self { Self::Single(t) => t.coverage(), Self::Multiple(t) => t.coverage, Self::Alternate(t) => t.coverage, Self::Ligature(t) => t.coverage, Self::Context(t) => t.coverage(), Self::ChainContext(t) => t.coverage(), Self::ReverseChainSingle(t) => t.coverage, } } /// Checks that the current subtable is *Reverse Chaining Contextual Single*. #[inline] pub fn is_reverse(&self) -> bool { matches!(self, Self::ReverseChainSingle(_)) } } ttf-parser-0.24.1/src/tables/gvar.rs000064400000000000000000001735601046102023000153610ustar 00000000000000//! A [Glyph Variations Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/gvar) implementation. // https://docs.microsoft.com/en-us/typography/opentype/spec/otvarcommonformats#tuple-variation-store // We do have to call clone for readability on some types. #![allow(clippy::clone_on_copy)] #![allow(clippy::neg_cmp_op_on_partial_ord)] use core::cmp; use core::convert::TryFrom; use core::num::NonZeroU16; use crate::parser::{LazyArray16, Offset, Offset16, Offset32, Stream, F2DOT14}; use crate::{glyf, PhantomPoints, PointF}; use crate::{GlyphId, NormalizedCoordinate, OutlineBuilder, Rect, RectF, Transform}; /// 'The TrueType rasterizer dynamically generates 'phantom' points for each glyph /// that represent horizontal and vertical advance widths and side bearings, /// and the variation data within the `gvar` table includes data for these phantom points.' /// /// We don't actually use them, but they are required during deltas parsing. const PHANTOM_POINTS_LEN: usize = 4; #[derive(Clone, Copy)] enum GlyphVariationDataOffsets<'a> { Short(LazyArray16<'a, Offset16>), Long(LazyArray16<'a, Offset32>), } #[derive(Clone, Copy, Default, Debug)] struct PointAndDelta { x: i16, y: i16, x_delta: f32, y_delta: f32, } // This structure will be used by the `VariationTuples` stack buffer, // so it has to be as small as possible. #[derive(Clone, Copy, Default)] struct VariationTuple<'a> { set_points: Option>, deltas: PackedDeltasIter<'a>, /// The last parsed point with delta in the contour. /// Used during delta resolving. prev_point: Option, } /// The maximum number of variation tuples stored on the stack. /// /// The TrueType spec allows up to 4095 tuples, which is way larger /// than we do. But in reality, an average font will have less than 10 tuples. /// We can avoid heap allocations if the number of tuples is less than this number. const MAX_STACK_TUPLES_LEN: u16 = 32; /// A list of variation tuples, possibly stored on the heap. /// /// This is the only part of the `gvar` algorithm that actually allocates a data. /// This is probably unavoidable due to `gvar` structure, /// since we have to iterate all tuples in parallel. enum VariationTuples<'a> { Stack { headers: [VariationTuple<'a>; MAX_STACK_TUPLES_LEN as usize], len: u16, }, #[cfg(feature = "gvar-alloc")] Heap { vec: std::vec::Vec>, }, } impl<'a> Default for VariationTuples<'a> { fn default() -> Self { Self::Stack { headers: [VariationTuple::default(); MAX_STACK_TUPLES_LEN as usize], len: 0, } } } impl<'a> VariationTuples<'a> { /// Attempt to reserve up to `capacity` total slots for variation tuples. #[cfg(feature = "gvar-alloc")] fn reserve(&mut self, capacity: u16) -> bool { // If the requested capacity exceeds the configured maximum stack tuple size ... if capacity > MAX_STACK_TUPLES_LEN { // ... and we're currently on the stack, move to the heap. if let Self::Stack { headers, len } = self { let mut vec = std::vec::Vec::with_capacity(capacity as usize); for header in headers.iter_mut().take(*len as usize) { let header = core::mem::take(header); vec.push(header); } *self = Self::Heap { vec }; return true; } } // Otherwise ... match self { // ... extend the vec capacity to hold our new elements ... Self::Heap { vec } if vec.len() < capacity as usize => { vec.reserve(capacity as usize - vec.len()); true } // ... or do nothing if the vec is already large enough or we're on the stack. _ => true, } } /// Attempt to reserve up to `capacity` total slots for variation tuples. #[cfg(not(feature = "gvar-alloc"))] fn reserve(&mut self, capacity: u16) -> bool { capacity <= MAX_STACK_TUPLES_LEN } /// Get the number of tuples stored in the structure. #[cfg_attr(not(feature = "gvar-alloc"), allow(dead_code))] fn len(&self) -> u16 { match self { Self::Stack { len, .. } => *len, #[cfg(feature = "gvar-alloc")] Self::Heap { vec } => vec.len() as u16, } } /// Append a new tuple header to the list. /// This may panic if the list can't hold a new header. #[cfg(feature = "gvar-alloc")] fn push(&mut self, header: VariationTuple<'a>) { // Reserve space for the new element. // This may fail and result in a later panic, but that matches pre-heap behavior. self.reserve(self.len() + 1); match self { Self::Stack { headers, len } => { headers[usize::from(*len)] = header; *len += 1; } Self::Heap { vec } => vec.push(header), } } /// Append a new tuple header to the list. /// This may panic if the list can't hold a new header. #[cfg(not(feature = "gvar-alloc"))] #[inline] fn push(&mut self, header: VariationTuple<'a>) { match self { Self::Stack { headers, len } => { headers[usize::from(*len)] = header; *len += 1; } } } /// Remove all tuples from the structure. fn clear(&mut self) { match self { Self::Stack { len, .. } => *len = 0, #[cfg(feature = "gvar-alloc")] Self::Heap { vec } => vec.clear(), } } #[inline] fn as_mut_slice(&mut self) -> &mut [VariationTuple<'a>] { match self { Self::Stack { headers, len } => &mut headers[0..usize::from(*len)], #[cfg(feature = "gvar-alloc")] Self::Heap { vec } => vec.as_mut_slice(), } } fn apply( &mut self, all_points: glyf::GlyphPointsIter, points: glyf::GlyphPointsIter, point: glyf::GlyphPoint, ) -> Option { let mut x = f32::from(point.x); let mut y = f32::from(point.y); for tuple in self.as_mut_slice() { if let Some(ref mut set_points) = tuple.set_points { if set_points.next()? { if let Some((x_delta, y_delta)) = tuple.deltas.next() { // Remember the last set point and delta. tuple.prev_point = Some(PointAndDelta { x: point.x, y: point.y, x_delta, y_delta, }); x += x_delta; y += y_delta; } else { // If there are no more deltas, we have to resolve them manually. let set_points = set_points.clone(); let (x_delta, y_delta) = infer_deltas( tuple, set_points, points.clone(), all_points.clone(), point, ); x += x_delta; y += y_delta; } } else { // Point is not referenced, so we have to resolve it. let set_points = set_points.clone(); let (x_delta, y_delta) = infer_deltas(tuple, set_points, points.clone(), all_points.clone(), point); x += x_delta; y += y_delta; } if point.last_point { tuple.prev_point = None; } } else { if let Some((x_delta, y_delta)) = tuple.deltas.next() { x += x_delta; y += y_delta; } } } Some(PointF { x, y }) } // This is just like `apply()`, but without `infer_deltas`, // since we use it only for component points and not a contour. // And since there are no contour and no points, `infer_deltas()` will do nothing. fn apply_null(&mut self) -> Option { let mut x = 0.0; let mut y = 0.0; for tuple in self.as_mut_slice() { if let Some(ref mut set_points) = tuple.set_points { if set_points.next()? { if let Some((x_delta, y_delta)) = tuple.deltas.next() { x += x_delta; y += y_delta; } } } else { if let Some((x_delta, y_delta)) = tuple.deltas.next() { x += x_delta; y += y_delta; } } } Some(PointF { x, y }) } } #[derive(Clone, Copy, Default, Debug)] struct TupleVariationHeaderData { scalar: f32, has_private_point_numbers: bool, serialized_data_len: u16, } // https://docs.microsoft.com/en-us/typography/opentype/spec/otvarcommonformats#tuplevariationheader fn parse_variation_tuples<'a>( count: u16, coordinates: &[NormalizedCoordinate], shared_tuple_records: &LazyArray16, shared_point_numbers: Option>, points_len: u16, mut main_s: Stream<'a>, mut serialized_s: Stream<'a>, tuples: &mut VariationTuples<'a>, ) -> Option<()> { debug_assert!(core::mem::size_of::() <= 80); // `TupleVariationHeader` has a variable size, so we cannot use a `LazyArray`. for _ in 0..count { let header = parse_tuple_variation_header(coordinates, shared_tuple_records, &mut main_s)?; if !(header.scalar > 0.0) { // Serialized data for headers with non-positive scalar should be skipped. serialized_s.advance(usize::from(header.serialized_data_len)); continue; } let serialized_data_start = serialized_s.offset(); // Resolve point numbers source. let point_numbers = if header.has_private_point_numbers { PackedPointsIter::new(&mut serialized_s)? } else { shared_point_numbers.clone() }; // TODO: this // Since the packed representation can include zero values, // it is possible for a given point number to be repeated in the derived point number list. // In that case, there will be multiple delta values in the deltas data // associated with that point number. All of these deltas must be applied // cumulatively to the given point. let deltas_count = if let Some(point_numbers) = point_numbers.clone() { u16::try_from(point_numbers.clone().count()).ok()? } else { points_len }; let deltas = { // Use `checked_sub` in case we went over the `serialized_data_len`. let left = usize::from(header.serialized_data_len) .checked_sub(serialized_s.offset() - serialized_data_start)?; let deltas_data = serialized_s.read_bytes(left)?; PackedDeltasIter::new(header.scalar, deltas_count, deltas_data) }; let tuple = VariationTuple { set_points: point_numbers.map(SetPointsIter::new), deltas, prev_point: None, }; tuples.push(tuple); } Some(()) } // https://docs.microsoft.com/en-us/typography/opentype/spec/otvarcommonformats#tuplevariationheader fn parse_tuple_variation_header( coordinates: &[NormalizedCoordinate], shared_tuple_records: &LazyArray16, s: &mut Stream, ) -> Option { const EMBEDDED_PEAK_TUPLE_FLAG: u16 = 0x8000; const INTERMEDIATE_REGION_FLAG: u16 = 0x4000; const PRIVATE_POINT_NUMBERS_FLAG: u16 = 0x2000; const TUPLE_INDEX_MASK: u16 = 0x0FFF; let serialized_data_size = s.read::()?; let tuple_index = s.read::()?; let has_embedded_peak_tuple = tuple_index & EMBEDDED_PEAK_TUPLE_FLAG != 0; let has_intermediate_region = tuple_index & INTERMEDIATE_REGION_FLAG != 0; let has_private_point_numbers = tuple_index & PRIVATE_POINT_NUMBERS_FLAG != 0; let tuple_index = tuple_index & TUPLE_INDEX_MASK; let axis_count = coordinates.len() as u16; let peak_tuple = if has_embedded_peak_tuple { s.read_array16::(axis_count)? } else { // Use shared tuples. let start = tuple_index.checked_mul(axis_count)?; let end = start.checked_add(axis_count)?; shared_tuple_records.slice(start..end)? }; let (start_tuple, end_tuple) = if has_intermediate_region { ( s.read_array16::(axis_count)?, s.read_array16::(axis_count)?, ) } else { ( LazyArray16::::default(), LazyArray16::::default(), ) }; let mut header = TupleVariationHeaderData { scalar: 0.0, has_private_point_numbers, serialized_data_len: serialized_data_size, }; // Calculate the scalar value according to the pseudo-code described at: // https://docs.microsoft.com/en-us/typography/opentype/spec/otvaroverview#algorithm-for-interpolation-of-instance-values let mut scalar = 1.0; for i in 0..axis_count { let v = coordinates[usize::from(i)].get(); let peak = peak_tuple.get(i)?.0; if peak == 0 || v == peak { continue; } if has_intermediate_region { let start = start_tuple.get(i)?.0; let end = end_tuple.get(i)?.0; if start > peak || peak > end || (start < 0 && end > 0 && peak != 0) { continue; } if v < start || v > end { return Some(header); } if v < peak { if peak != start { scalar *= f32::from(v - start) / f32::from(peak - start); } } else { if peak != end { scalar *= f32::from(end - v) / f32::from(end - peak); } } } else if v == 0 || v < cmp::min(0, peak) || v > cmp::max(0, peak) { // 'If the instance coordinate is out of range for some axis, then the // region and its associated deltas are not applicable.' return Some(header); } else { scalar *= f32::from(v) / f32::from(peak); } } header.scalar = scalar; Some(header) } // https://docs.microsoft.com/en-us/typography/opentype/spec/otvarcommonformats#packed-point-numbers mod packed_points { use crate::parser::{FromData, Stream}; struct Control(u8); impl Control { const POINTS_ARE_WORDS_FLAG: u8 = 0x80; const POINT_RUN_COUNT_MASK: u8 = 0x7F; #[inline] fn is_points_are_words(&self) -> bool { self.0 & Self::POINTS_ARE_WORDS_FLAG != 0 } // 'Mask for the low 7 bits to provide the number of point values in the run, minus one.' // So we have to add 1. // It will never overflow because of a mask. #[inline] fn run_count(&self) -> u8 { (self.0 & Self::POINT_RUN_COUNT_MASK) + 1 } } impl FromData for Control { const SIZE: usize = 1; #[inline] fn parse(data: &[u8]) -> Option { data.get(0).copied().map(Control) } } #[derive(Clone, Copy, PartialEq)] enum State { Control, ShortPoint, LongPoint, } // This structure will be used by the `VariationTuples` stack buffer, // so it has to be as small as possible. // Therefore we cannot use `Stream` and other abstractions. #[derive(Clone, Copy)] pub struct PackedPointsIter<'a> { data: &'a [u8], // u16 is enough, since the maximum number of points is 32767. offset: u16, state: State, points_left: u8, } impl<'a> PackedPointsIter<'a> { // The first Option::None indicates a parsing error. // The second Option::None indicates "no points". pub fn new<'b>(s: &'b mut Stream<'a>) -> Option> { // The total amount of points can be set as one or two bytes // depending on the first bit. let b1 = s.read::()?; let mut count = u16::from(b1); if b1 & Control::POINTS_ARE_WORDS_FLAG != 0 { let b2 = s.read::()?; count = (u16::from(b1 & Control::POINT_RUN_COUNT_MASK) << 8) | u16::from(b2); } if count == 0 { // No points is not an error. return Some(None); } let start = s.offset(); let tail = s.tail()?; // The actual packed points data size is not stored, // so we have to parse the points first to advance the provided stream. // Since deltas will be right after points. let mut i = 0; while i < count { let control = s.read::()?; let run_count = u16::from(control.run_count()); let is_points_are_words = control.is_points_are_words(); // Do not actually parse the number, simply advance. s.advance_checked( if is_points_are_words { 2 } else { 1 } * usize::from(run_count), )?; i += run_count; } if i == 0 { // No points is not an error. return Some(None); } if i > count { // Malformed font. return None; } // Check that points data size is smaller than the storage type // used by the iterator. let data_len = s.offset() - start; if data_len > usize::from(u16::MAX) { return None; } Some(Some(PackedPointsIter { data: &tail[0..data_len], offset: 0, state: State::Control, points_left: 0, })) } } impl<'a> Iterator for PackedPointsIter<'a> { type Item = u16; fn next(&mut self) -> Option { if usize::from(self.offset) >= self.data.len() { return None; } if self.state == State::Control { let control = Control(self.data[usize::from(self.offset)]); self.offset += 1; self.points_left = control.run_count(); self.state = if control.is_points_are_words() { State::LongPoint } else { State::ShortPoint }; self.next() } else { let mut s = Stream::new_at(self.data, usize::from(self.offset))?; let point = if self.state == State::LongPoint { self.offset += 2; s.read::()? } else { self.offset += 1; u16::from(s.read::()?) }; self.points_left -= 1; if self.points_left == 0 { self.state = State::Control; } Some(point) } } } // The `PackedPointsIter` will return referenced point numbers as deltas. // i.e. 1 2 4 is actually 1 3 7 // But this is not very useful in our current algorithm, // so we will convert it once again into: // false true false true false false false true // This way we can iterate glyph points and point numbers in parallel. #[derive(Clone, Copy)] pub struct SetPointsIter<'a> { iter: PackedPointsIter<'a>, unref_count: u16, } impl<'a> SetPointsIter<'a> { #[inline] pub fn new(mut iter: PackedPointsIter<'a>) -> Self { let unref_count = iter.next().unwrap_or(0); SetPointsIter { iter, unref_count } } #[inline] pub fn restart(self) -> Self { let mut iter = self.iter.clone(); iter.offset = 0; iter.state = State::Control; iter.points_left = 0; let unref_count = iter.next().unwrap_or(0); SetPointsIter { iter, unref_count } } } impl<'a> Iterator for SetPointsIter<'a> { type Item = bool; #[inline] fn next(&mut self) -> Option { if self.unref_count != 0 { self.unref_count -= 1; return Some(false); } if let Some(unref_count) = self.iter.next() { self.unref_count = unref_count; if self.unref_count != 0 { self.unref_count -= 1; } } // Iterator will be returning `Some(true)` after "finished". // This is because this iterator will be zipped with the `glyf::GlyphPointsIter` // and the number of glyph points can be larger than the amount of set points. // Anyway, this is a non-issue in a well-formed font. Some(true) } } #[cfg(test)] mod tests { use super::*; struct NewControl { deltas_are_words: bool, run_count: u8, } fn gen_control(control: NewControl) -> u8 { assert!(control.run_count > 0, "run count cannot be zero"); let mut n = 0; if control.deltas_are_words { n |= 0x80; } n |= (control.run_count - 1) & 0x7F; n } #[test] fn empty() { let mut s = Stream::new(&[]); assert!(PackedPointsIter::new(&mut s).is_none()); } #[test] fn single_zero_control() { let mut s = Stream::new(&[0]); assert!(PackedPointsIter::new(&mut s).unwrap().is_none()); } #[test] fn single_point() { let data = vec![ 1, // total count gen_control(NewControl { deltas_are_words: false, run_count: 1, }), 1, ]; let points_iter = PackedPointsIter::new(&mut Stream::new(&data)) .unwrap() .unwrap(); let mut iter = SetPointsIter::new(points_iter); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), true); // Endlessly true. } #[test] fn set_0_and_2() { let data = vec![ 2, // total count gen_control(NewControl { deltas_are_words: false, run_count: 2, }), 0, 2, ]; let points_iter = PackedPointsIter::new(&mut Stream::new(&data)) .unwrap() .unwrap(); let mut iter = SetPointsIter::new(points_iter); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), true); // Endlessly true. } #[test] fn set_1_and_2() { let data = vec![ 2, // total count gen_control(NewControl { deltas_are_words: false, run_count: 2, }), 1, 1, ]; let points_iter = PackedPointsIter::new(&mut Stream::new(&data)) .unwrap() .unwrap(); let mut iter = SetPointsIter::new(points_iter); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), true); // Endlessly true. } #[test] fn set_1_and_3() { let data = vec![ 2, // total count gen_control(NewControl { deltas_are_words: false, run_count: 2, }), 1, 2, ]; let points_iter = PackedPointsIter::new(&mut Stream::new(&data)) .unwrap() .unwrap(); let mut iter = SetPointsIter::new(points_iter); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), true); // Endlessly true. } #[test] fn set_2_5_7() { let data = vec![ 3, // total count gen_control(NewControl { deltas_are_words: false, run_count: 3, }), 2, 3, 2, ]; let points_iter = PackedPointsIter::new(&mut Stream::new(&data)) .unwrap() .unwrap(); let mut iter = SetPointsIter::new(points_iter); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), true); // Endlessly true. } #[test] fn more_than_127_points() { let mut data = vec![]; // total count data.push(Control::POINTS_ARE_WORDS_FLAG); data.push(150); data.push(gen_control(NewControl { deltas_are_words: false, run_count: 100, })); for _ in 0..100 { data.push(2); } data.push(gen_control(NewControl { deltas_are_words: false, run_count: 50, })); for _ in 0..50 { data.push(2); } let points_iter = PackedPointsIter::new(&mut Stream::new(&data)) .unwrap() .unwrap(); let mut iter = SetPointsIter::new(points_iter); assert_eq!(iter.next().unwrap(), false); for _ in 0..150 { assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); } assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), true); // Endlessly true. } #[test] fn long_points() { let data = vec![ 2, // total count gen_control(NewControl { deltas_are_words: true, run_count: 2, }), 0, 2, 0, 3, ]; let points_iter = PackedPointsIter::new(&mut Stream::new(&data)) .unwrap() .unwrap(); let mut iter = SetPointsIter::new(points_iter); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), true); // Endlessly true. } #[test] fn multiple_runs() { let data = vec![ 5, // total count gen_control(NewControl { deltas_are_words: true, run_count: 2, }), 0, 2, 0, 3, gen_control(NewControl { deltas_are_words: false, run_count: 3, }), 2, 3, 2, ]; let points_iter = PackedPointsIter::new(&mut Stream::new(&data)) .unwrap() .unwrap(); let mut iter = SetPointsIter::new(points_iter); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), false); assert_eq!(iter.next().unwrap(), true); assert_eq!(iter.next().unwrap(), true); // Endlessly true. } #[test] fn runs_overflow() { // TrueType allows up to 32767 points. let data = vec![0xFF; 0xFFFF * 2]; assert!(PackedPointsIter::new(&mut Stream::new(&data)).is_none()); } } } use packed_points::*; // https://docs.microsoft.com/en-us/typography/opentype/spec/otvarcommonformats#packed-deltas mod packed_deltas { use crate::parser::Stream; struct Control(u8); impl Control { const DELTAS_ARE_ZERO_FLAG: u8 = 0x80; const DELTAS_ARE_WORDS_FLAG: u8 = 0x40; const DELTA_RUN_COUNT_MASK: u8 = 0x3F; #[inline] fn is_deltas_are_zero(&self) -> bool { self.0 & Self::DELTAS_ARE_ZERO_FLAG != 0 } #[inline] fn is_deltas_are_words(&self) -> bool { self.0 & Self::DELTAS_ARE_WORDS_FLAG != 0 } // 'Mask for the low 6 bits to provide the number of delta values in the run, minus one.' // So we have to add 1. // It will never overflow because of a mask. #[inline] fn run_count(&self) -> u8 { (self.0 & Self::DELTA_RUN_COUNT_MASK) + 1 } } #[derive(Clone, Copy, PartialEq, Debug)] enum State { Control, ZeroDelta, ShortDelta, LongDelta, } impl Default for State { #[inline] fn default() -> Self { State::Control } } #[derive(Clone, Copy, Default)] struct RunState { data_offset: u16, state: State, run_deltas_left: u8, } impl RunState { fn next(&mut self, data: &[u8], scalar: f32) -> Option { if self.state == State::Control { if usize::from(self.data_offset) == data.len() { return None; } let control = Control(Stream::read_at::(data, usize::from(self.data_offset))?); self.data_offset += 1; self.run_deltas_left = control.run_count(); self.state = if control.is_deltas_are_zero() { State::ZeroDelta } else if control.is_deltas_are_words() { State::LongDelta } else { State::ShortDelta }; self.next(data, scalar) } else { let mut s = Stream::new_at(data, usize::from(self.data_offset))?; let delta = if self.state == State::LongDelta { self.data_offset += 2; f32::from(s.read::()?) * scalar } else if self.state == State::ZeroDelta { 0.0 } else { self.data_offset += 1; f32::from(s.read::()?) * scalar }; self.run_deltas_left -= 1; if self.run_deltas_left == 0 { self.state = State::Control; } Some(delta) } } } // This structure will be used by the `VariationTuples` stack buffer, // so it has to be as small as possible. // Therefore we cannot use `Stream` and other abstractions. #[derive(Clone, Copy, Default)] pub struct PackedDeltasIter<'a> { data: &'a [u8], x_run: RunState, y_run: RunState, /// A total number of deltas per axis. /// /// Required only by restart() total_count: u16, scalar: f32, } impl<'a> PackedDeltasIter<'a> { /// `count` indicates a number of delta pairs. pub fn new(scalar: f32, count: u16, data: &'a [u8]) -> Self { debug_assert!(core::mem::size_of::() <= 32); let mut iter = PackedDeltasIter { data, total_count: count, scalar, ..PackedDeltasIter::default() }; // 'The packed deltas are arranged with all of the deltas for X coordinates first, // followed by the deltas for Y coordinates.' // So we have to skip X deltas in the Y deltas iterator. // // Note that Y deltas doesn't necessarily start with a Control byte // and can actually start in the middle of the X run. // So we can't simply split the input data in half // and process those chunks separately. for _ in 0..count { iter.y_run.next(data, scalar); } iter } #[inline] pub fn restart(self) -> Self { PackedDeltasIter::new(self.scalar, self.total_count, self.data) } #[inline] pub fn next(&mut self) -> Option<(f32, f32)> { let x = self.x_run.next(self.data, self.scalar)?; let y = self.y_run.next(self.data, self.scalar)?; Some((x, y)) } } #[cfg(test)] mod tests { use super::*; struct NewControl { deltas_are_zero: bool, deltas_are_words: bool, run_count: u8, } fn gen_control(control: NewControl) -> u8 { assert!(control.run_count > 0, "run count cannot be zero"); let mut n = 0; if control.deltas_are_zero { n |= 0x80; } if control.deltas_are_words { n |= 0x40; } n |= (control.run_count - 1) & 0x3F; n } #[test] fn empty() { let mut iter = PackedDeltasIter::new(1.0, 1, &[]); assert!(iter.next().is_none()); } #[test] fn single_delta() { let data = vec![ gen_control(NewControl { deltas_are_zero: false, deltas_are_words: false, run_count: 2, }), 2, 3, ]; let mut iter = PackedDeltasIter::new(1.0, 1, &data); assert_eq!(iter.next().unwrap(), (2.0, 3.0)); assert!(iter.next().is_none()); } #[test] fn two_deltas() { let data = vec![ gen_control(NewControl { deltas_are_zero: false, deltas_are_words: false, run_count: 4, }), 2, 3, 4, 5, ]; let mut iter = PackedDeltasIter::new(1.0, 2, &data); // Remember that X deltas are defined first. assert_eq!(iter.next().unwrap(), (2.0, 4.0)); assert_eq!(iter.next().unwrap(), (3.0, 5.0)); assert!(iter.next().is_none()); } #[test] fn single_long_delta() { let data = vec![ gen_control(NewControl { deltas_are_zero: false, deltas_are_words: true, run_count: 2, }), 0, 2, 0, 3, ]; let mut iter = PackedDeltasIter::new(1.0, 1, &data); assert_eq!(iter.next().unwrap(), (2.0, 3.0)); assert!(iter.next().is_none()); } #[test] fn zeros() { let data = vec![gen_control(NewControl { deltas_are_zero: true, deltas_are_words: false, run_count: 4, })]; let mut iter = PackedDeltasIter::new(1.0, 2, &data); assert_eq!(iter.next().unwrap(), (0.0, 0.0)); assert_eq!(iter.next().unwrap(), (0.0, 0.0)); assert!(iter.next().is_none()); } #[test] fn zero_words() { // When `deltas_are_zero` is set, `deltas_are_words` should be ignored. let data = vec![gen_control(NewControl { deltas_are_zero: true, deltas_are_words: true, run_count: 4, })]; let mut iter = PackedDeltasIter::new(1.0, 2, &data); assert_eq!(iter.next().unwrap(), (0.0, 0.0)); assert_eq!(iter.next().unwrap(), (0.0, 0.0)); assert!(iter.next().is_none()); } #[test] fn zero_runs() { let data = vec![ gen_control(NewControl { deltas_are_zero: true, deltas_are_words: false, run_count: 2, }), gen_control(NewControl { deltas_are_zero: true, deltas_are_words: false, run_count: 4, }), gen_control(NewControl { deltas_are_zero: true, deltas_are_words: false, run_count: 6, }), ]; let mut iter = PackedDeltasIter::new(1.0, 6, &data); // First run. assert_eq!(iter.next().unwrap(), (0.0, 0.0)); // Second run. assert_eq!(iter.next().unwrap(), (0.0, 0.0)); assert_eq!(iter.next().unwrap(), (0.0, 0.0)); // Third run. assert_eq!(iter.next().unwrap(), (0.0, 0.0)); assert_eq!(iter.next().unwrap(), (0.0, 0.0)); assert_eq!(iter.next().unwrap(), (0.0, 0.0)); assert!(iter.next().is_none()); } #[test] fn delta_after_zeros() { let data = vec![ gen_control(NewControl { deltas_are_zero: true, deltas_are_words: false, run_count: 2, }), gen_control(NewControl { deltas_are_zero: false, deltas_are_words: false, run_count: 2, }), 2, 3, ]; let mut iter = PackedDeltasIter::new(1.0, 2, &data); assert_eq!(iter.next().unwrap(), (0.0, 2.0)); assert_eq!(iter.next().unwrap(), (0.0, 3.0)); assert!(iter.next().is_none()); } #[test] fn unexpected_end_of_data_1() { let data = vec![gen_control(NewControl { deltas_are_zero: false, deltas_are_words: false, run_count: 2, })]; let mut iter = PackedDeltasIter::new(1.0, 1, &data); assert!(iter.next().is_none()); } #[test] fn unexpected_end_of_data_2() { // Only X is set. let data = vec![ gen_control(NewControl { deltas_are_zero: false, deltas_are_words: false, run_count: 2, }), 1, ]; let mut iter = PackedDeltasIter::new(1.0, 1, &data); assert!(iter.next().is_none()); } #[test] fn unexpected_end_of_data_3() { let data = vec![gen_control(NewControl { deltas_are_zero: false, deltas_are_words: true, run_count: 2, })]; let mut iter = PackedDeltasIter::new(1.0, 1, &data); assert!(iter.next().is_none()); } #[test] fn unexpected_end_of_data_4() { // X data is too short. let data = vec![ gen_control(NewControl { deltas_are_zero: false, deltas_are_words: true, run_count: 2, }), 1, ]; let mut iter = PackedDeltasIter::new(1.0, 1, &data); assert!(iter.next().is_none()); } #[test] fn unexpected_end_of_data_6() { // Only X is set. let data = vec![ gen_control(NewControl { deltas_are_zero: false, deltas_are_words: true, run_count: 2, }), 0, 1, ]; let mut iter = PackedDeltasIter::new(1.0, 1, &data); assert!(iter.next().is_none()); } #[test] fn unexpected_end_of_data_7() { // Y data is too short. let data = vec![ gen_control(NewControl { deltas_are_zero: false, deltas_are_words: true, run_count: 2, }), 0, 1, 0, ]; let mut iter = PackedDeltasIter::new(1.0, 1, &data); assert!(iter.next().is_none()); } #[test] fn single_run() { let data = vec![ gen_control(NewControl { deltas_are_zero: false, deltas_are_words: false, run_count: 1, }), 2, 3, ]; let mut iter = PackedDeltasIter::new(1.0, 1, &data); assert!(iter.next().is_none()); } #[test] fn too_many_pairs() { let data = vec![ gen_control(NewControl { deltas_are_zero: false, deltas_are_words: false, run_count: 2, }), 2, 3, ]; // We have only one pair, not 10. let mut iter = PackedDeltasIter::new(1.0, 10, &data); assert!(iter.next().is_none()); } #[test] fn invalid_number_of_pairs() { let data = vec![ gen_control(NewControl { deltas_are_zero: false, deltas_are_words: false, run_count: 2, }), 2, 3, 4, 5, 6, 7, ]; // We have 3 pairs, not 4. // We don't actually check this, since it will be very expensive. // And it should not happen in a well-formed font anyway. // So as long as it doesn't panic - we are fine. let mut iter = PackedDeltasIter::new(1.0, 4, &data); assert_eq!(iter.next().unwrap(), (2.0, 7.0)); assert!(iter.next().is_none()); } #[test] fn mixed_runs() { let data = vec![ gen_control(NewControl { deltas_are_zero: false, deltas_are_words: false, run_count: 3, }), 2, 3, 4, gen_control(NewControl { deltas_are_zero: false, deltas_are_words: true, run_count: 2, }), 0, 5, 0, 6, gen_control(NewControl { deltas_are_zero: true, deltas_are_words: false, run_count: 1, }), ]; let mut iter = PackedDeltasIter::new(1.0, 3, &data); assert_eq!(iter.next().unwrap(), (2.0, 5.0)); assert_eq!(iter.next().unwrap(), (3.0, 6.0)); assert_eq!(iter.next().unwrap(), (4.0, 0.0)); assert!(iter.next().is_none()); } #[test] fn non_default_scalar() { let data = vec![ gen_control(NewControl { deltas_are_zero: false, deltas_are_words: false, run_count: 2, }), 2, 3, ]; let mut iter = PackedDeltasIter::new(0.5, 1, &data); assert_eq!(iter.next().unwrap(), (1.0, 1.5)); assert!(iter.next().is_none()); } #[test] fn runs_overflow() { let data = vec![0xFF; 0xFFFF]; let mut iter = PackedDeltasIter::new(1.0, 0xFFFF, &data); // As long as it doesn't panic - we are fine. assert_eq!(iter.next().unwrap(), (0.0, 0.0)); } } } use packed_deltas::PackedDeltasIter; /// Infer unreferenced deltas. /// /// A font can define deltas only for specific points, to reduce the file size. /// In this case, we have to infer undefined/unreferenced deltas manually, /// depending on the context. /// /// This is already a pretty complex task, since deltas should be resolved /// only inside the current contour (do not confuse with component). /// And during resolving we can actually wrap around the contour. /// So if there is no deltas after the current one, we have to use /// the first delta of the current contour instead. /// Same goes for the previous delta. If there are no deltas /// before the current one, we have to use the last one in the current contour. /// /// And in case of `ttf-parser` everything is becoming even more complex, /// since we don't actually have a list of points and deltas, only iterators. /// Because of `ttf-parser`'s allocation free policy. /// Which makes the code even more complicated. /// /// https://docs.microsoft.com/en-us/typography/opentype/spec/gvar#inferred-deltas-for-un-referenced-point-numbers fn infer_deltas( tuple: &VariationTuple, points_set: SetPointsIter, // A points iterator that starts after the current point. points: glyf::GlyphPointsIter, // A points iterator that starts from the first point in the glyph. all_points: glyf::GlyphPointsIter, curr_point: glyf::GlyphPoint, ) -> (f32, f32) { let mut current_contour = points.current_contour(); if curr_point.last_point && current_contour != 0 { // When we parsed the last point of a contour, // an iterator had switched to the next contour. // So we have to move to the previous one. current_contour -= 1; } let prev_point = if let Some(prev_point) = tuple.prev_point { // If a contour already had a delta - just use it. prev_point } else { // If not, find the last point with delta in the current contour. let mut last_point = None; let mut deltas = tuple.deltas.clone(); for (point, is_set) in points.clone().zip(points_set.clone()) { if is_set { if let Some((x_delta, y_delta)) = deltas.next() { last_point = Some(PointAndDelta { x: point.x, y: point.y, x_delta, y_delta, }); } } if point.last_point { break; } } // If there is no last point, there are no deltas. match last_point { Some(p) => p, None => return (0.0, 0.0), } }; let mut next_point = None; if !curr_point.last_point { // If the current point is not the last one in the contour, // find the first set delta in the current contour. let mut deltas = tuple.deltas.clone(); for (point, is_set) in points.clone().zip(points_set.clone()) { if is_set { if let Some((x_delta, y_delta)) = deltas.next() { next_point = Some(PointAndDelta { x: point.x, y: point.y, x_delta, y_delta, }); } break; } if point.last_point { break; } } } if next_point.is_none() { // If there were no deltas after the current point, // restart from the start of the contour. // // This is probably the most expensive branch, // but nothing we can do about it since `glyf`/`gvar` data structure // doesn't allow implementing a reverse iterator. // So we have to parse everything once again. let mut all_points = all_points.clone(); let mut deltas = tuple.deltas.clone().restart(); let mut points_set = points_set.clone().restart(); let mut contour = 0; while let (Some(point), Some(is_set)) = (all_points.next(), points_set.next()) { // First, we have to skip already processed contours. if contour != current_contour { if is_set { let _ = deltas.next(); } contour = all_points.current_contour(); continue; } if is_set { let (x_delta, y_delta) = deltas.next().unwrap_or((0.0, 0.0)); next_point = Some(PointAndDelta { x: point.x, y: point.y, x_delta, y_delta, }); break; } if point.last_point { break; } } } // If there is no next point, there are no deltas. let next_point = match next_point { Some(p) => p, None => return (0.0, 0.0), }; let dx = infer_delta( prev_point.x, curr_point.x, next_point.x, prev_point.x_delta, next_point.x_delta, ); let dy = infer_delta( prev_point.y, curr_point.y, next_point.y, prev_point.y_delta, next_point.y_delta, ); (dx, dy) } fn infer_delta( prev_point: i16, target_point: i16, next_point: i16, prev_delta: f32, next_delta: f32, ) -> f32 { if prev_point == next_point { if prev_delta == next_delta { prev_delta } else { 0.0 } } else if target_point <= prev_point.min(next_point) { if prev_point < next_point { prev_delta } else { next_delta } } else if target_point >= prev_point.max(next_point) { if prev_point > next_point { prev_delta } else { next_delta } } else { // 'Target point coordinate is between adjacent point coordinates.' // // 'Target point delta is derived from the adjacent point deltas // using linear interpolation.' let target_sub = target_point.checked_sub(prev_point); let next_sub = next_point.checked_sub(prev_point); let d = if let (Some(target_sub), Some(next_sub)) = (target_sub, next_sub) { f32::from(target_sub) / f32::from(next_sub) } else { return 0.0; }; (1.0 - d) * prev_delta + d * next_delta } } /// A [Glyph Variations Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/gvar). #[derive(Clone, Copy)] pub struct Table<'a> { axis_count: NonZeroU16, shared_tuple_records: LazyArray16<'a, F2DOT14>, offsets: GlyphVariationDataOffsets<'a>, glyphs_variation_data: &'a [u8], } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if version != 0x00010000 { return None; } let axis_count = s.read::()?; let shared_tuple_count = s.read::()?; let shared_tuples_offset = s.read::()?; let glyph_count = s.read::()?; let flags = s.read::()?; let glyph_variation_data_array_offset = s.read::()?; // The axis count cannot be zero. let axis_count = NonZeroU16::new(axis_count)?; let shared_tuple_records = { let mut sub_s = Stream::new_at(data, shared_tuples_offset.to_usize())?; sub_s.read_array16::(shared_tuple_count.checked_mul(axis_count.get())?)? }; let glyphs_variation_data = data.get(glyph_variation_data_array_offset.to_usize()..)?; let offsets = { let offsets_count = glyph_count.checked_add(1)?; let is_long_format = flags & 1 == 1; // The first bit indicates a long format. if is_long_format { GlyphVariationDataOffsets::Long(s.read_array16::(offsets_count)?) } else { GlyphVariationDataOffsets::Short(s.read_array16::(offsets_count)?) } }; Some(Table { axis_count, shared_tuple_records, offsets, glyphs_variation_data, }) } #[inline] fn parse_variation_data( &self, glyph_id: GlyphId, coordinates: &[NormalizedCoordinate], points_len: u16, tuples: &mut VariationTuples<'a>, ) -> Option<()> { tuples.clear(); if coordinates.len() != usize::from(self.axis_count.get()) { return None; } let next_glyph_id = glyph_id.0.checked_add(1)?; let (start, end) = match self.offsets { GlyphVariationDataOffsets::Short(ref array) => { // 'If the short format (Offset16) is used for offsets, // the value stored is the offset divided by 2.' ( array.get(glyph_id.0)?.to_usize() * 2, array.get(next_glyph_id)?.to_usize() * 2, ) } GlyphVariationDataOffsets::Long(ref array) => ( array.get(glyph_id.0)?.to_usize(), array.get(next_glyph_id)?.to_usize(), ), }; // Ignore empty data. if start == end { return Some(()); } let data = self.glyphs_variation_data.get(start..end)?; parse_variation_data( coordinates, &self.shared_tuple_records, points_len, data, tuples, ) } /// Outlines a glyph. pub fn outline( &self, glyf_table: glyf::Table, coordinates: &[NormalizedCoordinate], glyph_id: GlyphId, builder: &mut dyn OutlineBuilder, ) -> Option { let mut b = glyf::Builder::new(Transform::default(), RectF::new(), builder); let glyph_data = glyf_table.get(glyph_id)?; outline_var_impl( glyf_table, self, glyph_id, glyph_data, coordinates, 0, &mut b, ); b.bbox.to_rect() } pub(crate) fn phantom_points( &self, glyf_table: glyf::Table, coordinates: &[NormalizedCoordinate], glyph_id: GlyphId, ) -> Option { let outline_points = glyf_table.outline_points(glyph_id); let mut tuples = VariationTuples::default(); self.parse_variation_data(glyph_id, coordinates, outline_points, &mut tuples)?; // Skip all outline deltas. for _ in 0..outline_points { tuples.apply_null()?; } Some(PhantomPoints { left: tuples.apply_null()?, right: tuples.apply_null()?, top: tuples.apply_null()?, bottom: tuples.apply_null()?, }) } } impl core::fmt::Debug for Table<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Table {{ ... }}") } } #[allow(clippy::comparison_chain)] fn outline_var_impl( glyf_table: glyf::Table, gvar_table: &Table, glyph_id: GlyphId, data: &[u8], coordinates: &[NormalizedCoordinate], depth: u8, builder: &mut glyf::Builder, ) -> Option<()> { if depth >= glyf::MAX_COMPONENTS { return None; } let mut s = Stream::new(data); let number_of_contours = s.read::()?; // Skip bbox. // // In case of a variable font, a bounding box defined in the `glyf` data // refers to the default variation values. Which is not what we want. // Instead, we have to manually calculate outline's bbox. s.advance(8); // TODO: This is the most expensive part. Find a way to allocate it only once. // `VariationTuples` is a very large struct, so allocate it once. let mut tuples = VariationTuples::default(); if number_of_contours > 0 { // Simple glyph. let number_of_contours = NonZeroU16::new(number_of_contours as u16)?; let mut glyph_points = glyf::parse_simple_outline(s.tail()?, number_of_contours)?; let all_glyph_points = glyph_points.clone(); let points_len = glyph_points.points_left; gvar_table.parse_variation_data(glyph_id, coordinates, points_len, &mut tuples)?; while let Some(point) = glyph_points.next() { let p = tuples.apply(all_glyph_points.clone(), glyph_points.clone(), point)?; builder.push_point(p.x, p.y, point.on_curve_point, point.last_point); } Some(()) } else if number_of_contours < 0 { // Composite glyph. // In case of a composite glyph, `gvar` data contains position adjustments // for each component. // Basically, an additional translation used during transformation. // So we have to push zero points manually, instead of parsing the `glyf` data. // // Details: // https://docs.microsoft.com/en-us/typography/opentype/spec/gvar#point-numbers-and-processing-for-composite-glyphs let components = glyf::CompositeGlyphIter::new(s.tail()?); let components_count = components.clone().count() as u16; gvar_table.parse_variation_data(glyph_id, coordinates, components_count, &mut tuples)?; for component in components { let t = tuples.apply_null()?; let mut transform = builder.transform; // Variation component offset should be applied only when // the ARGS_ARE_XY_VALUES flag is set. if component.flags.args_are_xy_values() { transform = Transform::combine(transform, Transform::new_translate(t.x, t.y)); } transform = Transform::combine(transform, component.transform); let mut b = glyf::Builder::new(transform, builder.bbox, builder.builder); if let Some(glyph_data) = glyf_table.get(component.glyph_id) { outline_var_impl( glyf_table, gvar_table, component.glyph_id, glyph_data, coordinates, depth + 1, &mut b, )?; // Take updated bbox. builder.bbox = b.bbox; } } Some(()) } else { // An empty glyph. None } } // https://docs.microsoft.com/en-us/typography/opentype/spec/otvarcommonformats#tuple-variation-store-header fn parse_variation_data<'a>( coordinates: &[NormalizedCoordinate], shared_tuple_records: &LazyArray16, points_len: u16, data: &'a [u8], tuples: &mut VariationTuples<'a>, ) -> Option<()> { const SHARED_POINT_NUMBERS_FLAG: u16 = 0x8000; const COUNT_MASK: u16 = 0x0FFF; let mut main_stream = Stream::new(data); let tuple_variation_count = main_stream.read::()?; let data_offset = main_stream.read::()?; // 'The high 4 bits are flags, and the low 12 bits // are the number of tuple variation tables for this glyph.' let has_shared_point_numbers = tuple_variation_count & SHARED_POINT_NUMBERS_FLAG != 0; let tuple_variation_count = tuple_variation_count & COUNT_MASK; // 'The number of tuple variation tables can be any number between 1 and 4095.' // No need to check for 4095, because this is 0x0FFF that we masked before. if tuple_variation_count == 0 { return None; } // Attempt to reserve space for the tuples we're about to parse. // If it fails, bail out. if !tuples.reserve(tuple_variation_count) { return None; } // A glyph variation data consists of three parts: header + variation tuples + serialized data. // Each tuple has it's own chunk in the serialized data. // Because of that, we are using two parsing streams: one for tuples and one for serialized data. // So we can parse them in parallel and avoid needless allocations. let mut serialized_stream = Stream::new_at(data, data_offset.to_usize())?; // All tuples in the variation data can reference the same point numbers, // which are defined at the start of the serialized data. let mut shared_point_numbers = None; if has_shared_point_numbers { shared_point_numbers = PackedPointsIter::new(&mut serialized_stream)?; } parse_variation_tuples( tuple_variation_count, coordinates, shared_tuple_records, shared_point_numbers, points_len.checked_add(PHANTOM_POINTS_LEN as u16)?, main_stream, serialized_stream, tuples, ) } ttf-parser-0.24.1/src/tables/head.rs000064400000000000000000000047221046102023000153140ustar 00000000000000//! A [Font Header Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/head) implementation. use crate::parser::{Fixed, Stream}; use crate::Rect; /// An index format used by the [Index to Location Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/loca). #[allow(missing_docs)] #[derive(Clone, Copy, PartialEq, Eq, Debug)] pub enum IndexToLocationFormat { Short, Long, } /// A [Font Header Table](https://docs.microsoft.com/en-us/typography/opentype/spec/head). #[derive(Clone, Copy, Debug)] pub struct Table { /// Units per EM. /// /// Guarantee to be in a 16..=16384 range. pub units_per_em: u16, /// A bounding box that large enough to enclose any glyph from the face. pub global_bbox: Rect, /// An index format used by the [Index to Location Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/loca). pub index_to_location_format: IndexToLocationFormat, } impl Table { /// Parses a table from raw data. pub fn parse(data: &[u8]) -> Option { // Do not check the exact length, because some fonts include // padding in table's length in table records, which is incorrect. if data.len() < 54 { return None; } let mut s = Stream::new(data); s.skip::(); // version s.skip::(); // font revision s.skip::(); // checksum adjustment s.skip::(); // magic number s.skip::(); // flags let units_per_em = s.read::()?; s.skip::(); // created time s.skip::(); // modified time let x_min = s.read::()?; let y_min = s.read::()?; let x_max = s.read::()?; let y_max = s.read::()?; s.skip::(); // mac style s.skip::(); // lowest PPEM s.skip::(); // font direction hint let index_to_location_format = s.read::()?; if !(16..=16384).contains(&units_per_em) { return None; } let index_to_location_format = match index_to_location_format { 0 => IndexToLocationFormat::Short, 1 => IndexToLocationFormat::Long, _ => return None, }; Some(Table { units_per_em, global_bbox: Rect { x_min, y_min, x_max, y_max, }, index_to_location_format, }) } } ttf-parser-0.24.1/src/tables/hhea.rs000064400000000000000000000023211046102023000153110ustar 00000000000000//! A [Horizontal Header Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/hhea) implementation. use crate::parser::Stream; /// A [Horizontal Header Table](https://docs.microsoft.com/en-us/typography/opentype/spec/hhea). #[derive(Clone, Copy, Debug)] pub struct Table { /// Face ascender. pub ascender: i16, /// Face descender. pub descender: i16, /// Face line gap. pub line_gap: i16, /// Number of metrics in the `hmtx` table. pub number_of_metrics: u16, } impl Table { /// Parses a table from raw data. pub fn parse(data: &[u8]) -> Option { // Do not check the exact length, because some fonts include // padding in table's length in table records, which is incorrect. if data.len() < 36 { return None; } let mut s = Stream::new(data); s.skip::(); // version let ascender = s.read::()?; let descender = s.read::()?; let line_gap = s.read::()?; s.advance(24); let number_of_metrics = s.read::()?; Some(Table { ascender, descender, line_gap, number_of_metrics, }) } } ttf-parser-0.24.1/src/tables/hmtx.rs000064400000000000000000000074201046102023000153710ustar 00000000000000//! A [Horizontal/Vertical Metrics Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/hmtx) implementation. use core::num::NonZeroU16; use crate::parser::{FromData, LazyArray16, Stream}; use crate::GlyphId; /// Horizontal/Vertical Metrics. #[derive(Clone, Copy, Debug)] pub struct Metrics { /// Width/Height advance for `hmtx`/`vmtx`. pub advance: u16, /// Left/Top side bearing for `hmtx`/`vmtx`. pub side_bearing: i16, } impl FromData for Metrics { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Metrics { advance: s.read::()?, side_bearing: s.read::()?, }) } } /// A [Horizontal/Vertical Metrics Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/hmtx). /// /// `hmtx` and `vmtx` tables has the same structure, so we're reusing the same struct for both. #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// A list of metrics indexed by glyph ID. pub metrics: LazyArray16<'a, Metrics>, /// Side bearings for glyph IDs greater than or equal to the number of `metrics` values. pub bearings: LazyArray16<'a, i16>, /// Sum of long metrics + bearings. pub number_of_metrics: u16, } impl<'a> Table<'a> { /// Parses a table from raw data. /// /// - `number_of_metrics` is from the `hhea`/`vhea` table. /// - `number_of_glyphs` is from the `maxp` table. pub fn parse( mut number_of_metrics: u16, number_of_glyphs: NonZeroU16, data: &'a [u8], ) -> Option { if number_of_metrics == 0 { return None; } let mut s = Stream::new(data); let metrics = s.read_array16::(number_of_metrics)?; // 'If the number_of_metrics is less than the total number of glyphs, // then that array is followed by an array for the left side bearing values // of the remaining glyphs.' let bearings_count = number_of_glyphs.get().checked_sub(number_of_metrics); let bearings = if let Some(count) = bearings_count { number_of_metrics += count; // Some malformed fonts can skip "left side bearing values" // even when they are expected. // Therefore if we weren't able to parser them, simply fallback to an empty array. // No need to mark the whole table as malformed. s.read_array16::(count).unwrap_or_default() } else { LazyArray16::default() }; Some(Table { metrics, bearings, number_of_metrics, }) } /// Returns advance for a glyph. #[inline] pub fn advance(&self, glyph_id: GlyphId) -> Option { if glyph_id.0 >= self.number_of_metrics { return None; } if let Some(metrics) = self.metrics.get(glyph_id.0) { Some(metrics.advance) } else { // 'As an optimization, the number of records can be less than the number of glyphs, // in which case the advance value of the last record applies // to all remaining glyph IDs.' self.metrics.last().map(|m| m.advance) } } /// Returns side bearing for a glyph. #[inline] pub fn side_bearing(&self, glyph_id: GlyphId) -> Option { if let Some(metrics) = self.metrics.get(glyph_id.0) { Some(metrics.side_bearing) } else { // 'If the number_of_metrics is less than the total number of glyphs, // then that array is followed by an array for the side bearing values // of the remaining glyphs.' self.bearings .get(glyph_id.0.checked_sub(self.metrics.len())?) } } } ttf-parser-0.24.1/src/tables/hvar.rs000064400000000000000000000067031046102023000153540ustar 00000000000000//! A [Horizontal Metrics Variations Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/hvar) implementation. use crate::delta_set::DeltaSetIndexMap; use crate::parser::{Offset, Offset32, Stream}; use crate::var_store::ItemVariationStore; use crate::{GlyphId, NormalizedCoordinate}; /// A [Horizontal Metrics Variations Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/hvar). #[derive(Clone, Copy)] pub struct Table<'a> { data: &'a [u8], variation_store: ItemVariationStore<'a>, advance_width_mapping_offset: Option, lsb_mapping_offset: Option, rsb_mapping_offset: Option, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if version != 0x00010000 { return None; } let variation_store_offset = s.read::()?; let var_store_s = Stream::new_at(data, variation_store_offset.to_usize())?; let variation_store = ItemVariationStore::parse(var_store_s)?; Some(Table { data, variation_store, advance_width_mapping_offset: s.read::>()?, lsb_mapping_offset: s.read::>()?, rsb_mapping_offset: s.read::>()?, }) } /// Returns the advance width offset for a glyph. #[inline] pub fn advance_offset( &self, glyph_id: GlyphId, coordinates: &[NormalizedCoordinate], ) -> Option { let (outer_idx, inner_idx) = if let Some(offset) = self.advance_width_mapping_offset { DeltaSetIndexMap::new(self.data.get(offset.to_usize()..)?).map(glyph_id.0 as u32)? } else { // 'If there is no delta-set index mapping table for advance widths, // then glyph IDs implicitly provide the indices: // for a given glyph ID, the delta-set outer-level index is zero, // and the glyph ID is the delta-set inner-level index.' (0, glyph_id.0) }; self.variation_store .parse_delta(outer_idx, inner_idx, coordinates) } /// Returns the left side bearing offset for a glyph. #[inline] pub fn left_side_bearing_offset( &self, glyph_id: GlyphId, coordinates: &[NormalizedCoordinate], ) -> Option { let set_data = self.data.get(self.lsb_mapping_offset?.to_usize()..)?; self.side_bearing_offset(glyph_id, coordinates, set_data) } /// Returns the right side bearing offset for a glyph. #[inline] pub fn right_side_bearing_offset( &self, glyph_id: GlyphId, coordinates: &[NormalizedCoordinate], ) -> Option { let set_data = self.data.get(self.rsb_mapping_offset?.to_usize()..)?; self.side_bearing_offset(glyph_id, coordinates, set_data) } fn side_bearing_offset( &self, glyph_id: GlyphId, coordinates: &[NormalizedCoordinate], set_data: &[u8], ) -> Option { let (outer_idx, inner_idx) = DeltaSetIndexMap::new(set_data).map(glyph_id.0 as u32)?; self.variation_store .parse_delta(outer_idx, inner_idx, coordinates) } } impl core::fmt::Debug for Table<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Table {{ ... }}") } } ttf-parser-0.24.1/src/tables/kern.rs000064400000000000000000000355601046102023000153560ustar 00000000000000/*! A [Kerning Table]( https://docs.microsoft.com/en-us/typography/opentype/spec/kern) implementation. Supports both [OpenType](https://docs.microsoft.com/en-us/typography/opentype/spec/kern) and [Apple Advanced Typography](https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6kern.html) variants. Since there is no single correct way to process a kerning data, we have to provide an access to kerning subtables, so a caller can implement a kerning algorithm manually. But we still try to keep the API as high-level as possible. */ #[cfg(feature = "apple-layout")] use crate::aat; use crate::parser::{FromData, LazyArray16, NumFrom, Offset, Offset16, Stream}; use crate::GlyphId; #[derive(Clone, Copy, Debug)] struct OTCoverage(u8); #[rustfmt::skip] impl OTCoverage { #[inline] fn is_horizontal(self) -> bool { self.0 & (1 << 0) != 0 } #[inline] fn has_cross_stream(self) -> bool { self.0 & (1 << 2) != 0 } } impl FromData for OTCoverage { const SIZE: usize = 1; #[inline] fn parse(data: &[u8]) -> Option { data.get(0).copied().map(OTCoverage) } } #[derive(Clone, Copy, Debug)] struct AATCoverage(u8); #[rustfmt::skip] impl AATCoverage { #[inline] fn is_horizontal(self) -> bool { self.0 & (1 << 7) == 0 } #[inline] fn has_cross_stream(self) -> bool { self.0 & (1 << 6) != 0 } #[inline] fn is_variable(self) -> bool { self.0 & (1 << 5) != 0 } } impl FromData for AATCoverage { const SIZE: usize = 1; #[inline] fn parse(data: &[u8]) -> Option { data.get(0).copied().map(AATCoverage) } } /// A kerning pair. #[derive(Clone, Copy, Debug)] pub struct KerningPair { /// Glyphs pair. /// /// In the kern table spec, a kerning pair is stored as two u16, /// but we are using one u32, so we can binary search it directly. pub pair: u32, /// Kerning value. pub value: i16, } impl KerningPair { /// Returns left glyph ID. #[inline] pub fn left(&self) -> GlyphId { GlyphId((self.pair >> 16) as u16) } /// Returns right glyph ID. #[inline] pub fn right(&self) -> GlyphId { GlyphId(self.pair as u16) } } impl FromData for KerningPair { const SIZE: usize = 6; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(KerningPair { pair: s.read::()?, value: s.read::()?, }) } } /// A kerning subtable format. #[allow(missing_docs)] #[derive(Clone, Debug)] pub enum Format<'a> { Format0(Subtable0<'a>), #[cfg(feature = "apple-layout")] Format1(aat::StateTable<'a>), #[cfg(not(feature = "apple-layout"))] Format1, Format2(Subtable2<'a>), Format3(Subtable3<'a>), } /// A kerning subtable. #[derive(Clone, Debug)] pub struct Subtable<'a> { /// Indicates that subtable is for horizontal text. pub horizontal: bool, /// Indicates that subtable is variable. pub variable: bool, /// Indicates that subtable has a cross-stream values. pub has_cross_stream: bool, /// Indicates that subtable uses a state machine. /// /// In this case `glyphs_kerning()` will return `None`. pub has_state_machine: bool, /// Subtable format. pub format: Format<'a>, } impl<'a> Subtable<'a> { /// Returns kerning for a pair of glyphs. /// /// Returns `None` in case of state machine based subtable. #[inline] pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option { match self.format { Format::Format0(ref subtable) => subtable.glyphs_kerning(left, right), Format::Format2(ref subtable) => subtable.glyphs_kerning(left, right), Format::Format3(ref subtable) => subtable.glyphs_kerning(left, right), _ => None, } } } /// A list of subtables. /// /// The internal data layout is not designed for random access, /// therefore we're not providing the `get()` method and only an iterator. #[derive(Clone, Copy)] pub struct Subtables<'a> { /// Indicates an Apple Advanced Typography format. is_aat: bool, /// The total number of tables. count: u32, /// Actual data. Starts right after the `kern` header. data: &'a [u8], } impl<'a> Subtables<'a> { /// Returns the number of subtables. pub fn len(&self) -> u32 { self.count } /// Checks if there are any subtables. pub fn is_empty(&self) -> bool { self.count == 0 } } impl core::fmt::Debug for Subtables<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtables {{ ... }}") } } impl<'a> IntoIterator for Subtables<'a> { type Item = Subtable<'a>; type IntoIter = SubtablesIter<'a>; #[inline] fn into_iter(self) -> Self::IntoIter { SubtablesIter { is_aat: self.is_aat, table_index: 0, number_of_tables: self.count, stream: Stream::new(self.data), } } } /// An iterator over kerning subtables. #[allow(missing_debug_implementations)] #[derive(Clone, Default)] pub struct SubtablesIter<'a> { /// Indicates an Apple Advanced Typography format. is_aat: bool, /// The current table index, table_index: u32, /// The total number of tables. number_of_tables: u32, /// Actual data. Starts right after `kern` header. stream: Stream<'a>, } impl<'a> Iterator for SubtablesIter<'a> { type Item = Subtable<'a>; fn next(&mut self) -> Option { if self.table_index == self.number_of_tables { return None; } if self.stream.at_end() { return None; } if self.is_aat { const HEADER_SIZE: u8 = 8; let table_len = self.stream.read::()?; let coverage = self.stream.read::()?; let format_id = self.stream.read::()?; self.stream.skip::(); // variation tuple index if format_id > 3 { // Unknown format. return None; } // Subtract the header size. let data_len = usize::num_from(table_len).checked_sub(usize::from(HEADER_SIZE))?; let data = self.stream.read_bytes(data_len)?; let format = match format_id { 0 => Format::Format0(Subtable0::parse(data)?), #[cfg(feature = "apple-layout")] 1 => Format::Format1(aat::StateTable::parse(data)?), #[cfg(not(feature = "apple-layout"))] 1 => Format::Format1, 2 => Format::Format2(Subtable2::parse(HEADER_SIZE, data)?), 3 => Format::Format3(Subtable3::parse(data)?), _ => return None, }; Some(Subtable { horizontal: coverage.is_horizontal(), variable: coverage.is_variable(), has_cross_stream: coverage.has_cross_stream(), has_state_machine: format_id == 1, format, }) } else { const HEADER_SIZE: u8 = 6; self.stream.skip::(); // version let table_len = self.stream.read::()?; // In the OpenType variant, `format` comes first. let format_id = self.stream.read::()?; let coverage = self.stream.read::()?; if format_id != 0 && format_id != 2 { // Unknown format. return None; } let data_len = if self.number_of_tables == 1 { // An OpenType `kern` table with just one subtable is a special case. // The `table_len` property is mainly required to jump to the next subtable, // but if there is only one subtable, this property can be ignored. // This is abused by some fonts, to get around the `u16` size limit. self.stream.tail()?.len() } else { // Subtract the header size. usize::from(table_len).checked_sub(usize::from(HEADER_SIZE))? }; let data = self.stream.read_bytes(data_len)?; let format = match format_id { 0 => Format::Format0(Subtable0::parse(data)?), 2 => Format::Format2(Subtable2::parse(HEADER_SIZE, data)?), _ => return None, }; Some(Subtable { horizontal: coverage.is_horizontal(), variable: false, // Only AAT supports it. has_cross_stream: coverage.has_cross_stream(), has_state_machine: format_id == 1, format, }) } } } /// A format 0 subtable. /// /// Ordered List of Kerning Pairs. #[derive(Clone, Copy, Debug)] pub struct Subtable0<'a> { /// A list of kerning pairs. pub pairs: LazyArray16<'a, KerningPair>, } impl<'a> Subtable0<'a> { /// Parses a subtable from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let number_of_pairs = s.read::()?; s.advance(6); // search_range (u16) + entry_selector (u16) + range_shift (u16) let pairs = s.read_array16::(number_of_pairs)?; Some(Self { pairs }) } /// Returns kerning for a pair of glyphs. #[inline] pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option { let needle = u32::from(left.0) << 16 | u32::from(right.0); self.pairs .binary_search_by(|v| v.pair.cmp(&needle)) .map(|(_, v)| v.value) } } /// A format 2 subtable. /// /// Simple n x m Array of Kerning Values. #[derive(Clone, Copy, Debug)] pub struct Subtable2<'a> { // TODO: parse actual structure data: &'a [u8], header_len: u8, } impl<'a> Subtable2<'a> { /// Parses a subtable from raw data. pub fn parse(header_len: u8, data: &'a [u8]) -> Option { Some(Self { header_len, data }) } /// Returns kerning for a pair of glyphs. pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option { let mut s = Stream::new(self.data); s.skip::(); // row_width // Offsets are from beginning of the subtable and not from the `data` start, // so we have to subtract the header. let header_len = usize::from(self.header_len); let left_hand_table_offset = s.read::()?.to_usize().checked_sub(header_len)?; let right_hand_table_offset = s.read::()?.to_usize().checked_sub(header_len)?; let array_offset = s.read::()?.to_usize().checked_sub(header_len)?; // 'The array can be indexed by completing the left-hand and right-hand class mappings, // adding the class values to the address of the subtable, // and fetching the kerning value to which the new address points.' let left_class = get_format2_class(left.0, left_hand_table_offset, self.data).unwrap_or(0); let right_class = get_format2_class(right.0, right_hand_table_offset, self.data).unwrap_or(0); // 'Values within the left-hand offset table should not be less than the kerning array offset.' if usize::from(left_class) < array_offset { return None; } // Classes are already premultiplied, so we only need to sum them. let index = usize::from(left_class) + usize::from(right_class); let value_offset = index.checked_sub(header_len)?; Stream::read_at::(self.data, value_offset) } } pub(crate) fn get_format2_class(glyph_id: u16, offset: usize, data: &[u8]) -> Option { let mut s = Stream::new_at(data, offset)?; let first_glyph = s.read::()?; let index = glyph_id.checked_sub(first_glyph)?; let number_of_classes = s.read::()?; let classes = s.read_array16::(number_of_classes)?; classes.get(index) } /// A format 3 subtable. /// /// Simple n x m Array of Kerning Indices. #[derive(Clone, Copy, Debug)] pub struct Subtable3<'a> { // TODO: parse actual structure data: &'a [u8], } impl<'a> Subtable3<'a> { /// Parses a subtable from raw data. pub fn parse(data: &'a [u8]) -> Option { Some(Self { data }) } /// Returns kerning for a pair of glyphs. #[inline] pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option { let mut s = Stream::new(self.data); let glyph_count = s.read::()?; let kerning_values_count = s.read::()?; let left_hand_classes_count = s.read::()?; let right_hand_classes_count = s.read::()?; s.skip::(); // reserved let indices_count = u16::from(left_hand_classes_count) * u16::from(right_hand_classes_count); let kerning_values = s.read_array16::(u16::from(kerning_values_count))?; let left_hand_classes = s.read_array16::(glyph_count)?; let right_hand_classes = s.read_array16::(glyph_count)?; let indices = s.read_array16::(indices_count)?; let left_class = left_hand_classes.get(left.0)?; let right_class = right_hand_classes.get(right.0)?; if left_class > left_hand_classes_count || right_class > right_hand_classes_count { return None; } let index = u16::from(left_class) * u16::from(right_hand_classes_count) + u16::from(right_class); let index = indices.get(index)?; kerning_values.get(u16::from(index)) } } /// A [Kerning Table](https://docs.microsoft.com/en-us/typography/opentype/spec/kern). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// A list of subtables. pub subtables: Subtables<'a>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { // The `kern` table has two variants: OpenType and Apple. // And they both have different headers. // There are no robust way to distinguish them, so we have to guess. // // The OpenType one has the first two bytes (UInt16) as a version set to 0. // While Apple one has the first four bytes (Fixed) set to 1.0 // So the first two bytes in case of an OpenType format will be 0x0000 // and 0x0001 in case of an Apple format. let mut s = Stream::new(data); let version = s.read::()?; let subtables = if version == 0 { let count = s.read::()?; Subtables { is_aat: false, count: u32::from(count), data: s.tail()?, } } else { s.skip::(); // Skip the second part of u32 version. // Note that AAT stores the number of tables as u32 and not as u16. let count = s.read::()?; Subtables { is_aat: true, count, data: s.tail()?, } }; Some(Self { subtables }) } } ttf-parser-0.24.1/src/tables/kerx.rs000064400000000000000000000374171046102023000153730ustar 00000000000000//! An [Extended Kerning Table]( //! https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6kerx.html) implementation. // TODO: find a way to test this table // This table is basically untested because it uses Apple's State Tables // and I have no idea how to generate them. use core::num::NonZeroU16; use crate::kern::KerningPair; use crate::parser::{FromData, LazyArray32, NumFrom, Offset, Offset32, Stream}; use crate::{aat, GlyphId}; const HEADER_SIZE: usize = 12; /// A format 0 subtable. /// /// Ordered List of Kerning Pairs. /// /// The same as in `kern`, but uses `LazyArray32` instead of `LazyArray16`. #[derive(Clone, Copy, Debug)] pub struct Subtable0<'a> { /// A list of kerning pairs. pub pairs: LazyArray32<'a, KerningPair>, } impl<'a> Subtable0<'a> { /// Parses a subtable from raw data. fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let number_of_pairs = s.read::()?; s.advance(12); // search_range (u32) + entry_selector (u32) + range_shift (u32) let pairs = s.read_array32::(number_of_pairs)?; Some(Self { pairs }) } /// Returns kerning for a pair of glyphs. #[inline] pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option { let needle = u32::from(left.0) << 16 | u32::from(right.0); self.pairs .binary_search_by(|v| v.pair.cmp(&needle)) .map(|(_, v)| v.value) } } /// A state machine entry. #[derive(Clone, Copy, Debug)] pub struct EntryData { /// An action index. pub action_index: u16, } impl FromData for EntryData { const SIZE: usize = 2; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(EntryData { action_index: s.read::()?, }) } } /// A format 1 subtable. /// /// State Table for Contextual Kerning. #[derive(Clone)] pub struct Subtable1<'a> { /// A state table. pub state_table: aat::ExtendedStateTable<'a, EntryData>, actions_data: &'a [u8], } impl<'a> Subtable1<'a> { fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option { let mut s = Stream::new(data); let state_table = aat::ExtendedStateTable::parse(number_of_glyphs, &mut s)?; // Actions offset is right after the state table. let actions_offset = s.read::()?; // Actions offset is from the start of the state table and not from the start of subtable. // And since we don't know the length of the actions data, // simply store all the data after the offset. let actions_data = data.get(actions_offset.to_usize()..)?; Some(Subtable1 { state_table, actions_data, }) } /// Returns kerning at action index. #[inline] pub fn glyphs_kerning(&self, action_index: u16) -> Option { Stream::read_at(self.actions_data, usize::from(action_index) * i16::SIZE) } } impl<'a> core::ops::Deref for Subtable1<'a> { type Target = aat::ExtendedStateTable<'a, EntryData>; fn deref(&self) -> &Self::Target { &self.state_table } } impl core::fmt::Debug for Subtable1<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtable1 {{ ... }}") } } /// A format 2 subtable. /// /// Simple n x m Array of Kerning Values. /// /// The same as in `kern`, but uses 32bit offsets instead of 16bit one. #[derive(Clone, Copy)] pub struct Subtable2<'a>(&'a [u8]); // TODO: parse actual structure impl<'a> Subtable2<'a> { /// Returns kerning for a pair of glyphs. pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option { let mut s = Stream::new(self.0); s.skip::(); // row_width // Offsets are from beginning of the subtable and not from the `data` start, // so we have to subtract the header. let left_hand_table_offset = s.read::()?.to_usize().checked_sub(HEADER_SIZE)?; let right_hand_table_offset = s.read::()?.to_usize().checked_sub(HEADER_SIZE)?; let array_offset = s.read::()?.to_usize().checked_sub(HEADER_SIZE)?; // 'The array can be indexed by completing the left-hand and right-hand class mappings, // adding the class values to the address of the subtable, // and fetching the kerning value to which the new address points.' let left_class = crate::kern::get_format2_class(left.0, left_hand_table_offset, self.0).unwrap_or(0); let right_class = crate::kern::get_format2_class(right.0, right_hand_table_offset, self.0).unwrap_or(0); // 'Values within the left-hand offset table should not be less than the kerning array offset.' if usize::from(left_class) < array_offset { return None; } // Classes are already premultiplied, so we only need to sum them. let index = usize::from(left_class) + usize::from(right_class); let value_offset = index.checked_sub(HEADER_SIZE)?; Stream::read_at::(self.0, value_offset) } } impl core::fmt::Debug for Subtable2<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtable2 {{ ... }}") } } /// A container of Anchor Points used by [`Subtable4`]. #[derive(Clone, Copy)] pub struct AnchorPoints<'a>(&'a [u8]); impl AnchorPoints<'_> { /// Returns a mark and current anchor points at action index. pub fn get(&self, action_index: u16) -> Option<(u16, u16)> { // Each action contains two 16-bit fields, so we must // double the action_index to get the correct offset here. let offset = usize::from(action_index) * u16::SIZE * 2; let mut s = Stream::new_at(self.0, offset)?; Some((s.read::()?, s.read::()?)) } } impl core::fmt::Debug for AnchorPoints<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "AnchorPoints {{ ... }}") } } /// A format 4 subtable. /// /// State Table for Control Point/Anchor Point Positioning. /// /// Note: I wasn't able to find any fonts that actually use /// `ControlPointActions` and/or `ControlPointCoordinateActions`, /// therefore only `AnchorPointActions` is supported. #[derive(Clone)] pub struct Subtable4<'a> { /// A state table. pub state_table: aat::ExtendedStateTable<'a, EntryData>, /// Anchor points. pub anchor_points: AnchorPoints<'a>, } impl<'a> Subtable4<'a> { fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option { let mut s = Stream::new(data); let state_table = aat::ExtendedStateTable::parse(number_of_glyphs, &mut s)?; let flags = s.read::()?; let action_type = ((flags & 0xC0000000) >> 30) as u8; let points_offset = usize::num_from(flags & 0x00FFFFFF); // We support only Anchor Point Actions. if action_type != 1 { return None; } Some(Self { state_table, anchor_points: AnchorPoints(data.get(points_offset..)?), }) } } impl<'a> core::ops::Deref for Subtable4<'a> { type Target = aat::ExtendedStateTable<'a, EntryData>; fn deref(&self) -> &Self::Target { &self.state_table } } impl core::fmt::Debug for Subtable4<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtable4 {{ ... }}") } } /// A format 6 subtable. /// /// Simple Index-based n x m Array of Kerning Values. #[derive(Clone, Copy)] pub struct Subtable6<'a> { data: &'a [u8], number_of_glyphs: NonZeroU16, } impl<'a> Subtable6<'a> { // TODO: parse actual structure fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Self { Subtable6 { number_of_glyphs, data, } } /// Returns kerning for a pair of glyphs. pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option { use core::convert::TryFrom; let mut s = Stream::new(self.data); let flags = s.read::()?; s.skip::(); // row_count s.skip::(); // col_count // All offsets are from the start of the subtable. let row_index_table_offset = s.read::()?.to_usize().checked_sub(HEADER_SIZE)?; let column_index_table_offset = s.read::()?.to_usize().checked_sub(HEADER_SIZE)?; let kerning_array_offset = s.read::()?.to_usize().checked_sub(HEADER_SIZE)?; let kerning_vector_offset = s.read::()?.to_usize().checked_sub(HEADER_SIZE)?; let row_index_table_data = self.data.get(row_index_table_offset..)?; let column_index_table_data = self.data.get(column_index_table_offset..)?; let kerning_array_data = self.data.get(kerning_array_offset..)?; let kerning_vector_data = self.data.get(kerning_vector_offset..)?; let has_long_values = flags & 0x00000001 != 0; if has_long_values { let l: u32 = aat::Lookup::parse(self.number_of_glyphs, row_index_table_data)? .value(left) .unwrap_or(0) as u32; let r: u32 = aat::Lookup::parse(self.number_of_glyphs, column_index_table_data)? .value(right) .unwrap_or(0) as u32; let array_offset = usize::try_from(l + r).ok()?.checked_mul(i32::SIZE)?; let vector_offset: u32 = Stream::read_at(kerning_array_data, array_offset)?; Stream::read_at(kerning_vector_data, usize::num_from(vector_offset)) } else { let l: u16 = aat::Lookup::parse(self.number_of_glyphs, row_index_table_data)? .value(left) .unwrap_or(0); let r: u16 = aat::Lookup::parse(self.number_of_glyphs, column_index_table_data)? .value(right) .unwrap_or(0); let array_offset = usize::from(l + r).checked_mul(i16::SIZE)?; let vector_offset: u16 = Stream::read_at(kerning_array_data, array_offset)?; Stream::read_at(kerning_vector_data, usize::from(vector_offset)) } } } impl core::fmt::Debug for Subtable6<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtable6 {{ ... }}") } } /// An extended kerning subtable format. #[allow(missing_docs)] #[derive(Clone, Debug)] pub enum Format<'a> { Format0(Subtable0<'a>), Format1(Subtable1<'a>), Format2(Subtable2<'a>), Format4(Subtable4<'a>), Format6(Subtable6<'a>), } /// A kerning subtable. #[derive(Clone, Debug)] pub struct Subtable<'a> { /// Indicates that subtable is for horizontal text. pub horizontal: bool, /// Indicates that subtable is variable. pub variable: bool, /// Indicates that subtable has a cross-stream values. pub has_cross_stream: bool, /// Indicates that subtable uses a state machine. /// /// In this case `glyphs_kerning()` will return `None`. pub has_state_machine: bool, /// The tuple count. /// /// This value is only used with variation fonts and should be 0 for all other fonts. pub tuple_count: u32, /// Subtable format. pub format: Format<'a>, } impl<'a> Subtable<'a> { /// Returns kerning for a pair of glyphs. /// /// Returns `None` in case of state machine based subtable. #[inline] pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option { match self.format { Format::Format0(ref subtable) => subtable.glyphs_kerning(left, right), Format::Format1(_) => None, Format::Format2(ref subtable) => subtable.glyphs_kerning(left, right), Format::Format4(_) => None, Format::Format6(ref subtable) => subtable.glyphs_kerning(left, right), } } } #[derive(Clone, Copy, Debug)] struct Coverage(u8); #[rustfmt::skip] impl Coverage { // TODO: use hex #[inline] pub fn is_horizontal(self) -> bool { self.0 & (1 << 7) == 0 } #[inline] pub fn has_cross_stream(self) -> bool { self.0 & (1 << 6) != 0 } #[inline] pub fn is_variable(self) -> bool { self.0 & (1 << 5) != 0 } } /// A list of extended kerning subtables. /// /// The internal data layout is not designed for random access, /// therefore we're not providing the `get()` method and only an iterator. #[derive(Clone, Copy)] pub struct Subtables<'a> { /// The number of glyphs from the `maxp` table. number_of_glyphs: NonZeroU16, /// The total number of tables. number_of_tables: u32, /// Actual data. Starts right after the `kerx` header. data: &'a [u8], } impl core::fmt::Debug for Subtables<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtables {{ ... }}") } } impl<'a> IntoIterator for Subtables<'a> { type Item = Subtable<'a>; type IntoIter = SubtablesIter<'a>; #[inline] fn into_iter(self) -> Self::IntoIter { SubtablesIter { number_of_glyphs: self.number_of_glyphs, table_index: 0, number_of_tables: self.number_of_tables, stream: Stream::new(self.data), } } } /// An iterator over extended kerning subtables. #[allow(missing_debug_implementations)] #[derive(Clone)] pub struct SubtablesIter<'a> { /// The number of glyphs from the `maxp` table. number_of_glyphs: NonZeroU16, /// The current table index. table_index: u32, /// The total number of tables. number_of_tables: u32, /// Actual data. Starts right after the `kerx` header. stream: Stream<'a>, } impl<'a> Iterator for SubtablesIter<'a> { type Item = Subtable<'a>; fn next(&mut self) -> Option { if self.table_index == self.number_of_tables { return None; } if self.stream.at_end() { return None; } let s = &mut self.stream; let table_len = s.read::()?; let coverage = Coverage(s.read::()?); s.skip::(); // unused let raw_format = s.read::()?; let tuple_count = s.read::()?; // Subtract the header size. let data_len = usize::num_from(table_len).checked_sub(HEADER_SIZE)?; let data = s.read_bytes(data_len)?; let format = match raw_format { 0 => Subtable0::parse(data).map(Format::Format0)?, 1 => Subtable1::parse(self.number_of_glyphs, data).map(Format::Format1)?, 2 => Format::Format2(Subtable2(data)), 4 => Subtable4::parse(self.number_of_glyphs, data).map(Format::Format4)?, 6 => Format::Format6(Subtable6::parse(self.number_of_glyphs, data)), _ => { // Unknown format. return None; } }; self.table_index += 1; Some(Subtable { horizontal: coverage.is_horizontal(), variable: coverage.is_variable(), has_cross_stream: coverage.has_cross_stream(), has_state_machine: raw_format == 1 || raw_format == 4, tuple_count, format, }) } } /// An [Extended Kerning Table]( /// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6kerx.html). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// A list of subtables. pub subtables: Subtables<'a>, } impl<'a> Table<'a> { /// Parses a table from raw data. /// /// `number_of_glyphs` is from the `maxp` table. pub fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option { let mut s = Stream::new(data); s.skip::(); // version s.skip::(); // padding let number_of_tables = s.read::()?; let subtables = Subtables { number_of_glyphs, number_of_tables, data: s.tail()?, }; Some(Table { subtables }) } } ttf-parser-0.24.1/src/tables/loca.rs000064400000000000000000000066161046102023000153350ustar 00000000000000//! An [Index to Location Table](https://docs.microsoft.com/en-us/typography/opentype/spec/loca) //! implementation. use core::convert::TryFrom; use core::num::NonZeroU16; use core::ops::Range; use crate::parser::{LazyArray16, NumFrom, Stream}; use crate::{GlyphId, IndexToLocationFormat}; /// An [Index to Location Table](https://docs.microsoft.com/en-us/typography/opentype/spec/loca). #[derive(Clone, Copy, Debug)] pub enum Table<'a> { /// Short offsets. Short(LazyArray16<'a, u16>), /// Long offsets. Long(LazyArray16<'a, u32>), } impl<'a> Table<'a> { /// Parses a table from raw data. /// /// - `number_of_glyphs` is from the `maxp` table. /// - `format` is from the `head` table. pub fn parse( number_of_glyphs: NonZeroU16, format: IndexToLocationFormat, data: &'a [u8], ) -> Option { // The number of ranges is `maxp.numGlyphs + 1`. // // Check for overflow first. let mut total = if number_of_glyphs.get() == u16::MAX { number_of_glyphs.get() } else { number_of_glyphs.get() + 1 }; // By the spec, the number of `loca` offsets is `maxp.numGlyphs + 1`. // But some malformed fonts can have less glyphs than that. // In which case we try to parse only the available offsets // and do not return an error, since the expected data length // would go beyond table's length. // // In case when `loca` has more data than needed we simply ignore the rest. let actual_total = match format { IndexToLocationFormat::Short => data.len() / 2, IndexToLocationFormat::Long => data.len() / 4, }; let actual_total = u16::try_from(actual_total).ok()?; total = total.min(actual_total); let mut s = Stream::new(data); match format { IndexToLocationFormat::Short => Some(Table::Short(s.read_array16::(total)?)), IndexToLocationFormat::Long => Some(Table::Long(s.read_array16::(total)?)), } } /// Returns the number of offsets. #[inline] pub fn len(&self) -> u16 { match self { Table::Short(ref array) => array.len(), Table::Long(ref array) => array.len(), } } /// Checks if there are any offsets. pub fn is_empty(&self) -> bool { self.len() == 0 } /// Returns glyph's range in the `glyf` table. #[inline] pub fn glyph_range(&self, glyph_id: GlyphId) -> Option> { let glyph_id = glyph_id.0; if glyph_id == u16::MAX { return None; } // Glyph ID must be smaller than total number of values in a `loca` array. if glyph_id + 1 >= self.len() { return None; } let range = match self { Table::Short(ref array) => { // 'The actual local offset divided by 2 is stored.' usize::from(array.get(glyph_id)?) * 2..usize::from(array.get(glyph_id + 1)?) * 2 } Table::Long(ref array) => { usize::num_from(array.get(glyph_id)?)..usize::num_from(array.get(glyph_id + 1)?) } }; if range.start >= range.end { // 'The offsets must be in ascending order.' // And range cannot be empty. None } else { Some(range) } } } ttf-parser-0.24.1/src/tables/math.rs000064400000000000000000000775731046102023000153620ustar 00000000000000//! A [Math Table](https://docs.microsoft.com/en-us/typography/opentype/spec/math) implementation. use crate::gpos::Device; use crate::opentype_layout::Coverage; use crate::parser::{ FromData, FromSlice, LazyArray16, LazyOffsetArray16, Offset, Offset16, Stream, }; use crate::GlyphId; /// A [Math Value](https://docs.microsoft.com/en-us/typography/opentype/spec/math#mathvaluerecord) /// with optional device corrections. #[derive(Clone, Copy, Debug)] pub struct MathValue<'a> { /// The X or Y value in font design units. pub value: i16, /// Device corrections for this value. pub device: Option>, } impl<'a> MathValue<'a> { fn parse(data: &'a [u8], parent: &'a [u8]) -> Option { Some(MathValueRecord::parse(data)?.get(parent)) } } /// A math value record with unresolved offset. #[derive(Clone, Copy)] struct MathValueRecord { value: i16, device_offset: Option, } impl FromData for MathValueRecord { const SIZE: usize = 4; fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); let value = s.read::()?; let device_offset = s.read::>()?; Some(MathValueRecord { value, device_offset, }) } } impl MathValueRecord { fn get(self, data: &[u8]) -> MathValue { let device = self .device_offset .and_then(|offset| data.get(offset.to_usize()..)) .and_then(Device::parse); MathValue { value: self.value, device, } } } /// A mapping from glyphs to /// [Math Values](https://docs.microsoft.com/en-us/typography/opentype/spec/math#mathvaluerecord). #[derive(Clone, Copy)] pub struct MathValues<'a> { data: &'a [u8], coverage: Coverage<'a>, records: LazyArray16<'a, MathValueRecord>, } impl<'a> FromSlice<'a> for MathValues<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let coverage = s.parse_at_offset16::(data)?; let count = s.read::()?; let records = s.read_array16::(count)?; Some(MathValues { data, coverage, records, }) } } impl<'a> MathValues<'a> { /// Returns the value for the glyph or `None` if it is not covered. #[inline] pub fn get(&self, glyph: GlyphId) -> Option> { let index = self.coverage.get(glyph)?; Some(self.records.get(index)?.get(self.data)) } } impl core::fmt::Debug for MathValues<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "MathValues {{ ... }}") } } /// A [Math Constants Table](https://learn.microsoft.com/en-us/typography/opentype/spec/math#mathconstants-table). #[derive(Clone, Copy)] pub struct Constants<'a> { data: &'a [u8], } impl<'a> FromSlice<'a> for Constants<'a> { fn parse(data: &'a [u8]) -> Option { Some(Constants { data }) } } impl core::fmt::Debug for Constants<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Constants {{ ... }}") } } const SCRIPT_PERCENT_SCALE_DOWN_OFFSET: usize = 0; const SCRIPT_SCRIPT_PERCENT_SCALE_DOWN_OFFSET: usize = 2; const DELIMITED_SUB_FORMULA_MIN_HEIGHT_OFFSET: usize = 4; const DISPLAY_OPERATOR_MIN_HEIGHT_OFFSET: usize = 6; const MATH_LEADING_OFFSET: usize = 8; const AXIS_HEIGHT_OFFSET: usize = 12; const ACCENT_BASE_HEIGHT_OFFSET: usize = 16; const FLATTENED_ACCENT_BASE_HEIGHT_OFFSET: usize = 20; const SUBSCRIPT_SHIFT_DOWN_OFFSET: usize = 24; const SUBSCRIPT_TOP_MAX_OFFSET: usize = 28; const SUBSCRIPT_BASELINE_DROP_MIN_OFFSET: usize = 32; const SUPERSCRIPT_SHIFT_UP_OFFSET: usize = 36; const SUPERSCRIPT_SHIFT_UP_CRAMPED_OFFSET: usize = 40; const SUPERSCRIPT_BOTTOM_MIN_OFFSET: usize = 44; const SUPERSCRIPT_BASELINE_DROP_MAX_OFFSET: usize = 48; const SUB_SUPERSCRIPT_GAP_MIN_OFFSET: usize = 52; const SUPERSCRIPT_BOTTOM_MAX_WITH_SUBSCRIPT_OFFSET: usize = 56; const SPACE_AFTER_SCRIPT_OFFSET: usize = 60; const UPPER_LIMIT_GAP_MIN_OFFSET: usize = 64; const UPPER_LIMIT_BASELINE_RISE_MIN_OFFSET: usize = 68; const LOWER_LIMIT_GAP_MIN_OFFSET: usize = 72; const LOWER_LIMIT_BASELINE_DROP_MIN_OFFSET: usize = 76; const STACK_TOP_SHIFT_UP_OFFSET: usize = 80; const STACK_TOP_DISPLAY_STYLE_SHIFT_UP_OFFSET: usize = 84; const STACK_BOTTOM_SHIFT_DOWN_OFFSET: usize = 88; const STACK_BOTTOM_DISPLAY_STYLE_SHIFT_DOWN_OFFSET: usize = 92; const STACK_GAP_MIN_OFFSET: usize = 96; const STACK_DISPLAY_STYLE_GAP_MIN_OFFSET: usize = 100; const STRETCH_STACK_TOP_SHIFT_UP_OFFSET: usize = 104; const STRETCH_STACK_BOTTOM_SHIFT_DOWN_OFFSET: usize = 108; const STRETCH_STACK_GAP_ABOVE_MIN_OFFSET: usize = 112; const STRETCH_STACK_GAP_BELOW_MIN_OFFSET: usize = 116; const FRACTION_NUMERATOR_SHIFT_UP_OFFSET: usize = 120; const FRACTION_NUMERATOR_DISPLAY_STYLE_SHIFT_UP_OFFSET: usize = 124; const FRACTION_DENOMINATOR_SHIFT_DOWN_OFFSET: usize = 128; const FRACTION_DENOMINATOR_DISPLAY_STYLE_SHIFT_DOWN_OFFSET: usize = 132; const FRACTION_NUMERATOR_GAP_MIN_OFFSET: usize = 136; const FRACTION_NUM_DISPLAY_STYLE_GAP_MIN_OFFSET: usize = 140; const FRACTION_RULE_THICKNESS_OFFSET: usize = 144; const FRACTION_DENOMINATOR_GAP_MIN_OFFSET: usize = 148; const FRACTION_DENOM_DISPLAY_STYLE_GAP_MIN_OFFSET: usize = 152; const SKEWED_FRACTION_HORIZONTAL_GAP_OFFSET: usize = 156; const SKEWED_FRACTION_VERTICAL_GAP_OFFSET: usize = 160; const OVERBAR_VERTICAL_GAP_OFFSET: usize = 164; const OVERBAR_RULE_THICKNESS_OFFSET: usize = 168; const OVERBAR_EXTRA_ASCENDER_OFFSET: usize = 172; const UNDERBAR_VERTICAL_GAP_OFFSET: usize = 176; const UNDERBAR_RULE_THICKNESS_OFFSET: usize = 180; const UNDERBAR_EXTRA_DESCENDER_OFFSET: usize = 184; const RADICAL_VERTICAL_GAP_OFFSET: usize = 188; const RADICAL_DISPLAY_STYLE_VERTICAL_GAP_OFFSET: usize = 192; const RADICAL_RULE_THICKNESS_OFFSET: usize = 196; const RADICAL_EXTRA_ASCENDER_OFFSET: usize = 200; const RADICAL_KERN_BEFORE_DEGREE_OFFSET: usize = 204; const RADICAL_KERN_AFTER_DEGREE_OFFSET: usize = 208; const RADICAL_DEGREE_BOTTOM_RAISE_PERCENT_OFFSET: usize = 212; impl<'a> Constants<'a> { /// Percentage of scaling down for level 1 superscripts and subscripts. #[inline] pub fn script_percent_scale_down(&self) -> i16 { self.read_i16(SCRIPT_PERCENT_SCALE_DOWN_OFFSET) } /// Percentage of scaling down for level 2 (scriptScript) superscripts and subscripts. #[inline] pub fn script_script_percent_scale_down(&self) -> i16 { self.read_i16(SCRIPT_SCRIPT_PERCENT_SCALE_DOWN_OFFSET) } /// Minimum height required for a delimited expression (contained within parentheses, etc.) to /// be treated as a sub-formula. #[inline] pub fn delimited_sub_formula_min_height(&self) -> u16 { self.read_u16(DELIMITED_SUB_FORMULA_MIN_HEIGHT_OFFSET) } /// Minimum height of n-ary operators (such as integral and summation) for formulas in display /// mode (that is, appearing as standalone page elements, not embedded inline within text). #[inline] pub fn display_operator_min_height(&self) -> u16 { self.read_u16(DISPLAY_OPERATOR_MIN_HEIGHT_OFFSET) } /// White space to be left between math formulas to ensure proper line spacing. #[inline] pub fn math_leading(&self) -> MathValue<'a> { self.read_record(MATH_LEADING_OFFSET) } /// Axis height of the font. #[inline] pub fn axis_height(&self) -> MathValue<'a> { self.read_record(AXIS_HEIGHT_OFFSET) } /// Maximum (ink) height of accent base that does not require raising the accents. #[inline] pub fn accent_base_height(&self) -> MathValue<'a> { self.read_record(ACCENT_BASE_HEIGHT_OFFSET) } /// Maximum (ink) height of accent base that does not require flattening the accents. #[inline] pub fn flattened_accent_base_height(&self) -> MathValue<'a> { self.read_record(FLATTENED_ACCENT_BASE_HEIGHT_OFFSET) } /// The standard shift down applied to subscript elements. #[inline] pub fn subscript_shift_down(&self) -> MathValue<'a> { self.read_record(SUBSCRIPT_SHIFT_DOWN_OFFSET) } /// Maximum allowed height of the (ink) top of subscripts that does not require moving /// subscripts further down. #[inline] pub fn subscript_top_max(&self) -> MathValue<'a> { self.read_record(SUBSCRIPT_TOP_MAX_OFFSET) } /// Minimum allowed drop of the baseline of subscripts relative to the (ink) bottom of the /// base. #[inline] pub fn subscript_baseline_drop_min(&self) -> MathValue<'a> { self.read_record(SUBSCRIPT_BASELINE_DROP_MIN_OFFSET) } /// Standard shift up applied to superscript elements. #[inline] pub fn superscript_shift_up(&self) -> MathValue<'a> { self.read_record(SUPERSCRIPT_SHIFT_UP_OFFSET) } /// Standard shift of superscripts relative to the base, in cramped style. #[inline] pub fn superscript_shift_up_cramped(&self) -> MathValue<'a> { self.read_record(SUPERSCRIPT_SHIFT_UP_CRAMPED_OFFSET) } /// Minimum allowed height of the (ink) bottom of superscripts that does not require moving /// subscripts further up. #[inline] pub fn superscript_bottom_min(&self) -> MathValue<'a> { self.read_record(SUPERSCRIPT_BOTTOM_MIN_OFFSET) } /// Maximum allowed drop of the baseline of superscripts relative to the (ink) top of the /// base. #[inline] pub fn superscript_baseline_drop_max(&self) -> MathValue<'a> { self.read_record(SUPERSCRIPT_BASELINE_DROP_MAX_OFFSET) } /// Minimum gap between the superscript and subscript ink. #[inline] pub fn sub_superscript_gap_min(&self) -> MathValue<'a> { self.read_record(SUB_SUPERSCRIPT_GAP_MIN_OFFSET) } /// The maximum level to which the (ink) bottom of superscript can be pushed to increase the /// gap between superscript and subscript, before subscript starts being moved down. #[inline] pub fn superscript_bottom_max_with_subscript(&self) -> MathValue<'a> { self.read_record(SUPERSCRIPT_BOTTOM_MAX_WITH_SUBSCRIPT_OFFSET) } /// Extra white space to be added after each subscript and superscript. #[inline] pub fn space_after_script(&self) -> MathValue<'a> { self.read_record(SPACE_AFTER_SCRIPT_OFFSET) } /// Minimum gap between the (ink) bottom of the upper limit, and the (ink) top of the base /// operator. #[inline] pub fn upper_limit_gap_min(&self) -> MathValue<'a> { self.read_record(UPPER_LIMIT_GAP_MIN_OFFSET) } /// Minimum distance between baseline of upper limit and (ink) top of the base operator. #[inline] pub fn upper_limit_baseline_rise_min(&self) -> MathValue<'a> { self.read_record(UPPER_LIMIT_BASELINE_RISE_MIN_OFFSET) } /// Minimum gap between (ink) top of the lower limit, and (ink) bottom of the base operator. #[inline] pub fn lower_limit_gap_min(&self) -> MathValue<'a> { self.read_record(LOWER_LIMIT_GAP_MIN_OFFSET) } /// Minimum distance between baseline of the lower limit and (ink) bottom of the base operator. #[inline] pub fn lower_limit_baseline_drop_min(&self) -> MathValue<'a> { self.read_record(LOWER_LIMIT_BASELINE_DROP_MIN_OFFSET) } /// Standard shift up applied to the top element of a stack. #[inline] pub fn stack_top_shift_up(&self) -> MathValue<'a> { self.read_record(STACK_TOP_SHIFT_UP_OFFSET) } /// Standard shift up applied to the top element of a stack in display style. #[inline] pub fn stack_top_display_style_shift_up(&self) -> MathValue<'a> { self.read_record(STACK_TOP_DISPLAY_STYLE_SHIFT_UP_OFFSET) } /// Standard shift down applied to the bottom element of a stack. #[inline] pub fn stack_bottom_shift_down(&self) -> MathValue<'a> { self.read_record(STACK_BOTTOM_SHIFT_DOWN_OFFSET) } /// Standard shift down applied to the bottom element of a stack in display style. #[inline] pub fn stack_bottom_display_style_shift_down(&self) -> MathValue<'a> { self.read_record(STACK_BOTTOM_DISPLAY_STYLE_SHIFT_DOWN_OFFSET) } /// Minimum gap between (ink) bottom of the top element of a stack, and the (ink) top of the /// bottom element. #[inline] pub fn stack_gap_min(&self) -> MathValue<'a> { self.read_record(STACK_GAP_MIN_OFFSET) } /// Minimum gap between (ink) bottom of the top element of a stack, and the (ink) top of the /// bottom element in display style. #[inline] pub fn stack_display_style_gap_min(&self) -> MathValue<'a> { self.read_record(STACK_DISPLAY_STYLE_GAP_MIN_OFFSET) } /// Standard shift up applied to the top element of the stretch stack. #[inline] pub fn stretch_stack_top_shift_up(&self) -> MathValue<'a> { self.read_record(STRETCH_STACK_TOP_SHIFT_UP_OFFSET) } /// Standard shift down applied to the bottom element of the stretch stack. #[inline] pub fn stretch_stack_bottom_shift_down(&self) -> MathValue<'a> { self.read_record(STRETCH_STACK_BOTTOM_SHIFT_DOWN_OFFSET) } /// Minimum gap between the ink of the stretched element, and the (ink) bottom of the element above. #[inline] pub fn stretch_stack_gap_above_min(&self) -> MathValue<'a> { self.read_record(STRETCH_STACK_GAP_ABOVE_MIN_OFFSET) } /// Minimum gap between the ink of the stretched element, and the (ink) top of the element below. #[inline] pub fn stretch_stack_gap_below_min(&self) -> MathValue<'a> { self.read_record(STRETCH_STACK_GAP_BELOW_MIN_OFFSET) } /// Standard shift up applied to the numerator. #[inline] pub fn fraction_numerator_shift_up(&self) -> MathValue<'a> { self.read_record(FRACTION_NUMERATOR_SHIFT_UP_OFFSET) } /// Standard shift up applied to the numerator in display style. #[inline] pub fn fraction_numerator_display_style_shift_up(&self) -> MathValue<'a> { self.read_record(FRACTION_NUMERATOR_DISPLAY_STYLE_SHIFT_UP_OFFSET) } /// Standard shift down applied to the denominator. #[inline] pub fn fraction_denominator_shift_down(&self) -> MathValue<'a> { self.read_record(FRACTION_DENOMINATOR_SHIFT_DOWN_OFFSET) } /// Standard shift down applied to the denominator in display style. #[inline] pub fn fraction_denominator_display_style_shift_down(&self) -> MathValue<'a> { self.read_record(FRACTION_DENOMINATOR_DISPLAY_STYLE_SHIFT_DOWN_OFFSET) } /// Minimum tolerated gap between the (ink) bottom of the numerator and the ink of the /// fraction bar. #[inline] pub fn fraction_numerator_gap_min(&self) -> MathValue<'a> { self.read_record(FRACTION_NUMERATOR_GAP_MIN_OFFSET) } /// Minimum tolerated gap between the (ink) bottom of the numerator and the ink of the /// fraction bar in display style. #[inline] pub fn fraction_num_display_style_gap_min(&self) -> MathValue<'a> { self.read_record(FRACTION_NUM_DISPLAY_STYLE_GAP_MIN_OFFSET) } /// Thickness of the fraction bar. #[inline] pub fn fraction_rule_thickness(&self) -> MathValue<'a> { self.read_record(FRACTION_RULE_THICKNESS_OFFSET) } /// Minimum tolerated gap between the (ink) top of the denominator and the ink of the fraction bar. #[inline] pub fn fraction_denominator_gap_min(&self) -> MathValue<'a> { self.read_record(FRACTION_DENOMINATOR_GAP_MIN_OFFSET) } /// Minimum tolerated gap between the (ink) top of the denominator and the ink of the fraction /// bar in display style. #[inline] pub fn fraction_denom_display_style_gap_min(&self) -> MathValue<'a> { self.read_record(FRACTION_DENOM_DISPLAY_STYLE_GAP_MIN_OFFSET) } /// Horizontal distance between the top and bottom elements of a skewed fraction. #[inline] pub fn skewed_fraction_horizontal_gap(&self) -> MathValue<'a> { self.read_record(SKEWED_FRACTION_HORIZONTAL_GAP_OFFSET) } /// Vertical distance between the ink of the top and bottom elements of a skewed fraction. #[inline] pub fn skewed_fraction_vertical_gap(&self) -> MathValue<'a> { self.read_record(SKEWED_FRACTION_VERTICAL_GAP_OFFSET) } /// Distance between the overbar and the (ink) top of he base. #[inline] pub fn overbar_vertical_gap(&self) -> MathValue<'a> { self.read_record(OVERBAR_VERTICAL_GAP_OFFSET) } /// Thickness of overbar. #[inline] pub fn overbar_rule_thickness(&self) -> MathValue<'a> { self.read_record(OVERBAR_RULE_THICKNESS_OFFSET) } /// Extra white space reserved above the overbar. #[inline] pub fn overbar_extra_ascender(&self) -> MathValue<'a> { self.read_record(OVERBAR_EXTRA_ASCENDER_OFFSET) } /// Distance between underbar and (ink) bottom of the base. #[inline] pub fn underbar_vertical_gap(&self) -> MathValue<'a> { self.read_record(UNDERBAR_VERTICAL_GAP_OFFSET) } /// Thickness of underbar. #[inline] pub fn underbar_rule_thickness(&self) -> MathValue<'a> { self.read_record(UNDERBAR_RULE_THICKNESS_OFFSET) } /// Extra white space reserved below the underbar. #[inline] pub fn underbar_extra_descender(&self) -> MathValue<'a> { self.read_record(UNDERBAR_EXTRA_DESCENDER_OFFSET) } /// Space between the (ink) top of the expression and the bar over it. #[inline] pub fn radical_vertical_gap(&self) -> MathValue<'a> { self.read_record(RADICAL_VERTICAL_GAP_OFFSET) } /// Space between the (ink) top of the expression and the bar over it. #[inline] pub fn radical_display_style_vertical_gap(&self) -> MathValue<'a> { self.read_record(RADICAL_DISPLAY_STYLE_VERTICAL_GAP_OFFSET) } /// Thickness of the radical rule. #[inline] pub fn radical_rule_thickness(&self) -> MathValue<'a> { self.read_record(RADICAL_RULE_THICKNESS_OFFSET) } /// Extra white space reserved above the radical. #[inline] pub fn radical_extra_ascender(&self) -> MathValue<'a> { self.read_record(RADICAL_EXTRA_ASCENDER_OFFSET) } /// Extra horizontal kern before the degree of a radical, if such is present. #[inline] pub fn radical_kern_before_degree(&self) -> MathValue<'a> { self.read_record(RADICAL_KERN_BEFORE_DEGREE_OFFSET) } /// Negative kern after the degree of a radical, if such is present. #[inline] pub fn radical_kern_after_degree(&self) -> MathValue<'a> { self.read_record(RADICAL_KERN_AFTER_DEGREE_OFFSET) } /// Height of the bottom of the radical degree, if such is present, in proportion to the /// ascender of the radical sign. #[inline] pub fn radical_degree_bottom_raise_percent(&self) -> i16 { self.read_i16(RADICAL_DEGREE_BOTTOM_RAISE_PERCENT_OFFSET) } /// Read an `i16` at an offset into the table. #[inline] fn read_i16(&self, offset: usize) -> i16 { Stream::read_at(self.data, offset).unwrap_or(0) } /// Read an `u16` at an offset into the table. #[inline] fn read_u16(&self, offset: usize) -> u16 { Stream::read_at(self.data, offset).unwrap_or(0) } /// Read a `MathValueRecord` at an offset into the table. #[inline] fn read_record(&self, offset: usize) -> MathValue<'a> { self.data .get(offset..) .and_then(|data| MathValue::parse(data, self.data)) .unwrap_or(MathValue { value: 0, device: None, }) } } /// A [Math Kern Table](https://learn.microsoft.com/en-us/typography/opentype/spec/math#mathkern-table). #[derive(Clone)] pub struct Kern<'a> { data: &'a [u8], heights: LazyArray16<'a, MathValueRecord>, kerns: LazyArray16<'a, MathValueRecord>, } impl<'a> Kern<'a> { /// Number of heights at which the kern value changes. pub fn count(&self) -> u16 { self.heights.len() } /// The correction height at the given index. /// /// The index must be smaller than `count()`. pub fn height(&self, index: u16) -> Option> { Some(self.heights.get(index)?.get(self.data)) } /// The kern value at the given index. /// /// The index must be smaller than or equal to `count()`. pub fn kern(&self, index: u16) -> Option> { Some(self.kerns.get(index)?.get(self.data)) } } impl<'a> FromSlice<'a> for Kern<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let count = s.read::()?; let heights = s.read_array16::(count)?; let kerns = s.read_array16::(count + 1)?; Some(Kern { data, heights, kerns, }) } } impl core::fmt::Debug for Kern<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Kern {{ ... }}") } } #[derive(Clone, Copy)] struct KernInfoRecord { top_right: Option, top_left: Option, bottom_right: Option, bottom_left: Option, } impl FromData for KernInfoRecord { const SIZE: usize = 8; fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(KernInfoRecord { top_right: s.read::>()?, top_left: s.read::>()?, bottom_right: s.read::>()?, bottom_left: s.read::>()?, }) } } impl KernInfoRecord { fn get<'a>(&self, data: &'a [u8]) -> KernInfo<'a> { let parse_field = |offset: Option| { offset .and_then(|offset| data.get(offset.to_usize()..)) .and_then(Kern::parse) }; KernInfo { top_right: parse_field(self.top_right), top_left: parse_field(self.top_left), bottom_right: parse_field(self.bottom_right), bottom_left: parse_field(self.bottom_left), } } } /// An [entry in a Math Kern Info Table]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/math#mathkerninforecord). #[derive(Clone, Debug)] pub struct KernInfo<'a> { /// The kerning data for the top-right corner. pub top_right: Option>, /// The kerning data for the top-left corner. pub top_left: Option>, /// The kerning data for the bottom-right corner. pub bottom_right: Option>, /// The kerning data for the bottom-left corner. pub bottom_left: Option>, } /// A [Math Kern Info Table](https://docs.microsoft.com/en-us/typography/opentype/spec/math#mathkerninfo-table). #[derive(Clone, Copy)] pub struct KernInfos<'a> { data: &'a [u8], coverage: Coverage<'a>, records: LazyArray16<'a, KernInfoRecord>, } impl<'a> FromSlice<'a> for KernInfos<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let coverage = s.parse_at_offset16::(data)?; let count = s.read::()?; let records = s.read_array16::(count)?; Some(KernInfos { data, coverage, records, }) } } impl<'a> KernInfos<'a> { /// Returns the kerning info for the glyph or `None` if it is not covered. #[inline] pub fn get(&self, glyph: GlyphId) -> Option> { let index = self.coverage.get(glyph)?; Some(self.records.get(index)?.get(self.data)) } } impl core::fmt::Debug for KernInfos<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "KernInfos {{ ... }}") } } /// A [Math Glyph Info Table](https://learn.microsoft.com/en-us/typography/opentype/spec/math#mathglyphinfo-table). #[derive(Clone, Copy, Debug)] pub struct GlyphInfo<'a> { /// Per-glyph italics correction values. pub italic_corrections: Option>, /// Per-glyph horizontal positions for attaching mathematical accents. pub top_accent_attachments: Option>, /// Glyphs which are _extended shapes_. pub extended_shapes: Option>, /// Per-glyph information for mathematical kerning. pub kern_infos: Option>, } impl<'a> FromSlice<'a> for GlyphInfo<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); Some(GlyphInfo { italic_corrections: s.parse_at_offset16::(data), top_accent_attachments: s.parse_at_offset16::(data), extended_shapes: s.parse_at_offset16::(data), kern_infos: s.parse_at_offset16::(data), }) } } /// Glyph part flags. #[derive(Clone, Copy, Debug)] pub struct PartFlags(pub u16); #[allow(missing_docs)] impl PartFlags { #[inline] pub fn extender(self) -> bool { self.0 & 0x0001 != 0 } } impl FromData for PartFlags { const SIZE: usize = 2; fn parse(data: &[u8]) -> Option { u16::parse(data).map(PartFlags) } } /// Details for a glyph part in an assembly. #[derive(Clone, Copy, Debug)] pub struct GlyphPart { /// Glyph ID for the part. pub glyph_id: GlyphId, /// Lengths of the connectors on the start of the glyph, in font design units. pub start_connector_length: u16, /// Lengths of the connectors on the end of the glyph, in font design units. pub end_connector_length: u16, /// The full advance of the part, in font design units. pub full_advance: u16, /// Part flags. pub part_flags: PartFlags, } impl FromData for GlyphPart { const SIZE: usize = 10; fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(GlyphPart { glyph_id: s.read::()?, start_connector_length: s.read::()?, end_connector_length: s.read::()?, full_advance: s.read::()?, part_flags: s.read::()?, }) } } /// A [Glyph Assembly Table](https://learn.microsoft.com/en-us/typography/opentype/spec/math#glyphassembly-table). #[derive(Clone, Copy, Debug)] pub struct GlyphAssembly<'a> { /// The italics correction of the assembled glyph. pub italics_correction: MathValue<'a>, /// Parts the assembly is composed of. pub parts: LazyArray16<'a, GlyphPart>, } impl<'a> FromSlice<'a> for GlyphAssembly<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let italics_correction = s.read::()?.get(data); let count = s.read::()?; let parts = s.read_array16::(count)?; Some(GlyphAssembly { italics_correction, parts, }) } } /// Description of math glyph variants. #[derive(Clone, Copy, Debug)] pub struct GlyphVariant { /// The ID of the variant glyph. pub variant_glyph: GlyphId, /// Advance width/height, in design units, of the variant glyph. pub advance_measurement: u16, } impl FromData for GlyphVariant { const SIZE: usize = 4; fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(GlyphVariant { variant_glyph: s.read::()?, advance_measurement: s.read::()?, }) } } /// A [Math Glyph Construction Table]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/math#mathglyphconstruction-table). #[derive(Clone, Copy, Debug)] pub struct GlyphConstruction<'a> { /// A general recipe on how to construct a variant with large advance width/height. pub assembly: Option>, /// Prepared variants of the glyph with varying advances. pub variants: LazyArray16<'a, GlyphVariant>, } impl<'a> FromSlice<'a> for GlyphConstruction<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let assembly = s.parse_at_offset16::(data); let variant_count = s.read::()?; let variants = s.read_array16::(variant_count)?; Some(GlyphConstruction { assembly, variants }) } } /// A mapping from glyphs to /// [Math Glyph Construction Tables]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/math#mathglyphconstruction-table). #[derive(Clone, Copy)] pub struct GlyphConstructions<'a> { coverage: Coverage<'a>, constructions: LazyOffsetArray16<'a, GlyphConstruction<'a>>, } impl<'a> GlyphConstructions<'a> { fn new( data: &'a [u8], coverage: Option>, offsets: LazyArray16<'a, Option>, ) -> Self { GlyphConstructions { coverage: coverage.unwrap_or(Coverage::Format1 { glyphs: LazyArray16::new(&[]), }), constructions: LazyOffsetArray16::new(data, offsets), } } /// Returns the construction for the glyph or `None` if it is not covered. #[inline] pub fn get(&self, glyph: GlyphId) -> Option> { let index = self.coverage.get(glyph)?; self.constructions.get(index) } } impl core::fmt::Debug for GlyphConstructions<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "GlyphConstructions {{ ... }}") } } /// A [Math Variants Table]( /// https://learn.microsoft.com/en-us/typography/opentype/spec/math#mathvariants-table). #[derive(Clone, Copy, Debug)] pub struct Variants<'a> { /// Minimum overlap of connecting glyphs during glyph construction, in design units. pub min_connector_overlap: u16, /// Constructions for shapes growing in the vertical direction. pub vertical_constructions: GlyphConstructions<'a>, /// Constructions for shapes growing in the horizontal direction. pub horizontal_constructions: GlyphConstructions<'a>, } impl<'a> FromSlice<'a> for Variants<'a> { fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let min_connector_overlap = s.read::()?; let vertical_coverage = s.parse_at_offset16::(data); let horizontal_coverage = s.parse_at_offset16::(data); let vertical_count = s.read::()?; let horizontal_count = s.read::()?; let vertical_offsets = s.read_array16::>(vertical_count)?; let horizontal_offsets = s.read_array16::>(horizontal_count)?; Some(Variants { min_connector_overlap, vertical_constructions: GlyphConstructions::new( data, vertical_coverage, vertical_offsets, ), horizontal_constructions: GlyphConstructions::new( data, horizontal_coverage, horizontal_offsets, ), }) } } /// A [Math Table](https://docs.microsoft.com/en-us/typography/opentype/spec/math). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// Math positioning constants. pub constants: Option>, /// Per-glyph positioning information. pub glyph_info: Option>, /// Variants and assembly recipes for growable glyphs. pub variants: Option>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let major_version = s.read::()? as u8; s.skip::(); // minor version if major_version != 1 { return None; } Some(Table { constants: s.parse_at_offset16::(data), glyph_info: s.parse_at_offset16::(data), variants: s.parse_at_offset16::(data), }) } } trait StreamExt<'a> { fn parse_at_offset16>(&mut self, data: &'a [u8]) -> Option; } impl<'a> StreamExt<'a> for Stream<'a> { fn parse_at_offset16>(&mut self, data: &'a [u8]) -> Option { let offset = self.read::>()??.to_usize(); data.get(offset..).and_then(T::parse) } } ttf-parser-0.24.1/src/tables/maxp.rs000064400000000000000000000014611046102023000153550ustar 00000000000000//! A [Maximum Profile Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/maxp) implementation. use core::num::NonZeroU16; use crate::parser::Stream; /// A [Maximum Profile Table](https://docs.microsoft.com/en-us/typography/opentype/spec/maxp). #[derive(Clone, Copy, Debug)] pub struct Table { /// The total number of glyphs in the face. pub number_of_glyphs: NonZeroU16, } impl Table { /// Parses a table from raw data. pub fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if !(version == 0x00005000 || version == 0x00010000) { return None; } let n = s.read::()?; let number_of_glyphs = NonZeroU16::new(n)?; Some(Table { number_of_glyphs }) } } ttf-parser-0.24.1/src/tables/mod.rs000064400000000000000000000021161046102023000151650ustar 00000000000000pub mod cbdt; pub mod cblc; mod cff; pub mod cmap; pub mod colr; pub mod cpal; pub mod glyf; pub mod head; pub mod hhea; pub mod hmtx; pub mod kern; pub mod loca; pub mod maxp; pub mod name; pub mod os2; pub mod post; pub mod sbix; pub mod svg; pub mod vhea; pub mod vorg; #[cfg(feature = "opentype-layout")] pub mod gdef; #[cfg(feature = "opentype-layout")] pub mod gpos; #[cfg(feature = "opentype-layout")] pub mod gsub; #[cfg(feature = "opentype-layout")] pub mod math; #[cfg(feature = "apple-layout")] pub mod ankr; #[cfg(feature = "apple-layout")] pub mod feat; #[cfg(feature = "apple-layout")] pub mod kerx; #[cfg(feature = "apple-layout")] pub mod morx; #[cfg(feature = "apple-layout")] pub mod trak; #[cfg(feature = "variable-fonts")] pub mod avar; #[cfg(feature = "variable-fonts")] pub mod fvar; #[cfg(feature = "variable-fonts")] pub mod gvar; #[cfg(feature = "variable-fonts")] pub mod hvar; #[cfg(feature = "variable-fonts")] pub mod mvar; #[cfg(feature = "variable-fonts")] pub mod vvar; pub use cff::cff1; #[cfg(feature = "variable-fonts")] pub use cff::cff2; pub use cff::CFFError; ttf-parser-0.24.1/src/tables/morx.rs000064400000000000000000000336401046102023000154010ustar 00000000000000//! An [Extended Glyph Metamorphosis Table]( //! https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6morx.html) implementation. // Note: We do not have tests for this table because it has a very complicated structure. // Specifically, the State Machine Tables. I have no idea how to generate them. // And all fonts that use this table are mainly Apple one, so we cannot use them for legal reasons. // // On the other hand, this table is tested indirectly by https://github.com/RazrFalcon/rustybuzz // And it has like 170 tests. Which is pretty good. // Therefore after applying any changes to this table, // you have to check that all rustybuzz tests are still passing. use core::num::NonZeroU16; use crate::parser::{FromData, LazyArray32, NumFrom, Offset, Offset32, Stream}; use crate::{aat, GlyphId}; /// The feature table is used to compute the sub-feature flags /// for a list of requested features and settings. #[derive(Clone, Copy, Debug)] pub struct Feature { /// The type of feature. pub kind: u16, /// The feature's setting (aka selector). pub setting: u16, /// Flags for the settings that this feature and setting enables. pub enable_flags: u32, /// Complement of flags for the settings that this feature and setting disable. pub disable_flags: u32, } impl FromData for Feature { const SIZE: usize = 12; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(Feature { kind: s.read::()?, setting: s.read::()?, enable_flags: s.read::()?, disable_flags: s.read::()?, }) } } /// A contextual subtable state table trailing data. #[derive(Clone, Copy, Debug)] pub struct ContextualEntryData { /// A mark index. pub mark_index: u16, /// A current index. pub current_index: u16, } impl FromData for ContextualEntryData { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(ContextualEntryData { mark_index: s.read::()?, current_index: s.read::()?, }) } } /// A contextual subtable. #[derive(Clone)] pub struct ContextualSubtable<'a> { /// The contextual glyph substitution state table. pub state: aat::ExtendedStateTable<'a, ContextualEntryData>, offsets_data: &'a [u8], offsets: LazyArray32<'a, Offset32>, number_of_glyphs: NonZeroU16, } impl<'a> ContextualSubtable<'a> { fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option { let mut s = Stream::new(data); let state = aat::ExtendedStateTable::parse(number_of_glyphs, &mut s)?; // While the spec clearly states that this is an // 'offset from the beginning of the state subtable', // it's actually not. Subtable header should not be included. let offset = s.read::()?.to_usize(); // The offsets list is unsized. let offsets_data = data.get(offset..)?; let offsets = LazyArray32::::new(offsets_data); Some(ContextualSubtable { state, offsets_data, offsets, number_of_glyphs, }) } /// Returns a [Lookup](aat::Lookup) at index. pub fn lookup(&self, index: u32) -> Option> { let offset = self.offsets.get(index)?.to_usize(); let lookup_data = self.offsets_data.get(offset..)?; aat::Lookup::parse(self.number_of_glyphs, lookup_data) } } impl core::fmt::Debug for ContextualSubtable<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "ContextualSubtable {{ ... }}") } } /// A ligature subtable. #[derive(Clone, Debug)] pub struct LigatureSubtable<'a> { /// A state table. pub state: aat::ExtendedStateTable<'a, u16>, /// Ligature actions. pub ligature_actions: LazyArray32<'a, u32>, /// Ligature components. pub components: LazyArray32<'a, u16>, /// Ligatures. pub ligatures: LazyArray32<'a, GlyphId>, } impl<'a> LigatureSubtable<'a> { fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option { let mut s = Stream::new(data); let state = aat::ExtendedStateTable::parse(number_of_glyphs, &mut s)?; // Offset are from `ExtendedStateTable`/`data`, not from subtable start. let ligature_action_offset = s.read::()?.to_usize(); let component_offset = s.read::()?.to_usize(); let ligature_offset = s.read::()?.to_usize(); // All three arrays are unsized, so we're simply reading/mapping all the data past offset. let ligature_actions = LazyArray32::::new(data.get(ligature_action_offset..)?); let components = LazyArray32::::new(data.get(component_offset..)?); let ligatures = LazyArray32::::new(data.get(ligature_offset..)?); Some(LigatureSubtable { state, ligature_actions, components, ligatures, }) } } /// A contextual subtable state table trailing data. #[derive(Clone, Copy, Debug)] pub struct InsertionEntryData { /// A current insert index. pub current_insert_index: u16, /// A marked insert index. pub marked_insert_index: u16, } impl FromData for InsertionEntryData { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(InsertionEntryData { current_insert_index: s.read::()?, marked_insert_index: s.read::()?, }) } } /// An insertion subtable. #[derive(Clone, Debug)] pub struct InsertionSubtable<'a> { /// A state table. pub state: aat::ExtendedStateTable<'a, InsertionEntryData>, /// Insertion glyphs. pub glyphs: LazyArray32<'a, GlyphId>, } impl<'a> InsertionSubtable<'a> { fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option { let mut s = Stream::new(data); let state = aat::ExtendedStateTable::parse(number_of_glyphs, &mut s)?; let offset = s.read::()?.to_usize(); // TODO: unsized array? // The list is unsized. let glyphs = LazyArray32::::new(data.get(offset..)?); Some(InsertionSubtable { state, glyphs }) } } /// A subtable kind. #[allow(missing_docs)] #[derive(Clone, Debug)] pub enum SubtableKind<'a> { Rearrangement(aat::ExtendedStateTable<'a, ()>), Contextual(ContextualSubtable<'a>), Ligature(LigatureSubtable<'a>), NonContextual(aat::Lookup<'a>), Insertion(InsertionSubtable<'a>), } /// A subtable coverage. #[derive(Clone, Copy, Debug)] pub struct Coverage(u8); #[rustfmt::skip] impl Coverage { /// If true, this subtable will process glyphs in logical order /// (or reverse logical order if [`is_vertical`](Self::is_vertical) is also true). #[inline] pub fn is_logical(self) -> bool { self.0 & 0x10 != 0 } /// If true, this subtable will be applied to both horizontal and vertical text /// ([`is_vertical`](Self::is_vertical) should be ignored). #[inline] pub fn is_all_directions(self) -> bool { self.0 & 0x20 != 0 } /// If true, this subtable will process glyphs in descending order. #[inline] pub fn is_backwards(self) -> bool { self.0 & 0x40 != 0 } /// If true, this subtable will only be applied to vertical text. #[inline] pub fn is_vertical(self) -> bool { self.0 & 0x80 != 0 } } /// A subtable in a metamorphosis chain. #[derive(Clone, Debug)] pub struct Subtable<'a> { /// A subtable kind. pub kind: SubtableKind<'a>, /// A subtable coverage. pub coverage: Coverage, /// Subtable feature flags. pub feature_flags: u32, } /// A list of subtables in a metamorphosis chain. /// /// The internal data layout is not designed for random access, /// therefore we're not providing the `get()` method and only an iterator. #[derive(Clone, Copy)] pub struct Subtables<'a> { count: u32, data: &'a [u8], number_of_glyphs: NonZeroU16, } impl<'a> IntoIterator for Subtables<'a> { type Item = Subtable<'a>; type IntoIter = SubtablesIter<'a>; #[inline] fn into_iter(self) -> Self::IntoIter { SubtablesIter { index: 0, count: self.count, stream: Stream::new(self.data), number_of_glyphs: self.number_of_glyphs, } } } impl core::fmt::Debug for Subtables<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Subtables {{ ... }}") } } /// An iterator over a metamorphosis chain subtables. #[allow(missing_debug_implementations)] #[derive(Clone)] pub struct SubtablesIter<'a> { index: u32, count: u32, stream: Stream<'a>, number_of_glyphs: NonZeroU16, } impl<'a> Iterator for SubtablesIter<'a> { type Item = Subtable<'a>; fn next(&mut self) -> Option { if self.index == self.count { return None; } let s = &mut self.stream; if s.at_end() { return None; } let len = s.read::()?; let coverage = Coverage(s.read::()?); s.skip::(); // reserved let kind = s.read::()?; let feature_flags = s.read::()?; const HEADER_LEN: usize = 12; let len = usize::num_from(len).checked_sub(HEADER_LEN)?; let subtables_data = s.read_bytes(len)?; let kind = match kind { 0 => { let mut s = Stream::new(subtables_data); let table = aat::ExtendedStateTable::parse(self.number_of_glyphs, &mut s)?; SubtableKind::Rearrangement(table) } 1 => { let table = ContextualSubtable::parse(self.number_of_glyphs, subtables_data)?; SubtableKind::Contextual(table) } 2 => { let table = LigatureSubtable::parse(self.number_of_glyphs, subtables_data)?; SubtableKind::Ligature(table) } // 3 - reserved 4 => SubtableKind::NonContextual(aat::Lookup::parse( self.number_of_glyphs, subtables_data, )?), 5 => { let table = InsertionSubtable::parse(self.number_of_glyphs, subtables_data)?; SubtableKind::Insertion(table) } _ => return None, }; Some(Subtable { kind, coverage, feature_flags, }) } } /// A metamorphosis chain. #[derive(Clone, Copy, Debug)] pub struct Chain<'a> { /// Default chain features. pub default_flags: u32, /// A list of chain features. pub features: LazyArray32<'a, Feature>, /// A list of chain subtables. pub subtables: Subtables<'a>, } /// A list of metamorphosis chains. /// /// The internal data layout is not designed for random access, /// therefore we're not providing the `get()` method and only an iterator. #[derive(Clone, Copy)] pub struct Chains<'a> { data: &'a [u8], count: u32, number_of_glyphs: NonZeroU16, } impl<'a> Chains<'a> { fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option { let mut s = Stream::new(data); s.skip::(); // version s.skip::(); // reserved let count = s.read::()?; Some(Chains { count, data: s.tail()?, number_of_glyphs, }) } } impl<'a> IntoIterator for Chains<'a> { type Item = Chain<'a>; type IntoIter = ChainsIter<'a>; #[inline] fn into_iter(self) -> Self::IntoIter { ChainsIter { index: 0, count: self.count, stream: Stream::new(self.data), number_of_glyphs: self.number_of_glyphs, } } } impl core::fmt::Debug for Chains<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Chains {{ ... }}") } } /// An iterator over metamorphosis chains. #[allow(missing_debug_implementations)] #[derive(Clone)] pub struct ChainsIter<'a> { index: u32, count: u32, stream: Stream<'a>, number_of_glyphs: NonZeroU16, } impl<'a> Iterator for ChainsIter<'a> { type Item = Chain<'a>; fn next(&mut self) -> Option { if self.index == self.count { return None; } if self.stream.at_end() { return None; } let default_flags = self.stream.read::()?; let len = self.stream.read::()?; let features_count = self.stream.read::()?; let subtables_count = self.stream.read::()?; let features = self.stream.read_array32::(features_count)?; const HEADER_LEN: usize = 16; let len = usize::num_from(len) .checked_sub(HEADER_LEN)? .checked_sub(Feature::SIZE * usize::num_from(features_count))?; let subtables_data = self.stream.read_bytes(len)?; let subtables = Subtables { data: subtables_data, count: subtables_count, number_of_glyphs: self.number_of_glyphs, }; Some(Chain { default_flags, features, subtables, }) } } /// An [Extended Glyph Metamorphosis Table]( /// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6morx.html). /// /// Subtable Glyph Coverage used by morx v3 is not supported. #[derive(Clone)] pub struct Table<'a> { /// A list of metamorphosis chains. pub chains: Chains<'a>, } impl core::fmt::Debug for Table<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Table {{ ... }}") } } impl<'a> Table<'a> { /// Parses a table from raw data. /// /// `number_of_glyphs` is from the `maxp` table. pub fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option { Chains::parse(number_of_glyphs, data).map(|chains| Self { chains }) } } ttf-parser-0.24.1/src/tables/mvar.rs000064400000000000000000000045541046102023000153630ustar 00000000000000//! A [Metrics Variations Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/mvar) implementation. use crate::parser::{FromData, LazyArray16, Offset, Offset16, Stream}; use crate::var_store::ItemVariationStore; use crate::{NormalizedCoordinate, Tag}; #[derive(Clone, Copy)] struct ValueRecord { value_tag: Tag, delta_set_outer_index: u16, delta_set_inner_index: u16, } impl FromData for ValueRecord { const SIZE: usize = 8; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(ValueRecord { value_tag: s.read::()?, delta_set_outer_index: s.read::()?, delta_set_inner_index: s.read::()?, }) } } /// A [Metrics Variations Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/mvar). #[derive(Clone, Copy)] pub struct Table<'a> { variation_store: ItemVariationStore<'a>, records: LazyArray16<'a, ValueRecord>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if version != 0x00010000 { return None; } s.skip::(); // reserved let value_record_size = s.read::()?; if usize::from(value_record_size) != ValueRecord::SIZE { return None; } let count = s.read::()?; if count == 0 { return None; } let var_store_offset = s.read::>()??.to_usize(); let records = s.read_array16::(count)?; let variation_store = ItemVariationStore::parse(Stream::new_at(data, var_store_offset)?)?; Some(Table { variation_store, records, }) } /// Returns a metric offset by tag. pub fn metric_offset(&self, tag: Tag, coordinates: &[NormalizedCoordinate]) -> Option { let (_, record) = self.records.binary_search_by(|r| r.value_tag.cmp(&tag))?; self.variation_store.parse_delta( record.delta_set_outer_index, record.delta_set_inner_index, coordinates, ) } } impl core::fmt::Debug for Table<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Table {{ ... }}") } } ttf-parser-0.24.1/src/tables/name.rs000064400000000000000000000227261046102023000153370ustar 00000000000000//! A [Naming Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/name) implementation. #[cfg(feature = "std")] use std::string::String; #[cfg(feature = "std")] use std::vec::Vec; use crate::parser::{FromData, LazyArray16, Offset, Offset16, Stream}; use crate::Language; /// A list of [name ID](https://docs.microsoft.com/en-us/typography/opentype/spec/name#name-ids)'s. pub mod name_id { #![allow(missing_docs)] pub const COPYRIGHT_NOTICE: u16 = 0; pub const FAMILY: u16 = 1; pub const SUBFAMILY: u16 = 2; pub const UNIQUE_ID: u16 = 3; pub const FULL_NAME: u16 = 4; pub const VERSION: u16 = 5; pub const POST_SCRIPT_NAME: u16 = 6; pub const TRADEMARK: u16 = 7; pub const MANUFACTURER: u16 = 8; pub const DESIGNER: u16 = 9; pub const DESCRIPTION: u16 = 10; pub const VENDOR_URL: u16 = 11; pub const DESIGNER_URL: u16 = 12; pub const LICENSE: u16 = 13; pub const LICENSE_URL: u16 = 14; // RESERVED = 15 pub const TYPOGRAPHIC_FAMILY: u16 = 16; pub const TYPOGRAPHIC_SUBFAMILY: u16 = 17; pub const COMPATIBLE_FULL: u16 = 18; pub const SAMPLE_TEXT: u16 = 19; pub const POST_SCRIPT_CID: u16 = 20; pub const WWS_FAMILY: u16 = 21; pub const WWS_SUBFAMILY: u16 = 22; pub const LIGHT_BACKGROUND_PALETTE: u16 = 23; pub const DARK_BACKGROUND_PALETTE: u16 = 24; pub const VARIATIONS_POST_SCRIPT_NAME_PREFIX: u16 = 25; } /// A [platform ID](https://docs.microsoft.com/en-us/typography/opentype/spec/name#platform-ids). #[allow(missing_docs)] #[derive(Clone, Copy, PartialEq, Eq, Debug)] pub enum PlatformId { Unicode, Macintosh, Iso, Windows, Custom, } impl FromData for PlatformId { const SIZE: usize = 2; #[inline] fn parse(data: &[u8]) -> Option { match u16::parse(data)? { 0 => Some(PlatformId::Unicode), 1 => Some(PlatformId::Macintosh), 2 => Some(PlatformId::Iso), 3 => Some(PlatformId::Windows), 4 => Some(PlatformId::Custom), _ => None, } } } #[inline] fn is_unicode_encoding(platform_id: PlatformId, encoding_id: u16) -> bool { // https://docs.microsoft.com/en-us/typography/opentype/spec/name#windows-encoding-ids const WINDOWS_SYMBOL_ENCODING_ID: u16 = 0; const WINDOWS_UNICODE_BMP_ENCODING_ID: u16 = 1; match platform_id { PlatformId::Unicode => true, PlatformId::Windows => matches!( encoding_id, WINDOWS_SYMBOL_ENCODING_ID | WINDOWS_UNICODE_BMP_ENCODING_ID ), _ => false, } } #[derive(Clone, Copy)] struct NameRecord { platform_id: PlatformId, encoding_id: u16, language_id: u16, name_id: u16, length: u16, offset: Offset16, } impl FromData for NameRecord { const SIZE: usize = 12; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(NameRecord { platform_id: s.read::()?, encoding_id: s.read::()?, language_id: s.read::()?, name_id: s.read::()?, length: s.read::()?, offset: s.read::()?, }) } } /// A [Name Record](https://docs.microsoft.com/en-us/typography/opentype/spec/name#name-records). #[derive(Clone, Copy)] pub struct Name<'a> { /// A platform ID. pub platform_id: PlatformId, /// A platform-specific encoding ID. pub encoding_id: u16, /// A language ID. pub language_id: u16, /// A [Name ID](https://docs.microsoft.com/en-us/typography/opentype/spec/name#name-ids). /// /// A predefined list of ID's can be found in the [`name_id`](name_id/index.html) module. pub name_id: u16, /// A raw name data. /// /// Can be in any encoding. Can be empty. pub name: &'a [u8], } impl<'a> Name<'a> { /// Returns the Name's data as a UTF-8 string. /// /// Only Unicode names are supported. And since they are stored as UTF-16BE, /// we can't return `&str` and have to allocate a `String`. /// /// Supports: /// - Unicode Platform ID /// - Windows Platform ID + Symbol /// - Windows Platform ID + Unicode BMP #[cfg(feature = "std")] #[inline(never)] pub fn to_string(&self) -> Option { if self.is_unicode() { self.name_from_utf16_be() } else { None } } /// Checks that the current Name data has a Unicode encoding. #[inline] pub fn is_unicode(&self) -> bool { is_unicode_encoding(self.platform_id, self.encoding_id) } #[cfg(feature = "std")] #[inline(never)] fn name_from_utf16_be(&self) -> Option { let mut name: Vec = Vec::new(); for c in LazyArray16::::new(self.name) { name.push(c); } String::from_utf16(&name).ok() } /// Returns a Name language. pub fn language(&self) -> Language { if self.platform_id == PlatformId::Windows { Language::windows_language(self.language_id) } else if self.platform_id == PlatformId::Macintosh && self.encoding_id == 0 && self.language_id == 0 { Language::English_UnitedStates } else { Language::Unknown } } } #[cfg(feature = "std")] impl<'a> core::fmt::Debug for Name<'a> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { let name = self.to_string(); f.debug_struct("Name") .field("name", &name.as_deref().unwrap_or("unsupported encoding")) .field("platform_id", &self.platform_id) .field("encoding_id", &self.encoding_id) .field("language_id", &self.language_id) .field("language", &self.language()) .field("name_id", &self.name_id) .finish() } } #[cfg(not(feature = "std"))] impl<'a> core::fmt::Debug for Name<'a> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { f.debug_struct("Name") .field("name", &self.name) .field("platform_id", &self.platform_id) .field("encoding_id", &self.encoding_id) .field("language_id", &self.language_id) .field("language", &self.language()) .field("name_id", &self.name_id) .finish() } } /// A list of face names. #[derive(Clone, Copy, Default)] pub struct Names<'a> { records: LazyArray16<'a, NameRecord>, storage: &'a [u8], } impl<'a> Names<'a> { /// Returns a name at index. pub fn get(&self, index: u16) -> Option> { let record = self.records.get(index)?; let name_start = record.offset.to_usize(); let name_end = name_start + usize::from(record.length); let name = self.storage.get(name_start..name_end)?; Some(Name { platform_id: record.platform_id, encoding_id: record.encoding_id, language_id: record.language_id, name_id: record.name_id, name, }) } /// Returns a number of name records. pub fn len(&self) -> u16 { self.records.len() } /// Checks if there are any name records. pub fn is_empty(&self) -> bool { self.records.is_empty() } } impl core::fmt::Debug for Names<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Names {{ ... }}") } } impl<'a> IntoIterator for Names<'a> { type Item = Name<'a>; type IntoIter = NamesIter<'a>; #[inline] fn into_iter(self) -> Self::IntoIter { NamesIter { names: self, index: 0, } } } /// An iterator over face names. #[derive(Clone, Copy)] #[allow(missing_debug_implementations)] pub struct NamesIter<'a> { names: Names<'a>, index: u16, } impl<'a> Iterator for NamesIter<'a> { type Item = Name<'a>; fn next(&mut self) -> Option { if self.index < self.names.len() { self.index += 1; self.names.get(self.index - 1) } else { None } } #[inline] fn count(self) -> usize { usize::from(self.names.len().saturating_sub(self.index)) } } /// A [Naming Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/name). #[derive(Clone, Copy, Default, Debug)] pub struct Table<'a> { /// A list of names. pub names: Names<'a>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { // https://docs.microsoft.com/en-us/typography/opentype/spec/name#naming-table-format-1 const LANG_TAG_RECORD_SIZE: u16 = 4; let mut s = Stream::new(data); let version = s.read::()?; let count = s.read::()?; let storage_offset = s.read::()?.to_usize(); if version == 0 { // Do nothing. } else if version == 1 { let lang_tag_count = s.read::()?; let lang_tag_len = lang_tag_count.checked_mul(LANG_TAG_RECORD_SIZE)?; s.advance(usize::from(lang_tag_len)); // langTagRecords } else { // Unsupported version. return None; } let records = s.read_array16::(count)?; if s.offset() < storage_offset { s.advance(storage_offset - s.offset()); } let storage = s.tail()?; Some(Table { names: Names { records, storage }, }) } } ttf-parser-0.24.1/src/tables/os2.rs000064400000000000000000000434121046102023000151150ustar 00000000000000//! A [OS/2 and Windows Metrics Table](https://docs.microsoft.com/en-us/typography/opentype/spec/os2) //! implementation. use crate::parser::Stream; use crate::LineMetrics; const WEIGHT_CLASS_OFFSET: usize = 4; const WIDTH_CLASS_OFFSET: usize = 6; const TYPE_OFFSET: usize = 8; const Y_SUBSCRIPT_X_SIZE_OFFSET: usize = 10; const Y_SUPERSCRIPT_X_SIZE_OFFSET: usize = 18; const Y_STRIKEOUT_SIZE_OFFSET: usize = 26; const Y_STRIKEOUT_POSITION_OFFSET: usize = 28; const UNICODE_RANGES_OFFSET: usize = 42; const SELECTION_OFFSET: usize = 62; const TYPO_ASCENDER_OFFSET: usize = 68; const TYPO_DESCENDER_OFFSET: usize = 70; const TYPO_LINE_GAP_OFFSET: usize = 72; const WIN_ASCENT: usize = 74; const WIN_DESCENT: usize = 76; const X_HEIGHT_OFFSET: usize = 86; const CAP_HEIGHT_OFFSET: usize = 88; /// A face [weight](https://docs.microsoft.com/en-us/typography/opentype/spec/os2#usweightclass). #[allow(missing_docs)] #[derive(Clone, Copy, Eq, PartialEq, Debug, Hash)] pub enum Weight { Thin, ExtraLight, Light, Normal, Medium, SemiBold, Bold, ExtraBold, Black, Other(u16), } impl Weight { /// Returns a numeric representation of a weight. #[inline] pub fn to_number(self) -> u16 { match self { Weight::Thin => 100, Weight::ExtraLight => 200, Weight::Light => 300, Weight::Normal => 400, Weight::Medium => 500, Weight::SemiBold => 600, Weight::Bold => 700, Weight::ExtraBold => 800, Weight::Black => 900, Weight::Other(n) => n, } } } impl From for Weight { #[inline] fn from(value: u16) -> Self { match value { 100 => Weight::Thin, 200 => Weight::ExtraLight, 300 => Weight::Light, 400 => Weight::Normal, 500 => Weight::Medium, 600 => Weight::SemiBold, 700 => Weight::Bold, 800 => Weight::ExtraBold, 900 => Weight::Black, _ => Weight::Other(value), } } } impl Default for Weight { #[inline] fn default() -> Self { Weight::Normal } } /// A face [width](https://docs.microsoft.com/en-us/typography/opentype/spec/os2#uswidthclass). #[allow(missing_docs)] #[derive(Clone, Copy, Eq, PartialEq, Ord, PartialOrd, Debug, Hash)] pub enum Width { UltraCondensed, ExtraCondensed, Condensed, SemiCondensed, Normal, SemiExpanded, Expanded, ExtraExpanded, UltraExpanded, } impl Width { /// Returns a numeric representation of a width. #[inline] pub fn to_number(self) -> u16 { match self { Width::UltraCondensed => 1, Width::ExtraCondensed => 2, Width::Condensed => 3, Width::SemiCondensed => 4, Width::Normal => 5, Width::SemiExpanded => 6, Width::Expanded => 7, Width::ExtraExpanded => 8, Width::UltraExpanded => 9, } } } impl Default for Width { #[inline] fn default() -> Self { Width::Normal } } /// Face [permissions](https://docs.microsoft.com/en-us/typography/opentype/spec/os2#fst). #[allow(missing_docs)] #[derive(Clone, Copy, Eq, PartialEq, Ord, PartialOrd, Debug, Hash)] pub enum Permissions { Installable, Restricted, PreviewAndPrint, Editable, } /// A face style. #[derive(Clone, Copy, PartialEq, Eq, Debug, Hash)] pub enum Style { /// A face that is neither italic not obliqued. Normal, /// A form that is generally cursive in nature. Italic, /// A typically-sloped version of the regular face. Oblique, } impl Default for Style { #[inline] fn default() -> Style { Style::Normal } } /// A script metrics used by subscript and superscript. #[repr(C)] #[derive(Clone, Copy, Eq, PartialEq, Debug, Hash)] pub struct ScriptMetrics { /// Horizontal face size. pub x_size: i16, /// Vertical face size. pub y_size: i16, /// X offset. pub x_offset: i16, /// Y offset. pub y_offset: i16, } // https://docs.microsoft.com/en-us/typography/opentype/spec/os2#fsselection #[derive(Clone, Copy)] struct SelectionFlags(u16); #[rustfmt::skip] impl SelectionFlags { #[inline] fn italic(self) -> bool { self.0 & (1 << 0) != 0 } #[inline] fn bold(self) -> bool { self.0 & (1 << 5) != 0 } // #[inline] fn regular(self) -> bool { self.0 & (1 << 6) != 0 } #[inline] fn use_typo_metrics(self) -> bool { self.0 & (1 << 7) != 0 } #[inline] fn oblique(self) -> bool { self.0 & (1 << 9) != 0 } } /// [Unicode Ranges](https://docs.microsoft.com/en-us/typography/opentype/spec/os2#ur). #[derive(Clone, Copy, Default, Debug)] pub struct UnicodeRanges(u128); impl UnicodeRanges { /// Checks if ranges contain the specified character. pub fn contains_char(&self, c: char) -> bool { let range = char_range_index(c); if range >= 0 { self.0 & (1 << range as u128) != 0 } else { false } } } fn char_range_index(c: char) -> i8 { match c as u32 { 0x0000..=0x007F => 0, 0x0080..=0x00FF => 1, 0x0100..=0x017F => 2, 0x0180..=0x024F => 3, 0x0250..=0x02AF => 4, 0x1D00..=0x1DBF => 4, 0x02B0..=0x02FF => 5, 0xA700..=0xA71F => 5, 0x0300..=0x036F => 6, 0x1DC0..=0x1DFF => 6, 0x0370..=0x03FF => 7, 0x2C80..=0x2CFF => 8, 0x0400..=0x052F => 9, 0x2DE0..=0x2DFF => 9, 0xA640..=0xA69F => 9, 0x0530..=0x058F => 10, 0x0590..=0x05FF => 11, 0xA500..=0xA63F => 12, 0x0600..=0x06FF => 13, 0x0750..=0x077F => 13, 0x07C0..=0x07FF => 14, 0x0900..=0x097F => 15, 0x0980..=0x09FF => 16, 0x0A00..=0x0A7F => 17, 0x0A80..=0x0AFF => 18, 0x0B00..=0x0B7F => 19, 0x0B80..=0x0BFF => 20, 0x0C00..=0x0C7F => 21, 0x0C80..=0x0CFF => 22, 0x0D00..=0x0D7F => 23, 0x0E00..=0x0E7F => 24, 0x0E80..=0x0EFF => 25, 0x10A0..=0x10FF => 26, 0x2D00..=0x2D2F => 26, 0x1B00..=0x1B7F => 27, 0x1100..=0x11FF => 28, 0x1E00..=0x1EFF => 29, 0x2C60..=0x2C7F => 29, 0xA720..=0xA7FF => 29, 0x1F00..=0x1FFF => 30, 0x2000..=0x206F => 31, 0x2E00..=0x2E7F => 31, 0x2070..=0x209F => 32, 0x20A0..=0x20CF => 33, 0x20D0..=0x20FF => 34, 0x2100..=0x214F => 35, 0x2150..=0x218F => 36, 0x2190..=0x21FF => 37, 0x27F0..=0x27FF => 37, 0x2900..=0x297F => 37, 0x2B00..=0x2BFF => 37, 0x2200..=0x22FF => 38, 0x2A00..=0x2AFF => 38, 0x27C0..=0x27EF => 38, 0x2980..=0x29FF => 38, 0x2300..=0x23FF => 39, 0x2400..=0x243F => 40, 0x2440..=0x245F => 41, 0x2460..=0x24FF => 42, 0x2500..=0x257F => 43, 0x2580..=0x259F => 44, 0x25A0..=0x25FF => 45, 0x2600..=0x26FF => 46, 0x2700..=0x27BF => 47, 0x3000..=0x303F => 48, 0x3040..=0x309F => 49, 0x30A0..=0x30FF => 50, 0x31F0..=0x31FF => 50, 0x3100..=0x312F => 51, 0x31A0..=0x31BF => 51, 0x3130..=0x318F => 52, 0xA840..=0xA87F => 53, 0x3200..=0x32FF => 54, 0x3300..=0x33FF => 55, 0xAC00..=0xD7AF => 56, // Ignore Non-Plane 0 (57), since this is not a real range. 0x10900..=0x1091F => 58, 0x4E00..=0x9FFF => 59, 0x2E80..=0x2FDF => 59, 0x2FF0..=0x2FFF => 59, 0x3400..=0x4DBF => 59, 0x20000..=0x2A6DF => 59, 0x3190..=0x319F => 59, 0xE000..=0xF8FF => 60, 0x31C0..=0x31EF => 61, 0xF900..=0xFAFF => 61, 0x2F800..=0x2FA1F => 61, 0xFB00..=0xFB4F => 62, 0xFB50..=0xFDFF => 63, 0xFE20..=0xFE2F => 64, 0xFE10..=0xFE1F => 65, 0xFE30..=0xFE4F => 65, 0xFE50..=0xFE6F => 66, 0xFE70..=0xFEFF => 67, 0xFF00..=0xFFEF => 68, 0xFFF0..=0xFFFF => 69, 0x0F00..=0x0FFF => 70, 0x0700..=0x074F => 71, 0x0780..=0x07BF => 72, 0x0D80..=0x0DFF => 73, 0x1000..=0x109F => 74, 0x1200..=0x139F => 75, 0x2D80..=0x2DDF => 75, 0x13A0..=0x13FF => 76, 0x1400..=0x167F => 77, 0x1680..=0x169F => 78, 0x16A0..=0x16FF => 79, 0x1780..=0x17FF => 80, 0x19E0..=0x19FF => 80, 0x1800..=0x18AF => 81, 0x2800..=0x28FF => 82, 0xA000..=0xA48F => 83, 0xA490..=0xA4CF => 83, 0x1700..=0x177F => 84, 0x10300..=0x1032F => 85, 0x10330..=0x1034F => 86, 0x10400..=0x1044F => 87, 0x1D000..=0x1D24F => 88, 0x1D400..=0x1D7FF => 89, 0xF0000..=0xFFFFD => 90, 0x100000..=0x10FFFD => 90, 0xFE00..=0xFE0F => 91, 0xE0100..=0xE01EF => 91, 0xE0000..=0xE007F => 92, 0x1900..=0x194F => 93, 0x1950..=0x197F => 94, 0x1980..=0x19DF => 95, 0x1A00..=0x1A1F => 96, 0x2C00..=0x2C5F => 97, 0x2D30..=0x2D7F => 98, 0x4DC0..=0x4DFF => 99, 0xA800..=0xA82F => 100, 0x10000..=0x1013F => 101, 0x10140..=0x1018F => 102, 0x10380..=0x1039F => 103, 0x103A0..=0x103DF => 104, 0x10450..=0x1047F => 105, 0x10480..=0x104AF => 106, 0x10800..=0x1083F => 107, 0x10A00..=0x10A5F => 108, 0x1D300..=0x1D35F => 109, 0x12000..=0x123FF => 110, 0x12400..=0x1247F => 110, 0x1D360..=0x1D37F => 111, 0x1B80..=0x1BBF => 112, 0x1C00..=0x1C4F => 113, 0x1C50..=0x1C7F => 114, 0xA880..=0xA8DF => 115, 0xA900..=0xA92F => 116, 0xA930..=0xA95F => 117, 0xAA00..=0xAA5F => 118, 0x10190..=0x101CF => 119, 0x101D0..=0x101FF => 120, 0x102A0..=0x102DF => 121, 0x10280..=0x1029F => 121, 0x10920..=0x1093F => 121, 0x1F030..=0x1F09F => 122, 0x1F000..=0x1F02F => 122, _ => -1, } } /// A [OS/2 and Windows Metrics Table](https://docs.microsoft.com/en-us/typography/opentype/spec/os2). #[derive(Clone, Copy)] pub struct Table<'a> { /// Table version. pub version: u8, data: &'a [u8], } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; let table_len = match version { 0 => 78, 1 => 86, 2 => 96, 3 => 96, 4 => 96, 5 => 100, _ => return None, }; // Do not check the exact length, because some fonts include // padding in table's length in table records, which is incorrect. if data.len() < table_len { return None; } Some(Table { version: version as u8, data, }) } /// Returns weight class. #[inline] pub fn weight(&self) -> Weight { Weight::from(Stream::read_at::(self.data, WEIGHT_CLASS_OFFSET).unwrap_or(0)) } /// Returns face width. #[inline] pub fn width(&self) -> Width { match Stream::read_at::(self.data, WIDTH_CLASS_OFFSET).unwrap_or(0) { 1 => Width::UltraCondensed, 2 => Width::ExtraCondensed, 3 => Width::Condensed, 4 => Width::SemiCondensed, 5 => Width::Normal, 6 => Width::SemiExpanded, 7 => Width::Expanded, 8 => Width::ExtraExpanded, 9 => Width::UltraExpanded, _ => Width::Normal, } } /// Returns face permissions. /// /// Returns `None` in case of a malformed value. #[inline] pub fn permissions(&self) -> Option { let n = Stream::read_at::(self.data, TYPE_OFFSET).unwrap_or(0); if self.version <= 2 { // Version 2 and prior, applications are allowed to take // the most permissive of provided flags let permission = if n & 0xF == 0 { Permissions::Installable } else if n & 8 != 0 { Permissions::Editable } else if n & 4 != 0 { Permissions::PreviewAndPrint } else { Permissions::Restricted }; Some(permission) } else { // Version 3 onwards, flags must be mutually exclusive. match n & 0xF { 0 => Some(Permissions::Installable), 2 => Some(Permissions::Restricted), 4 => Some(Permissions::PreviewAndPrint), 8 => Some(Permissions::Editable), _ => None, } } } /// Checks if the face allows embedding a subset, further restricted by [`Self::permissions`]. #[inline] pub fn is_subsetting_allowed(&self) -> bool { if self.version <= 1 { // Flag introduced in version 2 true } else { let n = Stream::read_at::(self.data, TYPE_OFFSET).unwrap_or(0); n & 0x0100 == 0 } } /// Checks if the face allows outline data to be embedded. /// /// If false, only bitmaps may be embedded in accordance with [`Self::permissions`]. /// /// If the font contains no bitmaps and this flag is not set, it implies no embedding is allowed. #[inline] pub fn is_outline_embedding_allowed(&self) -> bool { if self.version <= 1 { // Flag introduced in version 2 true } else { let n = Stream::read_at::(self.data, TYPE_OFFSET).unwrap_or(0); n & 0x0200 == 0 } } /// Returns subscript metrics. #[inline] pub fn subscript_metrics(&self) -> ScriptMetrics { let mut s = Stream::new_at(self.data, Y_SUBSCRIPT_X_SIZE_OFFSET).unwrap_or_default(); ScriptMetrics { x_size: s.read::().unwrap_or(0), y_size: s.read::().unwrap_or(0), x_offset: s.read::().unwrap_or(0), y_offset: s.read::().unwrap_or(0), } } /// Returns superscript metrics. #[inline] pub fn superscript_metrics(&self) -> ScriptMetrics { let mut s = Stream::new_at(self.data, Y_SUPERSCRIPT_X_SIZE_OFFSET).unwrap_or_default(); ScriptMetrics { x_size: s.read::().unwrap_or(0), y_size: s.read::().unwrap_or(0), x_offset: s.read::().unwrap_or(0), y_offset: s.read::().unwrap_or(0), } } /// Returns strikeout metrics. #[inline] pub fn strikeout_metrics(&self) -> LineMetrics { LineMetrics { thickness: Stream::read_at::(self.data, Y_STRIKEOUT_SIZE_OFFSET).unwrap_or(0), position: Stream::read_at::(self.data, Y_STRIKEOUT_POSITION_OFFSET).unwrap_or(0), } } /// Returns Unicode ranges. #[inline] pub fn unicode_ranges(&self) -> UnicodeRanges { let mut s = Stream::new_at(self.data, UNICODE_RANGES_OFFSET).unwrap(); let n1 = s.read::().unwrap_or(0) as u128; let n2 = s.read::().unwrap_or(0) as u128; let n3 = s.read::().unwrap_or(0) as u128; let n4 = s.read::().unwrap_or(0) as u128; UnicodeRanges(n4 << 96 | n3 << 64 | n2 << 32 | n1) } #[inline] fn fs_selection(&self) -> u16 { Stream::read_at::(self.data, SELECTION_OFFSET).unwrap_or(0) } /// Returns style. pub fn style(&self) -> Style { let flags = SelectionFlags(self.fs_selection()); if flags.italic() { Style::Italic } else if self.version >= 4 && flags.oblique() { Style::Oblique } else { Style::Normal } } /// Checks if face is bold. /// /// Do not confuse with [`Weight::Bold`]. #[inline] pub fn is_bold(&self) -> bool { SelectionFlags(self.fs_selection()).bold() } /// Checks if typographic metrics should be used. #[inline] pub fn use_typographic_metrics(&self) -> bool { if self.version < 4 { false } else { SelectionFlags(self.fs_selection()).use_typo_metrics() } } /// Returns typographic ascender. #[inline] pub fn typographic_ascender(&self) -> i16 { Stream::read_at::(self.data, TYPO_ASCENDER_OFFSET).unwrap_or(0) } /// Returns typographic descender. #[inline] pub fn typographic_descender(&self) -> i16 { Stream::read_at::(self.data, TYPO_DESCENDER_OFFSET).unwrap_or(0) } /// Returns typographic line gap. #[inline] pub fn typographic_line_gap(&self) -> i16 { Stream::read_at::(self.data, TYPO_LINE_GAP_OFFSET).unwrap_or(0) } /// Returns Windows ascender. #[inline] pub fn windows_ascender(&self) -> i16 { Stream::read_at::(self.data, WIN_ASCENT).unwrap_or(0) } /// Returns Windows descender. #[inline] pub fn windows_descender(&self) -> i16 { // Should be negated. -Stream::read_at::(self.data, WIN_DESCENT).unwrap_or(0) } /// Returns x height. /// /// Returns `None` version is < 2. #[inline] pub fn x_height(&self) -> Option { if self.version < 2 { None } else { Stream::read_at::(self.data, X_HEIGHT_OFFSET) } } /// Returns capital height. /// /// Returns `None` version is < 2. #[inline] pub fn capital_height(&self) -> Option { if self.version < 2 { None } else { Stream::read_at::(self.data, CAP_HEIGHT_OFFSET) } } } impl core::fmt::Debug for Table<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Table {{ ... }}") } } ttf-parser-0.24.1/src/tables/post.rs000064400000000000000000000214471046102023000154030ustar 00000000000000//! A [PostScript Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/post) implementation. use crate::parser::{Fixed, LazyArray16, Stream}; #[cfg(feature = "glyph-names")] use crate::GlyphId; use crate::LineMetrics; const ITALIC_ANGLE_OFFSET: usize = 4; const UNDERLINE_POSITION_OFFSET: usize = 8; const UNDERLINE_THICKNESS_OFFSET: usize = 10; const IS_FIXED_PITCH_OFFSET: usize = 12; // https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6post.html /// A list of Macintosh glyph names. #[cfg(feature = "glyph-names")] const MACINTOSH_NAMES: &[&str] = &[ ".notdef", ".null", "nonmarkingreturn", "space", "exclam", "quotedbl", "numbersign", "dollar", "percent", "ampersand", "quotesingle", "parenleft", "parenright", "asterisk", "plus", "comma", "hyphen", "period", "slash", "zero", "one", "two", "three", "four", "five", "six", "seven", "eight", "nine", "colon", "semicolon", "less", "equal", "greater", "question", "at", "A", "B", "C", "D", "E", "F", "G", "H", "I", "J", "K", "L", "M", "N", "O", "P", "Q", "R", "S", "T", "U", "V", "W", "X", "Y", "Z", "bracketleft", "backslash", "bracketright", "asciicircum", "underscore", "grave", "a", "b", "c", "d", "e", "f", "g", "h", "i", "j", "k", "l", "m", "n", "o", "p", "q", "r", "s", "t", "u", "v", "w", "x", "y", "z", "braceleft", "bar", "braceright", "asciitilde", "Adieresis", "Aring", "Ccedilla", "Eacute", "Ntilde", "Odieresis", "Udieresis", "aacute", "agrave", "acircumflex", "adieresis", "atilde", "aring", "ccedilla", "eacute", "egrave", "ecircumflex", "edieresis", "iacute", "igrave", "icircumflex", "idieresis", "ntilde", "oacute", "ograve", "ocircumflex", "odieresis", "otilde", "uacute", "ugrave", "ucircumflex", "udieresis", "dagger", "degree", "cent", "sterling", "section", "bullet", "paragraph", "germandbls", "registered", "copyright", "trademark", "acute", "dieresis", "notequal", "AE", "Oslash", "infinity", "plusminus", "lessequal", "greaterequal", "yen", "mu", "partialdiff", "summation", "product", "pi", "integral", "ordfeminine", "ordmasculine", "Omega", "ae", "oslash", "questiondown", "exclamdown", "logicalnot", "radical", "florin", "approxequal", "Delta", "guillemotleft", "guillemotright", "ellipsis", "nonbreakingspace", "Agrave", "Atilde", "Otilde", "OE", "oe", "endash", "emdash", "quotedblleft", "quotedblright", "quoteleft", "quoteright", "divide", "lozenge", "ydieresis", "Ydieresis", "fraction", "currency", "guilsinglleft", "guilsinglright", "fi", "fl", "daggerdbl", "periodcentered", "quotesinglbase", "quotedblbase", "perthousand", "Acircumflex", "Ecircumflex", "Aacute", "Edieresis", "Egrave", "Iacute", "Icircumflex", "Idieresis", "Igrave", "Oacute", "Ocircumflex", "apple", "Ograve", "Uacute", "Ucircumflex", "Ugrave", "dotlessi", "circumflex", "tilde", "macron", "breve", "dotaccent", "ring", "cedilla", "hungarumlaut", "ogonek", "caron", "Lslash", "lslash", "Scaron", "scaron", "Zcaron", "zcaron", "brokenbar", "Eth", "eth", "Yacute", "yacute", "Thorn", "thorn", "minus", "multiply", "onesuperior", "twosuperior", "threesuperior", "onehalf", "onequarter", "threequarters", "franc", "Gbreve", "gbreve", "Idotaccent", "Scedilla", "scedilla", "Cacute", "cacute", "Ccaron", "ccaron", "dcroat", ]; /// An iterator over glyph names. /// /// The `post` table doesn't provide the glyph names count, /// so we have to simply iterate over all of them to find it out. #[derive(Clone, Copy, Default)] pub struct Names<'a> { data: &'a [u8], offset: usize, } impl core::fmt::Debug for Names<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Names {{ ... }}") } } impl<'a> Iterator for Names<'a> { type Item = &'a str; fn next(&mut self) -> Option { // Glyph names are stored as Pascal Strings. // Meaning u8 (len) + [u8] (data). if self.offset >= self.data.len() { return None; } let len = self.data[self.offset]; self.offset += 1; // An empty name is an error. if len == 0 { return None; } let name = self.data.get(self.offset..self.offset + usize::from(len))?; self.offset += usize::from(len); core::str::from_utf8(name).ok() } } /// A [PostScript Table](https://docs.microsoft.com/en-us/typography/opentype/spec/post). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// Italic angle in counter-clockwise degrees from the vertical. pub italic_angle: f32, /// Underline metrics. pub underline_metrics: LineMetrics, /// Flag that indicates that the font is monospaced. pub is_monospaced: bool, glyph_indexes: LazyArray16<'a, u16>, names_data: &'a [u8], } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { // Do not check the exact length, because some fonts include // padding in table's length in table records, which is incorrect. if data.len() < 32 { return None; } let version = Stream::new(data).read::()?; if !(version == 0x00010000 || version == 0x00020000 || version == 0x00025000 || version == 0x00030000 || version == 0x00040000) { return None; } let italic_angle = Stream::read_at::(data, ITALIC_ANGLE_OFFSET)?.0; let underline_metrics = LineMetrics { position: Stream::read_at::(data, UNDERLINE_POSITION_OFFSET)?, thickness: Stream::read_at::(data, UNDERLINE_THICKNESS_OFFSET)?, }; let is_monospaced = Stream::read_at::(data, IS_FIXED_PITCH_OFFSET)? != 0; let mut names_data: &[u8] = &[]; let mut glyph_indexes = LazyArray16::default(); // Only version 2.0 of the table has data at the end. if version == 0x00020000 { let mut s = Stream::new_at(data, 32)?; let indexes_count = s.read::()?; glyph_indexes = s.read_array16::(indexes_count)?; names_data = s.tail()?; } Some(Table { italic_angle, underline_metrics, is_monospaced, names_data, glyph_indexes, }) } /// Returns a glyph name by ID. #[cfg(feature = "glyph-names")] pub fn glyph_name(&self, glyph_id: GlyphId) -> Option<&'a str> { let mut index = self.glyph_indexes.get(glyph_id.0)?; // 'If the name index is between 0 and 257, treat the name index // as a glyph index in the Macintosh standard order.' if usize::from(index) < MACINTOSH_NAMES.len() { Some(MACINTOSH_NAMES[usize::from(index)]) } else { // 'If the name index is between 258 and 65535, then subtract 258 and use that // to index into the list of Pascal strings at the end of the table.' index -= MACINTOSH_NAMES.len() as u16; self.names().nth(usize::from(index)) } } /// Returns a glyph ID by a name. #[cfg(feature = "glyph-names")] pub fn glyph_index_by_name(&self, name: &str) -> Option { let id = if let Some(index) = MACINTOSH_NAMES.iter().position(|n| *n == name) { self.glyph_indexes .into_iter() .position(|i| usize::from(i) == index)? } else { let mut index = self.names().position(|n| n == name)?; index += MACINTOSH_NAMES.len(); self.glyph_indexes .into_iter() .position(|i| usize::from(i) == index)? }; Some(GlyphId(id as u16)) } /// Returns an iterator over glyph names. /// /// Default/predefined names are not included. Just the one in the font file. pub fn names(&self) -> Names<'a> { Names { data: self.names_data, offset: 0, } } } ttf-parser-0.24.1/src/tables/sbix.rs000064400000000000000000000165531046102023000153650ustar 00000000000000//! A [Standard Bitmap Graphics Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/sbix) implementation. use core::convert::TryFrom; use core::num::NonZeroU16; use crate::parser::{FromData, LazyArray16, LazyArray32, Offset, Offset32, Stream}; use crate::{GlyphId, RasterGlyphImage, RasterImageFormat, Tag}; /// A strike of glyphs. #[derive(Clone, Copy)] pub struct Strike<'a> { /// The pixels per EM size for which this strike was designed. pub pixels_per_em: u16, /// The device pixel density (in PPI) for which this strike was designed. pub ppi: u16, offsets: LazyArray16<'a, Offset32>, /// Data from the beginning of the `Strikes` table. data: &'a [u8], } impl<'a> Strike<'a> { fn parse(number_of_glyphs: u16, data: &'a [u8]) -> Option { let mut s = Stream::new(data); let pixels_per_em = s.read::()?; let ppi = s.read::()?; let offsets = s.read_array16(number_of_glyphs)?; Some(Strike { pixels_per_em, ppi, offsets, data, }) } /// Returns a glyph data. pub fn get(&self, glyph_id: GlyphId) -> Option> { self.get_inner(glyph_id, 0) } fn get_inner(&self, glyph_id: GlyphId, depth: u8) -> Option> { // Recursive `dupe`. Bail. if depth == 10 { return None; } let start = self.offsets.get(glyph_id.0)?.to_usize(); let end = self.offsets.get(glyph_id.0.checked_add(1)?)?.to_usize(); if start == end { return None; } let data_len = end.checked_sub(start)?.checked_sub(8)?; // 8 is a Glyph data header size. let mut s = Stream::new_at(self.data, start)?; let x = s.read::()?; let y = s.read::()?; let image_type = s.read::()?; let image_data = s.read_bytes(data_len)?; // We do ignore `pdf` and `mask` intentionally, because Apple docs state that: // 'Support for the 'pdf ' and 'mask' data types and sbixDrawOutlines flag // are planned for future releases of iOS and OS X.' let format = match &image_type.to_bytes() { b"png " => RasterImageFormat::PNG, b"dupe" => { // 'The special graphicType of 'dupe' indicates that // the data field contains a glyph ID. The bitmap data for // the indicated glyph should be used for the current glyph.' let glyph_id = GlyphId::parse(image_data)?; // TODO: The spec isn't clear about which x/y values should we use. // The current glyph or the referenced one. return self.get_inner(glyph_id, depth + 1); } _ => { // TODO: support JPEG and TIFF return None; } }; let (width, height) = png_size(image_data)?; Some(RasterGlyphImage { x, y, width, height, pixels_per_em: self.pixels_per_em, format, data: image_data, }) } /// Returns the number of glyphs in this strike. #[inline] pub fn len(&self) -> u16 { // The last offset simply indicates the glyph data end. We don't need it. self.offsets.len() - 1 } /// Checks if there are any glyphs. pub fn is_empty(&self) -> bool { self.len() == 0 } } impl core::fmt::Debug for Strike<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Strike {{ ... }}") } } /// A list of [`Strike`]s. #[derive(Clone, Copy)] pub struct Strikes<'a> { /// `sbix` table data. data: &'a [u8], // Offsets from the beginning of the `sbix` table. offsets: LazyArray32<'a, Offset32>, // The total number of glyphs in the face + 1. From the `maxp` table. number_of_glyphs: u16, } impl<'a> Strikes<'a> { /// Returns a strike at the index. pub fn get(&self, index: u32) -> Option> { let offset = self.offsets.get(index)?.to_usize(); let data = self.data.get(offset..)?; Strike::parse(self.number_of_glyphs, data) } /// Returns the number of strikes. #[inline] pub fn len(&self) -> u32 { self.offsets.len() } /// Checks if there are any strikes. pub fn is_empty(&self) -> bool { self.offsets.is_empty() } } impl core::fmt::Debug for Strikes<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Strikes {{ ... }}") } } impl<'a> IntoIterator for Strikes<'a> { type Item = Strike<'a>; type IntoIter = StrikesIter<'a>; #[inline] fn into_iter(self) -> Self::IntoIter { StrikesIter { strikes: self, index: 0, } } } /// An iterator over [`Strikes`]. #[allow(missing_debug_implementations)] pub struct StrikesIter<'a> { strikes: Strikes<'a>, index: u32, } impl<'a> Iterator for StrikesIter<'a> { type Item = Strike<'a>; fn next(&mut self) -> Option { if self.index < self.strikes.len() { self.index += 1; self.strikes.get(self.index - 1) } else { None } } } /// A [Standard Bitmap Graphics Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/sbix). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// A list of [`Strike`]s. pub strikes: Strikes<'a>, } impl<'a> Table<'a> { /// Parses a table from raw data. /// /// - `number_of_glyphs` is from the `maxp` table. pub fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option { let number_of_glyphs = number_of_glyphs.get().checked_add(1)?; let mut s = Stream::new(data); let version = s.read::()?; if version != 1 { return None; } s.skip::(); // flags let strikes_count = s.read::()?; if strikes_count == 0 { return None; } let offsets = s.read_array32::(strikes_count)?; Some(Table { strikes: Strikes { data, offsets, number_of_glyphs, }, }) } /// Selects the best matching [`Strike`] based on `pixels_per_em`. pub fn best_strike(&self, pixels_per_em: u16) -> Option> { let mut idx = 0; let mut max_ppem = 0; for (i, strike) in self.strikes.into_iter().enumerate() { if (pixels_per_em <= strike.pixels_per_em && strike.pixels_per_em < max_ppem) || (pixels_per_em > max_ppem && strike.pixels_per_em > max_ppem) { idx = i as u32; max_ppem = strike.pixels_per_em; } } self.strikes.get(idx) } } // The `sbix` table doesn't store the image size, so we have to parse it manually. // Which is quite simple in case of PNG, but way more complex for JPEG. // Therefore we are omitting it for now. fn png_size(data: &[u8]) -> Option<(u16, u16)> { // PNG stores its size as u32 BE at a fixed offset. let mut s = Stream::new_at(data, 16)?; let width = s.read::()?; let height = s.read::()?; // PNG size larger than u16::MAX is an error. Some((u16::try_from(width).ok()?, u16::try_from(height).ok()?)) } ttf-parser-0.24.1/src/tables/svg.rs000064400000000000000000000115571046102023000152160ustar 00000000000000//! An [SVG Table](https://docs.microsoft.com/en-us/typography/opentype/spec/svg) implementation. use crate::parser::{FromData, LazyArray16, NumFrom, Offset, Offset32, Stream}; use crate::GlyphId; /// An [SVG documents]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/svg#svg-document-list). #[derive(Clone, Copy, Debug)] pub struct SvgDocument<'a> { /// The SVG document data. /// /// Can be stored as a string or as a gzip compressed data, aka SVGZ. pub data: &'a [u8], /// The first glyph ID for the range covered by this record. pub start_glyph_id: GlyphId, /// The last glyph ID, *inclusive*, for the range covered by this record. pub end_glyph_id: GlyphId, } impl SvgDocument<'_> { /// Returns the glyphs range. pub fn glyphs_range(&self) -> core::ops::RangeInclusive { self.start_glyph_id..=self.end_glyph_id } } #[derive(Clone, Copy)] struct SvgDocumentRecord { start_glyph_id: GlyphId, end_glyph_id: GlyphId, svg_doc_offset: Option, svg_doc_length: u32, } impl SvgDocumentRecord { fn glyphs_range(&self) -> core::ops::RangeInclusive { self.start_glyph_id..=self.end_glyph_id } } impl FromData for SvgDocumentRecord { const SIZE: usize = 12; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(SvgDocumentRecord { start_glyph_id: s.read::()?, end_glyph_id: s.read::()?, svg_doc_offset: s.read::>()?, svg_doc_length: s.read::()?, }) } } /// A list of [SVG documents]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/svg#svg-document-list). #[derive(Clone, Copy)] pub struct SvgDocumentsList<'a> { data: &'a [u8], records: LazyArray16<'a, SvgDocumentRecord>, } impl<'a> SvgDocumentsList<'a> { /// Returns SVG document data at index. /// /// `index` is not a GlyphId. You should use [`find()`](SvgDocumentsList::find) instead. #[inline] pub fn get(&self, index: u16) -> Option> { let record = self.records.get(index)?; let offset = record.svg_doc_offset?.to_usize(); self.data .get(offset..offset + usize::num_from(record.svg_doc_length)) .map(|data| SvgDocument { data, start_glyph_id: record.start_glyph_id, end_glyph_id: record.end_glyph_id, }) } /// Returns a SVG document data by glyph ID. #[inline] pub fn find(&self, glyph_id: GlyphId) -> Option> { let index = self .records .into_iter() .position(|v| v.glyphs_range().contains(&glyph_id))?; self.get(index as u16) } /// Returns the number of SVG documents in the list. pub fn len(&self) -> u16 { self.records.len() } /// Checks if the list is empty. pub fn is_empty(&self) -> bool { self.records.is_empty() } } impl core::fmt::Debug for SvgDocumentsList<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "SvgDocumentsList {{ ... }}") } } impl<'a> IntoIterator for SvgDocumentsList<'a> { type Item = SvgDocument<'a>; type IntoIter = SvgDocumentsListIter<'a>; #[inline] fn into_iter(self) -> Self::IntoIter { SvgDocumentsListIter { list: self, index: 0, } } } /// An iterator over [`SvgDocumentsList`] values. #[derive(Clone, Copy)] #[allow(missing_debug_implementations)] pub struct SvgDocumentsListIter<'a> { list: SvgDocumentsList<'a>, index: u16, } impl<'a> Iterator for SvgDocumentsListIter<'a> { type Item = SvgDocument<'a>; #[inline] fn next(&mut self) -> Option { if self.index < self.list.len() { self.index += 1; self.list.get(self.index - 1) } else { None } } #[inline] fn count(self) -> usize { usize::from(self.list.len().saturating_sub(self.index)) } } /// An [SVG Table](https://docs.microsoft.com/en-us/typography/opentype/spec/svg). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// A list of SVG documents. pub documents: SvgDocumentsList<'a>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); s.skip::(); // version let doc_list_offset = s.read::>()??; let mut s = Stream::new_at(data, doc_list_offset.to_usize())?; let count = s.read::()?; let records = s.read_array16::(count)?; Some(Table { documents: SvgDocumentsList { data: &data[doc_list_offset.0 as usize..], records, }, }) } } ttf-parser-0.24.1/src/tables/trak.rs000064400000000000000000000112131046102023000153450ustar 00000000000000//! A [Tracking Table]( //! https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6trak.html) implementation. use crate::parser::{Fixed, FromData, LazyArray16, Offset, Offset16, Offset32, Stream}; #[derive(Clone, Copy, Debug)] struct TrackTableRecord { value: Fixed, name_id: u16, offset: Offset16, // Offset from start of the table. } impl FromData for TrackTableRecord { const SIZE: usize = 8; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(TrackTableRecord { value: s.read::()?, name_id: s.read::()?, offset: s.read::()?, }) } } /// A single track. #[derive(Clone, Copy, Debug)] pub struct Track<'a> { /// A track value. pub value: f32, /// The `name` table index for the track's name. pub name_index: u16, /// A list of tracking values for each size. pub values: LazyArray16<'a, i16>, } /// A list of tracks. #[derive(Clone, Copy, Default, Debug)] pub struct Tracks<'a> { data: &'a [u8], // the whole table records: LazyArray16<'a, TrackTableRecord>, sizes_count: u16, } impl<'a> Tracks<'a> { /// Returns a track at index. pub fn get(&self, index: u16) -> Option> { let record = self.records.get(index)?; let mut s = Stream::new(self.data.get(record.offset.to_usize()..)?); Some(Track { value: record.value.0, values: s.read_array16::(self.sizes_count)?, name_index: record.name_id, }) } /// Returns the number of tracks. pub fn len(&self) -> u16 { self.records.len() } /// Checks if there are any tracks. pub fn is_empty(&self) -> bool { self.records.is_empty() } } impl<'a> IntoIterator for Tracks<'a> { type Item = Track<'a>; type IntoIter = TracksIter<'a>; #[inline] fn into_iter(self) -> Self::IntoIter { TracksIter { tracks: self, index: 0, } } } /// An iterator over [`Tracks`]. #[allow(missing_debug_implementations)] pub struct TracksIter<'a> { tracks: Tracks<'a>, index: u16, } impl<'a> Iterator for TracksIter<'a> { type Item = Track<'a>; fn next(&mut self) -> Option { if self.index < self.tracks.len() { self.index += 1; self.tracks.get(self.index - 1) } else { None } } } /// A track data. #[derive(Clone, Copy, Default, Debug)] pub struct TrackData<'a> { /// A list of tracks. pub tracks: Tracks<'a>, /// A list of sizes. pub sizes: LazyArray16<'a, Fixed>, } impl<'a> TrackData<'a> { fn parse(offset: usize, data: &'a [u8]) -> Option { let mut s = Stream::new_at(data, offset)?; let tracks_count = s.read::()?; let sizes_count = s.read::()?; let size_table_offset = s.read::()?; // Offset from start of the table. let tracks = Tracks { data, records: s.read_array16::(tracks_count)?, sizes_count, }; // TODO: Isn't the size table is directly after the tracks table?! // Why we need an offset then? let sizes = { let mut s = Stream::new_at(data, size_table_offset.to_usize())?; s.read_array16::(sizes_count)? }; Some(TrackData { tracks, sizes }) } } /// A [Tracking Table]( /// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6trak.html). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// Horizontal track data. pub horizontal: TrackData<'a>, /// Vertical track data. pub vertical: TrackData<'a>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if version != 0x00010000 { return None; } let format = s.read::()?; if format != 0 { return None; } let hor_offset = s.read::>()?; let ver_offset = s.read::>()?; s.skip::(); // reserved let horizontal = if let Some(offset) = hor_offset { TrackData::parse(offset.to_usize(), data)? } else { TrackData::default() }; let vertical = if let Some(offset) = ver_offset { TrackData::parse(offset.to_usize(), data)? } else { TrackData::default() }; Some(Table { horizontal, vertical, }) } } ttf-parser-0.24.1/src/tables/vhea.rs000064400000000000000000000023261046102023000153340ustar 00000000000000//! A [Vertical Header Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/vhea) implementation. use crate::parser::Stream; /// A [Vertical Header Table](https://docs.microsoft.com/en-us/typography/opentype/spec/vhea). #[derive(Clone, Copy, Default, Debug)] pub struct Table { /// Face ascender. pub ascender: i16, /// Face descender. pub descender: i16, /// Face line gap. pub line_gap: i16, /// Number of metrics in the `vmtx` table. pub number_of_metrics: u16, } impl Table { /// Parses a table from raw data. pub fn parse(data: &[u8]) -> Option { // Do not check the exact length, because some fonts include // padding in table's length in table records, which is incorrect. if data.len() < 36 { return None; } let mut s = Stream::new(data); s.skip::(); // version let ascender = s.read::()?; let descender = s.read::()?; let line_gap = s.read::()?; s.advance(24); let number_of_metrics = s.read::()?; Some(Table { ascender, descender, line_gap, number_of_metrics, }) } } ttf-parser-0.24.1/src/tables/vorg.rs000064400000000000000000000035411046102023000153660ustar 00000000000000//! A [Vertical Origin Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/vorg) implementation. use crate::parser::{FromData, LazyArray16, Stream}; use crate::GlyphId; /// Vertical origin metrics for the /// [Vertical Origin Table](https://docs.microsoft.com/en-us/typography/opentype/spec/vorg). #[derive(Clone, Copy, Debug)] pub struct VerticalOriginMetrics { /// Glyph ID. pub glyph_id: GlyphId, /// Y coordinate, in the font's design coordinate system, of the vertical origin. pub y: i16, } impl FromData for VerticalOriginMetrics { const SIZE: usize = 4; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(VerticalOriginMetrics { glyph_id: s.read::()?, y: s.read::()?, }) } } /// A [Vertical Origin Table](https://docs.microsoft.com/en-us/typography/opentype/spec/vorg). #[derive(Clone, Copy, Debug)] pub struct Table<'a> { /// Default origin. pub default_y: i16, /// A list of metrics for each glyph. /// /// Ordered by `glyph_id`. pub metrics: LazyArray16<'a, VerticalOriginMetrics>, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if version != 0x00010000 { return None; } let default_y = s.read::()?; let count = s.read::()?; let metrics = s.read_array16::(count)?; Some(Table { default_y, metrics }) } /// Returns glyph's Y origin. pub fn glyph_y_origin(&self, glyph_id: GlyphId) -> i16 { self.metrics .binary_search_by(|m| m.glyph_id.cmp(&glyph_id)) .map(|(_, m)| m.y) .unwrap_or(self.default_y) } } ttf-parser-0.24.1/src/tables/vvar.rs000064400000000000000000000076361046102023000154000ustar 00000000000000//! A [Vertical Metrics Variations Table]( //! https://docs.microsoft.com/en-us/typography/opentype/spec/hvar) implementation. use crate::delta_set::DeltaSetIndexMap; use crate::parser::{Offset, Offset32, Stream}; use crate::var_store::ItemVariationStore; use crate::{GlyphId, NormalizedCoordinate}; /// A [Vertical Metrics Variations Table]( /// https://docs.microsoft.com/en-us/typography/opentype/spec/hvar). #[derive(Clone, Copy)] pub struct Table<'a> { data: &'a [u8], variation_store: ItemVariationStore<'a>, advance_height_mapping_offset: Option, tsb_mapping_offset: Option, bsb_mapping_offset: Option, vorg_mapping_offset: Option, } impl<'a> Table<'a> { /// Parses a table from raw data. pub fn parse(data: &'a [u8]) -> Option { let mut s = Stream::new(data); let version = s.read::()?; if version != 0x00010000 { return None; } let variation_store_offset = s.read::()?; let var_store_s = Stream::new_at(data, variation_store_offset.to_usize())?; let variation_store = ItemVariationStore::parse(var_store_s)?; Some(Table { data, variation_store, advance_height_mapping_offset: s.read::>()?, tsb_mapping_offset: s.read::>()?, bsb_mapping_offset: s.read::>()?, vorg_mapping_offset: s.read::>()?, }) } /// Returns the advance height offset for a glyph. #[inline] pub fn advance_offset( &self, glyph_id: GlyphId, coordinates: &[NormalizedCoordinate], ) -> Option { let (outer_idx, inner_idx) = if let Some(offset) = self.advance_height_mapping_offset { DeltaSetIndexMap::new(self.data.get(offset.to_usize()..)?).map(glyph_id.0 as u32)? } else { // 'If there is no delta-set index mapping table for advance widths, // then glyph IDs implicitly provide the indices: // for a given glyph ID, the delta-set outer-level index is zero, // and the glyph ID is the delta-set inner-level index.' (0, glyph_id.0) }; self.variation_store .parse_delta(outer_idx, inner_idx, coordinates) } /// Returns the top side bearing offset for a glyph. #[inline] pub fn top_side_bearing_offset( &self, glyph_id: GlyphId, coordinates: &[NormalizedCoordinate], ) -> Option { let set_data = self.data.get(self.tsb_mapping_offset?.to_usize()..)?; self.side_bearing_offset(glyph_id, coordinates, set_data) } /// Returns the bottom side bearing offset for a glyph. #[inline] pub fn bottom_side_bearing_offset( &self, glyph_id: GlyphId, coordinates: &[NormalizedCoordinate], ) -> Option { let set_data = self.data.get(self.bsb_mapping_offset?.to_usize()..)?; self.side_bearing_offset(glyph_id, coordinates, set_data) } /// Returns the vertical origin offset for a glyph. #[inline] pub fn vertical_origin_offset( &self, glyph_id: GlyphId, coordinates: &[NormalizedCoordinate], ) -> Option { let set_data = self.data.get(self.vorg_mapping_offset?.to_usize()..)?; self.side_bearing_offset(glyph_id, coordinates, set_data) } fn side_bearing_offset( &self, glyph_id: GlyphId, coordinates: &[NormalizedCoordinate], set_data: &[u8], ) -> Option { let (outer_idx, inner_idx) = DeltaSetIndexMap::new(set_data).map(glyph_id.0 as u32)?; self.variation_store .parse_delta(outer_idx, inner_idx, coordinates) } } impl core::fmt::Debug for Table<'_> { fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result { write!(f, "Table {{ ... }}") } } ttf-parser-0.24.1/src/var_store.rs000064400000000000000000000141111046102023000151360ustar 00000000000000//! Implementation of Item Variation Store //! //! use crate::parser::{FromData, LazyArray16, NumFrom, Stream}; use crate::NormalizedCoordinate; #[derive(Clone, Copy, Debug)] pub(crate) struct ItemVariationStore<'a> { data: &'a [u8], data_offsets: LazyArray16<'a, u32>, pub regions: VariationRegionList<'a>, } impl<'a> Default for ItemVariationStore<'a> { #[inline] fn default() -> Self { ItemVariationStore { data: &[], data_offsets: LazyArray16::new(&[]), regions: VariationRegionList { axis_count: 0, regions: LazyArray16::new(&[]), }, } } } impl<'a> ItemVariationStore<'a> { #[inline] pub fn parse(mut s: Stream) -> Option { let data = s.tail()?; let mut regions_s = s.clone(); let format = s.read::()?; if format != 1 { return None; } let region_list_offset = s.read::()?; let count = s.read::()?; let offsets = s.read_array16::(count)?; let regions = { regions_s.advance(usize::num_from(region_list_offset)); // TODO: should be the same as in `fvar` let axis_count = regions_s.read::()?; let count = regions_s.read::()?; let total = count.checked_mul(axis_count)?; VariationRegionList { axis_count, regions: regions_s.read_array16::(total)?, } }; Some(ItemVariationStore { data, data_offsets: offsets, regions, }) } pub fn region_indices(&self, index: u16) -> Option> { // Offsets in bytes from the start of the item variation store // to each item variation data subtable. let offset = self.data_offsets.get(index)?; let mut s = Stream::new_at(self.data, usize::num_from(offset))?; s.skip::(); // item_count s.skip::(); // short_delta_count let count = s.read::()?; s.read_array16::(count) } pub fn parse_delta( &self, outer_index: u16, inner_index: u16, coordinates: &[NormalizedCoordinate], ) -> Option { let offset = self.data_offsets.get(outer_index)?; let mut s = Stream::new_at(self.data, usize::num_from(offset))?; let item_count = s.read::()?; let word_delta_count = s.read::()?; let region_index_count = s.read::()?; let region_indices = s.read_array16::(region_index_count)?; if inner_index >= item_count { return None; } let has_long_words = (word_delta_count & 0x8000) != 0; let word_delta_count = word_delta_count & 0x7FFF; // From the spec: The length of the data for each row, in bytes, is // regionIndexCount + (wordDeltaCount & WORD_DELTA_COUNT_MASK) // if the LONG_WORDS flag is not set, or 2 x that amount if the flag is set. let mut delta_set_len = word_delta_count + region_index_count; if has_long_words { delta_set_len *= 2; } s.advance(usize::from(inner_index).checked_mul(usize::from(delta_set_len))?); let mut delta = 0.0; let mut i = 0; while i < word_delta_count { let idx = region_indices.get(i)?; let num = if has_long_words { // TODO: use f64? s.read::()? as f32 } else { f32::from(s.read::()?) }; delta += num * self.regions.evaluate_region(idx, coordinates); i += 1; } while i < region_index_count { let idx = region_indices.get(i)?; let num = if has_long_words { f32::from(s.read::()?) } else { f32::from(s.read::()?) }; delta += num * self.regions.evaluate_region(idx, coordinates); i += 1; } Some(delta) } } #[derive(Clone, Copy, Debug)] pub struct VariationRegionList<'a> { axis_count: u16, regions: LazyArray16<'a, RegionAxisCoordinatesRecord>, } impl<'a> VariationRegionList<'a> { #[inline] pub(crate) fn evaluate_region(&self, index: u16, coordinates: &[NormalizedCoordinate]) -> f32 { let mut v = 1.0; for (i, coord) in coordinates.iter().enumerate() { let region = match self.regions.get(index * self.axis_count + i as u16) { Some(r) => r, None => return 0.0, }; let factor = region.evaluate_axis(coord.get()); if factor == 0.0 { return 0.0; } v *= factor; } v } } #[derive(Clone, Copy, Debug)] struct RegionAxisCoordinatesRecord { start_coord: i16, peak_coord: i16, end_coord: i16, } impl RegionAxisCoordinatesRecord { #[inline] pub fn evaluate_axis(&self, coord: i16) -> f32 { let start = self.start_coord; let peak = self.peak_coord; let end = self.end_coord; if start > peak || peak > end { return 1.0; } if start < 0 && end > 0 && peak != 0 { return 1.0; } if peak == 0 || coord == peak { return 1.0; } if coord <= start || end <= coord { return 0.0; } if coord < peak { f32::from(coord - start) / f32::from(peak - start) } else { f32::from(end - coord) / f32::from(end - peak) } } } impl FromData for RegionAxisCoordinatesRecord { const SIZE: usize = 6; #[inline] fn parse(data: &[u8]) -> Option { let mut s = Stream::new(data); Some(RegionAxisCoordinatesRecord { start_coord: s.read::()?, peak_coord: s.read::()?, end_coord: s.read::()?, }) } } ttf-parser-0.24.1/tests/bitmap.rs000064400000000000000000000036771046102023000150000ustar 00000000000000use ttf_parser::{RasterGlyphImage, RasterImageFormat}; // NOTE: Bitmap.otb is an incomplete example font that was created specifically for this test. // It is under the same license as the other source files in the project. static FONT_DATA: &[u8] = include_bytes!("fonts/bitmap.otb"); #[test] fn bitmap_font() { let face = ttf_parser::Face::parse(FONT_DATA, 0).unwrap(); assert_eq!(face.units_per_em(), 800); assert_eq!( face.glyph_hor_advance(face.glyph_index('a').unwrap()), Some(500) ); const W: u8 = 0; const B: u8 = 255; assert_eq!( face.glyph_raster_image(face.glyph_index('a').unwrap(), 1), Some(RasterGlyphImage { x: 0, y: 0, width: 4, height: 4, pixels_per_em: 8, format: RasterImageFormat::BitmapGray8, #[rustfmt::skip] data: &[ W, B, B, B, B, W, W, B, B, W, W, B, W, B, B, B ] }) ); assert_eq!( face.glyph_raster_image(face.glyph_index('d').unwrap(), 1), Some(RasterGlyphImage { x: 0, y: 0, width: 4, height: 6, pixels_per_em: 8, format: RasterImageFormat::BitmapGray8, #[rustfmt::skip] data: &[ W, W, W, B, W, W, W, B, W, B, B, B, B, W, W, B, B, W, W, B, W, B, B, B ] }) ); assert_eq!( face.glyph_raster_image(face.glyph_index('\"').unwrap(), 1), Some(RasterGlyphImage { x: 1, y: 4, width: 3, height: 2, pixels_per_em: 8, format: RasterImageFormat::BitmapGray8, #[rustfmt::skip] data: &[ B, W, B, B, W, B, ] }) ); } ttf-parser-0.24.1/tests/fonts/bitmap.otb000064400000000000000000000040301046102023000162510ustar 00000000000000  EBDT)1*EBLC_OS/2X"(dcmapLhead!Z\6hheaz;$hmtxd8maxp namep,~_< ['G['G8    d^KBnP"d8 d8 "-Dd"-Aa8L     #(Eb&4RbxV `>FMg oz+ 0   4 / ? U Ve `UntitledRegularBitsNPicas: Untitled: 2023UntitledVersion 1.0UntitledMade with Bits'n'Picas by Kreative Softwarehttp://www.kreativekorp.com/software/bitsnpicas/UntitledRegularBitsNPicas: Untitled: 2023UntitledVersion 1.0UntitledMade with Bits'n'Picas by Kreative Softwarehttp://www.kreativekorp.com/software/bitsnpicas/UntitledRegularBitsNPicas: Untitled: 2023UntitledVersion 1.0UntitledMade with Bits'n'Picas by Kreative Softwarehttp://www.kreativekorp.com/software/bitsnpicas/d$%&'DEFGttf-parser-0.24.1/tests/fonts/colr_1.ttf000064400000000000000000000521001046102023000161660ustar 00000000000000 @COLR0P:CPALSlOS/2EM`H`cmapvh4glyfhead'36hhea $hmtx. locaci?maxp( name`Xpost'K j5r_<Qg ???? Xdddddddddddddd^, 4 G TZcgm  q  x       !2\j !-9EQ]iu)5AMYeq} %1=IUamy !-9EQ]iu)5AMYeq} %1=IUamy !-9EQ]iu #/Qs  I U a m y  ! - 9 E Q ] i u    ) 5 A M Y1! 533##5#522 22(#557(%Kj%(e! #"&554632'4&#"3265e1+*22**2%vG99G-G77G1))190,,0 7,, 1!!!!5!!!! d7!d Dd7!d Dd7!d Dd7!d D1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!! 1! !! ! 7! 1!d7!dLD1!1!&& #5#35#46334&#&222X2222X #5#35#46334&#X;)d;)d;)d;)X);d);d);d);d^ #5#35#466334&&#(E)(E)(E)(E)X)E()E()E()E(,  #5#35#466334&&#6[76[76[76[7X7[67[67[67[6^R #5#35#466334&&#CrECrECrECrEXErCErCErCErC,  #!3!4>3!4.# /Rm>/Rm>/Rm>,/Rm>X>mR/,>mR/,>mR/>mR/R #!3!4>3!4.#R6aH6aH6aH^6aHXHa6^Ha6^Ha6Ha61!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1! 4 !Tu 2  h BU 4 <COLRv1 Static Test GlyphsRegularCOLRv1 Static Test Glyphs 2024-01-18T17:23:48.554408COLRv1 Static Test Glyphs Regular2024-01-18T17:23:48.554408COLRv1StaticTestGlyphs-RegularCOLRv1 Static Test GlyphsRegularCOLRv1 Static Test Glyphs 2024-01-18T17:23:48.554408COLRv1 Static Test Glyphs Regular2024-01-18T17:23:48.554408COLRv1StaticTestGlyphs-Regular      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~upem_box_glyph cross_glyphtrianglenegative_crosslinear_repeat_0_1linear_repeat_0.2_0.8linear_repeat_0_1.5linear_repeat_0.5_1.5sweep_0_360_pad_narrowsweep_60_300_pad_narrowsweep_0_90_pad_narrowsweep_90_0_pad_narrowsweep_45_90_pad_narrowsweep_90_45_pad_narrowsweep_247.5_292.5_pad_narrowsweep_-45_45_pad_narrowsweep_45_-45_pad_narrowsweep_270_440_pad_narrowsweep_440_270_pad_narrowsweep_-180_540_pad_narrowsweep_0_360_reflect_narrowsweep_60_300_reflect_narrowsweep_0_90_reflect_narrowsweep_90_0_reflect_narrowsweep_45_90_reflect_narrowsweep_90_45_reflect_narrow sweep_247.5_292.5_reflect_narrowsweep_-45_45_reflect_narrowsweep_45_-45_reflect_narrowsweep_270_440_reflect_narrowsweep_440_270_reflect_narrowsweep_-180_540_reflect_narrowsweep_0_360_repeat_narrowsweep_60_300_repeat_narrowsweep_0_90_repeat_narrowsweep_90_0_repeat_narrowsweep_45_90_repeat_narrowsweep_90_45_repeat_narrowsweep_247.5_292.5_repeat_narrowsweep_-45_45_repeat_narrowsweep_45_-45_repeat_narrowsweep_270_440_repeat_narrowsweep_440_270_repeat_narrowsweep_-180_540_repeat_narrowsweep_0_360_pad_widesweep_60_300_pad_widesweep_0_90_pad_widesweep_90_0_pad_widesweep_45_90_pad_widesweep_90_45_pad_widesweep_247.5_292.5_pad_widesweep_-45_45_pad_widesweep_45_-45_pad_widesweep_270_440_pad_widesweep_440_270_pad_widesweep_-180_540_pad_widesweep_0_360_reflect_widesweep_60_300_reflect_widesweep_0_90_reflect_widesweep_90_0_reflect_widesweep_45_90_reflect_widesweep_90_45_reflect_widesweep_247.5_292.5_reflect_widesweep_-45_45_reflect_widesweep_45_-45_reflect_widesweep_270_440_reflect_widesweep_440_270_reflect_widesweep_-180_540_reflect_widesweep_0_360_repeat_widesweep_60_300_repeat_widesweep_0_90_repeat_widesweep_90_0_repeat_widesweep_45_90_repeat_widesweep_90_45_repeat_widesweep_247.5_292.5_repeat_widesweep_-45_45_repeat_widesweep_45_-45_repeat_widesweep_270_440_repeat_widesweep_440_270_repeat_widesweep_-180_540_repeat_wide scale_0.5_1.5_center_500.0_500.0 scale_1.5_1.5_center_500.0_500.0scale_0.5_1.5_center_0_0scale_1.5_1.5_center_0_0scale_0.5_1.5_center_1000_1000scale_1.5_1.5_center_1000_1000linear_gradient_extend_mode_pad"linear_gradient_extend_mode_repeat#linear_gradient_extend_mode_reflect)radial_contained_gradient_extend_mode_pad,radial_contained_gradient_extend_mode_repeat-radial_contained_gradient_extend_mode_reflect*radial_horizontal_gradient_extend_mode_pad-radial_horizontal_gradient_extend_mode_repeat.radial_horizontal_gradient_extend_mode_reflectrotate_10_center_0_0rotate_-10_center_1000_1000rotate_25_center_500.0_500.0rotate_-15_center_500.0_500.0skew_25_0_center_0_0skew_25_0_center_500.0_500.0skew_0_15_center_0_0skew_0_15_center_500.0_500.0skew_-10_20_center_500.0_500.0skew_-10_20_center_1000_1000 transform_matrix_1_0_0_1_125_125 transform_matrix_1.5_0_0_1.5_0_01transform_matrix_0.9659_0.2588_-0.2588_0.9659_0_0+transform_matrix_1.0_0.0_0.6_1.0_-300.0_0.0 translate_0_0translate_0_100translate_0_-100translate_100_0translate_-100_0translate_200_200translate_-200_-200composite_CLEAR composite_SRCcomposite_DESTcomposite_SRC_OVERcomposite_DEST_OVERcomposite_SRC_INcomposite_DEST_INcomposite_SRC_OUTcomposite_DEST_OUTcomposite_SRC_ATOPcomposite_DEST_ATOP composite_XORcomposite_PLUScomposite_SCREENcomposite_OVERLAYcomposite_DARKENcomposite_LIGHTENcomposite_COLOR_DODGEcomposite_COLOR_BURNcomposite_HARD_LIGHTcomposite_SOFT_LIGHTcomposite_DIFFERENCEcomposite_EXCLUSIONcomposite_MULTIPLYcomposite_HSL_HUEcomposite_HSL_SATURATIONcomposite_HSL_COLORcomposite_HSL_LUMINOSITYforeground_color_linear_alpha_1!foreground_color_linear_alpha_0.3foreground_color_radial_alpha_1!foreground_color_radial_alpha_0.3foreground_color_sweep_alpha_1 foreground_color_sweep_alpha_0.3foreground_color_solid_alpha_1 foreground_color_solid_alpha_0.3clip_box_top_leftclip_box_bottom_leftclip_box_bottom_rightclip_box_top_rightclip_box_centerclip_shade_top_leftclip_shade_bottom_leftclip_shade_bottom_rightclip_shade_top_rightclip_shade_centerinset_clipped_radial_reflectgradient_p2_skewedcolored_circles_v0colored_circles_v1 circle_r50 circle_r100 circle_r150 circle_r200 circle_r250 circle_r300 circle_r350solid_colorline_alphapaintcolrglyph_cycle_firstpaintcolrglyph_cycle_secondno_cycle_multi_colrglyph,sweep_coincident_angles_forward_blue_red_pad0sweep_coincident_angles_forward_blue_red_reflect/sweep_coincident_angles_forward_blue_red_repeat.sweep_coincident_angles_forward_linen_gray_pad2sweep_coincident_angles_forward_linen_gray_reflect1sweep_coincident_angles_forward_linen_gray_repeat,sweep_coincident_angles_reverse_blue_red_pad0sweep_coincident_angles_reverse_blue_red_reflect/sweep_coincident_angles_reverse_blue_red_repeat.sweep_coincident_angles_reverse_linen_gray_pad2sweep_coincident_angles_reverse_linen_gray_reflect1sweep_coincident_angles_reverse_linen_gray_repeat+sweep_coincident_stops_forward_blue_red_pad/sweep_coincident_stops_forward_blue_red_reflect.sweep_coincident_stops_forward_blue_red_repeat-sweep_coincident_stops_forward_linen_gray_pad1sweep_coincident_stops_forward_linen_gray_reflect0sweep_coincident_stops_forward_linen_gray_repeat+sweep_coincident_stops_reverse_blue_red_pad/sweep_coincident_stops_reverse_blue_red_reflect.sweep_coincident_stops_reverse_blue_red_repeat-sweep_coincident_stops_reverse_linen_gray_pad1sweep_coincident_stops_reverse_linen_gray_reflect0sweep_coincident_stops_reverse_linen_gray_repeat$paint_glyph_nested_identity_identity%paint_glyph_nested_identity_translate)paint_glyph_nested_identity_rotate_origin)paint_glyph_nested_identity_rotate_center%paint_glyph_nested_translate_identity&paint_glyph_nested_translate_translate*paint_glyph_nested_translate_rotate_origin*paint_glyph_nested_translate_rotate_center)paint_glyph_nested_rotate_origin_identity*paint_glyph_nested_rotate_origin_translate.paint_glyph_nested_rotate_origin_rotate_origin.paint_glyph_nested_rotate_origin_rotate_center)paint_glyph_nested_rotate_center_identity*paint_glyph_nested_rotate_center_translate.paint_glyph_nested_rotate_center_rotate_origin.paint_glyph_nested_rotate_center_rotate_center"(H     9 K]o,>Pbt !"#$%1&C'U(g)y*+,-./01$263H4Z5l6~789:;< = > )? ;@ MA _B qC D E F G H I J K .L @M RN dO vP Q R S T U V W !X /Y CZ U[ k\ ] ^ _ ` a b /c Zd he zf g h i j k l m n 5o \p q r s t u v w x0y6z<{B|H}N~TZ`flrx~0W~"(U (U 6cDq%R`n|.J tdd,  dd, @33@  dd,@`@  dd, @`@ X@ XU* X X X xX fX( TX BX 0X \r X\r  X@@%U@0@ X@ XU* X X X xX fX( TX BX 0X \r X\r  X@@%U@0@ X@ XU* X X X xX fX( TX BX 0X \r X\r  X@@%U@0@ X@ XU* X X X xX fX( TX BX 0X \r X\r  X@ @@@P@ X@ XU* X X X xX fX( TX BX 0X \r X\r  X@ @@@P@ X@ XU* X X X xX fX( TX BX 0X \r X\r  X@ @@@P@ :' ` &`  ` `  ` ` 3 3 3 R g | d@ @@@ d@ @@@ d@ @@@  r   t wd cPU S@U ?,r +r  }}    EBAE   {h kXd [H K8d ;( + 88 ,     "$&(*,.0246 ddd, udd, 8X2X IX2X  X @ @@@  X @ 3@@ @ 3 K > = 0 / " !     _ d@ @@@8@ B  X@U@*@@@  X@U@*@@@  X@U@*@@@  X@U@*@@@  X@U@*@@@  X@U@*@@@  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @    xx  xx xx xx : L ^ n  U b  U Dxx  U &  U  U G (08@HPX`hpx0;FQWbmx         ~ v n f{ ^s Vk Nc F[ >S 6K .C &; 3 +   M  @ M  @ @ @ @ @ @ @ @  @ @j X@@@F@@X$4{4{X.((X qqX X X @X   ` SZbx`ir{dddd*KOO/hJ)*cA$c}šq ttf-parser-0.24.1/tests/fonts/colr_1_LICENSE000064400000000000000000000261351046102023000165450ustar 00000000000000 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.ttf-parser-0.24.1/tests/fonts/colr_1_variable.ttf000064400000000000000000001475501046102023000200510ustar 00000000000000COLR$KuCPAL|HVAR<OP/OS/2EM``STATpd¸€tcmapvh4fvarBgglyfgvar5I thead' 6hhea D$hmtx. locaci?(maxph name:  post'K1\_<QE ???? Xdddddddddddddd^, 4 G TZcgm  q  x       !2\j !-9EQ]iu)5AMYeq} %1=IUamy !-9EQ]iu)5AMYeq} %1=IUamy !-9EQ]iu #/Qs  I U a m y  ! - 9 E Q ] i u    ) 5 A M Y1! 533##5#522 22(#557(%Kj%(e! #"&554632'4&#"3265e1+*22**2%vG99G-G77G1))190,,0 7,, 1!!!!5!!!! d7!d Dd7!d Dd7!d Dd7!d D1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!! 1! !! ! 7! 1!d7!dLD1!1!&& #5#35#46334&#&222X2222X #5#35#46334&#X;)d;)d;)d;)X);d);d);d);d^ #5#35#466334&&#(E)(E)(E)(E)X)E()E()E()E(,  #5#35#466334&&#6[76[76[76[7X7[67[67[67[6^R #5#35#466334&&#CrECrECrECrEXErCErCErCErC,  #!3!4>3!4.# /Rm>/Rm>/Rm>,/Rm>X>mR/,>mR/,>mR/>mR/R #!3!4>3!4.#R6aH6aH6aH^6aHXHa6^Ha6^Ha6Ha61!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!1!d6"#X{ !@_{     )<O b   $;Sk !"#$%*&='P( c)*+ 6  l  Fw 4 @ 01 ,a > > >  >G 8 8 >  (3  &[  &  &  & & &  & ? & e @  @  @  . K 4 y 4  .  .  0 = 0 m *  *  *  *   * E !* o ".  #.  $&  %&  && A '& g (@  ),  *.  +.'COLRv1 Variable Test GlyphsRegularCOLRv1 Variable Test Glyphs 2024-01-18T17:23:48.602416COLRv1 Variable Test Glyphs Regular2024-01-18T17:23:48.602416COLRv1VariableTestGlyphs-RegularSweep Start Angle OffsetSweep End Angle OffsetSweep tests color stop offset 1Sweep tests color stop offset 2Sweep tests color stop offset 3Sweep tests color stop offset 4Scale tests, center x offsetScale tests, center y offsetScale tests, x or uniform scaleScale tests, y scaleGradient coords, x0Gradient coords, y0Gradient coords, x1Gradient coords, y1Gradient coords, x2Gradient coords, y2Gradient coords, r0Gradient coords, r1Extend tests color stop offset 1Extend tests color stop offset 2Extend tests color stop offset 3Var Rotate Angle OffsetVar Rotate Center X OffsetVar Rotate Center Y OffsetVar Skew X Angle OffsetVar Skew Y Angle OffsetVar Skew Center X OffsetVar Skew Center Y OffsetTransform scalars, xxTransform scalars, yxTransform scalars, xyTransform scalars, yyTransform scalars, dxTransform scalars, dyVar Translate dx OffsetVar Translate dy OffsetClipBox xMin offsetClipBox yMin offsetClipBox xMax offsetClipBox yMax offsetClipBox inner glyph inset offsetAlpha axis, PaintSolidAlpha axis, ColorStop 0Alpha axis, ColorStop 1COLRv1 Variable Test GlyphsRegularCOLRv1 Variable Test Glyphs 2024-01-18T17:23:48.602416COLRv1 Variable Test Glyphs Regular2024-01-18T17:23:48.602416COLRv1VariableTestGlyphs-RegularSweep Start Angle OffsetSweep End Angle OffsetSweep tests color stop offset 1Sweep tests color stop offset 2Sweep tests color stop offset 3Sweep tests color stop offset 4Scale tests, center x offsetScale tests, center y offsetScale tests, x or uniform scaleScale tests, y scaleGradient coords, x0Gradient coords, y0Gradient coords, x1Gradient coords, y1Gradient coords, x2Gradient coords, y2Gradient coords, r0Gradient coords, r1Extend tests color stop offset 1Extend tests color stop offset 2Extend tests color stop offset 3Var Rotate Angle OffsetVar Rotate Center X OffsetVar Rotate Center Y OffsetVar Skew X Angle OffsetVar Skew Y Angle OffsetVar Skew Center X OffsetVar Skew Center Y OffsetTransform scalars, xxTransform scalars, yxTransform scalars, xyTransform scalars, yyTransform scalars, dxTransform scalars, dyVar Translate dx OffsetVar Translate dy OffsetClipBox xMin offsetClipBox yMin offsetClipBox xMax offsetClipBox yMax offsetClipBox inner glyph inset offsetAlpha axis, PaintSolidAlpha axis, ColorStop 0Alpha axis, ColorStop 1      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~upem_box_glyph cross_glyphtrianglenegative_crosslinear_repeat_0_1linear_repeat_0.2_0.8linear_repeat_0_1.5linear_repeat_0.5_1.5sweep_0_360_pad_narrowsweep_60_300_pad_narrowsweep_0_90_pad_narrowsweep_90_0_pad_narrowsweep_45_90_pad_narrowsweep_90_45_pad_narrowsweep_247.5_292.5_pad_narrowsweep_-45_45_pad_narrowsweep_45_-45_pad_narrowsweep_270_440_pad_narrowsweep_440_270_pad_narrowsweep_-180_540_pad_narrowsweep_0_360_reflect_narrowsweep_60_300_reflect_narrowsweep_0_90_reflect_narrowsweep_90_0_reflect_narrowsweep_45_90_reflect_narrowsweep_90_45_reflect_narrow sweep_247.5_292.5_reflect_narrowsweep_-45_45_reflect_narrowsweep_45_-45_reflect_narrowsweep_270_440_reflect_narrowsweep_440_270_reflect_narrowsweep_-180_540_reflect_narrowsweep_0_360_repeat_narrowsweep_60_300_repeat_narrowsweep_0_90_repeat_narrowsweep_90_0_repeat_narrowsweep_45_90_repeat_narrowsweep_90_45_repeat_narrowsweep_247.5_292.5_repeat_narrowsweep_-45_45_repeat_narrowsweep_45_-45_repeat_narrowsweep_270_440_repeat_narrowsweep_440_270_repeat_narrowsweep_-180_540_repeat_narrowsweep_0_360_pad_widesweep_60_300_pad_widesweep_0_90_pad_widesweep_90_0_pad_widesweep_45_90_pad_widesweep_90_45_pad_widesweep_247.5_292.5_pad_widesweep_-45_45_pad_widesweep_45_-45_pad_widesweep_270_440_pad_widesweep_440_270_pad_widesweep_-180_540_pad_widesweep_0_360_reflect_widesweep_60_300_reflect_widesweep_0_90_reflect_widesweep_90_0_reflect_widesweep_45_90_reflect_widesweep_90_45_reflect_widesweep_247.5_292.5_reflect_widesweep_-45_45_reflect_widesweep_45_-45_reflect_widesweep_270_440_reflect_widesweep_440_270_reflect_widesweep_-180_540_reflect_widesweep_0_360_repeat_widesweep_60_300_repeat_widesweep_0_90_repeat_widesweep_90_0_repeat_widesweep_45_90_repeat_widesweep_90_45_repeat_widesweep_247.5_292.5_repeat_widesweep_-45_45_repeat_widesweep_45_-45_repeat_widesweep_270_440_repeat_widesweep_440_270_repeat_widesweep_-180_540_repeat_wide scale_0.5_1.5_center_500.0_500.0 scale_1.5_1.5_center_500.0_500.0scale_0.5_1.5_center_0_0scale_1.5_1.5_center_0_0scale_0.5_1.5_center_1000_1000scale_1.5_1.5_center_1000_1000linear_gradient_extend_mode_pad"linear_gradient_extend_mode_repeat#linear_gradient_extend_mode_reflect)radial_contained_gradient_extend_mode_pad,radial_contained_gradient_extend_mode_repeat-radial_contained_gradient_extend_mode_reflect*radial_horizontal_gradient_extend_mode_pad-radial_horizontal_gradient_extend_mode_repeat.radial_horizontal_gradient_extend_mode_reflectrotate_10_center_0_0rotate_-10_center_1000_1000rotate_25_center_500.0_500.0rotate_-15_center_500.0_500.0skew_25_0_center_0_0skew_25_0_center_500.0_500.0skew_0_15_center_0_0skew_0_15_center_500.0_500.0skew_-10_20_center_500.0_500.0skew_-10_20_center_1000_1000 transform_matrix_1_0_0_1_125_125 transform_matrix_1.5_0_0_1.5_0_01transform_matrix_0.9659_0.2588_-0.2588_0.9659_0_0+transform_matrix_1.0_0.0_0.6_1.0_-300.0_0.0 translate_0_0translate_0_100translate_0_-100translate_100_0translate_-100_0translate_200_200translate_-200_-200composite_CLEAR composite_SRCcomposite_DESTcomposite_SRC_OVERcomposite_DEST_OVERcomposite_SRC_INcomposite_DEST_INcomposite_SRC_OUTcomposite_DEST_OUTcomposite_SRC_ATOPcomposite_DEST_ATOP composite_XORcomposite_PLUScomposite_SCREENcomposite_OVERLAYcomposite_DARKENcomposite_LIGHTENcomposite_COLOR_DODGEcomposite_COLOR_BURNcomposite_HARD_LIGHTcomposite_SOFT_LIGHTcomposite_DIFFERENCEcomposite_EXCLUSIONcomposite_MULTIPLYcomposite_HSL_HUEcomposite_HSL_SATURATIONcomposite_HSL_COLORcomposite_HSL_LUMINOSITYforeground_color_linear_alpha_1!foreground_color_linear_alpha_0.3foreground_color_radial_alpha_1!foreground_color_radial_alpha_0.3foreground_color_sweep_alpha_1 foreground_color_sweep_alpha_0.3foreground_color_solid_alpha_1 foreground_color_solid_alpha_0.3clip_box_top_leftclip_box_bottom_leftclip_box_bottom_rightclip_box_top_rightclip_box_centerclip_shade_top_leftclip_shade_bottom_leftclip_shade_bottom_rightclip_shade_top_rightclip_shade_centerinset_clipped_radial_reflectgradient_p2_skewedcolored_circles_v0colored_circles_v1 circle_r50 circle_r100 circle_r150 circle_r200 circle_r250 circle_r300 circle_r350solid_colorline_alphapaintcolrglyph_cycle_firstpaintcolrglyph_cycle_secondno_cycle_multi_colrglyph,sweep_coincident_angles_forward_blue_red_pad0sweep_coincident_angles_forward_blue_red_reflect/sweep_coincident_angles_forward_blue_red_repeat.sweep_coincident_angles_forward_linen_gray_pad2sweep_coincident_angles_forward_linen_gray_reflect1sweep_coincident_angles_forward_linen_gray_repeat,sweep_coincident_angles_reverse_blue_red_pad0sweep_coincident_angles_reverse_blue_red_reflect/sweep_coincident_angles_reverse_blue_red_repeat.sweep_coincident_angles_reverse_linen_gray_pad2sweep_coincident_angles_reverse_linen_gray_reflect1sweep_coincident_angles_reverse_linen_gray_repeat+sweep_coincident_stops_forward_blue_red_pad/sweep_coincident_stops_forward_blue_red_reflect.sweep_coincident_stops_forward_blue_red_repeat-sweep_coincident_stops_forward_linen_gray_pad1sweep_coincident_stops_forward_linen_gray_reflect0sweep_coincident_stops_forward_linen_gray_repeat+sweep_coincident_stops_reverse_blue_red_pad/sweep_coincident_stops_reverse_blue_red_reflect.sweep_coincident_stops_reverse_blue_red_repeat-sweep_coincident_stops_reverse_linen_gray_pad1sweep_coincident_stops_reverse_linen_gray_reflect0sweep_coincident_stops_reverse_linen_gray_repeat$paint_glyph_nested_identity_identity%paint_glyph_nested_identity_translate)paint_glyph_nested_identity_rotate_origin)paint_glyph_nested_identity_rotate_center%paint_glyph_nested_translate_identity&paint_glyph_nested_translate_translate*paint_glyph_nested_translate_rotate_origin*paint_glyph_nested_translate_rotate_center)paint_glyph_nested_rotate_origin_identity*paint_glyph_nested_rotate_origin_translate.paint_glyph_nested_rotate_origin_rotate_origin.paint_glyph_nested_rotate_origin_rotate_center)paint_glyph_nested_rotate_center_identity*paint_glyph_nested_rotate_center_translate.paint_glyph_nested_rotate_center_rotate_origin.paint_glyph_nested_rotate_center_rotate_center"(H0     9 Oe{+l !2"H#^$%&'() *#+9,O-e.{/0123 4 *5 @6 V7 l8 9 : ; < = > 1? G@ ]A sB C D E F G H 8I NJ dK zL M N O P Q R S *T kU V W X Y Z [ \ !] ;^ U_ o` a b c:dLebfxghijklmnAolpqrstuv&w:xdyjzp{v||}~ "8NdDJPSV\=\=jKx,Y&Db~ dd,  dd, @33@  dd,@`@  dd, @`@  X@  XU*  X  X  X  X  ~X(  hX  RX  <X \r  &X\r   X @@%U@0@  X@  XU*  X  X  X  X  ~X(  hX  RX  <X \r  &X\r   X @@%U@0@  X@  XU*  X  X  X  X  ~X(  hX  RX  <X \r  &X\r   X @@%U@0@  X@  XU*  X  X  X  X  ~X(  hX  RX  <X \r  &X\r   X @ @@@P@  X@  XU*  X  X  X  X  ~X(  hX  RX  <X \r  &X\r   X @ @@@P@  X@  XU*  X  X  X  X  ~X(  hX  RX  <X \r  &X\r   X @ @@@P@  ` `  ` `  ` o` 3 3 3 b" " " d"@ @@@ d"@ @@@ d"@ @@@  (  r) , ) / / U/ xU/ s`r/ [Hr/ C 0}}3  3  EBAE3  3 9 pd9 o\9 [Hd9 G49 3 9  889 ,     "$&(*,.0246 ddd, udd, 8X2X IX2X  X @ @@@  X @ 3@@ @ 3 K > = 0 / " !     _ d@ @@@8@ B  X@U@*@@@  X@U@*@@@  X@U@*@@@  X@U@*@@@  X@U@*@@@  X@U@*@@@  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @  X @ @ @ @     rxx xx xx xx : L ^ n  U b  U Dxx  U &  U  U @@@G (08@HPX`hpx0;FQ\gr}         ~ v n f{ ^s Vk Nc F[ >S 6K .C &; 3 +   M  @ M  @ @ @ @ @ @ @ @  @ @;j X@<@@>F@@X$4{4{X.((X qqX X X @X   ` SZbx`ivd@@@@@ddDd?H     W(W:WLW`WnWWWWWWWWXXX8XVXtXXXXY YY.Y@YRYdYvYY,T@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@  *qMN OP  QR  %$'&)( S88 !"#+, -./0  1234 789:;<=>?@ AB 56CD  EFGH  IJKL  *KOO/hJ)*cA$c}šq * ,,SWPSSWPESWC1SWC2SWC3SWC4SCOXSCOYSCSXSCSY GRX0 GRY0 GRX1 GRY1 GRX2GRY2GRR0GRR1COL1COL2COL3ROTAROTXROTYSKXASKYASKCXSKCYTRXXTRYXTRXYTRYYTRDX TRDY!!TLDX""TLDY##CLXI$$CLYI%%CLXA&&CLYA''CLIO((APH1))APH2**APH3++,SWPSZSWPEZSWC1SWC2SWC3SWC4SCOX8SCOY8SCSXSCSY GRX0 GRY0 GRX1 GRY1 GRX2GRY2GRR0GRR1COL1COL2COL3ROTA0ROTX ROTY SKXAZSKYAZSKCX SKCY TRXXTRYXTRXYTRYYTRDX  TRDY !TLDX "TLDY #CLXI $CLYI %CLXA &CLYA 'CLIO (APH1)APH2*APH3+,-<K~J@@@@ A􁇃A  A􁇃A  A􁇃A  A􁇃A  A􁇃A $  A A@ @A A􃇇A A$  A A@ @A A􃇇A A$  A A@ @A A􃇇A A$  A A@ @A A􃇇A A$  A A@ @A A􃇇A A@A􁇃A ttf-parser-0.24.1/tests/fonts/demo.ttf000064400000000000000000000006201046102023000157330ustar 00000000000000@0cmap v,glyff4\head5|6hheaa$hmtxtjloca.,maxp )D_</\XpXX Xd  AA.dX3!%!!d4\D(l 33#'#c`Y>B @pttf-parser-0.24.1/tests/fonts/demo.ttx000064400000000000000000000060401046102023000157570ustar 00000000000000 ttf-parser-0.24.1/tests/tables/aat.rs000064400000000000000000000265341046102023000155400ustar 00000000000000use std::num::NonZeroU16; use ttf_parser::GlyphId; use ttf_parser::apple_layout::Lookup; use crate::{convert, Unit::*}; mod format0 { use super::*; #[test] fn single() { let data = convert(&[ UInt16(0), // format UInt16(10), // value ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert_eq!(table.value(GlyphId(0)).unwrap(), 10); assert!(table.value(GlyphId(1)).is_none()); } #[test] fn not_enough_glyphs() { let data = convert(&[ UInt16(0), // format UInt16(10), // value ]); assert!(Lookup::parse(NonZeroU16::new(2).unwrap(), &data).is_none()); } #[test] fn too_many_glyphs() { let data = convert(&[ UInt16(0), // format UInt16(10), // value UInt16(11), // value <-- will be ignored ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert_eq!(table.value(GlyphId(0)).unwrap(), 10); assert!(table.value(GlyphId(1)).is_none()); } } mod format2 { use super::*; #[test] fn single() { let data = convert(&[ UInt16(2), // format // Binary Search Table UInt16(6), // segment size UInt16(1), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it // Segment [0] UInt16(118), // last glyph UInt16(118), // first glyph UInt16(10), // value ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert_eq!(table.value(GlyphId(118)).unwrap(), 10); assert!(table.value(GlyphId(1)).is_none()); } #[test] fn range() { let data = convert(&[ UInt16(2), // format // Binary Search Table UInt16(6), // segment size UInt16(1), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it // Segment [0] UInt16(7), // last glyph UInt16(5), // first glyph UInt16(18), // offset ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert!(table.value(GlyphId(4)).is_none()); assert_eq!(table.value(GlyphId(5)).unwrap(), 18); assert_eq!(table.value(GlyphId(6)).unwrap(), 18); assert_eq!(table.value(GlyphId(7)).unwrap(), 18); assert!(table.value(GlyphId(8)).is_none()); } } mod format4 { use super::*; #[test] fn single() { let data = convert(&[ UInt16(4), // format // Binary Search Table UInt16(6), // segment size UInt16(1), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it // Segment [0] UInt16(118), // last glyph UInt16(118), // first glyph UInt16(18), // offset // Values [0] UInt16(10), // value [0] ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert_eq!(table.value(GlyphId(118)).unwrap(), 10); assert!(table.value(GlyphId(1)).is_none()); } #[test] fn range() { let data = convert(&[ UInt16(4), // format // Binary Search Table UInt16(6), // segment size UInt16(1), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it // Segment [0] UInt16(7), // last glyph UInt16(5), // first glyph UInt16(18), // offset // Values [0] UInt16(10), // value [0] UInt16(11), // value [1] UInt16(12), // value [2] ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert!(table.value(GlyphId(4)).is_none()); assert_eq!(table.value(GlyphId(5)).unwrap(), 10); assert_eq!(table.value(GlyphId(6)).unwrap(), 11); assert_eq!(table.value(GlyphId(7)).unwrap(), 12); assert!(table.value(GlyphId(8)).is_none()); } } mod format6 { use super::*; #[test] fn single() { let data = convert(&[ UInt16(6), // format // Binary Search Table UInt16(4), // segment size UInt16(1), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it // Segment [0] UInt16(0), // glyph UInt16(10), // value ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert_eq!(table.value(GlyphId(0)).unwrap(), 10); assert!(table.value(GlyphId(1)).is_none()); } #[test] fn multiple() { let data = convert(&[ UInt16(6), // format // Binary Search Table UInt16(4), // segment size UInt16(3), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it // Segment [0] UInt16(0), // glyph UInt16(10), // value // Segment [1] UInt16(5), // glyph UInt16(20), // value // Segment [2] UInt16(10), // glyph UInt16(30), // value ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert_eq!(table.value(GlyphId(0)).unwrap(), 10); assert_eq!(table.value(GlyphId(5)).unwrap(), 20); assert_eq!(table.value(GlyphId(10)).unwrap(), 30); assert!(table.value(GlyphId(1)).is_none()); } // Tests below are indirectly testing BinarySearchTable. #[test] fn no_segments() { let data = convert(&[ UInt16(6), // format // Binary Search Table UInt16(4), // segment size UInt16(0), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it ]); assert!(Lookup::parse(NonZeroU16::new(1).unwrap(), &data).is_none()); } #[test] fn ignore_termination() { let data = convert(&[ UInt16(6), // format // Binary Search Table UInt16(4), // segment size UInt16(2), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it // Segment [0] UInt16(0), // glyph UInt16(10), // value // Segment [1] UInt16(0xFFFF), // glyph UInt16(0xFFFF), // value ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert!(table.value(GlyphId(0xFFFF)).is_none()); } #[test] fn only_termination() { let data = convert(&[ UInt16(6), // format // Binary Search Table UInt16(4), // segment size UInt16(1), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it // Segment [0] UInt16(0xFFFF), // glyph UInt16(0xFFFF), // value ]); assert!(Lookup::parse(NonZeroU16::new(1).unwrap(), &data).is_none()); } #[test] fn invalid_segment_size() { let data = convert(&[ UInt16(6), // format // Binary Search Table UInt16(8), // segment size <-- must be 4 UInt16(1), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it // Segment [0] UInt16(0), // glyph UInt16(10), // value ]); assert!(Lookup::parse(NonZeroU16::new(1).unwrap(), &data).is_none()); } } mod format8 { use super::*; #[test] fn single() { let data = convert(&[ UInt16(8), // format UInt16(0), // first glyph UInt16(1), // glyphs count UInt16(2), // value [0] ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert_eq!(table.value(GlyphId(0)).unwrap(), 2); assert!(table.value(GlyphId(1)).is_none()); } #[test] fn non_zero_first() { let data = convert(&[ UInt16(8), // format UInt16(5), // first glyph UInt16(1), // glyphs count UInt16(2), // value [0] ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert_eq!(table.value(GlyphId(5)).unwrap(), 2); assert!(table.value(GlyphId(1)).is_none()); assert!(table.value(GlyphId(6)).is_none()); } } mod format10 { use super::*; #[test] fn single() { let data = convert(&[ UInt16(10), // format UInt16(1), // value size: u8 UInt16(0), // first glyph UInt16(1), // glyphs count UInt8(2), // value [0] ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert_eq!(table.value(GlyphId(0)).unwrap(), 2); assert!(table.value(GlyphId(1)).is_none()); } #[test] fn invalid_value_size() { let data = convert(&[ UInt16(10), // format UInt16(50), // value size <-- invalid UInt16(0), // first glyph UInt16(1), // glyphs count UInt8(2), // value [0] ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert!(table.value(GlyphId(0)).is_none()); } #[test] fn unsupported_value_size() { let data = convert(&[ UInt16(10), // format UInt16(8), // value size <-- we do not support u64 UInt16(0), // first glyph UInt16(1), // glyphs count Raw(&[0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02]), // value [0] ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert!(table.value(GlyphId(0)).is_none()); } #[test] fn u32_value_size() { let data = convert(&[ UInt16(10), // format UInt16(4), // value size UInt16(0), // first glyph UInt16(1), // glyphs count UInt32(0xFFFF + 10), // value [0] <-- will be truncated ]); let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert_eq!(table.value(GlyphId(0)).unwrap(), 9); } } ttf-parser-0.24.1/tests/tables/ankr.rs000064400000000000000000000072411046102023000157200ustar 00000000000000use std::num::NonZeroU16; use ttf_parser::GlyphId; use ttf_parser::ankr::{Table, Point}; use crate::{convert, Unit::*}; #[test] fn empty() { let data = convert(&[ UInt16(0), // version UInt16(0), // reserved UInt32(0), // offset to lookup table UInt32(0), // offset to glyphs data ]); let _ = Table::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); } #[test] fn single() { let data = convert(&[ UInt16(0), // version UInt16(0), // reserved UInt32(12), // offset to lookup table UInt32(12 + 16), // offset to glyphs data // Lookup Table UInt16(6), // format // Binary Search Table UInt16(4), // segment size UInt16(1), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it // Segment [0] UInt16(0), // glyph UInt16(0), // offset // Glyphs Data UInt32(1), // number of points // Point [0] Int16(-5), // x Int16(11), // y ]); let table = Table::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); let points = table.points(GlyphId(0)).unwrap(); assert_eq!(points.get(0).unwrap(), Point { x: -5, y: 11 }); } #[test] fn two_points() { let data = convert(&[ UInt16(0), // version UInt16(0), // reserved UInt32(12), // offset to lookup table UInt32(12 + 16), // offset to glyphs data // Lookup Table UInt16(6), // format // Binary Search Table UInt16(4), // segment size UInt16(1), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it // Segment [0] UInt16(0), // glyph UInt16(0), // offset // Glyphs Data // Glyph Data [0] UInt32(2), // number of points // Point [0] Int16(-5), // x Int16(11), // y // Point [1] Int16(10), // x Int16(-40), // y ]); let table = Table::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); let points = table.points(GlyphId(0)).unwrap(); assert_eq!(points.get(0).unwrap(), Point { x: -5, y: 11 }); assert_eq!(points.get(1).unwrap(), Point { x: 10, y: -40 }); } #[test] fn two_glyphs() { let data = convert(&[ UInt16(0), // version UInt16(0), // reserved UInt32(12), // offset to lookup table UInt32(12 + 20), // offset to glyphs data // Lookup Table UInt16(6), // format // Binary Search Table UInt16(4), // segment size UInt16(2), // number of segments UInt16(0), // search range: we don't use it UInt16(0), // entry selector: we don't use it UInt16(0), // range shift: we don't use it // Segment [0] UInt16(0), // glyph UInt16(0), // offset // Segment [1] UInt16(1), // glyph UInt16(8), // offset // Glyphs Data // Glyph Data [0] UInt32(1), // number of points // Point [0] Int16(-5), // x Int16(11), // y // Glyph Data [1] UInt32(1), // number of points // Point [0] Int16(40), // x Int16(10), // y ]); let table = Table::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); let points = table.points(GlyphId(0)).unwrap(); assert_eq!(points.get(0).unwrap(), Point { x: -5, y: 11 }); let points = table.points(GlyphId(1)).unwrap(); assert_eq!(points.get(0).unwrap(), Point { x: 40, y: 10 }); } ttf-parser-0.24.1/tests/tables/cff1.rs000064400000000000000000000724761046102023000156200ustar 00000000000000// TODO: simplify/rewrite use std::fmt::Write; use ttf_parser::{cff, GlyphId, CFFError, Rect}; struct Builder(String); impl ttf_parser::OutlineBuilder for Builder { fn move_to(&mut self, x: f32, y: f32) { write!(&mut self.0, "M {} {} ", x, y).unwrap(); } fn line_to(&mut self, x: f32, y: f32) { write!(&mut self.0, "L {} {} ", x, y).unwrap(); } fn quad_to(&mut self, x1: f32, y1: f32, x: f32, y: f32) { write!(&mut self.0, "Q {} {} {} {} ", x1, y1, x, y).unwrap(); } fn curve_to(&mut self, x1: f32, y1: f32, x2: f32, y2: f32, x: f32, y: f32) { write!(&mut self.0, "C {} {} {} {} {} {} ", x1, y1, x2, y2, x, y).unwrap(); } fn close(&mut self) { write!(&mut self.0, "Z ").unwrap(); } } #[allow(dead_code)] mod operator { pub const HORIZONTAL_STEM: u8 = 1; pub const VERTICAL_STEM: u8 = 3; pub const VERTICAL_MOVE_TO: u8 = 4; pub const LINE_TO: u8 = 5; pub const HORIZONTAL_LINE_TO: u8 = 6; pub const VERTICAL_LINE_TO: u8 = 7; pub const CURVE_TO: u8 = 8; pub const CALL_LOCAL_SUBROUTINE: u8 = 10; pub const RETURN: u8 = 11; pub const ENDCHAR: u8 = 14; pub const HORIZONTAL_STEM_HINT_MASK: u8 = 18; pub const HINT_MASK: u8 = 19; pub const COUNTER_MASK: u8 = 20; pub const MOVE_TO: u8 = 21; pub const HORIZONTAL_MOVE_TO: u8 = 22; pub const VERTICAL_STEM_HINT_MASK: u8 = 23; pub const CURVE_LINE: u8 = 24; pub const LINE_CURVE: u8 = 25; pub const VV_CURVE_TO: u8 = 26; pub const HH_CURVE_TO: u8 = 27; pub const SHORT_INT: u8 = 28; pub const CALL_GLOBAL_SUBROUTINE: u8 = 29; pub const VH_CURVE_TO: u8 = 30; pub const HV_CURVE_TO: u8 = 31; pub const HFLEX: u8 = 34; pub const FLEX: u8 = 35; pub const HFLEX1: u8 = 36; pub const FLEX1: u8 = 37; pub const FIXED_16_16: u8 = 255; } #[allow(dead_code)] mod top_dict_operator { pub const CHARSET_OFFSET: u16 = 15; pub const CHAR_STRINGS_OFFSET: u16 = 17; pub const PRIVATE_DICT_SIZE_AND_OFFSET: u16 = 18; pub const ROS: u16 = 1230; pub const FD_ARRAY: u16 = 1236; pub const FD_SELECT: u16 = 1237; } mod private_dict_operator { pub const LOCAL_SUBROUTINES_OFFSET: u16 = 19; } #[allow(dead_code)] #[derive(Clone, Copy)] enum TtfType { Raw(&'static [u8]), TrueTypeMagic, OpenTypeMagic, FontCollectionMagic, Int8(i8), UInt8(u8), Int16(i16), UInt16(u16), Int32(i32), UInt32(u32), CFFInt(i32), } use TtfType::*; fn convert(values: &[TtfType]) -> Vec { let mut data = Vec::with_capacity(256); for v in values { convert_type(*v, &mut data); } data } fn convert_type(value: TtfType, data: &mut Vec) { match value { TtfType::Raw(bytes) => { data.extend_from_slice(bytes); } TtfType::TrueTypeMagic => { data.extend_from_slice(&[0x00, 0x01, 0x00, 0x00]); } TtfType::OpenTypeMagic => { data.extend_from_slice(&[0x4F, 0x54, 0x54, 0x4F]); } TtfType::FontCollectionMagic => { data.extend_from_slice(&[0x74, 0x74, 0x63, 0x66]); } TtfType::Int8(n) => { data.extend_from_slice(&i8::to_be_bytes(n)); } TtfType::UInt8(n) => { data.extend_from_slice(&u8::to_be_bytes(n)); } TtfType::Int16(n) => { data.extend_from_slice(&i16::to_be_bytes(n)); } TtfType::UInt16(n) => { data.extend_from_slice(&u16::to_be_bytes(n)); } TtfType::Int32(n) => { data.extend_from_slice(&i32::to_be_bytes(n)); } TtfType::UInt32(n) => { data.extend_from_slice(&u32::to_be_bytes(n)); } TtfType::CFFInt(n) => { match n { -107..=107 => { data.push((n as i16 + 139) as u8); } 108..=1131 => { let n = n - 108; data.push(((n >> 8) + 247) as u8); data.push((n & 0xFF) as u8); } -1131..=-108 => { let n = -n - 108; data.push(((n >> 8) + 251) as u8); data.push((n & 0xFF) as u8); } -32768..=32767 => { data.push(28); data.extend_from_slice(&i16::to_be_bytes(n as i16)); } _ => { data.push(29); data.extend_from_slice(&i32::to_be_bytes(n)); } } } } } #[derive(Debug)] struct Writer { data: Vec, } impl Writer { fn new() -> Self { Writer { data: Vec::with_capacity(256) } } fn offset(&self) -> usize { self.data.len() } fn write(&mut self, value: TtfType) { convert_type(value, &mut self.data); } } fn gen_cff( global_subrs: &[&[TtfType]], local_subrs: &[&[TtfType]], chars: &[TtfType], ) -> Vec { fn gen_global_subrs(subrs: &[&[TtfType]]) -> Vec { let mut w = Writer::new(); for v1 in subrs { for v2 in v1.iter() { w.write(*v2); } } w.data } fn gen_local_subrs(subrs: &[&[TtfType]]) -> Vec { let mut w = Writer::new(); for v1 in subrs { for v2 in v1.iter() { w.write(*v2); } } w.data } const EMPTY_INDEX_SIZE: usize = 2; const INDEX_HEADER_SIZE: usize = 5; // TODO: support multiple subrs assert!(global_subrs.len() <= 1); assert!(local_subrs.len() <= 1); let global_subrs_data = gen_global_subrs(global_subrs); let local_subrs_data = gen_local_subrs(local_subrs); let chars_data = convert(chars); assert!(global_subrs_data.len() < 255); assert!(local_subrs_data.len() < 255); assert!(chars_data.len() < 255); let mut w = Writer::new(); // Header w.write(UInt8(1)); // major version w.write(UInt8(0)); // minor version w.write(UInt8(4)); // header size w.write(UInt8(0)); // absolute offset // Name INDEX w.write(UInt16(0)); // count // Top DICT // INDEX w.write(UInt16(1)); // count w.write(UInt8(1)); // offset size w.write(UInt8(1)); // index[0] let top_dict_idx2 = if local_subrs.is_empty() { 3 } else { 6 }; w.write(UInt8(top_dict_idx2)); // index[1] // Item 0 let mut charstr_offset = w.offset() + 2; charstr_offset += EMPTY_INDEX_SIZE; // String INDEX // Global Subroutines INDEX if !global_subrs_data.is_empty() { charstr_offset += INDEX_HEADER_SIZE + global_subrs_data.len(); } else { charstr_offset += EMPTY_INDEX_SIZE; } if !local_subrs_data.is_empty() { charstr_offset += 3; } w.write(CFFInt(charstr_offset as i32)); w.write(UInt8(top_dict_operator::CHAR_STRINGS_OFFSET as u8)); if !local_subrs_data.is_empty() { // Item 1 w.write(CFFInt(2)); // length w.write(CFFInt((charstr_offset + INDEX_HEADER_SIZE + chars_data.len()) as i32)); // offset w.write(UInt8(top_dict_operator::PRIVATE_DICT_SIZE_AND_OFFSET as u8)); } // String INDEX w.write(UInt16(0)); // count // Global Subroutines INDEX if global_subrs_data.is_empty() { w.write(UInt16(0)); // count } else { w.write(UInt16(1)); // count w.write(UInt8(1)); // offset size w.write(UInt8(1)); // index[0] w.write(UInt8(global_subrs_data.len() as u8 + 1)); // index[1] w.data.extend_from_slice(&global_subrs_data); } // CharString INDEX w.write(UInt16(1)); // count w.write(UInt8(1)); // offset size w.write(UInt8(1)); // index[0] w.write(UInt8(chars_data.len() as u8 + 1)); // index[1] w.data.extend_from_slice(&chars_data); if !local_subrs_data.is_empty() { // The local subroutines offset is relative to the beginning of the Private DICT data. // Private DICT w.write(CFFInt(2)); w.write(UInt8(private_dict_operator::LOCAL_SUBROUTINES_OFFSET as u8)); // Local Subroutines INDEX w.write(UInt16(1)); // count w.write(UInt8(1)); // offset size w.write(UInt8(1)); // index[0] w.write(UInt8(local_subrs_data.len() as u8 + 1)); // index[1] w.data.extend_from_slice(&local_subrs_data); } w.data } #[test] fn unsupported_version() { let data = convert(&[ UInt8(10), // major version, only 1 is supported UInt8(0), // minor version UInt8(4), // header size UInt8(0), // absolute offset ]); assert!(cff::Table::parse(&data).is_none()); } #[test] fn non_default_header_size() { let data = convert(&[ // Header UInt8(1), // major version UInt8(0), // minor version UInt8(8), // header size UInt8(0), // absolute offset // no-op, should be skipped UInt8(0), UInt8(0), UInt8(0), UInt8(0), // Name INDEX UInt16(0), // count // Top DICT // INDEX UInt16(1), // count UInt8(1), // offset size UInt8(1), // index[0] UInt8(3), // index[1] // Data CFFInt(21), UInt8(top_dict_operator::CHAR_STRINGS_OFFSET as u8), // String INDEX UInt16(0), // count // Global Subroutines INDEX UInt16(0), // count // CharString INDEX UInt16(1), // count UInt8(1), // offset size UInt8(1), // index[0] UInt8(4), // index[1] // Data CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO), UInt8(operator::ENDCHAR), ]); let table = cff::Table::parse(&data).unwrap(); let mut builder = Builder(String::new()); let rect = table.outline(GlyphId(0), &mut builder).unwrap(); assert_eq!(builder.0, "M 10 0 Z "); assert_eq!(rect, Rect { x_min: 10, y_min: 0, x_max: 10, y_max: 0 }); } fn rect(x_min: i16, y_min: i16, x_max: i16, y_max: i16) -> Rect { Rect { x_min, y_min, x_max, y_max } } macro_rules! test_cs_with_subrs { ($name:ident, $glob:expr, $loc:expr, $values:expr, $path:expr, $rect_res:expr) => { #[test] fn $name() { let data = gen_cff($glob, $loc, $values); let table = cff::Table::parse(&data).unwrap(); let mut builder = Builder(String::new()); let rect = table.outline(GlyphId(0), &mut builder).unwrap(); assert_eq!(builder.0, $path); assert_eq!(rect, $rect_res); } }; } macro_rules! test_cs { ($name:ident, $values:expr, $path:expr, $rect_res:expr) => { test_cs_with_subrs!($name, &[], &[], $values, $path, $rect_res); }; } macro_rules! test_cs_err { ($name:ident, $values:expr, $err:expr) => { #[test] fn $name() { let data = gen_cff(&[], &[], $values); let table = cff::Table::parse(&data).unwrap(); let mut builder = Builder(String::new()); let res = table.outline(GlyphId(0), &mut builder); assert_eq!(res.unwrap_err(), $err); } }; } test_cs!(move_to, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 Z ", rect(10, 20, 10, 20) ); test_cs!(move_to_with_width, &[ CFFInt(5), CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 Z ", rect(10, 20, 10, 20) ); test_cs!(hmove_to, &[ CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO), UInt8(operator::ENDCHAR), ], "M 10 0 Z ", rect(10, 0, 10, 0) ); test_cs!(hmove_to_with_width, &[ CFFInt(10), CFFInt(20), UInt8(operator::HORIZONTAL_MOVE_TO), UInt8(operator::ENDCHAR), ], "M 20 0 Z ", rect(20, 0, 20, 0) ); test_cs!(vmove_to, &[ CFFInt(10), UInt8(operator::VERTICAL_MOVE_TO), UInt8(operator::ENDCHAR), ], "M 0 10 Z ", rect(0, 10, 0, 10) ); test_cs!(vmove_to_with_width, &[ CFFInt(10), CFFInt(20), UInt8(operator::VERTICAL_MOVE_TO), UInt8(operator::ENDCHAR), ], "M 0 20 Z ", rect(0, 20, 0, 20) ); // Use only the first width. test_cs!(two_vmove_to_with_width, &[ CFFInt(10), CFFInt(20), UInt8(operator::VERTICAL_MOVE_TO), CFFInt(10), CFFInt(20), UInt8(operator::VERTICAL_MOVE_TO), UInt8(operator::ENDCHAR), ], "M 0 20 Z M 0 40 Z ", rect(0, 20, 0, 40) ); test_cs!(line_to, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), UInt8(operator::LINE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 L 40 60 Z ", rect(10, 20, 40, 60) ); test_cs!(line_to_with_multiple_pairs, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), UInt8(operator::LINE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 L 40 60 L 90 120 Z ", rect(10, 20, 90, 120) ); test_cs!(hline_to, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), UInt8(operator::HORIZONTAL_LINE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 L 40 20 Z ", rect(10, 20, 40, 20) ); test_cs!(hline_to_with_two_coords, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), UInt8(operator::HORIZONTAL_LINE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 L 40 20 L 40 60 Z ", rect(10, 20, 40, 60) ); test_cs!(hline_to_with_three_coords, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), UInt8(operator::HORIZONTAL_LINE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 L 40 20 L 40 60 L 90 60 Z ", rect(10, 20, 90, 60) ); test_cs!(vline_to, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), UInt8(operator::VERTICAL_LINE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 L 10 50 Z ", rect(10, 20, 10, 50) ); test_cs!(vline_to_with_two_coords, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), UInt8(operator::VERTICAL_LINE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 L 10 50 L 50 50 Z ", rect(10, 20, 50, 50) ); test_cs!(vline_to_with_three_coords, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), UInt8(operator::VERTICAL_LINE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 L 10 50 L 50 50 L 50 100 Z ", rect(10, 20, 50, 100) ); test_cs!(curve_to, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), CFFInt(70), CFFInt(80), UInt8(operator::CURVE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 C 40 60 90 120 160 200 Z ", rect(10, 20, 160, 200) ); test_cs!(curve_to_with_two_sets_of_coords, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), CFFInt(70), CFFInt(80), CFFInt(90), CFFInt(100), CFFInt(110), CFFInt(120), CFFInt(130), CFFInt(140), UInt8(operator::CURVE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 C 40 60 90 120 160 200 C 250 300 360 420 490 560 Z ", rect(10, 20, 490, 560) ); test_cs!(hh_curve_to, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), UInt8(operator::HH_CURVE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 C 40 20 80 70 140 70 Z ", rect(10, 20, 140, 70) ); test_cs!(hh_curve_to_with_y, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), CFFInt(70), UInt8(operator::HH_CURVE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 C 50 50 100 110 170 110 Z ", rect(10, 20, 170, 110) ); test_cs!(vv_curve_to, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), UInt8(operator::VV_CURVE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 C 10 50 50 100 50 160 Z ", rect(10, 20, 50, 160) ); test_cs!(vv_curve_to_with_x, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), CFFInt(70), UInt8(operator::VV_CURVE_TO), UInt8(operator::ENDCHAR), ], "M 10 20 C 40 60 90 120 90 190 Z ", rect(10, 20, 90, 190) ); #[test] fn only_endchar() { let data = gen_cff(&[], &[], &[UInt8(operator::ENDCHAR)]); let table = cff::Table::parse(&data).unwrap(); let mut builder = Builder(String::new()); assert!(table.outline(GlyphId(0), &mut builder).is_err()); } test_cs_with_subrs!(local_subr, &[], &[&[ CFFInt(30), CFFInt(40), UInt8(operator::LINE_TO), UInt8(operator::RETURN), ]], &[ CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO), CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_LOCAL_SUBROUTINE), UInt8(operator::ENDCHAR), ], "M 10 0 L 40 40 Z ", rect(10, 0, 40, 40) ); test_cs_with_subrs!(endchar_in_subr, &[], &[&[ CFFInt(30), CFFInt(40), UInt8(operator::LINE_TO), UInt8(operator::ENDCHAR), ]], &[ CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO), CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_LOCAL_SUBROUTINE), ], "M 10 0 L 40 40 Z ", rect(10, 0, 40, 40) ); test_cs_with_subrs!(global_subr, &[&[ CFFInt(30), CFFInt(40), UInt8(operator::LINE_TO), UInt8(operator::RETURN), ]], &[], &[ CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO), CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_GLOBAL_SUBROUTINE), UInt8(operator::ENDCHAR), ], "M 10 0 L 40 40 Z ", rect(10, 0, 40, 40) ); test_cs_err!(reserved_operator, &[ CFFInt(10), UInt8(2), UInt8(operator::ENDCHAR), ], CFFError::InvalidOperator); test_cs_err!(line_to_without_move_to, &[ CFFInt(10), CFFInt(20), UInt8(operator::LINE_TO), UInt8(operator::ENDCHAR), ], CFFError::MissingMoveTo); test_cs_err!(move_to_with_too_many_coords, &[ CFFInt(10), CFFInt(10), CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(move_to_with_not_enought_coords, &[ CFFInt(10), UInt8(operator::MOVE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(hmove_to_with_too_many_coords, &[ CFFInt(10), CFFInt(10), CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(hmove_to_with_not_enought_coords, &[ UInt8(operator::HORIZONTAL_MOVE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(vmove_to_with_too_many_coords, &[ CFFInt(10), CFFInt(10), CFFInt(10), UInt8(operator::VERTICAL_MOVE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(vmove_to_with_not_enought_coords, &[ UInt8(operator::VERTICAL_MOVE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(line_to_with_single_coord, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), UInt8(operator::LINE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(line_to_with_odd_number_of_coord, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), UInt8(operator::LINE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(hline_to_without_coords, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), UInt8(operator::HORIZONTAL_LINE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(vline_to_without_coords, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), UInt8(operator::VERTICAL_LINE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(curve_to_with_invalid_num_of_coords_1, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), UInt8(operator::CURVE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(curve_to_with_invalid_num_of_coords_2, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), CFFInt(70), CFFInt(80), CFFInt(90), UInt8(operator::CURVE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(hh_curve_to_with_not_enought_coords, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), UInt8(operator::HH_CURVE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(hh_curve_to_with_too_many_coords, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(30), CFFInt(40), CFFInt(50), UInt8(operator::HH_CURVE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(vv_curve_to_with_not_enought_coords, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), UInt8(operator::VV_CURVE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(vv_curve_to_with_too_many_coords, &[ CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO), CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(30), CFFInt(40), CFFInt(50), UInt8(operator::VV_CURVE_TO), UInt8(operator::ENDCHAR), ], CFFError::InvalidArgumentsStackLength); test_cs_err!(multiple_endchar, &[ UInt8(operator::ENDCHAR), UInt8(operator::ENDCHAR), ], CFFError::DataAfterEndChar); test_cs_err!(seac_with_not_enough_data, &[ CFFInt(0), CFFInt(0), CFFInt(0), CFFInt(0), UInt8(operator::ENDCHAR), ], CFFError::NestingLimitReached); test_cs_err!(operands_overflow, &[ CFFInt(0), CFFInt(1), CFFInt(2), CFFInt(3), CFFInt(4), CFFInt(5), CFFInt(6), CFFInt(7), CFFInt(8), CFFInt(9), CFFInt(0), CFFInt(1), CFFInt(2), CFFInt(3), CFFInt(4), CFFInt(5), CFFInt(6), CFFInt(7), CFFInt(8), CFFInt(9), CFFInt(0), CFFInt(1), CFFInt(2), CFFInt(3), CFFInt(4), CFFInt(5), CFFInt(6), CFFInt(7), CFFInt(8), CFFInt(9), CFFInt(0), CFFInt(1), CFFInt(2), CFFInt(3), CFFInt(4), CFFInt(5), CFFInt(6), CFFInt(7), CFFInt(8), CFFInt(9), CFFInt(0), CFFInt(1), CFFInt(2), CFFInt(3), CFFInt(4), CFFInt(5), CFFInt(6), CFFInt(7), CFFInt(8), CFFInt(9), ], CFFError::ArgumentsStackLimitReached); test_cs_err!(operands_overflow_with_4_byte_ints, &[ CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), ], CFFError::ArgumentsStackLimitReached); test_cs_err!(bbox_overflow, &[ CFFInt(32767), UInt8(operator::HORIZONTAL_MOVE_TO), CFFInt(32767), UInt8(operator::HORIZONTAL_LINE_TO), UInt8(operator::ENDCHAR), ], CFFError::BboxOverflow); #[test] fn endchar_in_subr_with_extra_data_1() { let data = gen_cff( &[], &[&[ CFFInt(30), CFFInt(40), UInt8(operator::LINE_TO), UInt8(operator::ENDCHAR), ]], &[ CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO), CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_LOCAL_SUBROUTINE), CFFInt(30), CFFInt(40), UInt8(operator::LINE_TO), ] ); let table = cff::Table::parse(&data).unwrap(); let mut builder = Builder(String::new()); let res = table.outline(GlyphId(0), &mut builder); assert_eq!(res.unwrap_err(), CFFError::DataAfterEndChar); } #[test] fn endchar_in_subr_with_extra_data_2() { let data = gen_cff( &[], &[&[ CFFInt(30), CFFInt(40), UInt8(operator::LINE_TO), UInt8(operator::ENDCHAR), CFFInt(30), CFFInt(40), UInt8(operator::LINE_TO), ]], &[ CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO), CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_LOCAL_SUBROUTINE), ] ); let table = cff::Table::parse(&data).unwrap(); let mut builder = Builder(String::new()); let res = table.outline(GlyphId(0), &mut builder); assert_eq!(res.unwrap_err(), CFFError::DataAfterEndChar); } #[test] fn subr_without_return() { let data = gen_cff( &[], &[&[ CFFInt(30), CFFInt(40), UInt8(operator::LINE_TO), UInt8(operator::ENDCHAR), CFFInt(30), CFFInt(40), UInt8(operator::LINE_TO), ]], &[ CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO), CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_LOCAL_SUBROUTINE), ] ); let table = cff::Table::parse(&data).unwrap(); let mut builder = Builder(String::new()); let res = table.outline(GlyphId(0), &mut builder); assert_eq!(res.unwrap_err(), CFFError::DataAfterEndChar); } #[test] fn recursive_local_subr() { let data = gen_cff( &[], &[&[ CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_LOCAL_SUBROUTINE), ]], &[ CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO), CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_LOCAL_SUBROUTINE), ] ); let table = cff::Table::parse(&data).unwrap(); let mut builder = Builder(String::new()); let res = table.outline(GlyphId(0), &mut builder); assert_eq!(res.unwrap_err(), CFFError::NestingLimitReached); } #[test] fn recursive_global_subr() { let data = gen_cff( &[&[ CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_GLOBAL_SUBROUTINE), ]], &[], &[ CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO), CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_GLOBAL_SUBROUTINE), ] ); let table = cff::Table::parse(&data).unwrap(); let mut builder = Builder(String::new()); let res = table.outline(GlyphId(0), &mut builder); assert_eq!(res.unwrap_err(), CFFError::NestingLimitReached); } #[test] fn recursive_mixed_subr() { let data = gen_cff( &[&[ CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_LOCAL_SUBROUTINE), ]], &[&[ CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_GLOBAL_SUBROUTINE), ]], &[ CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO), CFFInt(0 - 107), // subr index - subr bias UInt8(operator::CALL_GLOBAL_SUBROUTINE), ] ); let table = cff::Table::parse(&data).unwrap(); let mut builder = Builder(String::new()); let res = table.outline(GlyphId(0), &mut builder); assert_eq!(res.unwrap_err(), CFFError::NestingLimitReached); } #[test] fn zero_char_string_offset() { let data = convert(&[ // Header UInt8(1), // major version UInt8(0), // minor version UInt8(4), // header size UInt8(0), // absolute offset // Name INDEX UInt16(0), // count // Top DICT // INDEX UInt16(1), // count UInt8(1), // offset size UInt8(1), // index[0] UInt8(3), // index[1] // Data CFFInt(0), // zero offset! UInt8(top_dict_operator::CHAR_STRINGS_OFFSET as u8), ]); assert!(cff::Table::parse(&data).is_none()); } #[test] fn invalid_char_string_offset() { let data = convert(&[ // Header UInt8(1), // major version UInt8(0), // minor version UInt8(4), // header size UInt8(0), // absolute offset // Name INDEX UInt16(0), // count // Top DICT // INDEX UInt16(1), // count UInt8(1), // offset size UInt8(1), // index[0] UInt8(3), // index[1] // Data CFFInt(2), // invalid offset! UInt8(top_dict_operator::CHAR_STRINGS_OFFSET as u8), ]); assert!(cff::Table::parse(&data).is_none()); } // TODO: return from main // TODO: return without endchar // TODO: data after return // TODO: recursive subr // TODO: HORIZONTAL_STEM // TODO: VERTICAL_STEM // TODO: HORIZONTAL_STEM_HINT_MASK // TODO: HINT_MASK // TODO: COUNTER_MASK // TODO: VERTICAL_STEM_HINT_MASK // TODO: CURVE_LINE // TODO: LINE_CURVE // TODO: VH_CURVE_TO // TODO: HFLEX // TODO: FLEX // TODO: HFLEX1 // TODO: FLEX1 ttf-parser-0.24.1/tests/tables/cmap.rs000064400000000000000000000465001046102023000157060ustar 00000000000000mod format0 { use ttf_parser::{cmap, GlyphId}; use crate::{convert, Unit::*}; #[test] fn maps_not_all_256_codepoints() { let mut data = convert(&[ UInt16(0), // format UInt16(262), // subtable size UInt16(0), // language ID ]); // Map (only) codepoint 0x40 to 100. data.extend(std::iter::repeat(0).take(256)); data[6 + 0x40] = 100; let subtable = cmap::Subtable0::parse(&data).unwrap(); assert_eq!(subtable.glyph_index(0), None); assert_eq!(subtable.glyph_index(0x40), Some(GlyphId(100))); assert_eq!(subtable.glyph_index(100), None); let mut vec = vec![]; subtable.codepoints(|c| vec.push(c)); assert_eq!(vec, [0x40]); } } mod format2 { use ttf_parser::{cmap, GlyphId}; use crate::{convert, Unit::*}; const U16_SIZE: usize = std::mem::size_of::(); #[test] fn collect_codepoints() { let mut data = convert(&[ UInt16(2), // format UInt16(534), // subtable size UInt16(0), // language ID ]); // Make only high byte 0x28 multi-byte. data.extend(std::iter::repeat(0x00).take(256 * U16_SIZE)); data[6 + 0x28 * U16_SIZE + 1] = 0x08; data.extend(convert(&[ // First sub header (for single byte mapping) UInt16(254), // first code UInt16(2), // entry count UInt16(0), // id delta: uninteresting UInt16(0), // id range offset: uninteresting // Second sub header (for high byte 0x28) UInt16(16), // first code: (0x28 << 8) + 0x10 = 10256 UInt16(3), // entry count UInt16(0), // id delta: uninteresting UInt16(0), // id range offset: uninteresting ])); // Now only glyph ID's would follow. Not interesting for codepoints. let subtable = cmap::Subtable2::parse(&data).unwrap(); let mut vec = vec![]; subtable.codepoints(|c| vec.push(c)); assert_eq!(vec, [10256, 10257, 10258, 254, 255]); } #[test] fn codepoint_at_range_end() { let mut data = convert(&[ UInt16(2), // format UInt16(532), // subtable size UInt16(0), // language ID ]); // Only single bytes. data.extend(std::iter::repeat(0x00).take(256 * U16_SIZE)); data.extend(convert(&[ // First sub header (for single byte mapping) UInt16(40), // first code UInt16(2), // entry count UInt16(0), // id delta UInt16(2), // id range offset // Glyph index UInt16(100), // glyph ID [0] UInt16(1000), // glyph ID [1] UInt16(10000), // glyph ID [2] (unused) ])); let subtable = cmap::Subtable2::parse(&data).unwrap(); assert_eq!(subtable.glyph_index(39), None); assert_eq!(subtable.glyph_index(40), Some(GlyphId(100))); assert_eq!(subtable.glyph_index(41), Some(GlyphId(1000))); assert_eq!(subtable.glyph_index(42), None); } } mod format4 { use ttf_parser::{cmap, GlyphId}; use crate::{convert, Unit::*}; #[test] fn single_glyph() { let data = convert(&[ UInt16(4), // format UInt16(32), // subtable size UInt16(0), // language ID UInt16(4), // 2 x segCount UInt16(2), // search range UInt16(0), // entry selector UInt16(2), // range shift // End character codes UInt16(65), // char code [0] UInt16(65535), // char code [1] UInt16(0), // reserved // Start character codes UInt16(65), // char code [0] UInt16(65535), // char code [1] // Deltas Int16(-64), // delta [0] Int16(1), // delta [1] // Offsets into Glyph index array UInt16(0), // offset [0] UInt16(0), // offset [1] ]); let subtable = cmap::Subtable4::parse(&data).unwrap(); assert_eq!(subtable.glyph_index(0x41), Some(GlyphId(1))); assert_eq!(subtable.glyph_index(0x42), None); } #[test] fn continuous_range() { let data = convert(&[ UInt16(4), // format UInt16(32), // subtable size UInt16(0), // language ID UInt16(4), // 2 x segCount UInt16(2), // search range UInt16(0), // entry selector UInt16(2), // range shift // End character codes UInt16(73), // char code [0] UInt16(65535), // char code [1] UInt16(0), // reserved // Start character codes UInt16(65), // char code [0] UInt16(65535), // char code [1] // Deltas Int16(-64), // delta [0] Int16(1), // delta [1] // Offsets into Glyph index array UInt16(0), // offset [0] UInt16(0), // offset [1] ]); let subtable = cmap::Subtable4::parse(&data).unwrap(); assert_eq!(subtable.glyph_index(0x40), None); assert_eq!(subtable.glyph_index(0x41), Some(GlyphId(1))); assert_eq!(subtable.glyph_index(0x42), Some(GlyphId(2))); assert_eq!(subtable.glyph_index(0x43), Some(GlyphId(3))); assert_eq!(subtable.glyph_index(0x44), Some(GlyphId(4))); assert_eq!(subtable.glyph_index(0x45), Some(GlyphId(5))); assert_eq!(subtable.glyph_index(0x46), Some(GlyphId(6))); assert_eq!(subtable.glyph_index(0x47), Some(GlyphId(7))); assert_eq!(subtable.glyph_index(0x48), Some(GlyphId(8))); assert_eq!(subtable.glyph_index(0x49), Some(GlyphId(9))); assert_eq!(subtable.glyph_index(0x4A), None); } #[test] fn multiple_ranges() { let data = convert(&[ UInt16(4), // format UInt16(48), // subtable size UInt16(0), // language ID UInt16(8), // 2 x segCount UInt16(4), // search range UInt16(1), // entry selector UInt16(4), // range shift // End character codes UInt16(65), // char code [0] UInt16(69), // char code [1] UInt16(73), // char code [2] UInt16(65535), // char code [3] UInt16(0), // reserved // Start character codes UInt16(65), // char code [0] UInt16(67), // char code [1] UInt16(71), // char code [2] UInt16(65535), // char code [3] // Deltas Int16(-64), // delta [0] Int16(-65), // delta [1] Int16(-66), // delta [2] Int16(1), // delta [3] // Offsets into Glyph index array UInt16(0), // offset [0] UInt16(0), // offset [1] UInt16(0), // offset [2] UInt16(0), // offset [3] ]); let subtable = cmap::Subtable4::parse(&data).unwrap(); assert_eq!(subtable.glyph_index(0x40), None); assert_eq!(subtable.glyph_index(0x41), Some(GlyphId(1))); assert_eq!(subtable.glyph_index(0x42), None); assert_eq!(subtable.glyph_index(0x43), Some(GlyphId(2))); assert_eq!(subtable.glyph_index(0x44), Some(GlyphId(3))); assert_eq!(subtable.glyph_index(0x45), Some(GlyphId(4))); assert_eq!(subtable.glyph_index(0x46), None); assert_eq!(subtable.glyph_index(0x47), Some(GlyphId(5))); assert_eq!(subtable.glyph_index(0x48), Some(GlyphId(6))); assert_eq!(subtable.glyph_index(0x49), Some(GlyphId(7))); assert_eq!(subtable.glyph_index(0x4A), None); } #[test] fn unordered_ids() { let data = convert(&[ UInt16(4), // format UInt16(42), // subtable size UInt16(0), // language ID UInt16(4), // 2 x segCount UInt16(2), // search range UInt16(0), // entry selector UInt16(2), // range shift // End character codes UInt16(69), // char code [0] UInt16(65535), // char code [1] UInt16(0), // reserved // Start character codes UInt16(65), // char code [0] UInt16(65535), // char code [1] // Deltas Int16(0), // delta [0] Int16(1), // delta [1] // Offsets into Glyph index array UInt16(4), // offset [0] UInt16(0), // offset [1] // Glyph index array UInt16(1), // glyph ID [0] UInt16(10), // glyph ID [1] UInt16(100), // glyph ID [2] UInt16(1000), // glyph ID [3] UInt16(10000), // glyph ID [4] ]); let subtable = cmap::Subtable4::parse(&data).unwrap(); assert_eq!(subtable.glyph_index(0x40), None); assert_eq!(subtable.glyph_index(0x41), Some(GlyphId(1))); assert_eq!(subtable.glyph_index(0x42), Some(GlyphId(10))); assert_eq!(subtable.glyph_index(0x43), Some(GlyphId(100))); assert_eq!(subtable.glyph_index(0x44), Some(GlyphId(1000))); assert_eq!(subtable.glyph_index(0x45), Some(GlyphId(10000))); assert_eq!(subtable.glyph_index(0x46), None); } #[test] fn unordered_chars_and_ids() { let data = convert(&[ UInt16(4), // format UInt16(64), // subtable size UInt16(0), // language ID UInt16(12), // 2 x segCount UInt16(8), // search range UInt16(2), // entry selector UInt16(4), // range shift // End character codes UInt16(80), // char code [0] UInt16(256), // char code [1] UInt16(336), // char code [2] UInt16(512), // char code [3] UInt16(592), // char code [4] UInt16(65535), // char code [5] UInt16(0), // reserved // Start character codes UInt16(80), // char code [0] UInt16(256), // char code [1] UInt16(336), // char code [2] UInt16(512), // char code [3] UInt16(592), // char code [4] UInt16(65535), // char code [5] // Deltas Int16(-79), // delta [0] Int16(-246), // delta [1] Int16(-236), // delta [2] Int16(488), // delta [3] Int16(9408), // delta [4] Int16(1), // delta [5] // Offsets into Glyph index array UInt16(0), // offset [0] UInt16(0), // offset [1] UInt16(0), // offset [2] UInt16(0), // offset [3] UInt16(0), // offset [4] UInt16(0), // offset [5] ]); let subtable = cmap::Subtable4::parse(&data).unwrap(); assert_eq!(subtable.glyph_index(0x40), None); assert_eq!(subtable.glyph_index(0x50), Some(GlyphId(1))); assert_eq!(subtable.glyph_index(0x100), Some(GlyphId(10))); assert_eq!(subtable.glyph_index(0x150), Some(GlyphId(100))); assert_eq!(subtable.glyph_index(0x200), Some(GlyphId(1000))); assert_eq!(subtable.glyph_index(0x250), Some(GlyphId(10000))); assert_eq!(subtable.glyph_index(0x300), None); } #[test] fn no_end_codes() { let data = convert(&[ UInt16(4), // format UInt16(28), // subtable size UInt16(0), // language ID UInt16(4), // 2 x segCount UInt16(2), // search range UInt16(0), // entry selector UInt16(2), // range shift // End character codes UInt16(73), // char code [0] // 0xFF, 0xFF, // char code [1] <-- removed UInt16(0), // reserved // Start character codes UInt16(65), // char code [0] // 0xFF, 0xFF, // char code [1] <-- removed // Deltas Int16(-64), // delta [0] Int16(1), // delta [1] // Offsets into Glyph index array UInt16(0), // offset [0] UInt16(0), // offset [1] ]); assert!(cmap::Subtable4::parse(&data).is_none()); } #[test] fn invalid_segment_count() { let data = convert(&[ UInt16(4), // format UInt16(32), // subtable size UInt16(0), // language ID UInt16(1), // 2 x segCount <-- must be more than 1 UInt16(2), // search range UInt16(0), // entry selector UInt16(2), // range shift // End character codes UInt16(65), // char code [0] UInt16(65535), // char code [1] UInt16(0), // reserved // Start character codes UInt16(65), // char code [0] UInt16(65535), // char code [1] // Deltas Int16(-64), // delta [0] Int16(1), // delta [1] // Offsets into Glyph index array UInt16(0), // offset [0] UInt16(0), // offset [1] ]); assert!(cmap::Subtable4::parse(&data).is_none()); } #[test] fn only_end_segments() { let data = convert(&[ UInt16(4), // format UInt16(32), // subtable size UInt16(0), // language ID UInt16(2), // 2 x segCount UInt16(2), // search range UInt16(0), // entry selector UInt16(2), // range shift // End character codes UInt16(65535), // char code [1] UInt16(0), // reserved // Start character codes UInt16(65535), // char code [1] // Deltas Int16(-64), // delta [0] Int16(1), // delta [1] // Offsets into Glyph index array UInt16(0), // offset [0] UInt16(0), // offset [1] ]); let subtable = cmap::Subtable4::parse(&data).unwrap(); // Should not loop forever. assert_eq!(subtable.glyph_index(0x41), None); } #[test] fn invalid_length() { let data = convert(&[ UInt16(4), // format UInt16(16), // subtable size <-- the size should be 32, but we don't check it anyway UInt16(0), // language ID UInt16(4), // 2 x segCount UInt16(2), // search range UInt16(0), // entry selector UInt16(2), // range shift // End character codes UInt16(65), // char code [0] UInt16(65535), // char code [1] UInt16(0), // reserved // Start character codes UInt16(65), // char code [0] UInt16(65535), // char code [1] // Deltas Int16(-64), // delta [0] Int16(1), // delta [1] // Offsets into Glyph index array UInt16(0), // offset [0] UInt16(0), // offset [1] ]); let subtable = cmap::Subtable4::parse(&data).unwrap(); assert_eq!(subtable.glyph_index(0x41), Some(GlyphId(1))); assert_eq!(subtable.glyph_index(0x42), None); } #[test] fn codepoint_out_of_range() { let data = convert(&[ UInt16(4), // format UInt16(32), // subtable size UInt16(0), // language ID UInt16(4), // 2 x segCount UInt16(2), // search range UInt16(0), // entry selector UInt16(2), // range shift // End character codes UInt16(65), // char code [0] UInt16(65535), // char code [1] UInt16(0), // reserved // Start character codes UInt16(65), // char code [0] UInt16(65535), // char code [1] // Deltas Int16(-64), // delta [0] Int16(1), // delta [1] // Offsets into Glyph index array UInt16(0), // offset [0] UInt16(0), // offset [1] ]); let subtable = cmap::Subtable4::parse(&data).unwrap(); // Format 4 support only u16 codepoints, so we have to bail immediately otherwise. assert_eq!(subtable.glyph_index(0x1FFFF), None); } #[test] fn zero() { let data = convert(&[ UInt16(4), // format UInt16(42), // subtable size UInt16(0), // language ID UInt16(4), // 2 x segCount UInt16(2), // search range UInt16(0), // entry selector UInt16(2), // range shift // End character codes UInt16(69), // char code [0] UInt16(65535), // char code [1] UInt16(0), // reserved // Start character codes UInt16(65), // char code [0] UInt16(65535), // char code [1] // Deltas Int16(0), // delta [0] Int16(1), // delta [1] // Offsets into Glyph index array UInt16(4), // offset [0] UInt16(0), // offset [1] // Glyph index array UInt16(0), // glyph ID [0] <-- indicates missing glyph UInt16(10), // glyph ID [1] UInt16(100), // glyph ID [2] UInt16(1000), // glyph ID [3] UInt16(10000), // glyph ID [4] ]); let subtable = cmap::Subtable4::parse(&data).unwrap(); assert_eq!(subtable.glyph_index(0x41), None); } #[test] fn invalid_offset() { let data = convert(&[ UInt16(4), // format UInt16(42), // subtable size UInt16(0), // language ID UInt16(4), // 2 x segCount UInt16(2), // search range UInt16(0), // entry selector UInt16(2), // range shift // End character codes UInt16(69), // char code [0] UInt16(65535), // char code [1] UInt16(0), // reserved // Start character codes UInt16(65), // char code [0] UInt16(65535), // char code [1] // Deltas Int16(0), // delta [0] Int16(1), // delta [1] // Offsets into Glyph index array UInt16(4), // offset [0] UInt16(65535), // offset [1] // Glyph index array UInt16(1), // glyph ID [0] ]); let subtable = cmap::Subtable4::parse(&data).unwrap(); assert_eq!(subtable.glyph_index(65535), None); } #[test] fn collect_codepoints() { let data = convert(&[ UInt16(4), // format UInt16(24), // subtable size UInt16(0), // language ID UInt16(4), // 2 x segCount UInt16(2), // search range UInt16(0), // entry selector UInt16(2), // range shift // End character codes UInt16(34), // char code [0] UInt16(65535), // char code [1] UInt16(0), // reserved // Start character codes UInt16(27), // char code [0] UInt16(65533), // char code [1] // Deltas Int16(0), // delta [0] Int16(1), // delta [1] // Offsets into Glyph index array UInt16(4), // offset [0] UInt16(0), // offset [1] // Glyph index array UInt16(0), // glyph ID [0] UInt16(10), // glyph ID [1] ]); let subtable = cmap::Subtable4::parse(&data).unwrap(); let mut vec = vec![]; subtable.codepoints(|c| vec.push(c)); assert_eq!(vec, [27, 28, 29, 30, 31, 32, 33, 34, 65533, 65534, 65535]); } } ttf-parser-0.24.1/tests/tables/colr.rs000064400000000000000000000473261046102023000157340ustar 00000000000000use crate::{convert, Unit::*}; use ttf_parser::colr::{self, ClipBox, CompositeMode, GradientExtend, Paint, Painter}; use ttf_parser::{cpal, GlyphId, RgbaColor}; #[test] fn basic() { let cpal_data = convert(&[ UInt16(0), // version UInt16(3), // number of palette entries UInt16(1), // number of palettes UInt16(3), // number of colors UInt32(14), // offset to colors UInt16(0), // index of palette 0's first color UInt8(10), UInt8(15), UInt8(20), UInt8(25), // color 0 UInt8(30), UInt8(35), UInt8(40), UInt8(45), // color 1 UInt8(50), UInt8(55), UInt8(60), UInt8(65), // color 2 ]); let colr_data = convert(&[ UInt16(0), // version UInt16(3), // number of base glyphs UInt32(14), // offset to base glyphs UInt32(32), // offset to layers UInt16(4), // number of layers UInt16(2), UInt16(2), UInt16(2), // base glyph 0 (id 2) UInt16(3), UInt16(0), UInt16(3), // base glyph 1 (id 3) UInt16(7), UInt16(1), UInt16(1), // base glyph 2 (id 7) UInt16(10), UInt16(2), // layer 0 UInt16(11), UInt16(1), // layer 1 UInt16(12), UInt16(2), // layer 2 UInt16(13), UInt16(0), // layer 3 ]); let cpal = cpal::Table::parse(&cpal_data).unwrap(); let colr = colr::Table::parse(cpal, &colr_data).unwrap(); let paint = |id| { let mut painter = VecPainter(vec![]); colr.paint(GlyphId(id), 0, &mut painter, &[], RgbaColor::new(0, 0, 0, 255)).map(|_| painter.0) }; let a = RgbaColor::new(20, 15, 10, 25); let b = RgbaColor::new(40, 35, 30, 45); let c = RgbaColor::new(60, 55, 50, 65); assert_eq!(cpal.get(0, 0), Some(a)); assert_eq!(cpal.get(0, 1), Some(b)); assert_eq!(cpal.get(0, 2), Some(c)); assert_eq!(cpal.get(0, 3), None); assert_eq!(cpal.get(1, 0), None); assert!(!colr.contains(GlyphId(1))); assert!(colr.contains(GlyphId(2))); assert!(colr.contains(GlyphId(3))); assert!(!colr.contains(GlyphId(4))); assert!(!colr.contains(GlyphId(5))); assert!(!colr.contains(GlyphId(6))); assert!(colr.contains(GlyphId(7))); let a = CustomPaint::Solid(a); let b = CustomPaint::Solid(b); let c = CustomPaint::Solid(c); assert_eq!(paint(1), None); assert_eq!( paint(2).unwrap(), vec![ Command::OutlineGlyph(GlyphId(12)), Command::Paint(c.clone()), Command::OutlineGlyph(GlyphId(13)), Command::Paint(a.clone())] ); assert_eq!(paint(3).unwrap(), vec![ Command::OutlineGlyph(GlyphId(10)), Command::Paint(c.clone()), Command::OutlineGlyph(GlyphId(11)), Command::Paint(b.clone()), Command::OutlineGlyph(GlyphId(12)), Command::Paint(c.clone()), ]); assert_eq!(paint(7).unwrap(), vec![ Command::OutlineGlyph(GlyphId(11)), Command::Paint(b.clone()), ]); } #[derive(Clone, Debug, PartialEq)] struct CustomStop(f32, RgbaColor); #[derive(Clone, Debug, PartialEq)] enum CustomPaint { Solid(RgbaColor), LinearGradient(f32, f32, f32, f32, f32, f32, GradientExtend, Vec), RadialGradient(f32, f32, f32, f32, f32, f32, GradientExtend, Vec), SweepGradient(f32, f32, f32, f32, GradientExtend, Vec), } #[derive(Clone, Debug, PartialEq)] enum Command { OutlineGlyph(GlyphId), Paint(CustomPaint), PushLayer(CompositeMode), PopLayer, Transform(ttf_parser::Transform), PopTransform, PushClip, PushClipBox(ClipBox), PopClip, } struct VecPainter(Vec); impl<'a> Painter<'a> for VecPainter { fn outline_glyph(&mut self, glyph_id: GlyphId) { self.0.push(Command::OutlineGlyph(glyph_id)); } fn paint(&mut self, paint: Paint<'a>) { let custom_paint = match paint { Paint::Solid(color) => CustomPaint::Solid(color), Paint::LinearGradient(lg) => CustomPaint::LinearGradient(lg.x0, lg.y0, lg.x1, lg.y1, lg.x2, lg.y2, lg.extend, lg.stops(0, &[]).map(|stop| CustomStop(stop.stop_offset, stop.color)).collect()), Paint::RadialGradient(rg) => CustomPaint::RadialGradient(rg.x0, rg.y0, rg.r0, rg.r1, rg.x1, rg.y1, rg.extend, rg.stops(0, &[]).map(|stop| CustomStop(stop.stop_offset, stop.color)).collect()), Paint::SweepGradient(sg) => CustomPaint::SweepGradient(sg.center_x, sg.center_y, sg.start_angle, sg.end_angle, sg.extend, sg.stops(0, &[]).map(|stop| CustomStop(stop.stop_offset, stop.color)).collect()), }; self.0.push(Command::Paint(custom_paint)); } fn push_layer(&mut self, mode: colr::CompositeMode) { self.0.push(Command::PushLayer(mode)); } fn pop_layer(&mut self) { self.0.push(Command::PopLayer) } fn push_transform(&mut self, transform: ttf_parser::Transform) { self.0.push(Command::Transform(transform)) } fn pop_transform(&mut self) { self.0.push(Command::PopTransform) } fn push_clip(&mut self) { self.0.push(Command::PushClip) } fn push_clip_box(&mut self, clipbox: ClipBox) { self.0.push(Command::PushClipBox(clipbox)) } fn pop_clip(&mut self) { self.0.push(Command::PopClip) } } // A static and variable COLRv1 test font from Google Fonts: // https://github.com/googlefonts/color-fonts static COLR1_STATIC: &[u8] = include_bytes!("../fonts/colr_1.ttf"); static COLR1_VARIABLE: &[u8] = include_bytes!("../fonts/colr_1_variable.ttf"); mod colr1_static { use ttf_parser::{Face, GlyphId, RgbaColor}; use ttf_parser::colr::ClipBox; use ttf_parser::colr::CompositeMode::*; use ttf_parser::colr::GradientExtend::*; use crate::colr::{COLR1_STATIC, Command, CustomStop, VecPainter}; use crate::colr::Command::*; use crate::colr::CustomPaint::*; #[test] fn linear_gradient() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(9), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert_eq!(vec_painter.0, vec![ PushClipBox(ClipBox { x_min: 100.0, y_min: 250.0, x_max: 900.0, y_max: 950.0 }), OutlineGlyph(GlyphId(9)), PushClip, Paint(LinearGradient(100.0, 250.0, 900.0, 250.0, 100.0, 300.0, Repeat, vec![ CustomStop(0.2000122, RgbaColor { red: 255, green: 0, blue: 0, alpha: 255 }), CustomStop(0.7999878, RgbaColor { red: 0, green: 0, blue: 255, alpha: 255 })])), PopClip, PopClip] ) } #[test] fn sweep_gradient() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(13), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert_eq!(vec_painter.0, vec![ PushClipBox(ClipBox { x_min: 0.0, y_min: 0.0, x_max: 1000.0, y_max: 1000.0 }), OutlineGlyph(GlyphId(176)), PushClip, Paint(SweepGradient(500.0, 600.0, -0.666687, 0.666687, Pad, vec![ CustomStop(0.25, RgbaColor { red: 250, green: 240, blue: 230, alpha: 255 }), CustomStop(0.416687, RgbaColor { red: 0, green: 0, blue: 255, alpha: 255 }), CustomStop(0.583313, RgbaColor { red: 255, green: 0, blue: 0, alpha: 255 }), CustomStop(0.75, RgbaColor { red: 47, green: 79, blue: 79, alpha: 255 })])), PopClip, PopClip] ) } #[test] fn scale_around_center() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(84), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert_eq!(vec_painter.0, vec![ PushLayer(SourceOver), OutlineGlyph(GlyphId(3)), PushClip, Paint(Solid(RgbaColor { red: 0, green: 0, blue: 255, alpha: 127 })), PopClip, PushLayer(DestinationOver), Transform(ttf_parser::Transform::new_translate(500.0, 500.0)), Transform(ttf_parser::Transform::new_scale(0.5, 1.5)), Transform(ttf_parser::Transform::new_translate(-500.0, -500.0)), OutlineGlyph( GlyphId(3)), PushClip, Paint(Solid(RgbaColor { red: 255, green: 165, blue: 0, alpha: 178 })), PopClip, PopTransform, PopTransform, PopTransform, PopLayer, PopLayer] ) } #[test] fn scale() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(86), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform::new_scale(0.5, 1.5)))) } #[test] fn radial_gradient() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(93), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert_eq!(vec_painter.0, vec![ PushClipBox(ClipBox { x_min: 0.0, y_min: 0.0, x_max: 1000.0, y_max: 1000.0 }), OutlineGlyph(GlyphId(2)), PushClip, Paint(RadialGradient(166.0, 768.0, 0.0, 256.0, 166.0, 768.0, Pad, vec![ CustomStop(0.0, RgbaColor { red: 0, green: 128, blue: 0, alpha: 255 }), CustomStop(0.5, RgbaColor { red: 255, green: 255, blue: 255, alpha: 255 }), CustomStop(1.0, RgbaColor { red: 255, green: 0, blue: 0, alpha: 255 })])), PopClip, PopClip] ) } #[test] fn rotate() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(99), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform::new_rotate(0.055541992)))) } #[test] fn rotate_around_center() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(101), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert_eq!(vec_painter.0, vec![ PushLayer(SourceOver), OutlineGlyph(GlyphId(3)), PushClip, Paint(Solid(RgbaColor { red: 0, green: 0, blue: 255, alpha: 127 })), PopClip, PushLayer(DestinationOver), Transform(ttf_parser::Transform::new_translate(500.0, 500.0)), Transform(ttf_parser::Transform::new_rotate(0.13891602)), Transform(ttf_parser::Transform::new_translate(-500.0, -500.0)), OutlineGlyph(GlyphId(3)), PushClip, Paint(Solid(RgbaColor { red: 255, green: 165, blue: 0, alpha: 178 })), PopClip, PopTransform, PopTransform, PopTransform, PopLayer, PopLayer, ] ) } #[test] fn skew() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(103), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform::new_skew(0.13891602, 0.0)))); } #[test] fn skew_around_center() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(104), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert_eq!(vec_painter.0, vec![ PushLayer(SourceOver), OutlineGlyph(GlyphId(3)), PushClip, Paint(Solid(RgbaColor { red: 0, green: 0, blue: 255, alpha: 127 })), PopClip, PushLayer(DestinationOver), Transform(ttf_parser::Transform::new_translate(500.0, 500.0)), Transform(ttf_parser::Transform::new_skew(0.13891602, 0.0)), Transform(ttf_parser::Transform::new_translate(-500.0, -500.0)), OutlineGlyph(GlyphId(3)), PushClip, Paint(Solid(RgbaColor { red: 255, green: 165, blue: 0, alpha: 178 })), PopClip, PopTransform, PopTransform, PopTransform, PopLayer, PopLayer]) } #[test] fn transform() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(109), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform { a: 1.0, b: 0.0, c: 0.0, d: 1.0, e: 125.0, f: 125.0 } ))); } #[test] fn translate() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(114), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform::new_translate(0.0, 100.0)))); } #[test] fn composite() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(131), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Command::PushLayer(Xor))); } #[test] fn cyclic_dependency() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(179), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); } } mod colr1_variable { use ttf_parser::{Face, GlyphId, RgbaColor, Tag}; use ttf_parser::colr::ClipBox; use ttf_parser::colr::GradientExtend::*; use crate::colr::{COLR1_STATIC, COLR1_VARIABLE, CustomStop, VecPainter}; use crate::colr::Command::*; use crate::colr::CustomPaint::*; #[test] fn sweep_gradient() { let mut face = Face::parse(COLR1_VARIABLE, 0).unwrap(); face.set_variation(Tag::from_bytes(b"SWPS"), 45.0); face.set_variation(Tag::from_bytes(b"SWPE"), 58.0); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(13), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Paint(SweepGradient(500.0, 600.0, -0.416687, 0.9888916, Pad, vec![ CustomStop(0.25, RgbaColor { red: 250, green: 240, blue: 230, alpha: 255 }), CustomStop(0.416687, RgbaColor { red: 0, green: 0, blue: 255, alpha: 255 }), CustomStop(0.583313, RgbaColor { red: 255, green: 0, blue: 0, alpha: 255 }), CustomStop(0.75, RgbaColor { red: 47, green: 79, blue: 79, alpha: 255 })])) )); } #[test] fn scale_around_center() { let mut face = Face::parse(COLR1_VARIABLE, 0).unwrap(); face.set_variation(Tag::from_bytes(b"SCSX"), 1.1); face.set_variation(Tag::from_bytes(b"SCSY"), -0.9); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(84), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform::new_scale(1.599942, 0.60009766)))) } #[test] fn scale() { let mut face = Face::parse(COLR1_VARIABLE, 0).unwrap(); face.set_variation(Tag::from_bytes(b"SCSX"), 1.1); face.set_variation(Tag::from_bytes(b"SCSY"), -0.9); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(86), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform::new_scale(1.599942, 0.60009766)))) } #[test] fn radial_gradient() { let face = Face::parse(COLR1_STATIC, 0).unwrap(); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(93), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert_eq!(vec_painter.0, vec![ PushClipBox(ClipBox { x_min: 0.0, y_min: 0.0, x_max: 1000.0, y_max: 1000.0 }), OutlineGlyph(GlyphId(2)), PushClip, Paint(RadialGradient(166.0, 768.0, 0.0, 256.0, 166.0, 768.0, Pad, vec![ CustomStop(0.0, RgbaColor { red: 0, green: 128, blue: 0, alpha: 255 }), CustomStop(0.5, RgbaColor { red: 255, green: 255, blue: 255, alpha: 255 }), CustomStop(1.0, RgbaColor { red: 255, green: 0, blue: 0, alpha: 255 })])), PopClip, PopClip] ) } #[test] fn rotate() { let mut face = Face::parse(COLR1_VARIABLE, 0).unwrap(); face.set_variation(Tag::from_bytes(b"ROTA"), 150.0); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(99), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform::new_rotate(0.87341005)))) } #[test] fn rotate_around_center() { let mut face = Face::parse(COLR1_VARIABLE, 0).unwrap(); face.set_variation(Tag::from_bytes(b"ROTA"), 150.0); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(101), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform::new_rotate(0.9336252)))) } #[test] fn skew() { let mut face = Face::parse(COLR1_VARIABLE, 0).unwrap(); face.set_variation(Tag::from_bytes(b"SKXA"), 46.0); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(103), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform::new_skew(0.3944702, 0.0)))); } #[test] fn skew_around_center() { let mut face = Face::parse(COLR1_VARIABLE, 0).unwrap(); face.set_variation(Tag::from_bytes(b"SKXA"), 46.0); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(104), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform::new_skew(0.3944702, 0.0)))); } #[test] fn transform() { let mut face = Face::parse(COLR1_VARIABLE, 0).unwrap(); face.set_variation(Tag::from_bytes(b"TRDX"), 150.0); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(109), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform { a: 1.0, b: 0.0, c: 0.0, d: 1.0, e: 274.9939, f: 125.0 } ))); } #[test] fn translate() { let mut face = Face::parse(COLR1_VARIABLE, 0).unwrap(); face.set_variation(Tag::from_bytes(b"TLDX"), 100.0); let mut vec_painter = VecPainter(vec![]); face.paint_color_glyph(GlyphId(114), 0, RgbaColor::new(0, 0, 0, 255), &mut vec_painter); assert!(vec_painter.0.contains(&Transform(ttf_parser::Transform::new_translate(99.975586, 100.0)))); } } ttf-parser-0.24.1/tests/tables/feat.rs000064400000000000000000000047501046102023000157060ustar 00000000000000#![allow(clippy::bool_assert_comparison)] use ttf_parser::feat::Table; use crate::{convert, Unit::*}; #[test] fn basic() { let data = convert(&[ Fixed(1.0), // version UInt16(4), // number of features UInt16(0), // reserved UInt32(0), // reserved // Feature Name [0] UInt16(0), // feature UInt16(1), // number of settings UInt32(60), // offset to settings table UInt16(0), // flags: none UInt16(260), // name index // Feature Name [1] UInt16(1), // feature UInt16(1), // number of settings UInt32(64), // offset to settings table UInt16(0), // flags: none UInt16(256), // name index // Feature Name [2] UInt16(3), // feature UInt16(3), // number of settings UInt32(68), // offset to settings table Raw(&[0x80, 0x00]), // flags: exclusive UInt16(262), // name index // Feature Name [3] UInt16(6), // feature UInt16(2), // number of settings UInt32(80), // offset to settings table Raw(&[0xC0, 0x01]), // flags: exclusive and other UInt16(258), // name index // Setting Name [0] UInt16(0), // setting UInt16(261), // name index // Setting Name [1] UInt16(2), // setting UInt16(257), // name index // Setting Name [2] UInt16(0), // setting UInt16(268), // name index UInt16(3), // setting UInt16(264), // name index UInt16(4), // setting UInt16(265), // name index // Setting Name [3] UInt16(0), // setting UInt16(259), // name index UInt16(1), // setting UInt16(260), // name index ]); let table = Table::parse(&data).unwrap(); assert_eq!(table.names.len(), 4); let feature0 = table.names.get(0).unwrap(); assert_eq!(feature0.feature, 0); assert_eq!(feature0.setting_names.len(), 1); assert_eq!(feature0.exclusive, false); assert_eq!(feature0.name_index, 260); let feature2 = table.names.get(2).unwrap(); assert_eq!(feature2.feature, 3); assert_eq!(feature2.setting_names.len(), 3); assert_eq!(feature2.exclusive, true); assert_eq!(feature2.setting_names.get(1).unwrap().setting, 3); assert_eq!(feature2.setting_names.get(1).unwrap().name_index, 264); let feature3 = table.names.get(3).unwrap(); assert_eq!(feature3.default_setting_index, 1); assert_eq!(feature3.exclusive, true); } ttf-parser-0.24.1/tests/tables/glyf.rs000064400000000000000000000040601046102023000157220ustar 00000000000000use std::fmt::Write; struct Builder(String); impl ttf_parser::OutlineBuilder for Builder { fn move_to(&mut self, x: f32, y: f32) { write!(&mut self.0, "M {} {} ", x, y).unwrap(); } fn line_to(&mut self, x: f32, y: f32) { write!(&mut self.0, "L {} {} ", x, y).unwrap(); } fn quad_to(&mut self, x1: f32, y1: f32, x: f32, y: f32) { write!(&mut self.0, "Q {} {} {} {} ", x1, y1, x, y).unwrap(); } fn curve_to(&mut self, x1: f32, y1: f32, x2: f32, y2: f32, x: f32, y: f32) { write!(&mut self.0, "C {} {} {} {} {} {} ", x1, y1, x2, y2, x, y).unwrap(); } fn close(&mut self) { write!(&mut self.0, "Z ").unwrap(); } } #[test] fn endless_loop() { let data = b"\x00\x01\x00\x00\x00\x0f\x00\x10\x00PTT-W\x002h\xd7\x81x\x00\ \x00\x00?L\xbaN\x00c\x9a\x9e\x8f\x96\xe3\xfeu\xff\x00\xb2\x00@\x03\x00\xb8\ cvt 5:\x00\x00\x00\xb5\xf8\x01\x00\x03\x9ckEr\x92\xd7\xe6\x98M\xdc\x00\x00\ \x03\xe0\x00\x00\x00dglyf\"\t\x15`\x00\x00\x03\xe0\x00\x00\x00dglyf\"\t\x15\ `\x00\x00\x00 \x00\x00\x00\xfc\x97\x9fmx\x87\xc9\xc8\xfe\x00\x00\xbad\xff\ \xff\xf1\xc8head\xc7\x17\xce[\x00\x00\x00\xfc\x00\x00\x006hhea\x03\xc6\x05\ \xe4\x00\x00\x014\x00\x00\x00$hmtx\xc9\xfdq\xed\x00\x00\xb5\xf8\x01\x00\x03\ \x9ckEr\x92\xd7\xe6\xdch\x00\x00\xc9d\x00\x00\x04 loca\x00M\x82\x11\x00\x00\ \x00\x06\x00\x00\x00\xa0maxp\x17\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00 name\ \xf4\xd6\xfe\xad\x00OTTO\x00\x02gpost5;5\xe1\x00\x00\xb0P\x00\x00\x01\xf0perp%\ \xb0{\x04\x93D\x00\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x01\x00\x00\xe1!yf%1\ \x08\x95\x00\x00\x00\x00\x00\xaa\x06\x80fmtx\x02\x00\x00\x00\x00\x00\x00\x00\ \x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\ \x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00a\xcc\xff\ \xce\x03CCCCCCCCC\x00\x00\x00\x00\x00C\x00\x00\x00\x00\xb5\xf8\x01\x00\x00\x9c"; let face = ttf_parser::Face::parse(data, 0).unwrap(); let _ = face.outline_glyph(ttf_parser::GlyphId(0), &mut Builder(String::new())); } ttf-parser-0.24.1/tests/tables/hmtx.rs000064400000000000000000000060471046102023000157500ustar 00000000000000use std::num::NonZeroU16; use ttf_parser::GlyphId; use ttf_parser::hmtx::Table; use crate::{convert, Unit::*}; macro_rules! nzu16 { ($n:expr) => { NonZeroU16::new($n).unwrap() }; } #[test] fn simple_case() { let data = convert(&[ UInt16(1), // advance width [0] Int16(2), // side bearing [0] ]); let table = Table::parse(1, nzu16!(1), &data).unwrap(); assert_eq!(table.advance(GlyphId(0)), Some(1)); assert_eq!(table.side_bearing(GlyphId(0)), Some(2)); } #[test] fn empty() { assert!(Table::parse(1, nzu16!(1), &[]).is_none()); } #[test] fn zero_metrics() { let data = convert(&[ UInt16(1), // advance width [0] Int16(2), // side bearing [0] ]); assert!(Table::parse(0, nzu16!(1), &data).is_none()); } #[test] fn smaller_than_glyphs_count() { let data = convert(&[ UInt16(1), // advance width [0] Int16(2), // side bearing [0] Int16(3), // side bearing [1] ]); let table = Table::parse(1, nzu16!(2), &data).unwrap(); assert_eq!(table.advance(GlyphId(0)), Some(1)); assert_eq!(table.side_bearing(GlyphId(0)), Some(2)); assert_eq!(table.advance(GlyphId(1)), Some(1)); assert_eq!(table.side_bearing(GlyphId(1)), Some(3)); } #[test] fn no_additional_side_bearings() { let data = convert(&[ UInt16(1), // advance width [0] Int16(2), // side bearing [0] // A single side bearing should be present here. // We should simply ignore it and not return None during Table parsing. ]); let table = Table::parse(1, nzu16!(2), &data).unwrap(); assert_eq!(table.advance(GlyphId(0)), Some(1)); assert_eq!(table.side_bearing(GlyphId(0)), Some(2)); } #[test] fn less_metrics_than_glyphs() { let data = convert(&[ UInt16(1), // advance width [0] Int16(2), // side bearing [0] UInt16(3), // advance width [1] Int16(4), // side bearing [1] Int16(5), // side bearing [2] ]); let table = Table::parse(2, nzu16!(1), &data).unwrap(); assert_eq!(table.side_bearing(GlyphId(0)), Some(2)); assert_eq!(table.side_bearing(GlyphId(1)), Some(4)); assert_eq!(table.side_bearing(GlyphId(2)), None); } #[test] fn glyph_out_of_bounds_0() { let data = convert(&[ UInt16(1), // advance width [0] Int16(2), // side bearing [0] ]); let table = Table::parse(1, nzu16!(1), &data).unwrap(); assert_eq!(table.advance(GlyphId(0)), Some(1)); assert_eq!(table.side_bearing(GlyphId(0)), Some(2)); assert_eq!(table.advance(GlyphId(1)), None); assert_eq!(table.side_bearing(GlyphId(1)), None); } #[test] fn glyph_out_of_bounds_1() { let data = convert(&[ UInt16(1), // advance width [0] Int16(2), // side bearing [0] Int16(3), // side bearing [1] ]); let table = Table::parse(1, nzu16!(2), &data).unwrap(); assert_eq!(table.advance(GlyphId(1)), Some(1)); assert_eq!(table.side_bearing(GlyphId(1)), Some(3)); assert_eq!(table.advance(GlyphId(2)), None); assert_eq!(table.side_bearing(GlyphId(2)), None); } ttf-parser-0.24.1/tests/tables/main.rs000064400000000000000000000115761046102023000157170ustar 00000000000000#[rustfmt::skip] mod aat; #[rustfmt::skip] mod ankr; #[rustfmt::skip] mod cff1; #[rustfmt::skip] mod cmap; #[rustfmt::skip] mod colr; #[rustfmt::skip] mod feat; #[rustfmt::skip] mod glyf; #[rustfmt::skip] mod hmtx; #[rustfmt::skip] mod maxp; #[rustfmt::skip] mod sbix; #[rustfmt::skip] mod trak; use ttf_parser::{fonts_in_collection, Face, FaceParsingError}; #[allow(dead_code)] #[derive(Clone, Copy)] pub enum Unit { Raw(&'static [u8]), Int8(i8), UInt8(u8), Int16(i16), UInt16(u16), Int32(i32), UInt32(u32), Fixed(f32), } pub fn convert(units: &[Unit]) -> Vec { let mut data = Vec::with_capacity(256); for v in units { convert_unit(*v, &mut data); } data } fn convert_unit(unit: Unit, data: &mut Vec) { match unit { Unit::Raw(bytes) => { data.extend_from_slice(bytes); } Unit::Int8(n) => { data.extend_from_slice(&i8::to_be_bytes(n)); } Unit::UInt8(n) => { data.extend_from_slice(&u8::to_be_bytes(n)); } Unit::Int16(n) => { data.extend_from_slice(&i16::to_be_bytes(n)); } Unit::UInt16(n) => { data.extend_from_slice(&u16::to_be_bytes(n)); } Unit::Int32(n) => { data.extend_from_slice(&i32::to_be_bytes(n)); } Unit::UInt32(n) => { data.extend_from_slice(&u32::to_be_bytes(n)); } Unit::Fixed(n) => { data.extend_from_slice(&i32::to_be_bytes((n * 65536.0) as i32)); } } } #[test] fn empty_font() { assert_eq!( Face::parse(&[], 0).unwrap_err(), FaceParsingError::UnknownMagic ); } #[test] fn zero_tables() { use Unit::*; let data = convert(&[ Raw(&[0x00, 0x01, 0x00, 0x00]), // magic UInt16(0), // numTables UInt16(0), // searchRange UInt16(0), // entrySelector UInt16(0), // rangeShift ]); assert_eq!( Face::parse(&data, 0).unwrap_err(), FaceParsingError::NoHeadTable ); } #[test] fn tables_count_overflow() { use Unit::*; let data = convert(&[ Raw(&[0x00, 0x01, 0x00, 0x00]), // magic UInt16(u16::MAX), // numTables UInt16(0), // searchRange UInt16(0), // entrySelector UInt16(0), // rangeShift ]); assert_eq!( Face::parse(&data, 0).unwrap_err(), FaceParsingError::MalformedFont ); } #[test] fn empty_font_collection() { use Unit::*; let data = convert(&[ Raw(&[0x74, 0x74, 0x63, 0x66]), // magic UInt16(0), // majorVersion UInt16(0), // minorVersion UInt32(0), // numFonts ]); assert_eq!(fonts_in_collection(&data), Some(0)); assert_eq!( Face::parse(&data, 0).unwrap_err(), FaceParsingError::FaceIndexOutOfBounds ); } #[test] fn font_collection_num_fonts_overflow_1() { use Unit::*; let data = convert(&[ Raw(&[0x74, 0x74, 0x63, 0x66]), // magic UInt16(0), // majorVersion UInt16(0), // minorVersion UInt32(u32::MAX), // numFonts ]); assert_eq!(fonts_in_collection(&data), Some(u32::MAX)); } #[test] #[should_panic] fn font_collection_num_fonts_overflow_2() { use Unit::*; let data = convert(&[ Raw(&[0x74, 0x74, 0x63, 0x66]), // magic UInt16(0), // majorVersion UInt16(0), // minorVersion UInt32(u32::MAX), // numFonts ]); assert_eq!( Face::parse(&data, 0).unwrap_err(), FaceParsingError::MalformedFont ); } #[test] fn font_index_overflow() { use Unit::*; let data = convert(&[ Raw(&[0x74, 0x74, 0x63, 0x66]), // magic UInt16(0), // majorVersion UInt16(0), // minorVersion UInt32(1), // numFonts UInt32(12), // offset [0] ]); assert_eq!(fonts_in_collection(&data), Some(1)); assert_eq!( Face::parse(&data, u32::MAX).unwrap_err(), FaceParsingError::FaceIndexOutOfBounds ); } #[test] fn font_index_overflow_on_regular_font() { use Unit::*; let data = convert(&[ Raw(&[0x00, 0x01, 0x00, 0x00]), // magic UInt16(0), // numTables UInt16(0), // searchRange UInt16(0), // entrySelector UInt16(0), // rangeShift ]); assert_eq!(fonts_in_collection(&data), None); assert_eq!( Face::parse(&data, 1).unwrap_err(), FaceParsingError::FaceIndexOutOfBounds ); } ttf-parser-0.24.1/tests/tables/maxp.rs000064400000000000000000000037031046102023000157310ustar 00000000000000use std::num::NonZeroU16; use ttf_parser::maxp::Table; use crate::{convert, Unit::*}; #[test] fn version_05() { let table = Table::parse(&convert(&[ Fixed(0.3125), // version UInt16(1), // number of glyphs ])).unwrap(); assert_eq!(table.number_of_glyphs, NonZeroU16::new(1).unwrap()); } #[test] fn version_1_full() { let table = Table::parse(&convert(&[ Fixed(1.0), // version UInt16(1), // number of glyphs UInt16(0), // maximum points in a non-composite glyph UInt16(0), // maximum contours in a non-composite glyph UInt16(0), // maximum points in a composite glyph UInt16(0), // maximum contours in a composite glyph UInt16(0), // maximum zones UInt16(0), // maximum twilight points UInt16(0), // number of Storage Area locations UInt16(0), // number of FDEFs UInt16(0), // number of IDEFs UInt16(0), // maximum stack depth UInt16(0), // maximum byte count for glyph instructions UInt16(0), // maximum number of components UInt16(0), // maximum levels of recursion ])).unwrap(); assert_eq!(table.number_of_glyphs, NonZeroU16::new(1).unwrap()); } #[test] fn version_1_trimmed() { // We don't really care about the data after the number of glyphs. let table = Table::parse(&convert(&[ Fixed(1.0), // version UInt16(1), // number of glyphs ])).unwrap(); assert_eq!(table.number_of_glyphs, NonZeroU16::new(1).unwrap()); } #[test] fn unknown_version() { let table = Table::parse(&convert(&[ Fixed(0.0), // version UInt16(1), // number of glyphs ])); assert!(table.is_none()); } #[test] fn zero_glyphs() { let table = Table::parse(&convert(&[ Fixed(0.3125), // version UInt16(0), // number of glyphs ])); assert!(table.is_none()); } // TODO: what to do when the number of glyphs is 0xFFFF? // we're actually checking this in loca ttf-parser-0.24.1/tests/tables/sbix.rs000064400000000000000000000076151046102023000157370ustar 00000000000000use std::num::NonZeroU16; use ttf_parser::{GlyphId, RasterImageFormat}; use ttf_parser::sbix::Table; use crate::{convert, Unit::*}; #[test] fn single_glyph() { let data = convert(&[ UInt16(1), // version UInt16(0), // flags UInt32(1), // number of strikes UInt32(12), // strike offset [0] // Strike [0] UInt16(20), // pixels_per_em UInt16(72), // ppi UInt32(12), // glyph data offset [0] UInt32(44), // glyph data offset [1] // Glyph Data [0] UInt16(1), // x UInt16(2), // y Raw(b"png "), // type tag // PNG data, just the part we need Raw(&[0x89, 0x50, 0x4E, 0x47]), Raw(&[0x0D, 0x0A, 0x1A, 0x0A]), Raw(&[0x00, 0x00, 0x00, 0x0D]), Raw(&[0x49, 0x48, 0x44, 0x52]), UInt32(20), // width UInt32(30), // height ]); let table = Table::parse(NonZeroU16::new(1).unwrap(), &data).unwrap(); assert_eq!(table.strikes.len(), 1); let strike = table.strikes.get(0).unwrap(); assert_eq!(strike.pixels_per_em, 20); assert_eq!(strike.ppi, 72); assert_eq!(strike.len(), 1); let glyph_data = strike.get(GlyphId(0)).unwrap(); assert_eq!(glyph_data.x, 1); assert_eq!(glyph_data.y, 2); assert_eq!(glyph_data.width, 20); assert_eq!(glyph_data.height, 30); assert_eq!(glyph_data.pixels_per_em, 20); assert_eq!(glyph_data.format, RasterImageFormat::PNG); assert_eq!(glyph_data.data.len(), 24); } #[test] fn duplicate_glyph() { let data = convert(&[ UInt16(1), // version UInt16(0), // flags UInt32(1), // number of strikes UInt32(12), // strike offset [0] // Strike [0] UInt16(20), // pixels_per_em UInt16(72), // ppi UInt32(16), // glyph data offset [0] UInt32(48), // glyph data offset [1] UInt32(58), // glyph data offset [2] // Glyph Data [0] UInt16(1), // x UInt16(2), // y Raw(b"png "), // type tag // PNG data, just the part we need Raw(&[0x89, 0x50, 0x4E, 0x47]), Raw(&[0x0D, 0x0A, 0x1A, 0x0A]), Raw(&[0x00, 0x00, 0x00, 0x0D]), Raw(&[0x49, 0x48, 0x44, 0x52]), UInt32(20), // width UInt32(30), // height // Glyph Data [1] UInt16(3), // x UInt16(4), // y Raw(b"dupe"), // type tag UInt16(0), // glyph id ]); let table = Table::parse(NonZeroU16::new(2).unwrap(), &data).unwrap(); assert_eq!(table.strikes.len(), 1); let strike = table.strikes.get(0).unwrap(); assert_eq!(strike.pixels_per_em, 20); assert_eq!(strike.ppi, 72); assert_eq!(strike.len(), 2); let glyph_data = strike.get(GlyphId(1)).unwrap(); assert_eq!(glyph_data.x, 1); assert_eq!(glyph_data.y, 2); assert_eq!(glyph_data.width, 20); assert_eq!(glyph_data.height, 30); assert_eq!(glyph_data.pixels_per_em, 20); assert_eq!(glyph_data.format, RasterImageFormat::PNG); assert_eq!(glyph_data.data.len(), 24); } #[test] fn recursive() { let data = convert(&[ UInt16(1), // version UInt16(0), // flags UInt32(1), // number of strikes UInt32(12), // strike offset [0] // Strike [0] UInt16(20), // pixels_per_em UInt16(72), // ppi UInt32(16), // glyph data offset [0] UInt32(26), // glyph data offset [1] UInt32(36), // glyph data offset [2] // Glyph Data [0] UInt16(1), // x UInt16(2), // y Raw(b"dupe"), // type tag UInt16(0), // glyph id // Glyph Data [1] UInt16(1), // x UInt16(2), // y Raw(b"dupe"), // type tag UInt16(0), // glyph id ]); let table = Table::parse(NonZeroU16::new(2).unwrap(), &data).unwrap(); let strike = table.strikes.get(0).unwrap(); assert!(strike.get(GlyphId(0)).is_none()); assert!(strike.get(GlyphId(1)).is_none()); } ttf-parser-0.24.1/tests/tables/trak.rs000064400000000000000000000061471046102023000157320ustar 00000000000000use ttf_parser::trak::Table; use crate::{convert, Unit::*}; #[test] fn empty() { let data = convert(&[ Fixed(1.0), // version UInt16(0), // format UInt16(0), // horizontal data offset UInt16(0), // vertical data offset UInt16(0), // padding ]); let table = Table::parse(&data).unwrap(); assert_eq!(table.horizontal.tracks.len(), 0); assert_eq!(table.horizontal.sizes.len(), 0); assert_eq!(table.vertical.tracks.len(), 0); assert_eq!(table.vertical.sizes.len(), 0); } #[test] fn basic() { let data = convert(&[ Fixed(1.0), // version UInt16(0), // format UInt16(12), // horizontal data offset UInt16(0), // vertical data offset UInt16(0), // padding // TrackData UInt16(3), // number of tracks UInt16(2), // number of sizes UInt32(44), // offset to size table // TrackTableEntry [0] Fixed(-1.0), // track UInt16(256), // name index UInt16(52), // offset of the two per-size tracking values // TrackTableEntry [1] Fixed(0.0), // track UInt16(258), // name index UInt16(60), // offset of the two per-size tracking values // TrackTableEntry [2] Fixed(1.0), // track UInt16(257), // name index UInt16(56), // offset of the two per-size tracking values // Size [0] Fixed(12.0), // points // Size [1] Fixed(24.0), // points // Per-size tracking values. Int16(-15), Int16(-7), Int16(50), Int16(20), Int16(0), Int16(0), ]); let table = Table::parse(&data).unwrap(); assert_eq!(table.horizontal.tracks.len(), 3); assert_eq!(table.horizontal.tracks.get(0).unwrap().value, -1.0); assert_eq!(table.horizontal.tracks.get(1).unwrap().value, 0.0); assert_eq!(table.horizontal.tracks.get(2).unwrap().value, 1.0); assert_eq!(table.horizontal.tracks.get(0).unwrap().name_index, 256); assert_eq!(table.horizontal.tracks.get(1).unwrap().name_index, 258); assert_eq!(table.horizontal.tracks.get(2).unwrap().name_index, 257); assert_eq!(table.horizontal.tracks.get(0).unwrap().values.len(), 2); assert_eq!(table.horizontal.tracks.get(0).unwrap().values.get(0).unwrap(), -15); assert_eq!(table.horizontal.tracks.get(0).unwrap().values.get(1).unwrap(), -7); assert_eq!(table.horizontal.tracks.get(1).unwrap().values.len(), 2); assert_eq!(table.horizontal.tracks.get(1).unwrap().values.get(0).unwrap(), 0); assert_eq!(table.horizontal.tracks.get(1).unwrap().values.get(1).unwrap(), 0); assert_eq!(table.horizontal.tracks.get(2).unwrap().values.len(), 2); assert_eq!(table.horizontal.tracks.get(2).unwrap().values.get(0).unwrap(), 50); assert_eq!(table.horizontal.tracks.get(2).unwrap().values.get(1).unwrap(), 20); assert_eq!(table.horizontal.sizes.len(), 2); assert_eq!(table.horizontal.sizes.get(0).unwrap().0, 12.0); assert_eq!(table.horizontal.sizes.get(1).unwrap().0, 24.0); assert_eq!(table.vertical.tracks.len(), 0); assert_eq!(table.vertical.sizes.len(), 0); }