This produces more granural information on the actual pixel errors
present between two bitmaps. It now also asserts that both bitmaps are
the same size, since we should never compare two differently sized
bitmaps anyway.
By keeping track of the enclosing range around all Unicode ranges of a
FontCascadeList entry, we can quickly reject any code point that's
outside all ranges.
This knocks font_for_code_point() from 7% to 3% in the profile when
scrolling on https://screenshotone.com/
Now, when Skia backend context is available by the time backing stores
are allocated, there is no need to have a separate BackingStore class.
This allows us to get rid of BackingStore -> PaintingSurface cache.
Our floating point number parser was based on the fast_float library:
https://github.com/fastfloat/fast_float
However, our implementation only supports 8-bit characters. To support
UTF-16, we will need to be able to convert char16_t-based strings to
numbers as well. This works out-of-the-box with fast_float.
We can also use fast_float for integer parsing.
Gfx::Bitmap only supports a bit depth of 8, therefore we refused to
load AVIF images which didn't have this bit depth.
However, we can tell the libavif decoder to reduce the output depth by
setting avifRGBImage.depth to 8. This allows us to support any input
depth.
Makes images load on https://www.ikea.com/ which uses Cloudflare Images
to re-encode their images to 16-bit AVIF.
Skia's SkColorSpace::Make() doesn't seem to like ICC v2 profiles
containing table-based L* transfer curves. Now we use
skcms_ApproximateCurve() to convert these tables to parametric curves
that Skia supports in case Skia's SkColorSpace::Make() fails.
Based very scientifically on what's listed here:
https://harfbuzz.github.io/what-does-harfbuzz-do.html
I've moved the code into LibGfx because including a HarfBuzz header
directly from LibWeb is a little unpleasant. But the Gfx::FontTech enum
follows the CSS definitions for font features for simplicity.
TrueType collections are supported. SVG and Embedded OpenType are not,
but they're not widely supported by other browsers so that's fine.
Most of the features are completely supported by HarfBuzz, so we can
just return true. Graphite support is optional (and it appears we use a
build of HarfBuzz without it) but there's a define we can check.
Incremental Font Transfer is a whole separate thing that we definitely
don't support yet.
This header file is never included by any other headers, and including
it would require adding the fontconfig include paths to the swift
interop header generation, which is not desirable.
this commit also introduces GlobalFontConfig class that is now
responsible for fontconfig initialization. it seems sane, even thought
FcInit's docs state that if the default configuration has already been
loaded, this routine does nothing.
the goal is to rely on fontconfig for font directory resolution. it
doesn't seem like it would be appropritate to call to fontconfig funcs
from within the LibCore.
i'm not 100% confident that FontDatabase is the correct place.. seems
okay?
In the previous fix, we were still drawing IDAT data to the reference
frame even when no fcTL was present. This would cause rendering issues
when subsequent frames use APNG_BLEND_OP_OVER blending mode, as they
would composite over the incorrect reference frame. This commit adds a
simple check to properly skip any frame without an fcTL chunk.
We now properly handle OS/2 format BMPs that use 3 bytes per color
entry instead of 4. While OS/2 2.x officially specified 4 bytes per
color, some tools still produce files with 3-byte entries. We can
identify such files by checking the available color table space.
This extends ICO loader to support Windows cursor files. There is no
point in creating a separate loader for this, as the ICO format is
very similar to the CUR format. The only differences are bytes used to
identify the file and a presence of a hotspot in the CUR header.
Color masks should only be used when the compression type is either
BITFIELDS or ALPHABITFIELDS. They were always read before and produced
corrupted images when there was random data in the mask fields.
Other browsers don't think that BMP files with more than 1024 colors are
invalid. They clamp the palette instead, and now we do the same. This
allows us to load more BMPs.
Typeface::try_load_from_externally_owned_memory() relies on that
external owner keeping the memory around. However, neither WOFF nor
WOFF2 do so - they both create separate ByteBuffers to hold the TTF
data. So, rename them to make it clearer that they don't have any
requirements on the byte owner.
Before this change, IDAT data was mistakenly always included in the
animation. Now we only include frames with explicit fcTL chunks.
As per the PNG spec (third edition):
"The static image may be included as the first frame of the animation
by the presence of a single fcTL chunk before IDAT. Otherwise, the
static image is not part of the animation."
We also fall back to the IDAT data when APNG has acTL but no fcTL
chunks. Test image is 062.png from fDAT-inherits-cICP.html from WPT.