Code Monkey home page Code Monkey logo

coloraide's Introduction

Donate via PayPal Build Coverage Status PyPI Version PyPI Downloads PyPI - Python Version License

ColorAide

Overview

ColorAide is a pure Python, object oriented approach to colors.

>>> from coloraide import Color
>>> c = Color("red")
>>> c.to_string()
'rgb(255 0 0)'
>>> c.convert('hsl').to_string()
'hsl(0 100% 50%)'
>>> c.set("lch.chroma", 30).to_string()
'rgb(173.81 114.29 97.218)'
>>> Color("blue").mix("yellow", space="lch").to_string()
'rgb(255 65.751 107.47)'

ColorAide particularly has a focus on the following:

  • Accurate colors.

  • Proper round tripping (where reasonable).

  • Be generally easy to pick up for the average user.

  • Support modern CSS color spaces and syntax.

  • Make accessible many new and old non-CSS color spaces.

  • Provide a number of useful utilities such as interpolation, color distancing, blending, gamut mapping, filters, correlated color temperature, color vision deficiency simulation, etc.

  • Provide a plugin API to extend supported color spaces and approaches to various utilities.

  • Allow users to configure defaults to their liking.

With ColorAide, you can specify a color, convert it to other color spaces, mix it with other colors, output it in different CSS formats, and much more!

Documentation

https://facelessuser.github.io/coloraide

License

MIT

coloraide's People

Contributors

cfra avatar facelessuser avatar kdrag0n avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

coloraide's Issues

Limit default colors and plugins to just CSS colors?

Should we limit the default registered color spaces to just CSS colors? Or just include them all and let the user subclass and deregister what they don't want? This goes for delta E plugins etc.

Right now we just include them all and the user would need to deregister any spaces they explicitly do not want to include. I have no problem releasing 1.0 this way, but just want to think whether we may prefer the default to be a bit lighter or not.

Static typing

May require a little refactoring to do it well, but this is a biggish project, and typing will help with maintenance.

Gamut mapping according to Color Level 4

The CSS spec has finally posted their gamut mapping algorithm https://drafts.csswg.org/css-color/#binsearch.

It comes from an approach of SDR colors. What this means is that any colors that exceed 100% lightness or are below 0% lightness are set to white or black respectively. This changes the algorithm some from how we currently implement things.

You no longer have to wait for low and high chroma to converge as you know going into the binary search that all colors will have a suitable lightness and reducing chroma will always get the color into gamut. Before, a lightness out of the 0% - 100% range could produce a color still out of gamut meaning you wanted to continue searching until the low and high chroma converged and the clip the result to be sure you were "in gamut". This is probably more of an HDR gamut mapping approach.

Essentially, we are performing something more of an HDR approach currently, while the CSS spec is keeping things currently as an SDR approach until at some time they deem they've found suitable HDR color spaces with equally suitable gamut mapping.

We should probably provide this SDR gamut mapping approach for those who wish to mimic what browsers do. The differences are significant, but they can be a little different. Basically, if we use the SDR approach, we simply have to return white or black if the color is too light or too dark, and if not analyze the difference between the clipped color (a clipped form of the chroma reduced color) and the current chroma reduced color. If we are able to get the JND below the limit, we can return the clipped color, if not, we continue using the binary search to adjust the chroma until we find an optimal reduction. If we are in gamut, we can try to raise the chroma and if we are out of gamut, we try and reduce the chroma.

In general, using the SDR approach will keep the hue a little more consistent as you never land in a situation where you can't quite reach the JND and have to clip. I'm playing with it right now and will have to make a decision. We may or may not just use SDR for now or may provide both.

Handling adjusters when interpolating and mixing

Color.js handles mixing a specific channel by the user providing Nan. We allow you to specify the channel name. If you want to mix multiple channels differently (in one shot), that is currently not possible.

The CSS Color 4 spec allows adjusters to have a different percentage on each channel. You can technically do this by calling mix X number of times, where X is the number of channels you're mixing. Each call would specify a different channel. Should this be easier to do in one shot? Would that complicate the generalized interpolation function?

Should specifying a channel be mixed with channel specific options? For instance, specifying just the hue channel to be mixed and the hue angle option?

Should clipping clip hues?

I noticed that colorjs.io, when doing clipping, clips hues. This feels wrong as the spec even mentions that hues wrap. We just make sure the hues wrap and clip everything else. I know clipping is supposed to be "dumb", but since the spec states they wrap, it seems like we should not be clipping them. We should look into this. Comparing to the clipping done in browsers, it seems like we match up just fine.

Enhancement: Python doesn't have a built-in cbrt, and ours is not great

In general, Python doesn't have a cube root function. Numpy does, but that is a heavy package that we've been trying to not require.

So, generally, people might suggest "cube roots are easy! Just use n ** (1/3).". Well, this is what we've been using, and yes, we are in the ballpark, but it introduces errors for some things that should just be exact.

>>> 64 ** (1/3)
3.9999999999999996
>>> (64 ** (1/3)) ** 3
63.99999999999998

Not great...we should get an even 4.

So what can we do? Just use Numpy? Well, not exactly. We can actually rework ours to be a little fancier, and a little more accurate. How are we going to do this? Using Newton's Method.

Now, keep in mind, not every value has a perfect root, and not every value can be perfectly calculated with a perfect cube root, so sometimes, we are going to have an approximation.

But here we go, now we've implemented a nth_root function using Newton's Method.

>>> (64 ** (1/3)) ** 3
63.99999999999998
>>> nth_root(64, 3) ** 3
64.0

Much better. Now, how does it work with decimals?

>>> nth_root(64, 1.8) ** 1.8
63.999999999999986
>>> (64 ** (1 / 1.8)) ** 1.8
64.00000000000001

Not bad, which is better, going over or going under? ๐Ÿคท but we are in the ballpark.

>>> (1.2 ** (1 / 1.8)) ** 1.8
1.2
>>> nth_root(1.2, 1.8) ** 1.8
1.2

And we generally handle negative values to avoid imaginary results:

>>> nth_root(-64, 1.8)
-10.079368399158984
>>> nth_root(64, 1.8)
10.079368399158984

Any requested root below 1 will be handled the old-fashioned way as the algorithm won't handle powers less than 1, and anything less than 0 will throw an error.

Now, what does this do? The hope was generally just to make translations a little more stable. Beyond that, I had no real goals, it was more an evaluation. But looking at things, we can see things are a bit closer and better:

>>> Color('white').convert('lch').convert('hsl')
color(hsl 0 0% 100% / 1)
>>> Color('white').convert('lch-d65').convert('hsl')
color(hsl 0 0% 100% / 1)
>>> Color('white').convert('lab').convert('srgb').coords()
[0.9999999999999997, 1.0000000000000002, 0.9999999999999997]
>>> Color('white').convert('lab-d65').convert('srgb').coords()
[0.9999999999999997, 1.0000000000000002, 0.9999999999999997]

This is just strictly tweaking nth_root calculations to get them more accurate (cube root and others). No tweaking or optimization transform matrices or playing around with rounding, just a better root calculation.

It's never going to be perfect, and none of this was to force perfect Lab results, those are just a bonus. Does it do this for ICtCp, OKLab, or Jzazbz? :laugh: nope! And there was no expectation it would, but all things considering, our results are pretty good.

>>> Color('white').convert('jzazbz').convert('srgb').coords()
[1.0000000000000067, 1.0000000000000038, 0.9999999999999908]
>>> Color('white').convert('oklab').convert('srgb').coords()
[1.0000000000000009, 0.999999999999999, 1.0000000000000007]
>>> Color('white').convert('ictcp').convert('srgb').coords()
[0.9999999999999402, 1.000000000000015, 1.0000000000000204]

OKLab can squeeze out a decent white HSL value, but not OKLch:

>>> Color('white').convert('oklab').convert('hsl').coords()
[nan, 0.0, 100.0]
>>> Color('white').convert('oklch').convert('hsl').coords()
[16.193122576085646, -720.3736700421233, 100.00412133090708]

Display P3 has a D65 whitepoint

in the documentation the white point for display p3 is given as D50. It should be D65. Display P3 is identical to sRGB in transfer curve, viewing contitions, whitepoint, image state. The sole difference is the display primaries.

Allow 'playground' like behavior in all code blocks?

This is just an idea, and may not be as practical as it is with JavaScript libraries, but it might be neat to allow any block style code block to be edited and ran using Pyodide.

So, what would need to change?

  • Code blocks would need to be transformed into a compatible format to work the same as Pyodide. That may require us to refactor some things.
  • Our Pyodide code would need to be cleaned up and be able to target any code block on the page, not just the one with the hardcoded id.
  • We may want to pre-render such blocks and defer loading Pyodide until a user clicks an 'Edit' button for the first time. Maybe provide some visual indicating that Pyodide is loading.

I think if we did the above, it would keep docs snappy until a user wanted to edit a block, in that case, the environment would be prepared and edits should be pretty snappy after the first initial load.

Do we need this? No. Is it a high priority? No. But it would be cool ๐Ÿ˜‰. I don't think we are going to try and implement a whole notebook framework (or incorporate existing frameworks), but it is nice to read about a certain feature, and immediately be able to play with it on the page.

More unit tests are required

We've generally been moving fast to get something functional out as we had a need to do so. But focus in the future should be towards getting things unit tested to make sure we don't break anything.

  • Coverage for the API. This includes all available methods, properties, and exercising their different parameters.
  • Cover subclassing and overriding default parameters of the Color object. I forgot this during the first phase, so now it is here so I don't forget.
  • Test corner cases through the API to exercise logic in underlying modules. While the first phase of API tests will make sure there is nothing broken with the API, this will dig down deeper into specific cases, like what if we pass a bad type into parameter x. What if we exceed the range of parameter y, etc. We may cover many of these cases in the first API pass, but this pass will ensure we are covering everything.
  • Color space string-specific outputs.
  • Tests demonstating specific conversion cases.

Is fitting/setting in a different color space than the current useful?

Fitting in different color spaces can have mixed results. If you are in one sRGB color space variant, let's say srgb and you want to set hsl, fitting and setting work fine.

If you are out of gamut in lch, and want to fit to srgb, when you are closer to things like zero, the calculation can be ever so slightly off. So you can fit, the conversion back will give you the same thing you had before, and then you are still out of gamut, though the user may assume they are now in gamut.

This can also occur with channel setting via set. So, the question is, do we need to put a disclaimer? Axe the ability altogether? Something to think about.

Auto minimize and package notebook/playground code

The interactive playground/notebook code should get integrated better with a simple way to minimize and lint the code etc. Probably pull over the framework from pymdown-extensions. We can then build our CSS with SCSS and decrease the payload size of what we are downloading on the pages.

Use proper white point as defined in CSS spec opposed to BrueLindbloom one

CSS mentions that xy = (0.31272, 0.32903) should be used to calculate the white point instead of the white point we are using from BrueLindbloom: [0.95407, 1, 1.08883]. These xy values will yield a different value (somewhere loosely in the ballpark, but still different).

The actual sRGB spec uses the same values but they are rounded off to four decimal places: xy = (0.3127, 0.3290). I imagine not too much of a difference should be noticeable one way or another. The more precise value will likely make things calculate a little better.

Other spaces?

We probably don't plan to support every color space out there, but we'd probably like to support some others. Maybe:

Potentially others that would be fun to play with.

DOCS: Need to update styling to fix content tab styles

Material broke our overflow styling of content tabs. A fix has been made over at pymdown-extension's styling (the styling we use in these docs). We'll have to update the styling dependency and then regenerate our docs because right now tab overflow on mobile is broken.

What white point should we use for D50 and D65?

I know I kind of keep asking this, currently we are using white points as described by the CIE but using xy values rounded to 4 decimal white points. Most people everywhere do, but not everyone. We used to use the ones on http://www.brucelindbloom.com/. We toyed with using what is described in the CSS spec which is what we have rounded to 5 decimals for the xy coordinates.

Obviously, we will never match with everyone, and we ultimately have to choose. But which one before we go out of alpha?

Percent handling in Lab, Lch, Oklab, Oklch, etc.

It seems percentages are actually going to happen for lab values of a and b and chroma in Lch, etc.

Percentages will be based off suitable ranges that are decent for things like Display-P3. Often ranges are lopsided in a given space, but they are being normalized. So even if a range is something like [-0.2, 0.38], they'll just be [-0.4, 0.4], etc. They are just basically adding okay ranges that are easy to handle. This is fine and let's people use percentages if they have a need. The good news is that percentages would like they'll be optional. So while right now lightness is always a percent in Lab, Lch, Oklab, and Oklch, in the future, they are likely to be optional. This means we could specify an Oklab color as oklab(0.8 0.2 -0.3) or oklab(80% 0.2 -0.3). This is nice because people can copy real-world oklab colors into a CSS color.

I assume these percentage conventions will bleed over into the color(oklab l a b) format once they finalize it. Not sure if hues will be a percentage or not. They aren't when it comes to oklab() and friends. It's still early, so we'll wait and see.

Recognize `color()` functions with fallbacks

We should parse the default color() function in such a way as to allow fallback colors. Of course, fallback colors will not contribute to the final color, but we will be able to parse the color function even if fallbacks are provided.

Maybe find a better way to manage conversions

Currently, we just chain conversion functions that came straight from the CSS specifications (though some may vary slightly).

Color.js, which we have used as a reference in many cases seems to use XYZ as a waypoint to do "most" conversions through. Some conversions may be more direct. The color space class in their case carries around these methods. Whether we do things similarly or not, there is likely a better way we can manage conversions than we do now.

We should look into this.

Allow using other chromatic adaptation methods?

We could allow this. In general, it would be easy enough to have the needed matrices pre-calculated.

Now, if we wanted to have different illuminants (other than D50 and D65), then we'd have to calculate these on the fly which would be slower. I guess if we ever supported color spaces with illuminants other than D50 and D65 we may need to add such functionality. Or at least maybe pre-compile D65 and D50 and then if you have to deal with an unexpected illuminant, calculate one on the fly ๐Ÿคท๐Ÿป. Something to look at.

Non-official CSS color names should use the `--` prefix for names with `color()`

Things like color(hsl are not real supported color profiles in CSS. We should probably denote them as custom via color(--hsl. This will give us an avenue in case a color space is one day presented with an official name in CSS and we are already using the wrong one. It will avoid conflicts. Then we can just support the official and our old custom and phase out the old custom one.

Maybe expose clipping as a `.clip()` method? instead of just an option for `.fit()`?

In the same way that โˆ†E76 is exposed as just distance() for Euclidean distance and can be used also via detla_e(method="76"), maybe we can expose .clip() from our gamut fitting/reduction method .fit(). Even with an actual gamut mapping/reduction via chroma reduction present, sometimes clipping can be useful and it can be cumbersome to type .fit(method='clip'). I think at one time we had one but then dropped it during the alpha period.

Alternatively, we could expose all fit approaches as fit_name, so fit_clip, fit_lch_chroma, etc., just like we do with delta_e, but I think .clip() is probably used enough that exposing it just as clip() is more than reasonable. We never really want to remove it either, so maybe it shouldn't even be considered a "plugin" exactly. Clipping should just always be available.

Drop magic string handling for setting an attribute

Drop magic string handling when setting via property (color.red = "20%") or via set (color.set('red', '20%')).

Turns out color format takes percentages too. And it appears they handle things a bit different, like always serializing to 0 - 1 for percents. I'm thinking instead of all of this magic, we will parse CSS strings and such when creating a color object, but you have to work in numbers when modifying the color object directly.

Color function and percentages

Apparently, the color() function can take percentages for channels. It seems that they are suggesting that those percentages should get normalized to 1 - 0.

It is still unclear as the spec states these are equivalent, and looking at this one would assume they normalize lightness to 0 - 1 internally.

lab(67.5345% -8.6911 -41.6019);
color(lab 67.5345% -8.6911 -41.6019);

But then they do this:

#7654CD;
rgb(46.27% 32.94% 80.39%)
color(lab 44.36 36.05 -58.99)
color(xyz 0.2005, 0.14089, 0.4472)

Here it is clear they are not using a percentage for lab, but it is treated just like a percent. This seems to go against their original statement, so how is lab actually represented with color()?

Handle `float('inf')`?

Basically, we can handle any float or integer except Ifinity. If we get this, things will break. The options are to:

  1. Handle it like a NaN, probably the easiest, and can be done simply by having is_nan check math.isfinite(x). No Infinity is not a NaN, but you have to do something with it ๐Ÿค”.
  2. Let it break stuff. A user should not use Infinity, so they get what they get ๐Ÿ˜ˆ.
  3. Capture it and raise a sensible error, but then we need hooks in every possible place the user can set stuff. ๐Ÿคฎ
  4. Check Infinity separately, when we check NaN and raise an error. More reasonable. So if you do a convert or mix or convert to string, the library will see Infinity and throw an error. ๐Ÿค”
  5. Check Infinity separately, but convert to something else? But to what, zero? It wouldn't operate the same as NaN in mixing, but it would just translate to zero. ๐Ÿค”
  6. Something else? ๐Ÿคท๐Ÿป

Use some more direct conversions for HWB, HSL, and HSV

Currently, especially for HWB, many of the conversions go indirectly through sRGB and then back to their target space. But some conversions would be much more accurate. For instance, taking HWB to directly too and from HSV. It may make more sense than to take HWB through HSV to get to HSL or sRGB as well. We need to evaluate what makes the most sense, but we should optimize conversions to be the cleanest. It was noticed that Colorjs.io does this, most likely to give the most accurate conversion possible.

Ensure playground always works when deploying docs with a release

We seem to always have an issue with docs not quite getting deployed right. We seem to be able to deploy them manually, but the CI seems to never get it right.

We are currently building the wheels, running the build to ensure everything is properly linked in MkDocs before running gh-deploy, but the recent deploy was still accessing extra-notebook.js instead of extra-notebook-<hash>.js which caused the site to be broken. We need to figure out how to ensure this doesn't break.

Inconsistent use of white points

One of my biggest concerns with using CIELAB and CIELCH was how inconsistent the chroma calculation was. It would swing quite large, and something always felt off. It bothered me greatly.

Recently I dug into it and found the reason for the large swing in chroma and why it was even closer to zero was due to the fact that white point usage was consistent. As CIELAB was in D50 and sRGB was in D65, there was a conversion to XYZ and then chromatic adaption. It was in this area that CIELAB was thrown greatly off course.

We can see the heights to which chroma could reach when evaluating sRGB achromatic colors:

srgb: maximum chroma: 0.014636
srgb: maximum a: 0.003817
srgb: maximum b: 0.014130

But after re-evaluating all the matrices with the same chromatic adaption white point, we can see a vast improvement.

srgb: maximum chroma: 0.000012
srgb: maximum a: 0.000011
srgb: maximum b: 0.000005

I want to code up an experimental branch that adjusts all the matrices. I am interested to see if color.js (and the CSS spec) fix the XYZ matrices or they adjust the chromatic adaption matrices.

NOTE: I could not exactly figure out how color.js calculated their ProPhoto matrices. I was really, really close, but not spot-on. I'm not sure if their primaries were different or their white point was different from mine. But if all the matrices get updated, it won't matter as we would then match them.

Ref color-js/color.js#87

Maybe gamut mapping is not the best default?

I think gamut mapping is very useful and generally can help give a better color when going from a large color space to a small color space.

But, having it enabled by default may not be the best approach. It may be better served as an on-demand option. I say this mainly because during interpolation, gamut mapping can cause discontinuities in the interpolation. This is because the colors are adjusted to try and represent something that better reflects the intent of the out of gamut color, but that isn't what you want when interpolating as it may produce something similar, but far enough away from the previous color to be quite noticable.

It seems that maybe clipping is better in these scenarios as the color will ramp up to the gamult limit and just be clipped there until it falls back into gamut. This will at least provide smooth transitions

Screen Shot 2021-05-03 at 11 47 16 PM

Notebook?

At this point, we've learned enough that we could totally pull this off. Most likely, we'd have to use Python Markdown as our documentation site's styling expects things to be generated from that. Yes, that's two more dependencies we'd have to pull in, but we'd only have to pull them on the Notebook page.

Do we need this? Nope. Would it still be kind of fun to do? Maybe ๐Ÿ˜.

I'm pretty pleased with how things currently work, so it may be a bit before I look into this as I honestly have other things I should get to outside of this project.

Evaluate HSLuv and HPLuv (Implemented via Gist)

Is this worth adding? https://www.hsluv.org/. It is just LCHuv (CIELUV) converted to use saturation instead of chroma.

Document adding new color spaces, โˆ†E methods, and gamut mapping methods

We've mentioned in the docs that you can create your own color spaces and register them (along with โˆ†E methods and gamut mapping methods) but we've never actually documented how this can be done. This is mainly because we weren't 100% sure if the solution was going to change. I think moving forward, and especially before a 1.0 release, we should document how this can be done.

Using `color()` for all colors

It sounds like CSS may be warming up to allowing the color() form for all colors, not just RGB-ish spaces.

It sounds like they may specify percentages to be specific to the RGB space, as they are now, using 0% - 100% to represent 0 - 1, but for other spaces, like Lab etc. to have 0% to 100% represent the given channel within something like the Display-P3 space. So hues would probably map to the full range of hues (assuming they allow hues to be represented with percentages), lightness would be between the normal SDR range and a and b would map to some +/- value that encompasses the overall min and max value that may be used for all displayable Display P3 values.

This actually sounds good, and I am hoping they go this route. I don't imagine I would often be using percentages for things like a and b in Lab, but having some defined sane default sounds like the way to go.

What I'm most excited about is that I won't get burned at some future point as we will finally have a defined, universal method for specifying any color using the color() format. I don't know yet whether color() will have to accept deg for hue and such, but we'll see.

color function should only accept/output percentage format for percent only channels

Playing around with Safari Technology Preview today, I noticed that their lab implementation, when using the color() function, only accepts L as a percent.

Looking over the CSS level 4 spec, I noticed that the very few examples that are given also show percentages being used. Nowhere is it explicitly mentioned that in the case of percentage channels that only percentage inputs should be accepted. Nowhere does it mention that the serialization should only output "percentage" channels as percentages. I'm sure the question must have been asked somewhere, or Safari implemented it on the assumption that the inputs of the lab() format should match the inputs of the color() format.

Unfortunately, we were influenced by Colorjs.io's current implementation, and while we did make inquires on some of the percentage handlings, we never explicitly asked if these "percent" channels should only be accepted as percents. With that said, we recently saw they have an issue open that explicitly mentions what this behavior should look like: color-js/color.js#70. It is clearly stated that colorjs.io is implemented incorrectly as it allows percentage channels in the color() form to omit the %.

Basically, we need to require percentage symbols on percentage channels. Luckily this is pretty easy to fix. Each of our spaces has a "range" specifier which specifies the range of each channel and what type of channel they are. We simply need to check for Percent type channels (when evaluating string inputs) and ensure their channel input ends in % or fail the match. Additionally, when outputting the color() form to string, we can simply attach a % to the channel if it is a Percent type.


As an aside, technically, color() output does not support hues, so technically, the color() output for any of our cylindrical spaces is not valid, but we already know that. color() is simply our default serialization of colors. If in the future, the CSS spec gives implements hues in color() and provides additional guidance on how to represent hues, and it is different than just accepting a plain number as we do now, then we will adjust accordingly, but since there is no guidance, we will continue to represent hues a simple number and dump the spaces to the color() form as a generic serialization.

Gamut Mapping and Interpolation with Oklab/Oklch by default?

It seems the CSS spec may require gamut mapping with Oklch. The good thing is that this eliminates the purple shift issue of CIE LAB. This requires two parts:

  1. Distancing with Oklab which is much faster.
  2. A variant of the lch-chroma algorithm that uses oklab instead.

Generally, mapping looks better, but with more extreme colors, such as Rec 2020's blue mapped to sRGB, you get a washed-out blue. Granted this is in comparison to using CIE LCH which gives a very "blue" blue for Rec 2020 blue. Demo. Does that mean Oklch is wrong? Not really, but the mapping works differently for such colors.

Additionally, CSS wants to interpolate with Oklab by default. Interpolation is much better than CIE LAB in most cases. Hue is preserved and things generally blend better, but color interpolation tends to lean towards the dark end. Meaning that compared to CIE LAB, the middle gray is shifted towards the light end. Demo.

If this is what CSS does, we will add the implementation, but we'll have to decide if we wish to make it the default or not.

Handle pre-multiplied alpha values

This shouldn't be too difficult, we just need to premultiply the values with alpha before interpolation, and divide the channels with the interpolated alpha after interpolation. The notable thing is that we should avoid angle channels, those are not multiplied before or divided after.

Gamut Mapping and Interpolation

Based on the CSS specification, it is noted that when interpolating in a given color space, if the colors are not in the given color space, they are converted, but if they are not in gamut, they are not gamut mapped. The CSS color-mix spec seemed to indicate that they would be gamut mapped. This seems to have been a mistake and was only mentioned prior to allowing extended ranges with sRGB, Display P3, etc.

What this means is that technically, sRGB and Display P3 can sanely be represented even if their colors are out of gamut either in the negative direction or positive. When finally evaluated, they of course will be gamut mapped, but this is unnecessary during interpolation.

Some color spaces, such as HSL, HWB, etc. have no sane extended range. These colors would generally need to be gamut mapped if they were out of range.

It sounds like this it may be in our best interest, for better interpolation, to not gamut map when colors are too big for the target interpolation spaces except in cases where the color space will not allow a sane interpolation if not restrained.

It sounds like this is currently how the CSS spec is worded for interpolation, though it doesn't specifically mention anything about exceptions for HSL, but there is an open issue to maybe word it as such.

Making the adjustment to avoid gamut mapping colors when converting in a smaller space, we can see the results are quite different than when we do ensure the colors are gamut mapped before interpolation. Essentially we avoid truncating the full range of the interpolation, but the final colors are then gamut mapped when evaluated for display, in this case in sRGB.

Screen Shot 2022-01-04 at 9 40 48 PM

When to release 1.0.0?

Currently, we are on a 0.X.Y release. This can be considered a prerelease, but it is public. I'm trying to decide what if anything needs to happen before we call it 1.0.0.

For the most part, the API is finalized. Could it change? Maybe. I'm not sure yet, but overall, most of the bugs have been shaken out.

Overall, I don't think new features should hold up a 1.0 release, they can always be added later.

  • Is there anything else we want to do before 1.0.0? Are there any areas we still aren't quite happy with things?
  • Should we wait to see if they (CSSWG) formally add XYZ D65?
  • Should we wait to see if they (CSSWG) formally add Oklab?
  • Should we give more time to see how much they shake up the syntax? Some browsers are implementing some of the color changes, but there is no guarantee when they are going to be done changing things.
  • Should we be concerned about how none may get handled before 1.0?
  • Should #81 hold us up? Do we need to implement this exactly like CSS?
  • Should we be concerned about color(space X X X / X) syntax and how we use it for all colors? It is technically only speced for rectangular spaces, but ๐Ÿคท๐Ÿป. w3c/csswg-drafts#6741
  • Wait for resolution on percentages. w3c/csswg-drafts#6761
  • Document creating plugins. #99
  • Provide a lighter variant default Color object that only provides the current, accepted CSS colors? Users would need to subclass and register additional, provided colors if desired. #101
  • Gamut mapping algorithm is up in the air right now. Lch isn't perfect, but the CSS recommended one isn't either. The question is which one is more egregious. #118
  • Percent handling is currently being decided upon, and while it seems to have been decided for non color() syntax, things are still to be decided for the color() syntax, assuming they implement it in CSS. #116

Remove gamut mapping shortcut?

Our gamut mapping algorithm is very close to the recommended method via the CSS Level 4 color spec except for one exception, we shortcut the algorithm if flooring the chroma does not yield us a color that is in gamut. This is to make the algorithm faster and not waste time binary searching for the perfect chroma when none will allow us to be in gamut. What this translates to is that lightness and darkness are out of the range of the color's gamut. With the shortcut, we usually get colors, if they meet the criteria, getting mapped to white or black. Without this shortcut, these same colors will be very close to white or black or may actually end up as white or black.

I haven't done a great deal of extensive testing to ensure there is no situation in which having no shortcut would be preferable, but my gut instinct is that any difference is going to be virtually unnoticeable, but maybe it is worth looking into. Or, if we want to claim we strictly follow the CSS Level 4 spec, we should drop the shortcut so we can claim we follow it exactly.

Docs: Playground does't load pyodide in mobile Firefox (at least on Android)

I don't think there is much I can do, but thought I'd note it here that the playground doesn't work in mobile Firefox on Android. Works great on normal, desktop Firefox, so...๐Ÿคท๐Ÿป.

If anyone runs into this, this issue is to assert the issue is known. I am unaware of any workarounds right now as Pyodide's official repl demo won't load in mobile Firefox either, so I imagine they need to fix something on their end.

Refactor color spaces

Move convert, gamut mapping, contrast utilities, etc. out of the color space class. Strip the color space class down to just what specifies the color space (or as far down as we can pair it down). All other things should be higher lever operations and part of the parent Color() class. This will hopefully separate the spaces from having to know about each other.

Explore using a base conversion color for color spaces conversions

Currently, color spaces manage their conversion points with a general expectation that a D65 XYZ conversion point is always available. While XYZ is generally the preferred conversion point, there are some colors that must go through other colors.
They then reuse other space conversion points to provide the multi-step conversion process to XYZ (and sometimes others). With this additional complexity, we must also push chromatic adaptation down into the color space object.

Borrowing this idea from the WICG experimental color API, we'd like to experiment with the idea of the color space defining what color should be used as a conversion point and limit the conversion points to a single point.

For instance: HSL -> sRGB, sRGB -> sRGB Linear, sRGB Linear to XYZ, HSV -> HSL, etc. For some conversions, you may have multiple steps. sRGB -> HSV would actually create a conversion chain of sRGB -> HSL -> HSV. It's basically already like this, but the color spaces manage these chains and may manage multiple chains for different conversion points.

Now, I'm not sure the above WICG suggestion is exactly how we'd approach it as we'd like to shorten the chain when possible and keep such things as sRGB going all the way through XYZ just to come back to HSV.

The current draft is here: https://wicg.github.io/color-api/#converting-colorspaces. In some cases, this will take us all the way down to XYZ even though there is a shorter path with less computation. So, we'd like to:

  • Assuming the target doesn't already match the space allowing us to skip conversion.
  • Start by creating a list containing the conversion chain from XYZ to the target Color, along with an index mapping of color space name to index. This will start at the target and follow the base conversions until we bottom out at XYZ. All chains should bottom out at XYZ eventually.
  • Then assuming the color doesn't already match the target, begin to convert the color to its base. Then see if the base name is in the space index and if so quit. Bottoming out on XYZ will also kick us out.
  • Now our color should be a base somewhere in our target's conversion chain, so using the index, we can start at that point in the conversion chain, skipping unnecessary steps providing the optimal conversion path.

I'm not sure what kind of performance hit we may take doing this, but as far as maintainability goes, this would make things even easier I believe.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.