Missing wood for trees. RGB displays are deliberately made to be 'good enough' rather than 'great'. None of them can display whole areas of the visible spectrum [1]. Before taking issue with tiny amounts of visible banding, just maybe it would be better if monitors could actually display a reasonable amount of the visible spectrum...
It's quite obvious that for a gradient from 0x33 to 0x66 (one component), there are simply not enough distinct values to prevent banding. In this case, the range is 51 different values, for a gradient that's maybe spanning 500-1000 pixels. It really looks quite terrible.
I should point out that the area of a region in that diagram does not correspond with the ability to distinguish colors in that region. The diagram is misleading, but your point still stands.
It's overreach to say high DPI displays eliminate the need for anti-aliasing. You will still get shimmering, pixel popping, and temporal aliasing, it will just be harder to notice. If you have thin shapes like hanging cables or chain-link fences being displayed, even at high resolution, you will get pixels coming in and out of existence. Higher sampled AA tends to add spatial "stability" so that slight shifts of viewpoint won't trigger this.
Author here. You're right, high-DPI does not actually eliminate aliasing and I've toned done the wording slightly (added "nearly"). Certainly high-DPI is a huge step in the right direction even if several problems with aliasing remain.
I was simply attempting to say "thank you" to Apple for innovating with high-DPI displays so that I could follow that by saying more or less what you did: many things remain to address image clarity.
I don't think it wouldn't have been so bad if it was slower, like clouds on a not very breezy day. As it is, it felt like it was trying to hypnotize me.
Author here. You're completely right. I did for fun, not because I "should."
I'm not a designer; just someone who enjoyed playing with SVG and SMIL to make a subtle background effect that I liked.
You can turn off the animation with the menu at the bottom right. Apologies to everyone for burning your CPU cycles so needlessly. Just turn it off if you don't like it.
> Academics have told us that human eyes can't distinguish between the 16,777,216 colors provided by 24-bit depth, so we believe that even though it can't be true.
This is a fallacy. That the human eye cannot distinguish N different colours does not imply that a given colour space of >N colours contains every colour that the human eye can distinguish.
To give an equivalent example that is easier to understand, consider a colour space with 16.7 million shades of red. 16.7 million colours is more than the human eye can distinguish, but there are clearly many colours that the human eye can distinguish that are not in that colour space (notably shades of green, shades of blue, and combinations of red, green, and blue).
I think he disputing the author's assertion that "being unable to distinguish the millions of colors in 24-bit sRGB can't be true" is necessarily true given the conclusion. It's important to get the fine details right even if you agree with the overall point (though I'm not sure who's actually right here).
I agree. Displays in general suck. The display on the RMBP is a step in the right direction, but there's still a lot of work to be done. Anyone who settles for "good enough" may as well live life like it's the 16th century. It was "good enough" then too.
My vision for the future? Specifications that fully exceed the human capacity for discernibility.
2880x1800? Nope, I can still see aliasing (particularly in Terminal when I'm coding).
IPS? Nope. Move your head slightly up and down and -- while the chromaticity stays (roughly) the same -- the luminance does not.
Black levels? I can still distinguish a black screen from the bezel, so it needs some work too.
Color depth? See the provided banding examples. Not to mention that the three primaries in most LCD panels form a very small triangle in the chromaticity diagram.
Refresh rates also stink. It's 2012 -- motion should be so fluid it looks real by now.
Anyway, there's a lot of potential for improvement but I'm afraid if you're not OCD (like me) or a color scientist, most people just don't care too much.
Huh, don't get me started on image quality and such. Look how people watch constantly 4:3 movies horribly stretched on 16:9 screens. Look how bad the animation is on 99% of bluRay films (to the point of being unbearable) -- action scenes are stuttering and jittering except on the most recent hardware.
The number of bytes of memory available on GPUs is hardly the biggest issue when considering rendering at more than 8 bits per channel of color precision. Even producing a 30-bit framebuffer (as has been supported for a while on many desktop GPUs) comes at a performance penalty and is incompatible with many pieces of software and hardware. Modern GPUs actually allow 16 bits per channel of precision when rendering, but it comes at a significant cost - various features no longer function, memory bandwidth is devoured, etc. You can't simply say 'we can spare the bits' and wave away all the technological challenges here, especially when the advantage gained from all those costs is comparatively miniscule.
To be fair, this sort of applies to retina displays as well: The hardware put into some of the Retina macs can barely handle the bandwidth demands of realtime rendering at such high resolutions. Until that problem is addressed, you certainly shouldn't be running around demanding >8bpc color precision.
Dithering for gradients could certainly make a minor difference in rendered quality, but I don't think most browser vendors are interested in making rendering performance slower right now - they're quite busy trying to make it faster. As evidenced by the fact that the OP's site runs like complete garbage in even modern browsers. Even if you spend the cycles and the power to dither, you're basically approximating something like another bit of precision. Is it really worth the cost just for another bit? You could support the argument for dithering by showing a side by side comparison, at least.
You're right. I trivialize the challenges, and I admit that.
However, my point is that it has been some fifteen years since 24-bit "True Color" arrived. Certainly since then we've increased computing performance to the point where we can manage to throw a few more bits around, yes?
Also, you're right that browser vendors are dutifully concerned about performance. In fact, one of the reasons I enjoyed adding a subtly animated background (see previous reply apologizing to those who hate it) was a bit of evidence to back up another point I've made elsewhere: in 2012 our computers that process billions of CPU operations per second paired with high-powered GPUs can still get chunky with relatively trivial 2D animation in a web browser.
Incidentally, on that point, if you happen to have IE 10, check out how well it uses your GPU to do CSS transitions. It doesn't do the SVG/SMIL animation used in the background, but I find it fascinating how effortless it executes the animation it does support: http://tiamat.tsotech.com/ie-10-is-no-joke
I'd really like to see Chrome and Firefox catch up with that degree of GPU acceleration.
But in this particular blog entry, I'm asking that some attention return to rendering quality. I'd love to see color banding in gradients disappear soon.
Actually dithering makes a huge difference. I see zero banding whatsoever in the examples posted in this thread, because my sw/hw config does automatic dithering (Xorg fbdev 0.4.2, AMD Fusion E-350 GPU, Lenovo X120e laptop).
Contrast this with other posters who I presume don't have dithering, and write comments like "this gradient looks ugly"...
> Modern GPUs actually allow 16 bits per channel of precision when rendering, but it comes at a significant cost - various features no longer function, memory bandwidth is devoured, etc.
I'm not sure that this is true. It's extremely common for games to render lighting texture maps with HDR render targets. 16 bit-per-channel render targets are often used to accumulate excess light. The excess light gets processed by successful full screen pixel shader passes. These passes generally include blurring and flattening into a normal 8-bit-per-channel texture on the swap chain.
Furthermore, multiple render targets have been common since before the release of the Xbox 360. Deferred shading engines come into and out of popularity on what seems like a biyearly basis depending on what visual style is popular in games currently.
It seems to me that modern hardware is quite capable of greater color range, but the most demanding applications, games, would rather use that extra video RAM and fill rate for storing and processing surface normals, lighting scalars, material properties, etc. Desktop applications always lag behind games dramatically. Hell, browsers are just now starting to see some of the benefits of modern GPUs.
>8bpc render targets have limited feature sets, especially when talking about older architectures like the XBox 360.
It's true that they're used to accumulate lighting and other information, but that's basically one of the simplest possible use cases for a render target; you're just writing pixel data.
Multisampling, blending, etc - all things used often in 3D rendering - are not necessarily supported on a >8bpc render target. Whether you can do them depends on the drivers and the hardware. The same goes for textures that are >8bpc (filtering might not work, for example).
I ran into this problem just yesterday in a program I was writing when I tried to do blending on a high precision render target. :)
As far as I know, most modern PC GPUs actually use a 32-bit framebuffer with 8 bits of padding for their 24-bit colour mode and do all shader computations in floating point already, so 30-bit colour isn't going to use any more memory bandwidth or shader time.
Doing 4 8-bit computations in parallel is different from doing 3 10-bit computations in parallel (or two 11-bit computations and one 10-bit computation). The hardware required is different.
Modern GPUs are certainly capable of high-precision arithmetic but it is not a safe assumption that all the hardware is designed for it. A lot of it is designed for the stuff it spends most of its time doing, which is 8 bits per channel rasterization.
Not only that, it's such a processor hog it makes my MacBook Air fans whir at full volume.
I disabled JavaScript on the site, hoping that would make it behave, and got... no blog, only this:
> This blog uses a little JavaScript. Nothing dodgy, though, and nothing hosted at third-party sites. Just some jQuery and animation bits. So please, if you'd be so kind, ask Noscript to call off the hounds.
Congratulations, those “animation bits”, which you won't let me turn off, make your site unbearably annoying to read.
You are not the only one. I couldn't keep my eye focus on the text due to the rotating background, and closed the website after reading the first two paragraphs.
Thanks for saying so, though I will admit I don't actually own an iPad and never tested the design there. Apologies to iPad users.
I aim to detect mobile browsers and just disable the background by default on mobile. (Incidentally, on the client side, is using Modernizr's "touch" flag a best practice here? I'm something of a noob.)
I can't on my MacBook or external screen. I tried on my iPhone—assuming that the screen might be lower quality—sadly the website blew up and there was no gradient to be found.
"No. No, no, no. (Points to a section of a wall.) That's the new Military Grey bit there, and that's the dowdy, old, nasty Ocean Grey bit there. Or was it the other way round?" -- Rimmer, 'Red Dwarf'
Safari renders both sides as rgba(72, 72, 72, 1.0). I have no idea why. Perhaps (ironically) a dithering algorithm gone wrong. In chrome the divide is visible.
I see banding in gradients quite often; more often in dynamic scenes in games than in more tightly designed static content online.
Of course much of the media we consume has quantized colour, streamed videos especially don't really stand up to any kind of close examination using today's displays, never mind futuristic ones. Banding from limited gamut is not the big problem here, there are much bigger elephants.
Possible I have the gradients defined in such a way that Safari doesn't render it. Safari is supposed to pick up on the -webkit-linear-gradient, right?
I've zoomed it in until the bars almost filled the screen and I still didn't see any gradient, despite reading in the text it should have been so obvious. Kind of defeats the point about 24 bit sucking...
Most screens aren't able to display reliably 24 bits color anyway. In fact, until recently many cheap LCD screens were able to display a 16 or 20 bits color space.
With all due respect, I don't see the gradient break. may be because of that gray animation rotating behind that div?, this posts reminds me of that yesterday's HN parody.
mrb, I am envious of your configuration. If I could turn on system-wide dithering, I would do so in a heartbeat. You tempt me to ditch Windows even though I am (embarrassingly enough to admit here) actually fairly happy with Windows overall.
If it's an LCD and you didn't pay a lot for it, then it's doing that.
Graphic designers usually pay for good quality monitors, but they would be well served by also having a bad monitor to check what their work looks like for most people.
The obvious answer (or strawman) might be aesthetics. Yet two bits is sufficient to create art. Each medium has its limitations. Pen and ink, watercolor, clay - why should a computer screen be seen as inherently different?
We tend to view computer art on our own often miscalibrated displays, under variable lighting conditions, and at a variety of resolutions. Computer art is generally mass produced. Yes, it is behind glass, but not in the manner of the MonaLisa.
I'm not saying that "deep color" isn't worthwhile. Only that a coherent case for its practical advantages wasn't made. The problem wasn't obvious on my screen, and I am biased toward content over form.
Are you really trying to suggest that the difference between the 24-bit color that we already can't reliably reproduce on current monitors and the 30-bit of the author, is the same order of magnitude as between 24-bit and 2-bit monochrome?
No, I am being ironic in order to try to diplomatically say that the comment I was replying to is not hacker news worthy (especially in its original form).
I don't see anything wrong with your image. The gradient looks perfect on my monitor.
Edit: it looks perfect because my sw/hw config (driver Xorg fbdev 0.4.2, AMD Fusion E-350, Lenovo X120e laptop) does automatic dithering. I can see a tiny bit of dithering on the white end of the other gradient posted by nhw: http://file.st/KWb7XS7x Other that that, no banding on either images. Dithering invisible on yours. Dithering 24-bit colors really helps.
I'm missing the part about what awesome thing would be possible to do with higher-bit displays. The answer for higher-DPI displays is obvious: everything looks sharper, all the time. While it may be true that 30-bit color would allow the gradient between those two grays to be gradual instead of a single step, when does someone actually want to draw a gradient between such similar shades of gray?
It's not about gradients between gray shades. It's about more shades, which is useful in DTP, video and film, digital painting and so on. If you ever have a chance, take a look at a high quality picture or video on a display like Eizo CG275W (or newer like 276) connected to 30-bit output capable card (Quadros have it, GeForce don't). It's a whole level above what you see on regular monitors, even calibrated. OTOH, they are really expensive - I get by with calibrator and regular monitors.
Black and white pictures only have 256 shades of grey. Ever wondered why black and white photographs taken on film, displayed in an exhibition have such stunning detail and dynamic range, while the same photographs viewed on an LCD look bland and weak?
"With high-DPI displays, the aliasing problem caused by insufficient pixels has been extinguished." that's not true. Aliasing is an omnipresent problem in discretized data every rasterized display is based on. Even though a retina display samples at a much higher rate, it needs anti aliasing for optimal results.
I'd like 64bit displays too. The problem is that there's quasi no content and also no content pipeline. Someone has to take the first step here.
And having 2GB of VRAM does not imply that you'll need a 268-million pixel display to use it all. In modern games, the vast majority of that VRAM is used by textures (and we need as much of it as we can get!)
It's exactly the same thing with "retina displays" that only "need" 300 PPI (a rule Apple themselves often break, and yet still call them like that). The human eye can discern up to 600 PPI, and even up to 1200 PPI.
Oh wow, the nineties called and they want their grayscale-only AA back... To make a point he conveniently totally forgot to talk about RGB decimation / subpixel rendering. WTF !? Seriously...
If he really thinks that there's only one step between #484848 and #494949 then I'd suggest reading a bit on MS' ClearType and using a magnifier to take a closer look at how Windows or OS X fonts are rendered. Hint: it's not grayscale anti-aliasing.
I honestly don't think that using dithering "because pixels are small" would provide better result than sub-pixel anti-aliasing.
[1] For a reasonable graphic example http://en.wikipedia.org/wiki/File:CIExy1931_srgb_gamut.png where grey is the entire visible spectrum and the coloured areas are the standard RGB colour space.