Does PWP support 10 bit color support similar to what can be enabled in photoshop?
Document here:
http://www.amd.com/la/Documents/48108-B ... _Final.pdf
Enabling 10-bit Color
Moderator: jsachs
Re: Enabling 10-bit Color
PWP does not support 10-bit color as in the link you provided, however 8-bit color supports more intensity levels per channel than the eye can distinguish as long as the monitor gamma is set correctly, so this is mostly marketing hype except for certain special applications.
Jonathan Sachs
Digital Light & Color
Digital Light & Color
-
- Posts: 227
- Joined: November 24th, 2009, 2:00 am
- What is the make/model of your primary camera?: Fuji X-Pro 2
- Contact:
Re: Enabling 10-bit Color
In my experience, there are two big steps to amazing and accurate color. The first step is buying a colorimeter and calibrating an IPS panel LCD monitor; that would be JS's advice to set the gamma curves for the three channels correctly. The second step, much more expensive, is getting a monitor that displays most of the AdobeRGB color space, such as the NEC PA series (and calibrating it with a colorimeter that works on such wide-gamut displays).
The eye can see colors that sRGB cannot display. Adobe RGB is much closer to the range of the eye. I believe JS's point is that dividing the R, G, and B channels of Adobe RGB into 256 intervals and combining the three channels surpasses what the eye can do. Mark Dubovoy has an essay on the Luminous Landscape website disagreeing by assertion.
Not to mention the camera and lens; being at the scene when the light is right; and finding a profiled paper that prints to your satisfaction.
The eye can see colors that sRGB cannot display. Adobe RGB is much closer to the range of the eye. I believe JS's point is that dividing the R, G, and B channels of Adobe RGB into 256 intervals and combining the three channels surpasses what the eye can do. Mark Dubovoy has an essay on the Luminous Landscape website disagreeing by assertion.
Not to mention the camera and lens; being at the scene when the light is right; and finding a profiled paper that prints to your satisfaction.
Re: Enabling 10-bit Color
I've seen two responses asserting that 256 colors is more than the human eye/brain can perceive. I haven't seen any assertions stating that either person has tested their claim. Have you tested a true 10 bit workflow to see if you can see any difference? I haven't. I would like the option to do that but I can't if my application doesn't support it.
I agree, charles2 with everything you have said except that I would add you also need a spectrophotometer to do noticeably better printer/paper profiling than what you get with the manufacturer's profiles.
My point here is that if we have everything you have stated and are seeing great color, we might be able to get evern better color reproduction with a 10bit workflow.
I agree, charles2 with everything you have said except that I would add you also need a spectrophotometer to do noticeably better printer/paper profiling than what you get with the manufacturer's profiles.
My point here is that if we have everything you have stated and are seeing great color, we might be able to get evern better color reproduction with a 10bit workflow.
Re: Enabling 10-bit Color
It has been thoroughly documented that under ideal conditions the eye can separate about 200 gray levels. If the number was significantly higher than 256, you would be able to see banding in a continuous gradient. The number of distinguishable levels of colors is considerably worse than for neutrals. There is a good argument for more than 8 bit wide lookup tables in the display adapter as this is what defines the gamma curve, but I don't see any conceivable advantage to sending more than 8 bits of image data to the monitor assuming proper calibration.
Jonathan Sachs
Digital Light & Color
Digital Light & Color
Re: Enabling 10-bit Color
I use a Radeon Pro W2100 connected to a BenQ SW2700 via Displayport and calibrated using a X-Rite i1Display Pro (to set the 14bit hardware LUT in the monitor).
For general editing of 16 bit TIFF files I use PWP 7 (as I like the way it replicates darkroom processes in software), however, where colour or tonality is critical, then for finalising the images I use Zoner Photo Studio 17 to display them using a 10bit data feed to the monitor.
1 I can see posterisation in 8 bit greyscale ramps
2 I can see posterisation in 8 or 10 bit greyscale ramps when using 10 bit rendering if I switch between 8 bit and 10 bit data transmission
3 I can see differences in images when using 10 bit rendering if I switch between 8 bit and 10 bit data transmission
4 I do not see posterisation of 10 bit greyscale ramps using 10 bit rendering and 10 bit data transmission
5 My colour vision is above average (I passed the finest gradation level of the RAF's colour vision test)
I accept that many people can't see the difference, but some people can!
For general editing of 16 bit TIFF files I use PWP 7 (as I like the way it replicates darkroom processes in software), however, where colour or tonality is critical, then for finalising the images I use Zoner Photo Studio 17 to display them using a 10bit data feed to the monitor.
1 I can see posterisation in 8 bit greyscale ramps
2 I can see posterisation in 8 or 10 bit greyscale ramps when using 10 bit rendering if I switch between 8 bit and 10 bit data transmission
3 I can see differences in images when using 10 bit rendering if I switch between 8 bit and 10 bit data transmission
4 I do not see posterisation of 10 bit greyscale ramps using 10 bit rendering and 10 bit data transmission
5 My colour vision is above average (I passed the finest gradation level of the RAF's colour vision test)
I accept that many people can't see the difference, but some people can!
Re: Enabling 10-bit Color
Seeing posterization on a monitor is not the way to test how many gray levels you can distinguish. The tiniest kink in the monitor calibration curves will produce visible posterization. The experiment used to determine the ability to differentiate gray levels is done with special equipment that uses precisely controlled light sources.
Jonathan Sachs
Digital Light & Color
Digital Light & Color
Re: Enabling 10-bit Color
Hi, yes, I'm aware of that.
This is why I pointed out that feeding 10 bit data to my monitor produced a ramp that I perceive as smooth, but I see sharp transitions when sending 8 bit data, so the fundamental linearity (OK, conformance to Gamma 2.2) of my monitor is sufficient for my vision, but an 8 bit data feed isn't.
It's also in the published literature that the luminance discrimination threshold varies between individuals by a factor of more than six; so whilst the average person's discrimination threshold can be accommodated within a 256 level Gamma 2.2 response curve this isn't so for everyone. Given that relatively few of us can differentiate an 8 bit feed from a 10 bit data feed it would be a very sensible commercial decision to not implement the technical complexity necessary for the 10 bit data feed to the monitor!
Incidentally, when using a continuous source and a monochromator (I used to work for a spectrophotometer company), at relatively bright levels and in some parts of the spectrum I could consistently detect changes well below 0.5 Δe - I was also told that that was theoretically impossible! (Note that - as you are well aware! - due the tri-stimulus nature of vision, variations of chromaticity are detected using the sum and difference processing of luminance stimuli of the chromophores, hence to have good chrominance discrimination requires good relative luminance discrimination.)
Let me just reiterate...
I still think it's a very sensible commercial decision to not implement 10bit output, even though some people actually can slightly benefit from it.
P.S. I've now switched to the PWP 8 Beta and initial impressions are very favourable - thank you.
This is why I pointed out that feeding 10 bit data to my monitor produced a ramp that I perceive as smooth, but I see sharp transitions when sending 8 bit data, so the fundamental linearity (OK, conformance to Gamma 2.2) of my monitor is sufficient for my vision, but an 8 bit data feed isn't.
It's also in the published literature that the luminance discrimination threshold varies between individuals by a factor of more than six; so whilst the average person's discrimination threshold can be accommodated within a 256 level Gamma 2.2 response curve this isn't so for everyone. Given that relatively few of us can differentiate an 8 bit feed from a 10 bit data feed it would be a very sensible commercial decision to not implement the technical complexity necessary for the 10 bit data feed to the monitor!
Incidentally, when using a continuous source and a monochromator (I used to work for a spectrophotometer company), at relatively bright levels and in some parts of the spectrum I could consistently detect changes well below 0.5 Δe - I was also told that that was theoretically impossible! (Note that - as you are well aware! - due the tri-stimulus nature of vision, variations of chromaticity are detected using the sum and difference processing of luminance stimuli of the chromophores, hence to have good chrominance discrimination requires good relative luminance discrimination.)
Let me just reiterate...
I still think it's a very sensible commercial decision to not implement 10bit output, even though some people actually can slightly benefit from it.
P.S. I've now switched to the PWP 8 Beta and initial impressions are very favourable - thank you.
Re: Enabling 10-bit Color
As I understand it, the Windows driver model is exclusively 8-bit, but there are lookup tables that convert the 8 bits going to the display to 10 or more bits. This lookup table is what is created by a monitor calibrator and it is embedded in the custom monitor profile it creates. A program that runs at startup then extracts the table from the profile and loads it into the display adapter. Thus, no matter how many bits the display may have, only 256 gray levels are ever sent to the monitor, but the extra bits are used to make sure that the 256 gray levels are selected so as to match the desired gamma curve as accurately as possible. Windows supports up to a 16-bit lookup table.
I didn't say you couldn't distinguish more than 200 gray levels, just that you can't use a monitor connected to a computer to prove it. Some people apparently can also see the moons of Jupiter and the phases on Venus with the naked eye.
I didn't say you couldn't distinguish more than 200 gray levels, just that you can't use a monitor connected to a computer to prove it. Some people apparently can also see the moons of Jupiter and the phases on Venus with the naked eye.
Jonathan Sachs
Digital Light & Color
Digital Light & Color
Re: Enabling 10-bit Color
OK, yes the WDDM defaults to the 8 bit interface, but, from Windows 7 on is capable of 10 bit interfacing when using DirectX, Direct3D or OpenGL with the correct video drivers installed (which in turn depend on the characteristics of the matching video card). AMD Radeon Pro and nVidia Quadro cards have this, but relatively few consumer level cards do.
Even with this, it requires a reboot for the drivers to request 10 bit support from the OS, and to successfully do this the hardware chain is supposed to be checked for 10 bit compliance before the request is made (a request which, indeed, the OS need not honour!). Even after this, the API presented to the application is still 8 bit until the application actively enables 10 bit support via DirectX or Direct3D. Even with Win 10 (I haven't checked for this particular OS) I believe that the Window API remains at 8 bit support, only the DirectX display surface for rendering images within the window display area is 10 bit enabled.
Quite a palaver.
Even with this, it requires a reboot for the drivers to request 10 bit support from the OS, and to successfully do this the hardware chain is supposed to be checked for 10 bit compliance before the request is made (a request which, indeed, the OS need not honour!). Even after this, the API presented to the application is still 8 bit until the application actively enables 10 bit support via DirectX or Direct3D. Even with Win 10 (I haven't checked for this particular OS) I believe that the Window API remains at 8 bit support, only the DirectX display surface for rendering images within the window display area is 10 bit enabled.
Quite a palaver.