#2101 closed defect (fixed)
New video card, and all XPRA windows are dark
Reported by: | Nathan Hallquist | Owned by: | Nathan Hallquist |
---|---|---|---|
Priority: | major | Milestone: | 2.5 |
Component: | client | Version: | 2.4.x |
Keywords: | Cc: |
Description
The card is a quadro p2000. I am using windows 10. I am using the beta client from the XPRA web site.
The NVIDIA control panel (not XPRA settings) lets me switch between "YCbCr422" "YCbCr444" and "RGB" for "output color format" (this is in the "Change Resolution" pane). The XPRA window is dark *unless* I chose YCbCr422 with 8bpc for "Output color depth". With 10bpc and YCbCr422 XPRA becomes darker than the rest of the screen.
The issue is not exclusively actuated by the "bpc" settings, but also by the "output color format". It is dark with both 10 and 8 for YCbCr444 and for RGB.
when I choose YCbCr422 (which was, weirdly, the default) some things, like thin text, looked distorted. This distortion is a global property of that setting affecting all windows. The distortion reminds of a CRT with bad color convergence, really weird, but text looks great with RGB or YCbCr444 on my screen (wide-format display-port samsung OLED), so I really don't want to use that setting, but unless I use that setting XPRA is darker than the rest of the screen..
Screenshots of the XPRA looks perfect. Namely, they are not darker than the rest of the screen. Specifically, screenshots of the XPRA window look less dark than the window itself.
I am going to attach a photo that I took on my cell phone.
Attachments (9)
Change History (25)
Changed 2 years ago by
Attachment: | IMG_0251.jpg added |
---|
Changed 2 years ago by
Attachment: | Capture.PNG added |
---|
comment:1 follow-up: 2 Changed 2 years ago by
Owner: | changed from Antoine Martin to Nathan Hallquist |
---|
Inlining the pictures:
- The NVIDIA control:
- The darker xpra window:
This is probably related to wiki/ImageDepth and in particular #1309.
Can you post the log output with -d opengl
?
There should be some relevant details in init_formats()
during window initialization.
Until I can fix this, you can disable opengl in the client, which may become a lot slower..
What is probably needed to fix this:
- either we should not be trying to use 10 bpc in the client (especially if the server doesn't export pixel data with high bit depth) - or at least give the option to use 8 bpc instead
- fix the upsampling / rendering so 8 bpc data is rendered correctly on a 10 bpc display.
It is dark with both 10 and 8 for YCbCr444 and for RGB.
That's weird, I would not expect standard 8-bit modes to have any problems. RGB and YCbCr444 are equivallent and lossless.
when I choose YCbCr422 (which was, weirdly, the default) some things, like thin text, looked distorted.
This distortion is a global property of that setting affecting all windows. The distortion reminds of a CRT with bad color convergence, really weird
That's expected due to the horizontal subsampling of chroma.
Can you post a screenshot or picture to be sure?
comment:2 Changed 2 years ago by
Changed 2 years ago by
Attachment: | IMG_0255.jpg added |
---|
Changed 2 years ago by
Attachment: | IMG_0256.jpg added |
---|
Changed 2 years ago by
Attachment: | YCbCr422.log added |
---|
-d opengl with output color format set to 422
Changed 2 years ago by
Attachment: | YCbCr444.log added |
---|
-d opengl with output color format set to 444
comment:4 Changed 2 years ago by
Everything started working correctly when I switched "HDR and WCG" to "off" in settings (this is windows settings not the nvidia-control panel). See screenshot.
When "HDR and "WCG" is on and color is 10bit 422 or 8 or 10 444 or RGB with "HDR an WCG" on in settings control panel XPRA is darker than everything else, but I also got a nice brightness slider in the windows settings.
Changed 2 years ago by
comment:5 Changed 2 years ago by
Another interesting thing: the brightness of the XPRA window is immune to the windows HDR brightness slider.
comment:6 Changed 2 years ago by
Another detail: "native" opengl works great. This problem only pops up with gtk opengl.
comment:7 Changed 2 years ago by
Another detail: "native" opengl works great. This problem only pops up with gtk opengl.
Very interesting, the naive solution would be to switch the default to "native", since that's also going to be the default for GTK3.
But this may in fact hinder us in the long run if the "native" option doesn't support HDR properly. Maybe it does already, I really don't know much about it.
We have a ticket for X11 server HDR support: #1584
Can you attach the -d opengl
log output of both "native" and "gtk" backends?
comment:8 Changed 2 years ago by
I'm guessing it has something to with the 48-bit color that XPRA detects:
* gdkgl - version : 10.0 * gdkglext - version : 1.2.0 * green-size : 16 * gtkglext - version : 1.2.0 * has-depth-buffer : True * has-stencil-buffer : False * has_alpha : True * max-viewport-dims : 32768, 32768 * opengl : 4, 6 * pygdkglext - version : 1.0.0 * pyopengl : 3.1.1a1 * red-size : 16 * renderer : Quadro P2000/PCIe/SSE2 * rgba : True * safe : True * sample-buffers : 0 * samples : 0 * shading-language-version : 4.60 NVIDIA * stencil-size : 0 * stereo : False * texture-size-limit : 32768 * transparency : False * vendor : NVIDIA Corporation * zerocopy : True nathan@coyote2 MINGW64 ~/src
See slide 54 of:
comment:9 Changed 2 years ago by
Can you attach the -d opengl
log output of both "native" and "gtk" backends?
comment:10 Changed 2 years ago by
I think you are right about GTK supporting HDR somehow. The brightness slider only changes SDR windows according to its embedded description. With GTK on my HDR setup XPRA is always darker than the rest of the screen (unless I slide brightness below 10%), but is also immune from the brightness bar, which is only supposed to affect SDR windows.
Regarding logs, I'll do that when I get to work on Monday. My equipment at home isn't nice enough to let me test HDR. :)
Changed 2 years ago by
Attachment: | native.log added |
---|
Changed 2 years ago by
comment:12 Changed 2 years ago by
Key bits:
- init format is identical:
init_formats() texture pixel format=GL_RGB, internal format=RGB8, \ rgb modes=['YUV420P', 'YUV422P', 'YUV444P', 'GBRP', 'BGRA', 'BGRX', 'RGBA', 'RGBX', 'RGB', 'BGR']
But then:
- native:
ChoosePixelFormat for window 0x350834 and 8 bpc: 0x10 DescribePixelFormat: {'accum-size': 64, 'alpha-shift': 24, 'aux-buffers': 4, \ 'red-size': 8, 'blue-shift': 0, 'red-shift': 16, 'stencil-size': 8, \ 'green-shift': 8, 'blue-size': 8, 'accum-green-size': 16, 'alpha-size': 8, \ 'double-buffered': True, 'depth-size': 0, 'green-size': 8, 'accum-red-size': 16, \ 'depth': 32, 'accum-blue-size': 16, 'rgba': True, 'visible-mask': 0L}
- gtk:
Config_new_by_mode(<flags GDK_GL_MODE_DOUBLE | GDK_GL_MODE_ALPHA of type GdkGLConfigMode>)=<gtk.gdkgl.Config object at 0x739aaa0 (GdkGLConfigImplWin32 at 0x3f079a0)> GL props={'gdkglext': {'version': (1, 2, 0)}, 'has-depth-buffer': True, 'accum-blue-size': 0, \ 'stencil-size': 0, 'blue-size': 16, 'gdkgl': {'version': (6, 2)}, 'double-buffered': True, \ 'depth-size': 24, 'sample-buffers': 0, 'gtkglext': {'version': (1, 2, 0)}, \ 'alpha-size': 16, 'rgba': True, 'display_mode': ['ALPHA', 'DOUBLE'], 'samples': 0, \ 'red-size': 16, 'has-stencil-buffer': False, 'has_alpha': True, 'pygdkglext': {'version': (1, 0, 0)}, \ 'stereo': False, 'accum-green-size': 0, 'green-size': 16, 'accum-red-size': 0, 'depth': 48, 'aux-buffers': 4}
The key difference here is that "gtk" chooses a 16-bit per channel mode whereas "native" uses 8-bit.
We use ChoosePixelFormat with bpc = 8
in create_wgl_context
.
It seems that 16-bit per channel modes default to HDR then? (docs are not clear on that - it is implied)
So r21382 will now prefer "native" over "gtk".
@nathan_lstc: does that make sense, can I close?
At some point when we want to support HDR, we will need to use 16-bit per channel AND either:
- for SDR content: tell mswindows that the content we paint is SDR? (not sure how in opengl)
- use HDR and make sure that pixel buffers carry the colourspace information (and convert SDR to HDR when needed)
Opengl links:
- learnopengl Advanced-Lighting HDR
- glfw: Support HDR and wide color gamut
- Displaying HDR Nuts and Bolts
Microsoft docs: High Dynamic Range and Wide Color Gamut.
comment:13 Changed 2 years ago by
I'm satisfied. I've something else to add here: the examples from pygtkglext behave just like XPRA, so other things are affected as well.
comment:14 Changed 2 years ago by
I've found some a glgears written using gtkglext-win32 and I cannot find any attribs for gdk_gl_config_new
that get 8 bits per channel. Passing in "8" for GDK_GL_RED_SIZE seems to have no effect.
comment:15 Changed 2 years ago by
Resolution: | → fixed |
---|---|
Status: | new → closed |
I'm not going to backport this to 2.4.x, the fix will be in 2.5 only.
We'll revisit HDR in #1584.
comment:16 Changed 3 months ago by
this ticket has been moved to: https://github.com/Xpra-org/xpra/issues/2101
The NVIDIA control I mention