xpra icon
Bug tracker and wiki

Opened 12 months ago

Closed 2 months ago

Last modified 8 weeks ago

#1309 closed enhancement (fixed)

10-bit color support in opengl client

Reported by: Antoine Martin Owned by: J. Max Mena
Priority: major Milestone: 2.0
Component: client Version: trunk
Keywords: 30-bit colour opengl Cc:

Description

Split from #909.

The best explanation of the changes required can be found in https://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf, see 30-Bit Visual on Linux.

We'll need to tell the server we want 10-bit colour, maybe advertise a new YUV or RGB upload mode.

Attachments (2)

gl_check.txt (11.2 KB) - added by Antoine Martin 8 months ago.
gl_check.py output
session-info-bit-depth.png (50.0 KB) - added by Antoine Martin 8 months ago.
shows the bit depth on session info

Download all attachments as: .zip

Change History (13)

Changed 8 months ago by Antoine Martin

Attachment: gl_check.txt added

gl_check.py output

comment:1 Changed 8 months ago by Antoine Martin

Owner: changed from Antoine Martin to alas

With r15015, running xpra/client/gl/gl_check.py against a 30-bit display I get attachment/ticket/1309/gl_check.txt, which shows:

* blue-size                       : 10
* red-size                        : 10
* green-size                      : 10
* depth                           : 30

So we can detect support for 30-bit color and 10-bit per channel.
And r15018 handles 30-bit modes with native 30-bit upload: "r210" == "GL_UNSIGNED_INT_2_10_10_10_REV".
r15019 fixes swapped colour red and blue (oops), r15026 allows us to prefer high bit depth "r210" plain rgb encoding if the client is using 10-bit depth rendering. (jpeg and video encodings will still be used for lossy packets).
r15027 shows the bit depth on session info (normal bit depth is 24):
shows the bit depth on session info

We could probably handle R210 the same way (as "GL_UNSIGNED_INT_2_10_10_10") but since I don't have hardware to test.. this is not supported.

@afarr: FYI, we can handle high color depth displays (only tested on Linux).

Last edited 8 months ago by Antoine Martin (previous) (diff)

Changed 8 months ago by Antoine Martin

Attachment: session-info-bit-depth.png added

shows the bit depth on session info

comment:2 Changed 7 months ago by Antoine Martin

PS: r15094 fixes opengl rendering which broke because our hacked pygtkglext library is missing the "get_depth" method, OSX clients will not support high bit depths until this is fixed: #1443

comment:3 Changed 7 months ago by Antoine Martin

Milestone: 3.02.0

comment:4 Changed 7 months ago by Antoine Martin

See new wiki page: wiki/ImageDepth

comment:5 Changed 3 months ago by J. Max Mena

Realistically, we won't be able to test this until we get proper hardware for it. And even then, I have no idea what said proper hardware will be.

@antoine - some input as to what we should be testing with would be nice, but I wouldn't hold my breath on us actually getting said equipment if it involves asking for new hardware.

comment:6 Changed 3 months ago by Antoine Martin

Owner: changed from alas to J. Max Mena

@antoine - some input as to what we should be testing with would be nice, but I wouldn't hold my breath on us actually getting said equipment if it involves asking for new hardware.

You may already have all you need:

  • a good monitor, most 4k monitors can do 10-bit colour nowadays
  • a recent enough graphics card
  • probably best to use a displayport cable, not HDMI
  • support also varies from OS to OS.. (more limited on MS Windows AFAICT)

More nvidia info here: 10-bit per color support on NVIDIA Geforce GPUs

Actually verifying that you are rendering at 10-bit per colour is a bit harder:

  • see comment:1 to ensure 10-bit rendering is available
  • enable 30-bit as per wiki/ImageDepth
  • force RGB only encoding: --encodings=rgb and verify that paint packets come through as "r210" rgb pixel format
  • use an application that renders in 30-bit colour and verify that we forward the colours accurately (we should probably write a test opengl app to make it easier: #1553)

Edit: AMD’s 10-bit Video Output Technology seems to indicate that 10-bit color requires a "firepro" workstation card

Last edited 2 months ago by Antoine Martin (previous) (diff)

comment:7 Changed 2 months ago by Antoine Martin

Updates and fixes:

  • r16297 fixes mixed mode ("r210" vs "BGRA" uploads must use different upload datatypes), and red and blue colors were swapped when using r210 to RGB downsampling (ie: with all non-plain-rgb picture codecs)
  • r16298: make it possible to force the high bit depth code path using the XPRA_FORCE_HIGH_BIT_DEPTH=1 env var

The test application is ready in #1553, but it's not really easy to use because it requires opengl... virtualgl can't handle the "r210" pixel format, and the software gl renderer doesn't support it either.
So in order to test, I had to run the xpra server against my main desktop with the nvidia driver configured at 10 bpc.
Then connect a client... and the only client I had available for testing was a windows 7 system, and ms windows doesn't do 10 bpc with the consumer cards, so I had to swap cards. Then the monitor it was connected to didn't handle 10 bpc, so I had to swap that. Then the cables were too short. Then I had to make fixes (see this ticket and many other fixes yesterday - bugs you only hit with --use-display for example...)
TLDR: hard to test!

Last edited 2 months ago by Antoine Martin (previous) (diff)

comment:8 Changed 2 months ago by Antoine Martin

r16303: the "pixel-depth" option can now be used to force the opengl client to use deep color (use any value higher than 30) - even if the display doesn't claim to render deep color.
ie: running the server with --pixel-depth=30 -d compress, and a linux opengl client with --pixel-depth=30 --opengl=yes, I see:

compress:   0.1ms for  499x316  pixels at    0,0    for wid=1     using  rgb24 with ratio   1.6% \
    (  615KB to     9KB), sequence     5, client_options={'lz4': 1, 'rgb_format': 'r210'}

Note the "r210" rgb format. Same result if the client is running on a 30-bit display with --pixel-depth=0 (the default)
Whereas if the client runs on a 24-bit display, or if we force disable deep color with --pixel-depth=24 then we see:

compress:   1.4ms for  499x316  pixels at    0,0    for wid=1     using  rgb24 with ratio   1.3% \
    (  615KB to     7KB), sequence     3, client_options={'lz4': 1, 'rgb_format': 'RGB'}

Remaining issues:

  • dummy driver may be swapping red and blue: #1576
  • ms windows clients don't seem to render in deep color no matter how hard I try (ticket:1553#comment:2 - a GTK issue perhaps?)
  • virtualgl doesn't support 10 bpc: #1577
  • macos not tested (no hardware available for testing)

comment:9 Changed 2 months ago by Antoine Martin

Updates:

  • r16307 adds some of the test code to the xpra source under xpra/client/gtk_base/example and makes it python3 compatible - those examples have a helper shortcut on macos, and EXE binaries on win32
  • r16326 (+r16327 fixup) adds 16-bit opengl rendering support - very useful for forcing a low bit depth and seeing more color banding
  • r16328 (+r16329 fixup) better transparency support for 16-bit mode
  • r16331 improves the color gradient app (see ticket:1553#comment:4)

With these changes, it is now much easier to:

  • see color banding (use 16 bit client rendering mode)
  • launch the gradient color example (#1553) and compare local and remote

For macos, see also #1443

Last edited 2 months ago by Antoine Martin (previous) (diff)

comment:10 Changed 2 months ago by Antoine Martin

Resolution: fixed
Status: newclosed

Tested on win32 (no luck) and Linux (OK) as part of #1553, for macos testing: #1443. Closing.

comment:11 Changed 8 weeks ago by Antoine Martin

opengl applications running through virtualgl currently require this patch: ticket:1577#comment:2

Note: See TracTickets for help on using tickets.