Follow up from #1260 (nvenc 7) and #909 (10-bit colour).
According to NVIDIA VIDEO CODEC SDK: HEVC 10-bit encoding (..) require Pascal generation GPUs.
According to the docs:
NV_ENC_BUFFER_FORMAT_YUV420_10BIT
10 bit Semi-Planar YUV [Y plane followed by interleaved UV plane]. Each pixel of size 2 bytes. Most Significant 10 bits contain pixel data.
NV_ENC_BUFFER_FORMAT_YUV444_10BIT
10 bit Planar YUV444 [Y plane followed by U and V planes]. Each pixel of size 2 bytes. Most Significant 10 bits contain pixel data.
NV_ENC_BUFFER_FORMAT_ARGB10
10 bit Packed A2R10G10B10. Each pixel of size 2 bytes. Most Significant 10 bits contain pixel data.
NV_ENC_BUFFER_FORMAT_ABGR10
10 bit Packed A2B10G10R10. Each pixel of size 2 bytes. Most Significant 10 bits contain pixel data.
The "each pixel of size 2 bytes" looks wrong to me (probably true for YUV formats only): this conflicts with "A2B10G10R10" / "A2R10G10B10". It looks like plain 32-bit data (2 alpha + 3*10 bit) we get from X11, which would be nice. Another way would be to use nvenc via ffmpeg (ie: added support for 10 bit HEVC encoding), meh.
Milestone renamed
As per NvEncodeAPI Data structures: NV_ENC_BUFFER_FORMAT_ABGR10
: 10 bit Packed A2B10G10R10. This is a word-ordered format where a pixel is represented by a 32-bit word with R in the lowest 10 bits, G in the next 10 bits, B in the 10 bits after that and A in the highest 2 bits.
Done in r26968.
this ticket has been moved to: https://github.com/Xpra-org/xpra/issues/1308