Version 2 (modified by 8 years ago) (diff) | ,
---|
Performance
Automated Testing
The automated tests can be used to check for regressions between versions or for measuring the impact of specific changes or settings.
At present, the test_measure_perf.py test script must be modified by hand to change parameters - hopefully this will change in the future.
To take advantage of iptables packet accounting, follow the error message and setup iptables rules to match the port being used in the tests, ie: by default:
iptables -I INPUT -p tcp --dport 10000 -j ACCEPT iptables -I OUTPUT -p tcp --sport 10000 -j ACCEPT
Then you can let the tests run:
./tests/xpra/test_measure_perf.py "name recorded for this test set's samples"
And collect the CSV data printed at the end, which you can then use with your favourite statistics package or produce pretty graphs
Stress Testing
To be able to use a single client to create many connections to many servers, or simply to simulate an extremely high performance client, r2482 adds the ability to skip the client code which normally paints the windows. To use it:
XPRA_USE_FAKE_BACKING=1 xpra attach
The XPRA_FAKE_BACKING_DELAY
can be used to control how long (in milliseconds) the client waits before responding to the server after each draw packet:
XPRA_FAKE_BACKING_DELAY=10 XPRA_USE_FAKE_BACKING=1 xpra attach
Note: the windows are still shown - just not painted.
Tuning / Pending items
The following mostly undocumented tunables can have an impact on performance and should be investigated more thoroughly under different scenarios:
XPRA_YIELD
- see #181XPRA_USE_PIL
MAX_NONVIDEO_OR_INITIAL_PIXELS
andXPRA_MAX_NONVIDEO_PIXELS