I am developing a specialized screen sharing solution. The requirements call for it to:
- Work over a Wi-Fi LAN network.
- provide minimum latency and jitter.
I did all I know how to do on the programming side of things but I see that there is great variance in both jitter and latency in different wireless networks and setups (different base stations, ad-hoc vs. Managed etc.)
By trial and error I ran across base station settings that affect latency and jitter, such as power saving mode, beacon interval and DTIM values, but I would be very interested if somehow I could lay down optimal base station config options for minimum latency and jitter.
I understand of course that such optimization may very well decrease bandwidth or cause other undesired artifacts, but I would still be interested to know which knobs to try pushing and what they do.
In estimated order of importance:
Pick a perfectly clean channel and have good signal strength (between -40 and -60dBm).
Make sure no other traffic on the network is competing for airtime with your app. Especially multicast traffic, which gets sent at a low signaling rate, chewing up airtime. Don’t use multicasts or broadcasts in your own app.
Make sure you have more than enough bandwidth for your app. Overprovision your links by about 33%.
Disable 802.11 power save; keep all clients in Constantly Awake Mode (CAM).
Disable any AP or client settings or software that could cause the radio to do scans or otherwise go off-channel. These include old things like roaming and channel agility, and new things like Wi-Fi Direct and Apple AirDrop. Don’t run any kind of Wi-Fi network scanner like NetStumbler or inSSIDer in the background. Disable Wi-Fi-based location detection. Watch out for Widgets/Gadgets/Gizmos that list Wi-Fi networks; they often cause scans.
If using 2.4GHz, disable Bluetooth.
Disable NAT on the base station.
Use a low-latency WMM (QoS) queue. Either voice (VO) or video (VI).
Disable frame aggregation: both A-MPDU and A-MSDU.
Prefer IPv4 over IPv6. To this day, there’s still lots of equipment that handles IPv4 via a hardware-assisted “fast path”, but still handles IPv6 via software.
By the way, tweaking Beacon Intervals and DTIM Intervals are likely to do more harm than good overall. Most clients expect Beacon Intervals to be about 100 TU’s (802.11 Time Units; 1024 microseconds; sometimes called kµsec (kilo microseconds) or Kiusec (Kibi microseconds)), and DTIM intervals between 0-3 beacons. I’ve seen some poorly-written Wi-Fi clients freak out if you change either of those too much (like make either of them more than one second long).
Not really a direct answer, but I’d think that having a good wifi signal (read: access point near the host station, not many wifi networks in the area, not many wifi hosts in the network) would give much greater performance improvements than any tweak.
Also, as far as I know, while there are some really minor tweaks you could do, they are sensitive to the accesspoint/hostcard pair (results could be different with other cards) to the network status (overlapping networks, clients in the network, etc).
All in all, you will probably gain some 1-3% improvements spending lots of time carefully tuning those parameters, with an even smaller influence on the user experience – I’d say it’s probably the best way to spend time and you could get better results somewhere else (like checking for other nearby networks and make sure channels don’t overlap, or stuff like that).
Hope this helps.
Here are a few general tips:
- Disable features you don’t use (research and test those you do not understand)
- Disable ports you don’t need to open
- Increase timeouts