It is pretty natural that upload from a client device to the AP can be better than download, at least due to maximal ratio combining on APs that have more antennas than a client. However, do you guys know what is the expected ratio of such difference?
I have been struggling for a while with a customer environment where a 2x2 11ac laptop gets most of it’s capabilities with highest Tx and Rx datarates most of the time (130, 156 Mbps, long GI is set for other tests right now). Tx and Rx datarates are 99% times the same (for example 130/130, 156/156). When we were testing, there was one other device associated with the AP or no other device at all. The channel plan and powers are still a subject to change due to challenging deployment back in the days (APs in hallways, plenty of them due to natural poor in-room signal quality, mounting mistakes and so on).
Working with iperf (server on the wired side), occasionally we get up/down throughput of about 75, 80 or even 90 Mbps with these datarates. But most of the time, with the same datarates, up throughput is much lower, like 20-35 Mbps.
Is this ratio even valid? I doubt it.
If there was too much contention overhead or non-WiFi interference I would expect both up and down throughput to be decreased.
Any support in solving this mistery would be appreciated.
The equipment on the wireless side is AP3935 and AP3710, the controller is C5210. AFAIR we were testing both B@AC (default in their environment) and B@AP but this is a thing I will also try to review once again to get rid of any issues that could occur on the wired side. Didn’t see any wired QoS stuff though.