Why does 2.4 GHz draw more power than 5 GHz in STA mode (underwater test), while throughput drops?

Setup

  • Hardware: Nordic nRF7002DK as Wi-Fi STA (client).

  • AP & Receiver: a Linux PC creates a Wi-Fi Hotspot/SoftAP (NetworkManager/hostapd) and simultaneously runs the UDP receiver on the same machine.

  • Traffic: the STA streams the same JPEG image repeatedly over UDP; only the RF band changes (2.4 GHz vs 5 GHz).

  • Power: system powered at 3.3 V, measured with a precision current monitor (PPK-style).

  • Environment: near-water/underwater test; the transmitter is progressively submerged. Water depths: air, 0, −2, −4, −6, −8, −10 cm.

  • Controls: same AP, scripts, STA role, TX power, and payload across runs; only the band differs.

Observation

  • With increasing depth, 2.4 GHz shows higher average power and lower/more jittery throughput.

  • 5 GHz remains lower and steadier in power with relatively stable throughput. (Plots attached.)

    I expected 2.4 GHz not to be worse than 5 GHz near water due to the longer wavelength. Instead I see higher power and higer throughput decrease on 2.4 GHz.

  • Questions

    1. What mechanisms could make 2.4 GHz draw more power than 5 GHz in STA mode under these conditions?

    2. Why would throughput on 2.4 GHz fall too much? Any band-specific rate control/fallback behaviors that explain this?

      Thanks!

Parents Reply Children
No Data
Related