SHELL_BACKEND_RTT increased current consumption when buffer fills up

We are developing nRF52840 battery powered device, NCS 2.5.2.

Wanted to use shell for external commands for system testing (currents must be low when the debugger is not connected) - only in debug build, so this is not critical for production build.

Everything works as expected  when RTT is connected (1.77 mA for debug probe) but when RTT gets disconnected (or board starts without debugger connected), the current is ~4 uA until I guess the RTT buffer fills up and current jumps to 3.45 mA and stays indefinitely while the application keeps running (ble/nfc responding). Changing the RTT_BUFFER_SIZE_UP changes how long it takes for the current to jump to 3.45 mA.

My guess is that SHELL_BACKEND_RTT is configured to the default mode (RTT_MODE_BLOCK). Is there a way to force RTT_MODE_OVERWRITE (as in CONFIG_LOG_BACKEND_RTT_MODE_OVERWRITE)? I also tried CONFIG_SEGGER_RTT_MODE_NO_BLOCK_TRIM=y, but it did not help.

To reproduce the issue, copy shell_module from samples.
Add following configs:
CONFIG_PM_DEVICE=y
CONFIG_SERIAL=n
CONFIG_SHELL_BACKEND_RTT=y
CONFIG_SHELL_BACKEND_SERIAL=n
CONFIG_SHELL_MINIMAL=y
CONFIG_SHELL_LOG_BACKEND=y
CONFIG_SHELL_RTT_RX_POLL_PERIOD=1000 # needed for low power operation of Shell (reduces approx 30 uA).

replace main function with:
int main(void)
{
	if (IS_ENABLED(CONFIG_SHELL_START_OBSCURED))
	{
		login_init();
	}

	while (1)
	{
		LOG_WRN("HELLLLLLLLLOOOOOOOO");
		k_sleep(K_MSEC(1000));
	}

	return 0;
}
Flash the code, the current should be <4 uA. (debugport not connected)
Wait for ~20 sec for buffer to fill up
The current jumps up to 3.1 mA and stays up.
The application code continues running (e.g. add some led blinking etc).

Connect with some RTT viewer - the RTT buffer gets dumped, Disconnect RTT. If current remains up, the MCU may be left halted (1.8 mA), run nrfjprog --run and the cycle continues.
~20 sec of low current and jumps up to 3.1 mA as the buffer fills up again.
  • As a side note:
    CONFIG_SHELL_RTT_RX_POLL_PERIOD=1000 (ms) is critical for low power. I think it would make sense to maybe have the default higher interval if CONFIG_PM_DEVICE=y? Or have it documented under achieving low power consumption.

    Without this config the current is ~32uA instead of ~4 uA.

  • Hi,

    I see the same as you, and I have not been able to get to the bottom of this yet, but I will continue to look into it.

    Regardign CONFIG_SHELL_RTT_RX_POLL_PERIOD that is a good comment, but I suspect a lower value is better as default as RTT is typically only used for debugging and not in production code or code where you are measuring current consumption. So responsiveness is probably more important than saving some milli amps. 

Related