Current draw when NFCT is active

Hello DevZone,

I have a project where I want to use NFCT for communicating and charging. I have used the NRFX_NFCT library to read and write to my transmitter because my transmitter does not support using NDEF messages.

When the NFC field is on and my microcontroller indicates that it is selected I see a very large current drain using the power profiler.

I cannot explain or find out where all this current go to. In normal situations this current draw is no issue except for when the system is starting up.

I have tried using the profiler on the writable NDEF example project to compare the current draw but that was even worst.

Can anyone explain where all the current is flowing to and maybe help to optimize (if that is possible at all)?

Reading the electrical specifications in the datasheet I would have assumed that the current drain would be something like 500uA.

Kind regards,

Tom

Parents Reply
  • The reason is that the EVENTS_FIELDDETECTED event is unreliable: Erratum 79 And as a workaround, the driver sets up the TIMER peripheral to poll periodically (default every 100 us) to check if the field is lost. This consumes quite a lot of current.

    It is possible to change the period of the timer to achieve the desired tradeoff between current consumption optimization and field lost detection latency. You can change timer period parameter in the modules\nrfx\drivers\src\nrfx_nfct.c. The parameter value declaration is more or less at the beginning of the file:

    #define NRFX_NFCT_FIELD_TIMER_PERIOD 100  /**< Field polling period in us. */
Children
  • Changing the time to 10000 us does not show any improvement in the overall current consumption

    Even changing this value to more does not any improvement in the current consumption.

    And yes my tag is constantly in an NFC field

  • Hi, I did some measurements here, and after changing the timer period from 100 us to 10000 us it drops from about 3 mA to 1.8 mA. This is when powering VDD with 3V and enabling the DCDC.

    I tried to set up a very simple SENSE test example just to see the minimum achievable current, and just calling TASKS_SENSE makes the NFC peripheral consume about 500 uA, when the tag is in the NFC field, according to spec. Then after starting the HFXO and the TIMER that is needed for the workaround, I get to a base current of 1 mA, when the tag is in the field. I see that if I call TASKS_STARTTX after EVENTS_FIELDDETECTED the current jumps up to 1.8mA, same as when using the NFCT library. But I believe this should go down to 1 mA after the TX/RX is finished. So there might be some issues here. I will continue investigate and let you know.

    Maybe you can try the following code just to see if you get 1 mA as well?

        NRF_CLOCK->TASKS_HFCLKSTART = 1;
        while(!NRF_CLOCK->EVENTS_HFCLKSTARTED){}
        NRF_CLOCK->EVENTS_HFCLKSTARTED = 0;
        NRF_TIMER1->TASKS_START = 1;
        NRF_NFCT->TASKS_SENSE = 1;
        NRF_NFCT->SHORTS = 3;
Related