This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts
This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Problem to read back flash with NVS when concurrent use with Bluetooth

Hi,

We need to save 3 strings to flash using NVS. Based on the code sample in zephyr\samples\subsys\nvs we set the initialization and then write and read these data.

To have successful read back we need to make some dummy write to NVS before, else the reading doesn't correspond to the previous written value.

We have done the following observation:

  • doing read/write of the NVS before enabling Bluetooth works fine
  • doing read/write after enabling Bluetooth retrieves the very first data instead of the last written (and some more strange behaviour)
  • the Bluetooth stack is doing a second nvs_init() with larger sector number to fs structure

Our question are the following

  • is there more documentation on NVS than the basic developer source ?
  • is there any precautions to use "custom" access to NVS at the same time as Bluetooth stack ?
  • should we split the storage area for each uses or ID's should be fully managed by Bluetooth stack without any problems ?

We initiate fs with DTS flash controller definition (ex: nRF52833 -> storage begin at 0x7a000)

Config is NCS 1.5.0 or 1.6.0, nRF52840DK

Best regards,

Romain

Parents
  • I tried to reproduce this issue but was not able to, can you test the attached sample with NCS v1.6.0 and check if you get the same output:

    *** Booting Zephyr OS build v2.6.0-rc1-ncs1  ***
    Before nvs_init() from app
    Enable bluetooth sucess!
    Item at id 1 was found: data1
    Item at id 2 was found: data2
    Item at id 3 was found: data3
    > fs_nvs.nvs_recover_last_ate: Recovering last ate from sector 0
    [00:00:00.013,549] <inf> fs_nvs: 3 Sectors of 4096 bytes
    [00:00:00.013,549] <inf> fs_nvs: alloc wra: 0, ff0
    [00:00:00.013,549] <inf> fs_nvs: data wra: 0, 0
    [00:00:00.017,303] <dbg> fs_nvs.nvs_recover_last_ate: Recovering last ate from sector 0
    [00:00:00.022,796] <inf> fs_nvs: 8 Sectors of 4096 bytes
    [00:00:00.022,827] <inf> fs_nvs: alloc wra: 0, ff0
    [00:00:00.022,827] <inf> fs_nvs: data wra: 0, 0
    [00:00:00.022,918] <inf> sdc_hci_driver: SoftDevice Controller build revision:
                                             58 5d 8b 31 54 67 00 e9  b8 4a a7 df a9 9c e4 1c |X].1Tg.. .J......
                                             b3 0b ce 74                                      |...t
    [00:00:00.024,688] <inf> bt_hci_core: No ID address. App must call settings_load()
    

    nvs_and_ble.zip

    If I have misunderstood your issue, please attach a sample that demonstrates it.

    Best regards,

    Simon

  • Hi Simon,

    Yes I got (almost) the same log except that

    • I had to add CONFIG_LOG_PRINTK=y in proj.conf
    • The order of printk messages is not the same (due to priority level ?)

    *** Booting Zephyr OS build v2.6.0-rc1-ncs1  ***
    Before nvs_init() from app
    [00:00:10.525,390] <dbg> fs_nvs.nvs_recover_last_ate: Recovering last ate from sector 0
    [00:00:10.530,975] <inf> fs_nvs: 3 Sectors of 4096 bytes
    [00:00:10.530,975] <inf> fs_nvs: alloc wra: 0, ff0
    [00:00:10.530,975] <inf> fs_nvs: data wra: 0, 0
    [00:00:10.531,280] <dbg> fs_nvs.nvs_recover_last_ate: Recovering last ate from sector 0
    [00:00:10.536,804] <inf> fs_nvs: 8 Sectors of 4096 bytes
    [00:00:10.536,804] <inf> fs_nvs: alloc wra: 0, ff0
    [00:00:10.536,834] <inf> fs_nvs: data wra: 0, 0
    [00:00:10.536,926] <inf> sdc_hci_driver: SoftDevice Controller build revision: 
                                             58 5d 8b 31 54 67 00 e9  b8 4a a7 df a9 9c e4 1c |X].1Tg.. .J......
                                             b3 0b ce 74                                      |...t             
    [00:00:10.538,604] <inf> bt_hci_core: No ID address. App must call settings_load()
    Enable bluetooth sucess!
    Item at id 1 was found: data1
    Item at id 2 was found: data2
    Item at id 3 was found: data3

    I will try to expose some code to reproduce our issue...

    Basically we are doing the following

    1. init nvs
    2. read then write the default value if empty nvs
    3. enable Bluetooth (which inits the nvs again)
    4. connect to a central and enable some characteristics
    5. write a new value to nvs
    6. read the last value from nvs
  • When I implicitly set CONFIG_BT_GATT_CACHING=n (default value is y) it works fine until step 6.

    I guess you need to connect to a central to reproduce the issue as there is no issue until step 3.

    How is GATT caching interacting with NVS ?

  • Hello,

    I didn't manage to replicate the issue with the sample code Simon had uploaded here, but I'm guessing the problem in your case might be that the app is attempting to read back the data before it has actually been stored to flash (Note: write API is not synchronous).

    Have you tried to add a delay (k_sleep(), etc) after the write to give NVS more time to complete the flash writes before reading the data back again?

    Romain said:
    How is GATT caching interacting with NVS ?

     GATT caching causes more data to be stored with NVS so that may explain why it takes longer to store your user data.

Reply
  • Hello,

    I didn't manage to replicate the issue with the sample code Simon had uploaded here, but I'm guessing the problem in your case might be that the app is attempting to read back the data before it has actually been stored to flash (Note: write API is not synchronous).

    Have you tried to add a delay (k_sleep(), etc) after the write to give NVS more time to complete the flash writes before reading the data back again?

    Romain said:
    How is GATT caching interacting with NVS ?

     GATT caching causes more data to be stored with NVS so that may explain why it takes longer to store your user data.

Children
  • Hi Vidar,

    I am reading and writing a characteristic manually through an app so there is a few seconds between a successive write and read back. I tried again with more than 5s but nothing changed.

    I didn't manage to replicate the issue with the sample code Simon had uploaded here
    you need to connect to a central to reproduce the issue
  • Hi,

    It sounds like NVS should have plenty of time to complete the write operations then. So the question why CONFIG_BT_GATT_CACHING=n seems to improve the behaviour still remains. Would you be able to share your project so I can try to debug it here (assuming it can run on a nordic DK without too much effort). Otherwise I'll just try to set up a new project.

    Edit: have you defined a new flash partition for your NVS instance so there is not any conflict with the NVS instance used by Settings?

    e.g.

  • Hi Vidar,

    have you defined a new flash partition for your NVS instance

    No I didn't.

    But I am not clear with sectors overlapping on the storage partition which is 32kb. Bluetooth settings defines 8 sectors of 4096b (the whole storage partition as defined in dts), my custom NVS parameters defines 2 (3 in Simon example) sectors of 4096b and you propose to separate NVS to 2x16kb partitions. Could that be really OK ?

  • Hi,

    you propose to separate NVS to 2x16kb partitions. Could that be really OK ?

    This was just an example. Bluetooth settings will allocate 4 sectors with this partitioning. My point is that your NVS instance should not cover the same flash area as the other NVS instance used by Settings.

    custom NVS parameters defines 2 (3 in Simon example) sectors

    Are these 2 sectors currently placed in the same partition as Bluetooth settings?

  • Are these 2 sectors currently placed in the same partition as Bluetooth settings?

    Yes they were...

    But I tried your proposal and it seems working fine.
    The Bluetooth settings now takes 4 sectors instead of 8.

    So your point is that every user NVS data should go in a dedicated partition and the OS NVS data will take place in the original storage partition.

    Thank you

Related