This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

How store large amount data 73Kbytes into flash using FDS?

We are using nRF52840 with sdk 15.0 version & segger embedded studio. We are able to store some small amount of configuration data into flash using FDS. But now we want to store large data array near about 73Kbytes into flash. I am writing example code for storing this data and created one dummy array of size 73728 bytes. But I getting following ERROR:

FDS_ERR_RECORD_TOO_LARGE

I am confused for configuration of FDS for storing 73Kbytes 

Here is bellow edited in sdk_config.h

FDS_VIRTUAL_PAGES 10

FDS_VIRTUAL_PAGE_SIZE 2048

FDS_OP_QUEUE_SIZE 4

Will you please help me how i use this above configuration for storing 73KB data. & how i fixed FDS_ERR_RECORD_TOO_LARGE ERROR?

Is it possible to store this data in one fds_record_write (...) call request?

Thanks in advanced..!!

  • Is it possible to store this data in one fds_record_write (...) call request?

    Nope. You need to split this large block of data into smaller chunks <= FDS_VIRTUAL_PAGE_SIZE bytes.

  • The number that you give for the FDS_VIRTUAL_PAGE_SIZE is actually 4-byte words. So you can save at a maximum of (2048-5)*4 = 8172 bytes. So you will need to split up your data if you want to save more than that.

  • Ok, Is it possible to increase FDS_VIRTUAL_PAGE_SIZE as per required data storage above of 2048. If i increase this i got ERROR 11 i.e. NRF_ERROR_INVALID_DATA when fds_flash_init();

    We will split our data with smaller chunks with multiple of above maximum limit. But still confused for is it needed to change record key every request of that.

    here is bellow my sample code snippet for storing dummy array of 73728bytes.

     

    void fds_data_write() {
      uint32_t length = 74000;
      uint8_t data_write[length];
    //  memset(data_write,2,sizeof(data_write));
      for (int i = 0; i<=length; i++) {
       data_write[i] = i;
      }
    //  NRF_LOG_INFO("fds_data_write");
      ret_code_t ret = fds_write(FILE_ID_READING_1, REC_KEY_READING_1,(char *)data_write,length);
      if (ret == FDS_SUCCESS) {
        NRF_LOG_DEBUG("FDS data stored");
      } else {
        NRF_LOG_DEBUG("FDS failed error code:%d", ret);
      }
    }

    Will you please provide me program snippet for how i call multiple request for storing 73KB data with FILE-ID and REC_KEY. Because after stored this data we want to also retrieve this data as required time.

    Lets say from above whole array first 8172 bytes stored of 1 request with specific FILE_ID & RECKEY, Then i should need to call next chunk data for that is it necessary to change RECKEY.

    Because if KECKEY is not change for second call request it will overwritten data on old. Is any other way to store this large data with one FILEID & KECKEY?

    Thanks.....

  • Hey,

    "Ok, Is it possible to increase FDS_VIRTUAL_PAGE_SIZE as per required data storage above of 2048."

    I don't think that is possible. 2048 is the maximum value.

    "We will split our data with smaller chunks with multiple of above maximum limit. But still confused for is it needed to change record key every request of that."

    Yes, you will need to change the RECKEY for every request. You can use the same FILE_ID. Then when you want to receive the data, just use the same FILE_ID and RECKEY pairs.

    "Is any other way to store this large data with one FILEID & KECKEY?"

    I don't know of any other way of saving this with one FILE_ID & RECKEY.

Related