This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Continuously Store Data + Transmit Data at Specific Intervals

I have an application where I am collecting data from the ADC about 1Hz, I am considering storing that data in an SPI Flash (still figuring out how to do this); at each interval the data will also be sent out via UART to a connected mobile device. My question is if the user request the full data from storage and I want to send that while still collecting the data from the ADC in real time is that possible? In the worst possible case they could be requesting 86,400 records be sent to synchronize missing data on their mobile device with that which is in storage. So does BLE transmission cause a timing delay on the ADC read? Meaning if it is 0 seconds data is stored then I have 1 second to transmit the data before new data comes in? Or does the BLE transmission occur as a separate process - meaning if it took 15s to send the data it wouldn't interfere with ADC readings coming in at 1Hz? 

If there is a better way to achieve this - I'm interested (and any advice on using external SPI Flash is always appreciated :) )

  • Hi 

     

    As long as you make sure to give the ADC sampling higher priority than the record update you shouldn't have an issue with this. 

    For instance you can set up an interrupt to do the ADC sampling, using the app_timer module or a dedicated timer interrupt, and have the record update run from the main context. Then ADC sampling will run reliably at 1Hz, and the record update can take advantage of any spare time do update the records in the background. 

    Also, you should be aware that the BLE stack (SoftDevice) can interrupt the CPU even when you are not sending any data. If you want the 1Hz sampling to be completely unaffected by BLE activity you need to trigger the ADC from a dedicated timer over PPI, so that no CPU activity is required to start the ADC. 

    A final word of advice is to make sure you are merging several record updates into longer packets before sending, to make the process more efficient when you are sending thousands of samples quickly. 

    Best regards
    Torbjørn

     

     

Related