This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

uint8_t type variables behave unexpectedly on nRF832? converting to int32_t

Stack - nRF52 DK, SDK 15.0.0, Segger Embedded Studio, Windows. Using code examples described below.

So my high level goal is to read data from registers in an I2C sensor and send these values via Bluetooth. I'm reading I2C data using "nrf_drv_twi" rx and tx, which stores the read sensor data in variables of type uint8_t. To send data via bluetooth, I'm working out of this tutorial (https://devzone.nordicsemi.com/tutorials/b/bluetooth-low-energy/posts/ble-characteristics-a-beginners-tutorial), which in step "Step 3.F, Update the characteristic with temperature data", passes data as int32_t type variables. I'm having a lot of problems converting the uint8_t variables into any variables that aren't uint8_t (e.g. program behaves unexpectedly when storing values in uint16_t, int8_t, int32_t, etc).

So for example, I created a standalone project in Eclipse to check my sanity, just debugged locally with no build target:

#include <stdio.h>
#include <stdlib.h>
#include <stdint.h>

int main(void) {
	uint8_t a = 0x12;
	uint8_t b = 0x34;
	int16_t c = 0;
	c = a;
	c = c<<8;
	c = c + b;
	return EXIT_SUCCESS;
}

When debugging, variables expect exactly as I expect. After all the code is run, the debug console shows:

this is my desired output. essentially, I want to "concatenate" all the values together and parse after sending to an iOS app.

However, running the same code snippet contained in main() within a project built for the nRF52DK gives bizarre results:

I'm now noticing other weird behavior. When I first start debugging, a/b/c all have these values to start. when I change a/b to other values, e.g. 0x56 / 0x78, the debug values in the watch list remain unchanged even after those lines of code have been run. I tried setting breakpoints on each of the lines where they're running, but SES won't stop on those breakpoints and catches on one a couple lines down...

What can I do to get this code to behave as it does when building in Eclipse without a target?

Or maybe I can circumvent this problem entirely. Is there an easier way to send TWI sensor data via BLE?

Parents
  • GCC might optimized out those var that are not use.  Try to add a dummy printf that just print the value of a, b, c and break at the printf line.  Since the printf references those vars, GCC will keep them.  You will see them in the debugger.  If you don't see the correct value, try to step until you reach the printf.  The break might not at the intended place when compiled with optimization on.

      

  • Thanks, that was a helpful first step that got me a lot farther! Do you know how to turn optimization off? Not essential but would help.

    So after that I changed the code around a bit, but encountered another strange bug - I'm looking to "concatenate" 6 values, and the code works correctly for values 1-5, but then completely messes up concatenated variable when adding value 6. I took a 50 second video of the debugging process here:

    URL - https://www.youtube.com/watch?v=Vt3peCRBgi0&feature=youtu.be&hd=1

    the function included in the video is:

    void get_LSM_acc_data(int64_t * data_string)
    {
        uint8_t ax_low = 0;
        uint8_t ax_high = 0;
        uint8_t ay_low = 0;
        uint8_t ay_high = 0;
        uint8_t az_low = 0;
        uint8_t az_high = 0;
    
        read_accel_data(&ax_low, &ax_high, &ay_low, &ay_high, &az_low, &az_high);
    
        int64_t temp = 0;
        temp = temp | (uint64_t)ax_low;
        temp = temp << 8;
        temp = temp | (int8_t)ax_high;
        temp = temp << 8; 
        temp = temp | (int8_t)ay_low;
        temp = temp << 8; 
        temp = temp | (int8_t)ay_high;
        temp = temp << 8; 
        temp = temp | (int8_t)az_low;
        temp = temp << 8; 
        temp = temp | (int8_t)az_high;
        NRF_LOG_DEBUG("%x",temp);
        NRF_LOG_FLUSH();
        *data_string = temp; //ORDER - ax_low, ax_high, ay_low, ay_high, az_low, az_high
    }

    So from the video it pulls these values from the sensor:

    ax_low = 0x07
    ax_high = 0x00
    ay_low = 0x5e
    ay_high = 0x00
    az_low = 0x29
    az_high = 0xe1

    and after the lines of code, we want to get

    temp = 0x0700530029e1 //or 0x00000700530029e1

    I made a mistake in the video, when debugging "temp" never actually gets to 0x0x0700530029e1, it only gets to 0x0x070053002900, which makes it look like it hasn't yet run the last line of code, 

    temp = temp | (int8_t)az_high;

    but then after stepping through the breakpoint on NRF_LOG_DEBUG(...) a couple times, the value changes to 0xffffffffffffffe1, which makes it looks like an error occurred on that aforementioned last line of code.

    If you know how to turn off optimization I can try to look into it with higher resolution, but alternatively .. do you have any suggestions for what to try to prevent the data to turning from 0x0x070053002900 into 0xffffffffffffffe1?

  • az_high converting to an int8_t will become a negative value. hence you get 0xffffffffffffffe1 in the end.  What you should do is either to keep it unsigned or do an & with 0xff before oring with temp.  

  • Wow, in hindsight that feels so obvious, I need to get more sleep... Thanks!

Reply Children
No Data
Related