This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Problem with USB CDC Data Transfer

I'm experiencing a strange issue with the USB CDC example when connecting the nRF52840 to Windows 10.  This is with the 15.2 SDK. I'm using the usbd_ble_uart example on the BMD340 DK. When building the stock example and connecting the BMD-USB port to my PC the DK connects as a virtual comm port as expected. I can open it with a terminal emulator and see that data is being transferred. I'm using this example project to build software to connect to a custom application. If my custom application tries to talk to the DK BEFORE I test the connection with the terminal emulator then the custom application fails to be able to send data to the virtual comm port. Simply opening the port and then closing it with the terminal emulator puts the DK port in a "better" state and then the custom application is able to talk with the port.

It smelled like an OS issue so I fired up a serial port sniffer and see what was being sent by the different connection attempts. When the custom application tries to read the baud rate and data width from the virtual comm port BEFORE the terminal emulator has used the port it receives a baud rate of 0 and a data width of 0. Once the terminal emulator has opened and closed the connection and I re-sniff the session the baud rate is 9600 and the data with is 8 as expected. Please see the attached screen captures of side-by side comparisons of good and bad connections (Good on left, bad on right. First baud then data width): 

I've seen this issue on multiple PCs. I've also tried updating the windows USB CDC driver with the nordic_cdc_acm_examples.inf file. Somehow, for the initial connection, either the nRF5280's USB stack is in a strange state or the virtual comm port is in a weird state and it there can be no good connection until the terminal emulator makes a connection. Using the same port sniffing software I can see that the terminal emulator doesn't check the current values, it just writes the values it wants and moves forward.

So my questions are:
- If it's the nRF52's USB stack, is there a way to pre-set it so that it returns the correct baud rate and data with on the initial connection?
- If it's the virtual comm port, is there any way to set/reset it from the nRF52 side so that it's in the correct state for the initial connection?
Thanks

.

Parents
  • Hi,

     

    Virtual com-ports does not really care about the settings, as everything goes over to USB anyway on the transport side. You can open up the port with 300 baud, and send MBits/s given that the device handles it. What is the issue that you are observing? Is the data missing or corrupted?

    Kind regards,

    Håkon

  • I've been able to look more closely at the log files from the sniffer software (Serial Port Monitor, Eltima). After comparing the logs from the terminal emulator and from a simple Python script I can see that the initial 0 values I for baud and data width seem to be normal. From what I gather it's normal for there to be multiple opens and closes of the port behind the scenes by the OS before it's actually opened for tx/rx.

    The start of the issue is actually later in the log files. The nRF52840 gives no sign that a connection was made (APP_USBD_CDC_ACM_USER_EVT_PORT_OPEN) though the driver thinks it as made a connection. When the port is finally opened for writing the first call to IRP_MJ_WRITE registers as successfully when it actually wasn't (No RX shown by the nRF52840, APP_USBD_CDC_ACM_USER_EVT_RX_DONE). The second call to IRP_MJ_WRITE times-out with STATUS_TIMEOUT. All following write attempts of IRP_MJ_WRITE also fail but with STATUS_IO_TIMEOUT.

    I'm still doing compares between good and bad logs to find more clues.

    .

  • Are you testing on a custom board? if yes; is there any components on the D+/D- lines that might cause this?

    Are you able to reproduce this with a DK?

     

    Kind regards,

    Håkon

  • Hi Håkon,

      I've been doing all my testing on the PCA10056 DK while waiting for the customer's custom board to be completed. I'm using the usbd_ble_uart and peripheral/cli examples. The issue is observable with the usbd_ble_uart example with no modifications by watching the DK LEDs. With the peripheral/cli example I made a small modification that toggled a DK LED every time a byte was received. With the usbd_ble_uart example, to remove possible sources of the problem, I've removed/disabled the soft device and re-compiled it with ORIGIN = 0x0. The BT part is disabled but the USB CDC part is still testable.

    The two examples setup the USB CDC UART differently but both suffer from the same issue. We're continuing to dig into the SW libraries used on the Windows10 side to see if there is some kind of negative interaction between the serial port libraries and the Virtual com-port setup.

    So far, the only difference I can detect between a good connection and a bad connection is that with a good connection the virtual comm port connection parameters are already setup. That why I was looking for a way to preset them from the nRF SDK side at compile time. So far as I can tell, there isn't a way to do this. Do you know of a way?

    Thanks

    .

  • jemiaha said:
    With the usbd_ble_uart example, to remove possible sources of the problem, I've removed/disabled the soft device and re-compiled it with ORIGIN = 0x0. The BT part is disabled but the USB CDC part is still testable.

     Why not test directly with usbd_cdc_acm? If you remove all the handling wrt. BLE in that sample, you're ending essentially up with the same functionality as in that sample, just that you also need to handle the logic around the USBD events so that they do not go through the SoftDevice anymore.

     

    jemiaha said:
    So far, the only difference I can detect between a good connection and a bad connection is that with a good connection the virtual comm port connection parameters are already setup. That why I was looking for a way to preset them from the nRF SDK side at compile time. So far as I can tell, there isn't a way to do this. Do you know of a way?

    There's no function for setting this explicitly, but you can do this in main: 

        const uint32_t baud_rate = 115200;
        app_usbd_cdc_line_coding_t line_coding = 
        {
          .dwDTERate = &baud_rate,
          .bCharFormat = APP_USBD_CDC_LINE_STOPBIT_1,
          .bParityType = APP_USBD_CDC_LINE_PARITY_NONE,
          .bDataBits = 8,
        };
        memcpy(&m_app_cdc_acm.specific.p_data->ctx.line_coding, &line_coding, sizeof(app_usbd_cdc_line_coding_t));

     

    I am a bit unsure about the endianess on the dwDTERate member, you might have to swap the bytes if you see some weird values in that field.

     

    Kind regards,

    Håkon

  • Thanks!. I'm exploring that. It has an effect and I do see some strange values in the sniffer. Right now, in the USB setup there is the following code:

    app_usbd_class_inst_t const * class_cdc_acm = app_usbd_cdc_acm_class_inst_get(&m_app_cdc_acm);
    ret = app_usbd_class_append(class_cdc_acm);
    APP_ERROR_CHECK(ret);

    Where should the memcpy go in relation to that? (Before or after?)
    Thanks

    .

  • I placed it just before entering the loop. The important part is to set it before you enumerate, usually in the init will be fine.

    Here's a quick LE to BE 32 unsigned macro, so that your baud is read out correct on the host side:

    #define LE_TO_BE32(x) \
            (x >>24 & 0xff) |   \
            ((x<<8)&0xff0000) | \
            ((x>>8)&0xff00) | \
            ((x<<24)&0xff000000)
    
        const uint32_t baud_rate = LE_TO_BE32 (115200);

    You can ofcourse set this to whatever baud you'd like (without it having an effect on the USB throughput), its really only meant as a "meta data" if you actually communicate with an external device using UART.

    Do you get the correct baud settings now?

     

    Kind regards,

    Håkon

Reply
  • I placed it just before entering the loop. The important part is to set it before you enumerate, usually in the init will be fine.

    Here's a quick LE to BE 32 unsigned macro, so that your baud is read out correct on the host side:

    #define LE_TO_BE32(x) \
            (x >>24 & 0xff) |   \
            ((x<<8)&0xff0000) | \
            ((x>>8)&0xff00) | \
            ((x<<24)&0xff000000)
    
        const uint32_t baud_rate = LE_TO_BE32 (115200);

    You can ofcourse set this to whatever baud you'd like (without it having an effect on the USB throughput), its really only meant as a "meta data" if you actually communicate with an external device using UART.

    Do you get the correct baud settings now?

     

    Kind regards,

    Håkon

Children
No Data
Related