Hello,
I am working on bare-metal with the BSDLib on a custom board.
When trying to debug a specific case in which our server refuses TLS connection, I wanted to see which cipher was used.
However, it seems like I am not able to get this setting and also received strange results from the library.
Here is a sample code causing the issue.
self->Socket = nrf_socket(NRF_AF_INET, NRF_SOCK_STREAM, NRF_SPROTO_TLS1v2); UInt32 verify = 2; err = nrf_setsockopt(self->Socket, NRF_SOL_SECURE, NRF_SO_SEC_PEER_VERIFY, &verify, sizeof(verify)); err = nrf_setsockopt(self->Socket, NRF_SOL_SECURE, NRF_SO_SEC_TAG_LIST, tls_sec_tag, sizeof(tls_sec_tag)); err = nrf_connect(self->Socket, &(self->CloudAiAddr), sizeof(struct nrf_sockaddr_in)); // Fails with errno = 41 (NRF_EPROTOTYPE) but that is not impotant here nrf_sec_cipher_t cipher_in_use = 0; nrf_socklen_t optLen = sizeof(nrf_sec_cipher_t); int res = nrf_getsockopt(self->Socket, NRF_SOL_SECURE, NRF_SO_CIPHER_IN_USE, &cipher_in_use, &optLen); // res = 42, optlen = 4, cipher_in_use = 0 // res should either be 0 or -1. Also cipher_in_use is 0.
nrf_getsockopt returns 42 as value, which is kind of unexpected.
Versions:
- modem FW 1.2.0
- libbsd: 0.7.6 hard-float (using one of the more recent commits fixing the TLS limitation to <2kByte packets, Commit)
Am I using this wrong? Or is this unexpected behavior?