External sensor example

For a camera, you expect to get a 2D image with pixels which can be displayed, but with non-camera sensors, you expect to get packets which cannot be easily displayed. When you want to integrate a driver for a non-camera sensor like lidars, you would want to integrate it using an external sensor driver library.

external_sensor_defs implementation

sensor_external_sensor_t external_sensor_defs = {
    open : open_external_sensor,
    close : close_external_sensor,
    init : init_sensor,
    deinit : deinit_sensor,
    start_streaming : start_streaming,
    stop_streaming : stop_streaming,
    get_packet : get_packet,
    get_buffer_requirements : get_buffer_requirements,
    get_time : get_time,
    parse_config : parse_config,
    set_sensor_metadata : NULL,
    get_metadata_limits : NULL,
    fill_format_info : fill_format_info,
    set_callbacks : set_callbacks
};

For the full implementation, refer to the example source code. Functions in the snippet above are similar to those of an external camera library.

get_buffer_requirements() implementation

static int get_buffer_requirements(void* handle, uint32_t* numBuffers,
                                   uint32_t* bufSize)
{
    externalSensorContext_t* ctx = (externalSensorContext_t*) handle;

    if (handle == NULL || numBuffers == NULL || bufSize == NULL) {
        return EINVAL;
    }

    *numBuffers = 1;
    switch (ctx->mParams.out_data_format) {
    case SENSOR_FORMAT_LIDAR_POLAR:
        *bufSize = NB_DOME_POINTS * sizeof(sensor_lidar_polar_t);
        break;
    case SENSOR_FORMAT_LIDAR_POINT_CLOUD:
        *bufSize = NB_CUBE_POINTS * sizeof(sensor_lidar_point_cloud_t);
        break;
    case SENSOR_FORMAT_LIDAR_SPHERICAL:
        *bufSize = NB_SPHERE_POINTS * sizeof(sensor_lidar_spherical_t);
        break;
    default:
        LOG_ERROR("Unsupported data_format %d", ctx->mParams.out_data_format);
        return EINVAL;
    }
    return EOK;
}

For sensor drivers, you need to specify the size of buffers, so that the sensor service is aware of it. The get_buffer_requirements() function in the code snippet above reports the size of buffers. In your sensor config, you can specify data_format = SENSOR_FORMAT_LIDAR_POLAR to show that you're using polar coordinates.

get_packet() implementation

static int get_packet(void* handle, void* bufferIn, sensor_flags_t *flags,
                      void** bufferOut, int64_t *timestamp)
{
    externalSensorContext_t* ctx = (externalSensorContext_t*) handle;

    if (handle == NULL || bufferIn == NULL || flags == NULL || bufferOut == NULL || timestamp == NULL) {
        return EINVAL;
    }

    if (ctx->mFirstPacket) {
        ctx->mStart = ClockCycles();
        ctx->mFirstPacket = false;
    }
    flags->captured = false;

    if (ctx->mParams.out_data_format == SENSOR_FORMAT_LIDAR_SPHERICAL) {
        generateSphere(handle, (sensor_lidar_spherical_t *)bufferIn);
    } else if (ctx->mParams.out_data_format == SENSOR_FORMAT_LIDAR_POLAR) {
        generateDome(handle, (sensor_lidar_polar_t *)bufferIn);
    } else if (ctx->mParams.out_data_format == SENSOR_FORMAT_LIDAR_POINT_CLOUD) {
        generateCube(handle, (sensor_lidar_point_cloud_t *)bufferIn);
    }
    *bufferOut = bufferIn;
    *timestamp = get_time(handle);
    flags->captured = true;

    ctx->mEnd = ClockCycles();
    // This will handle the case where the free running counter wraps around once
    // in between start and end
    uint64_t duration = ctx->mEnd - ctx->mStart;

    nsSleep(handle, ctx->mDataPeriodInNs - (duration * ctx->mNsPerCycle));
    ctx->mStart = ClockCycles();

    return EOK;
}

The get_packet() function is similar to get_preview_frame() function of the external camera example library source code. It generates fake data, assigns it to bufferIn, sets captured flag to true, and sleeps for a bit to simulate a lidar. For a real lidar, bufferIn might be queued because a packet has not arrived yet. In that case, flags→captured would be left as false before returning.

Running external_sensor_example on a QNX target

Build external_platform_example and serdes_example then scp them to a QNX target:

# Navigate to the external sensor example code
cd source_package_sf_sensor/lib/sensor_drivers/external_sensors/example

# Source SDP script
source ~/qnx800/qnxsdp-env.sh

# Build external_camera_example
make

# scp libexternal_sensor_example.so over to /system/lib on QNX Raspberry Pi
scp ./nto/aarch64/so.le/libexternal_sensor_example.so root@<ip-address>:/system/lib

For the sensor config, note that you need to specify data_format to be SENSOR_FORMAT_LIDAR_POLAR, SENSOR_FORMAT_LIDAR_POINT_CLOUD, or SENSOR_FORMAT_LIDAR_SPHERICAL.

Create the following file at /system/etc/system/config/external_sensor_example.conf:

begin SENSOR_UNIT_1
    type = external_sensor
    name = external_sensor_example
    data_format = SENSOR_FORMAT_LIDAR_POLAR
    address = /system/lib/libexternal_sensor_example.so
end SENSOR_UNIT_1

Start sensor service and run sensor_example to interact with the external_sensor_example driver library:

# Start sensor service
sensor -U 521:521,1001 -r /data/share/sensor -c  /system/etc/system/config/external_sensor_example.conf

# Run sensor_example
sensor_example
Select which example you want to run:
        1) Sensor stream
        2) Sensor recorder
        3) Sensor data publisher
        4) Sensor data subscriber streaming
        5) Sensor data subscriber recording
        6) Sensor query
        7) Sensor data query
        8) Sensor data subscriber (event-mode)
        x) Exit the example
Page updated: