Use streams

Updated: April 19, 2023

Instead of using the viewfinder window (a Screen window) created by the Camera library and joining it with a Screen window group from your application, you can use a Stream object instead.

A Stream object gives you access to a screen_buffer_t object. Using a Stream object allows you to share the buffers within the same application or process, as well as between different applications and processes. Applications can create consumer streams that can read the Stream object provided by the Camera library, which is the producer of the stream. This is useful when you don't need to show the image buffers from the camera on a display, you need to use the GPU to render the image buffers to a display, or you want to use an external engine to composite the image buffers, instead of using Screen's composition features.

For example, you could use OpenGL to display the contents of the image buffer. For more information about working with consumer streams, see the Consumer topic in the Screen Developer's Guide.

To use a Stream object, your application must disable the default behavior of creating a viewfinder window. To do so, call camera_set_vf_property() with CAMERA_IMGPROP_CREATEWINDOW set to zero before you call camera_start_viewfinder(). After you start the viewfinder, use Screen event handing and call screen_get_event_property_pv() to get a reference to the Stream object. After you have this reference, you can use that stream as a screen_buffer_t object in the manner suitable for your application.

Here are the steps to use streams:
  1. Connect to the camera.
  2. Set the viewfinder mode for the camera.
  3. Create a Screen context, window, and buffers. For more information, see Using Streams in the Resource Sharing chapter of the Screen Developer's Guide.
  4. Start the viewfinder and configure camera settings.
  5. Stop the viewfinder.
For example, you may want to process the image buffer in some manner to run algorithms on it, and then use EGL and OpenGL to post the content to the display. Here's a code snippet to illustrate:
...
...
screen_context_t context;
screen_window_t window;

// Create a screen context and a window
screen_create_context(&context, 0);
screen_create_window(&window, context);

camera_handle_t cameraHandle;

// Connect to the camera and set the viewfinder mode
camera_open(CAMERA_UNIT_1, CAMERA_MODE_RW | CAMERA_MODE_ROLL, &cameraHandle);
camera_set_vf_mode(cameraHandle, CAMERA_VFMODE_VIDEO);

// Disable the viewfinder window creation
camera_set_vf_property(cameraHandle, CAMERA_IMGPROP_CREATEWINDOW, 0);

// Start the viewfinder
camera_start_viewfinder(cameraHandle, NULL, statusCallback, NULL);



// Application's (consumer) stream handle
screen_stream_t cstream;
  
// Camera library's (producer) stream handle
screen_stream_t pstream;

// Create a consumer stream to "look" at the Stream object
screen_buffer_t sbuffer = NULL;
screen_create_stream(&cstream, context);

const char *id = "mystreamexample";
screen_set_stream_property_cv(cstream, SCREEN_PROPERTY_ID_STRING, strlen(id), id);
    
const int usage = SCREEN_USAGE_NATIVE;
screen_set_stream_property_iv(cstream, SCREEN_PROPERTY_USAGE, &usage);

// It's possible that this code runs before the Sensor service creates the necessary buffers.
// If this code runs before the buffers are created, delay and try again later.
while ((rc = screen_consume_stream_buffers(cstream, 0, pstream)) != 0) {
    printf("screen_consume_stream_buffers() ret %d, errno=%d\n", rc, errno);
    sleep(1);
}

// Your application should be connected to the Camera library's viewfinder stream now
int count;
uint64_t frameCount = 0;
struct timespec currentTime;
struct timespec previousTime;
// Preset the starting time
clock_gettime(CLOCK_MONOTONIC, &previousTime);
do {
    screen_buffer_t prev = sbuffer;
    count = 0;
    int *dirty = NULL;

    while (screen_acquire_buffer(&sbuffer, cstream, &count, &dirty, NULL, 0)) {
        printf("screen_acquire_buffer() failed, err=%d\n", errno);
    }

    if (prev && prev != sbuffer) {
        screen_release_buffer(prev);
    }

    if (count > 0 && dirty != NULL) {
        frameCount++;
        // Get the current time
        clock_gettime(CLOCK_MONOTONIC, &currentTime);
        if (currentTime.tv_sec >= (previousTime.tv_sec + STREAM_FRAME_RATE_LOG_INTERVAL)) {
            // Calculate the framerate since last time
            time_t intervalTime = (currentTime.tv_sec*1000 + currentTime.tv_nsec/1000000) -
                                  (previousTime.tv_sec*1000 + previousTime.tv_nsec/1000000);
            if (intervalTime) {
                printf("framerate %lld\n", frameCount*1000/intervalTime);
            }
            memcpy(&previousTime, &currentTime, sizeof(struct timespec));
            frameCount = 0;
        }
    } else if (count > 0) {
        printf("acquired buffer count = %d\n", count);
    } else if (sbuffer) {
        printf("acquired buffer\n");
    }
    if (dirty) {
        free(dirty);
    }
} while (count);
...
...