Skip to content
This repository was archived by the owner on Feb 22, 2024. It is now read-only.

Examples

Ian Auty edited this page Nov 30, 2018 · 38 revisions

In active development

Contents

  1. Image capture
  2. Rapid image capture
  3. Raw image capture
  4. Timelapse mode
  5. Timeout mode
  6. Video recording
  7. Segmented video recording
  8. Quantization parameter
  9. Change encoding type
  10. Resizer component
  11. Splitter component
  12. Print pipeline
  13. Encode / Decode from FileStream - Image
  14. Static render overlay
  15. FFmpeg - RTMP streaming
  16. FFmpeg - Raw video convert
  17. FFmpeg - Images to video

Notes

If you want to change any of the default configuration settings, this can be done by modifying the static properties within the MMALCameraConfig class. The main class, MMALCamera which interfaces to the rest of the functionality the library provides is a Singleton and is called as follows: MMALCamera cam = MMALCamera.Instance.

FFmpeg

For FFmpeg functionality, you will need to install the latest version of FFmpeg from source - do not install from the Raspbian repositories as they don't have H.264 support.

A guide to installing FFmpeg from source including the H.264 codec can be found here

Camera warm-up

Note: The await Task.Delay(2000); is required to allow the camera sensor to "warm up". Due to the rolling shutter used in the Raspberry Pi camera modules, we need to wait for a few seconds before valid image data can be used, otherwise your images will likely be under-exposed. The value of 2 seconds is a safe amount of time to wait, but is only required after enabling the camera component, either on first run or after a manual disable.

Additionally, the call to ConfigureCameraSettings() is only required if you have made changes to the camera's configuration.

PPM/TGA support

Support for these encoders has been added in later firmware releases so will likely need a sudo rpi-update in order for it to work. Please see this issue for reference.

Image capture

The below examples describe how to take a simple JPEG image, either by using the built-in helper method or manual mode. Here we are using an Image Encoder component which will encode the raw image data into JPEG format; you can change the encoding format to be one of the following: JPEG, BMP, PNG, GIF. In addition, you can also change the pixel format you would like to encode with - in the below examples we are using YUV420.

Helper mode

public async Task TakePictureHelper()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))        
    {            
        await cam.TakePicture(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Manual mode

public async Task TakePictureManual()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
    using (var nullSink = new MMALNullSinkComponent())
    {
        cam.ConfigureCameraSettings();
        
        // Create our component pipeline.         
        imgEncoder.ConfigureOutputPort(0, MMALEncoding.JPEG, MMALEncoding.I420, 90);
                
        cam.Camera.StillPort.ConnectTo(imgEncoder);                    
        cam.Camera.PreviewPort.ConnectTo(nullSink);
        
        // Camera warm up time
        await Task.Delay(2000);        
        await cam.ProcessAsync(cam.Camera.StillPort);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Rapid image capture

By utilising the camera's video port, we are able to retrieve image frames at a much higher speed than using the conventional still port. Images captured via the video port will be of a lesser quality and do not support EXIF.

public async Task TakePictureFromVideoPort()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    using (var splitter = new MMALSplitterComponent(null))
    using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler, continuousCapture: true))
    using (var nullSink = new MMALNullSinkComponent())
    {
        cam.ConfigureCameraSettings();
        
        // Create our component pipeline.         
        imgEncoder.ConfigureOutputPort(0, MMALEncoding.JPEG, MMALEncoding.I420, 90);
                
        cam.Camera.VideoPort.ConnectTo(splitter);
        splitter.Outputs[0].ConnectTo(imgEncoder);                    
        cam.Camera.PreviewPort.ConnectTo(nullSink);
        
        // Camera warm up time
        await Task.Delay(2000);
                
        CancellationTokenSource cts = new CancellationTokenSource(TimeSpan.FromSeconds(15));
        
        // Process images for 15 seconds.        
        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Raw image capture from camera

In this example we are capturing raw unencoded image data directly from the camera sensor. You can change the pixel format of the raw data by changing the MMALCameraConfig.MMALStillEncoding and MMALCameraConfig.MMALStillSubFormat properties.

Helper mode

public async Task TakeRawPictureHelper()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "raw"))        
    {            
        await cam.TakeRawPicture(imgCaptureHandler);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Note: I422 encoding will prevent the native callback handler from being called, ultimately requiring a reboot of your Pi. I have tested I420, RGB24 and RGBA which work as expected.

Timelapse mode

The timelapse mode example describes how to take an image every 10 seconds for 4 hours. You can change the frequency and duration of the timelapse mode by changing the various properties in the Timelapse object.

public async Task TakeTimelapsePicture()
{                        
    MMALCamera cam = MMALCamera.Instance;

    // This example will take an image every 10 seconds for 4 hours
    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    {
        var cts = new CancellationTokenSource(TimeSpan.FromHours(4));
        var tl = new Timelapse { Mode = TimelapseMode.Second, CancellationToken = cts.Token, Value = 10 };
        await cam.TakePictureTimelapse(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420, tl);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Timeout mode

The timeout mode example shows how to take simultaneous image captures for a set duration. This is done via a helper method in the MMALCamera class. We pass in a CancellationToken which will signal when image capturing should stop.

public async Task TakeTimeoutPicture()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    {
        var cts = new CancellationTokenSource(TimeSpan.FromHours(4));
        await cam.TakePictureTimeout(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420, cts.Token);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Video recording

The below examples show how to capture video using MMALSharp. For basic video recording, there is a built in helper method which uses H.264 encoding. If you wish to use a different encoding type, or would like to customise additional parameters such as bitrate, you can also do this manually.

Helper mode

// Self-contained method for recording H.264 video for a specified amount of time. Records at 30fps, 25Mb/s at the highest quality.
public async Task TakeVideoHelper()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "avi"))        
    {    
        var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
        // Take video for 3 minutes.
        await cam.TakeVideo(vidCaptureHandler, cts.Token);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Manual mode

public async Task TakeVideoManual()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "avi"))
    using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler))
    using (var renderer = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings();

        // Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
        vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 0, 25000000);

        cam.Camera.VideoPort.ConnectTo(vidEncoder);
        cam.Camera.PreviewPort.ConnectTo(renderer);
                                              
        // Camera warm up time
        await Task.Delay(2000);
        
        var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));

        // Take video for 3 minutes.
        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Segmented recording mode

The segmented recording mode allows us to split video recording into multiple files. The user is able to specify the frequency at which the split occurs via the Split object.

Note: MMALCameraConfig.InlineHeaders must be set to true in order for this to work.

public async Task TakeSegmentedVideo()
{       
    // Required for segmented recording mode
    MMALCameraConfig.InlineHeaders = true;
                 
    MMALCamera cam = MMALCamera.Instance;

    using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "avi"))
    using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler, null, new Split { Mode = TimelapseMode.Second, Value = 30 }))
    using (var renderer = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings();

        // Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
        vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 0, 25000000);

        cam.Camera.VideoPort.ConnectTo(vidEncoder);
        cam.Camera.PreviewPort.ConnectTo(renderer);
        cam.ConfigureCameraSettings();
   
        // Camera warm up time
        await Task.Delay(2000);

        var cts = new CancellationTokenSource(TimeSpan.FromMinutes(1));

        // Record video for 1 minute, using segmented video record to split into multiple files every 30 seconds.
        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Quantization parameter

The quantization parameter allows us to set a variable bitrate when recording in H.264 encoding. To enable this behavior, set the bitrate parameter to '0' and set the quality parameter to a value between 1-10. Note: this only applies to H.264, MJPEG makes use of both the quality and bitrate values.

public async Task QuantizationParameterExample()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "avi"))
    using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler))
    using (var renderer = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings();

        // Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. We make use of the quantization parameter (quality) to set a variable bitrate. The value 10 is the highest setting.
        vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 10);

        cam.Camera.VideoPort.ConnectTo(vidEncoder);
        cam.Camera.PreviewPort.ConnectTo(renderer);
                   
        // Camera warm up time
        await Task.Delay(2000);                 

        var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
                 
        // Take video for 3 minutes.
        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Change encoding type

Due to the way MMALSharp handles the lifecycle of each component, if we wish to change the encoding of a component we must do this by leaving the scope of the encoder's current using block; after doing so, this will free up the unmanaged resources of that encoder and will allow us to create a fresh instance with a different encoding type.

public async Task ChangeImageEncodingType()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
    using (var nullSink = new MMALNullSinkComponent())
    {
        cam.ConfigureCameraSettings();

        // Camera warm up time
        await Task.Delay(2000);
        
        // Create our component pipeline.         
        imgEncoder.ConfigureOutputPort(0, MMALEncoding.JPEG, MMALEncoding.I420, 90);
                
        cam.Camera.StillPort.ConnectTo(imgEncoder);                    
        cam.Camera.PreviewPort.ConnectTo(nullSink);
                
        await cam.ProcessAsync(cam.Camera.StillPort);
        
        imgCaptureHandler.Extension = "bmp";
        imgEncoder.ConfigureOutputPort(0, MMALEncoding.BMP, MMALEncoding.RGB32, 90);
        await cam.ProcessAsync(cam.Camera.StillPort);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

The same applies to video encoders too.

public async Task ChangeVideoEncodingType()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "avi"))
    using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler))
    using (var renderer = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings();   

        // Camera warm up time
        await Task.Delay(2000);                   
        
        // Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
        vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 0, 25000000);

        cam.Camera.VideoPort.ConnectTo(vidEncoder);
        cam.Camera.PreviewPort.ConnectTo(renderer);
                
        var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
               
        // Take video for 3 minutes.
        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
        
        vidCaptureHandler.Extension = "mjpeg";
        vidEncoder.ConfigureOutputPort(0, MMALEncoding.MJPEG, MMALEncoding.I420, 90, 25000000);
        cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
        
        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);        
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Resizer Component

The MMALResizerComponent can be connected to your pipeline to change the width/height and encoding type/pixel format of frames captured by the camera component. The resizer component itself is an MMALDownstreamHandlerComponent meaning you can process data to a file directly from it without the need to connect an encoder.

public async Task ResizerComponentExample()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    using (var resizer = new MMALResizerComponent(800, 600, null))
    using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
    using (var nullSink = new MMALNullSinkComponent())
    {
        cam.ConfigureCameraSettings();

        // Create our component pipeline.         
        resizer.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, cam.Camera.StillPort)
               .ConfigureOutputPort(0, MMALEncoding.I420, MMALEncoding.I420, 0);
        
        imgEncoder.ConfigureOutputPort(0, MMALEncoding.JPEG, MMALEncoding.I420, 90);
                
        cam.Camera.StillPort.ConnectTo(resizer);                    
        resizer.Outputs[0].ConnectTo(imgEncoder);
        cam.Camera.PreviewPort.ConnectTo(nullSink);
                          
        // Camera warm up time
        await Task.Delay(2000);

        await cam.ProcessAsync(cam.Camera.StillPort);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Splitter Component

The MMALSplitterComponent connects exclusively to the video port of the camera component. From here, the splitter provides 4 output ports, allowing you to further extend your pipeline and produce up to 4 file outputs at any given time.

public async Task SplitterComponentExample()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var handler = new VideoStreamCaptureHandler("/home/pi/video/", "avi"))
    using (var handler2 = new VideoStreamCaptureHandler("/home/pi/video/", "avi"))
    using (var handler3 = new VideoStreamCaptureHandler("/home/pi/video/", "avi"))
    using (var handler4 = new VideoStreamCaptureHandler("/home/pi/video/", "avi"))
    using (var splitter = new MMALSplitterComponent(null))
    using (var vidEncoder = new MMALVideoEncoder(handler, DateTime.Now.AddSeconds(10)))
    using (var vidEncoder2 = new MMALVideoEncoder(handler2, DateTime.Now.AddSeconds(15)))
    using (var vidEncoder3 = new MMALVideoEncoder(handler3, DateTime.Now.AddSeconds(10)))
    using (var vidEncoder4 = new MMALVideoEncoder(handler4, DateTime.Now.AddSeconds(10)))
    using (var renderer = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings();

        // Create our component pipeline.         
        splitter.ConfigureInputPort(MMALEncoding.I420, MMALEncoding.I420, cam.Camera.VideoPort)        
                .ConfigureOutputPort(0, MMALEncoding.OPAQUE, MMALEncoding.I420, 0)
                .ConfigureOutputPort(1, MMALEncoding.OPAQUE, MMALEncoding.I420, 0)
                .ConfigureOutputPort(2, MMALEncoding.OPAQUE, MMALEncoding.I420, 0)
                .ConfigureOutputPort(3, MMALEncoding.OPAQUE, MMALEncoding.I420, 0);

        vidEncoder.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, splitter.Outputs[0])
                  .ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 10, 25000000);

        vidEncoder2.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, splitter.Outputs[1])
                   .ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 20, 25000000);
        
        vidEncoder3.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, splitter.Outputs[2])
                   .ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 30, 25000000);

        vidEncoder4.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, splitter.Outputs[3])
                   .ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 40, 25000000);

        cam.Camera.VideoPort.ConnectTo(splitter);

        splitter.Outputs[0].ConnectTo(vidEncoder);
        splitter.Outputs[1].ConnectTo(vidEncoder2);
        splitter.Outputs[2].ConnectTo(vidEncoder3);
        splitter.Outputs[3].ConnectTo(vidEncoder4);

        cam.Camera.PreviewPort.ConnectTo(renderer);

        // Camera warm up time
        await Task.Delay(2000);

        await cam.ProcessAsync(cam.Camera.VideoPort);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

You can also use the splitter component to record video and capture images at the same time:

public async Task VideoAndImages()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "avi"))
    using (var splitter = new MMALSplitterComponent(null))
    using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler, continuousCapture: true))
    using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler))
    using (var nullSink = new MMALNullSinkComponent())
    {
        cam.ConfigureCameraSettings();
        
        // Create our component pipeline.         
        imgEncoder.ConfigureOutputPort(0, MMALEncoding.JPEG, MMALEncoding.I420, 90);
        vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 0, 25000000);
                
        cam.Camera.VideoPort.ConnectTo(splitter);
        splitter.Outputs[0].ConnectTo(imgEncoder);
        splitter.Outputs[1].ConnectTo(vidEncoder);
        cam.Camera.PreviewPort.ConnectTo(nullSink);
        
        // Camera warm up time
        await Task.Delay(2000);
                
        CancellationTokenSource cts = new CancellationTokenSource(TimeSpan.FromSeconds(15));
        
        // Process for 15 seconds.        
        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Print pipeline

Version 0.3 brings the ability to print out the current component pipeline you have configured - this can be useful when using many components and encoders (such as the splitter).

Calling the PrintPipeline() method on the MMALCamera instance will print your current pipeline to the console window.

public async Task PrintComponentPipeline()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
    using (var nullSink = new MMALNullSinkComponent())
    {
        cam.ConfigureCameraSettings();

        // Create our component pipeline.         
        imgEncoder.ConfigureOutputPort(0, MMALEncoding.JPEG, MMALEncoding.I420, 90);
                
        cam.Camera.StillPort.ConnectTo(imgEncoder);                    
        cam.Camera.PreviewPort.ConnectTo(nullSink);
        
        cam.PrintPipeline();

        // Camera warm up time
        await Task.Delay(2000);  
                
        await cam.ProcessAsync(cam.Camera.StillPort);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Encode / Decode from FileStream - Image

MMALSharp provides the ability to encode/decode images fed from FileStreams. It supports GIF, BMP, JPEG and PNG file formats, and decoding must be carried out to the following:

  • JPEG -> YUV420/422 (I420/422)
  • GIF -> RGB565 (RGB16)
  • BMP/PNG -> RGBA

Encode

public async Task EncodeFromFilestream()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var stream = File.OpenRead("/home/pi/raw_jpeg_decode.raw"))
    using (var imgCaptureHandler = new TransformStreamCaptureHandler(stream, "/home/pi/images/", "raw"))                
    using (var imgEncoder = new MMALImageFileEncoder(imgCaptureHandler))
    {
        // Create our component pipeline.
        imgEncoder.ConfigureInputPort(MMALEncoding.I420, null, 2592, 1944)
                  .ConfigureOutputPort(MMALEncoding.BMP, MMALEncoding.I420, 90, zeroCopy: true);

        await imgEncoder.Convert();
        Console.WriteLine("Finished");
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Decode

public async Task DecodeFromFilestream()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var stream = File.OpenRead("/home/pi/test.jpg"))
    using (var imgCaptureHandler = new TransformStreamCaptureHandler(stream, "/home/pi/images/", "raw"))                
    using (var imgDecoder = new MMALImageFileDecoder(imgCaptureHandler))
    {
        // Create our component pipeline.
        imgDecoder.ConfigureInputPort(MMALEncoding.JPEG, null)
                  .ConfigureOutputPort(MMALEncoding.I420, null, 90, zeroCopy: true);

        await imgDecoder.Convert();
        Console.WriteLine("Finished");
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Static render overlay

MMAL allows you to create additional video preview renderers which sit alongside the usual Null Sink or Video renderers shown in previous examples. The purpose of the additional renderers is that they allow you to overlay static content which is shown onto the display your Pi is connected to.

The overlay renderers will only work with unencoded images and they must have one of the following pixel formats:

  • YUV420 (I420)
  • RGB888 (RGB24)
  • RGBA
  • BGR888 (BGR24)
  • BGRA

An easy way to get an unencoded image for use with the overlay renderers is to use the Raw image capture functionality as described in this example, setting the MMALCameraConfig.StillEncoding and MMALCameraConfig.StillSubFormat properties to one of the accepted pixel formats. Once you have got your test frame, follow the below example to overlay your image:

public async Task StaticOverlayExample()
{                        
    MMALCamera cam = MMALCamera.Instance;
    
    PreviewConfiguration previewConfig = new PreviewConfiguration
    {
        FullScreen = false,
        PreviewWindow = new Rectangle(160, 0, 640, 480),
        Layer = 2,
        Opacity = 1
    };

    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
    using (var video = new MMALVideoRenderer(previewConfig))
    {                    
        cam.ConfigureCameraSettings();
        video.ConfigureRenderer();
                
        PreviewOverlayConfiguration overlayConfig = new PreviewOverlayConfiguration
        {
            FullScreen = true,
            PreviewWindow = new Rectangle(50, 0, 640, 480),
            Layer = 1,
            Resolution = new Resolution(640, 480),
            Encoding = MMALEncoding.I420,
            Opacity = 255
        };
                
        var overlay = cam.AddOverlay(video, overlayConfig, File.ReadAllBytes("/home/pi/test1.raw"));
        overlay.ConfigureRenderer();
        overlay.UpdateOverlay();
             
        //Create our component pipeline.  
        imgEncoder.ConfigureOutputPort(0, MMALEncoding.JPEG, MMALEncoding.I420, 90);
                
        cam.Camera.StillPort.ConnectTo(imgEncoder);
        cam.Camera.PreviewPort.ConnectTo(video);
                
        cam.PrintPipeline();
                
        await cam.ProcessAsync(cam.Camera.StillPort);        
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

In this example, we are using an unencoded YUV420 image and configuring the renderer using the settings in overlayConfig.

FFmpeg - RTMP streaming

public async Task FFmpegRTMPStreaming()
{                        
    MMALCamera cam = MMALCamera.Instance;

    // An RTMP server needs to be listening on the address specified in the capture handler. I have used the Nginx RTMP module for testing.    
    using (var ffCaptureHandler = FFmpegCaptureHandler.RTMPStreamer("mystream", "rtmp://192.168.1.91:6767/live"))
    using (var vidEncoder = new MMALVideoEncoder(ffCaptureHandler))
    using (var renderer = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings(); 

        // Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
        vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 0, 25000000);

        cam.Camera.VideoPort.ConnectTo(vidEncoder);
        cam.Camera.PreviewPort.ConnectTo(renderer);
                                
        // Camera warm up time
        await Task.Delay(2000);

        var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
          
        // Take video for 3 minutes.
        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Note:

If you intend on using the YouTube live streaming service, you will need to create the below method to return your own FFmpegCaptureHandler. You should replace the internal FFmpegCaptureHandler.RTMPStreamer seen in the example above with your custom method. The reason for this is YouTube streaming requires your RTMP stream to contain an audio input or otherwise it won't work. Internally, our RTMP streaming method does not include an audio stream, and at the current time we don't intend on changing it for this specific purpose.

public static FFmpegCaptureHandler RTMPStreamerWithAudio(string streamName, string streamUrl)
            => new FFmpegCaptureHandler($"-re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 50 -strict experimental -f flv -metadata streamName={streamName} {streamUrl}");

Please see here which discusses the issue in-depth.

FFmpeg - Raw video convert

This is a useful capture mode as it will push the elementary H.264 stream into an AVI container which can be opened by media players such as VLC.

public async Task FFmpegRawVideoConvert()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var ffCaptureHandler = FFmpegCaptureHandler.RawVideoToAvi("/home/pi/videos/", "testing1234"))
    using (var vidEncoder = new MMALVideoEncoder(ffCaptureHandler))
    using (var renderer = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings(); 

        vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 0, 25000000);

        cam.Camera.VideoPort.ConnectTo(vidEncoder);
        cam.Camera.PreviewPort.ConnectTo(renderer);
                          
        // Camera warm up time
        await Task.Delay(2000);

        var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
                
        // Take video for 3 minutes.
        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

FFmpeg - Images to video

This example will push all images processed by an image capture handler into a playable video.

public async Task FFmpegImagesToVideo()
{                        
    MMALCamera cam = MMALCamera.Instance;
    
    // This example will take an image every 10 seconds for 4 hours
    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    {
        var cts = new CancellationTokenSource(TimeSpan.FromHours(4));

        var tl = new Timelapse { Mode = TimelapseMode.Second, CancellationToken = cts.Token, Value = 10 };
        await cam.TakePictureTimelapse(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420, tl);

        // Process all images captured into a video at 2fps.
        imgCaptureHandler.ImagesToVideo("/home/pi/images/", 2);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}
Clone this wiki locally