Table of Contents

Publishing Point with User Push Source 3

This use-case demonstrates how to manually encode raw pixel data into H.264 video using EncoderFactory and IEncoder, then push the encoded frames via VirtualNetworkSource. This approach gives full control over the encoding pipeline, including the choice of encoding backend and hardware acceleration.

Overview

The pushingRoutine3() method runs in a background task and performs the following:

  1. Loads a background image and creates a SkiaSharp canvas for dynamic rendering
  2. Defines uncompressed (input) and encoded (output) media types
  3. Creates an encoder via EncoderFactory with configurable parameters
  4. Creates a VirtualNetworkSource using the encoder's actual output media type
  5. Runs a loop that renders frames, encodes them manually, and pushes encoded data to the source

Difference from Push Source 2

Push Source 2 uses ImageSource which handles encoding internally. This use-case performs encoding explicitly — user code creates the encoder, feeds raw pixel data into it, and retrieves encoded frames. This allows choosing the encoding backend (built-in or FFmpeg), enabling hardware acceleration, and fine-tuning encoder parameters.

Direct encoding is far more efficient and less resource-hungry than pushing bitmaps into ImageSource, because the encoder receives raw pixel data directly without the overhead of the image processing pipeline.

Setting Up the Canvas

The sample uses SkiaSharp for bitmap rendering, identical to Push Source 2:

SKBitmap bmpBackground = null;
SKBitmap bmpImage = null;
SKCanvas canvas = null;

using (var ms = new System.IO.MemoryStream(Properties.Resources.logo))
{
    using (SKImage img = SKImage.FromEncodedData(ms))
    {
        bmpBackground = SKBitmap.FromImage(img);
    }
}

bmpImage = new SKBitmap(1280, 720, SKColorType.Bgra8888, SKAlphaType.Premul);
canvas = new SKCanvas(bmpImage);

SKTypeface tf = SKTypeface.FromFamilyName("Arial", SKFontStyle.Bold);
SKFont font = new SKFont { Size = 48, Typeface = tf };
SKPaint paintFont = new SKPaint { Color = SKColors.Purple, Style = SKPaintStyle.Fill };
SKPaint paintFill = new SKPaint { Color = SKColors.White, Style = SKPaintStyle.Fill };

canvas.DrawBitmap(bmpBackground, new SKRect(0, 0, 1280, 720));

Defining Media Types

Two media types are needed: one describing the uncompressed input and one describing the desired encoded output.

Uncompressed Input

VAST.Common.MediaType uncompressedVideoMediaType = new VAST.Common.MediaType
{
    ContentType = VAST.Common.ContentType.Video,
    CodecId = VAST.Common.Codec.Uncompressed,
    PixelFormat = VAST.Common.PixelFormat.BGRA,
    Width = 1280,
    Height = 720,
    Framerate = new VAST.Common.Rational(30),
};

The PixelFormat must match the bitmap format. SkiaSharp's Bgra8888 corresponds to PixelFormat.BGRA.

Encoded Output

VAST.Common.MediaType encodedVideoMediaType = new VAST.Common.MediaType
{
    ContentType = VAST.Common.ContentType.Video,
    CodecId = VAST.Common.Codec.H264,
    Bitrate = 4000000,
    Width = 1280,
    Height = 720,
    Framerate = new VAST.Common.Rational(30),
};

Creating the Encoder

VAST.Media.EncoderParameters encoderParameters = new VAST.Media.EncoderParameters
{
    PreferredMediaFramework = VAST.Common.MediaFramework.Builtin,
    AllowHardwareAcceleration = true,
};

VAST.Media.IEncoder encoder = VAST.Media.EncoderFactory.Create(
    uncompressedVideoMediaType, encodedVideoMediaType, encoderParameters);

EncoderFactory selects the best available encoder based on the requested codec, media framework preference, and hardware acceleration setting.

EncoderParameters

Property Type Description
PreferredMediaFramework MediaFramework Builtin for the built-in encoder, FFmpeg for the FFmpeg-based encoder
AllowHardwareAcceleration bool true to prefer GPU-accelerated encoding (e.g., NVENC). When false with FFmpeg, the software x264 encoder is used

Retrieving the Actual Output Type

encodedVideoMediaType = encoder.OutputMediaType;

After creation, the encoder's OutputMediaType contains the finalized media type including codec private data (SPS/PPS for H.264). This must be used when registering the stream with the source — it ensures clients receive the correct codec configuration.

Creating the Source

VAST.Network.VirtualNetworkSource source = new VAST.Network.VirtualNetworkSource();
int videoStreamIndex = source.AddStream(encodedVideoMediaType);

this.server.CreatePublishingPoint("builtin", source);

The stream is registered with the encoder's actual output media type, not the originally requested one.

Encode and Push Loop

The loop renders frames, encodes them, and pushes encoded data at real-time pace.

Preparing Pixel Data

IntPtr ptr = bmpImage.GetPixels();
int imageSize = bmpImage.Info.RowBytes * bmpImage.Info.Height;
VAST.Common.VersatileBuffer pixelData = VAST.Media.MediaGlobal.LockBuffer(imageSize);
pixelData.Append(ptr, imageSize);
pixelData.Dts = videoFileTime;
pixelData.Pts = videoFileTime;

Raw pixel data is copied from the SkiaSharp bitmap into a VersatileBuffer obtained from the buffer pool via MediaGlobal.LockBuffer(). The buffer's Dts and Pts are set to the calculated frame timestamp.

Encoding

encoder.Write(pixelData);
pixelData.Release();

while (true)
{
    VAST.Common.VersatileBuffer encodedData = encoder.Read();
    if (encodedData == null)
    {
        break;
    }
    else
    {
        encodedData.StreamIndex = videoStreamIndex;
        source.PushMedia(encodedData);
        encodedData.Release();
    }
}

The encoding pipeline is a write/read pattern:

  1. Write — feed uncompressed pixel data into the encoder via encoder.Write(). Release the input buffer immediately after writing.
  2. Read — call encoder.Read() in a loop to retrieve encoded frames. The encoder may buffer frames internally (e.g., for B-frame reordering), so a single write may produce zero or multiple encoded frames.
  3. Push — set the StreamIndex on each encoded buffer and push it to the source via PushMedia(). Release the buffer after pushing.

Pacing

Task.Delay(10).Wait();

Platform Support

This use-case requires the VAST.Codec library for encoding when MediaFramework.FFmpeg or MediaFramework.CUDA framework is chosen. When using MediaFramework.FFmpeg, FFmpeg libraries must be available on the system. Hardware acceleration requires a compatible GPU (e.g., NVIDIA with NVENC support). When using MediaFramework.CUDA, an Nvidia GPU is expected to be present in the system.

See Also