Publishing Point with User Push Source 2
This use-case demonstrates how to generate an encoded video stream from dynamically updated bitmap images using ImageSource, combined with pre-encoded AAC audio via VirtualNetworkSource. The two sources are merged using AggregatedNetworkSource.
Overview
The pushingRoutine2() method runs in a background task and performs the following:
- Loads a background image and creates a SkiaSharp canvas for dynamic rendering
- Creates an ImageSource that encodes bitmaps into H.264 video
- Creates a VirtualNetworkSource for pre-encoded AAC audio
- Combines both sources into an AggregatedNetworkSource
- Creates a publishing point and runs a loop that updates the image and pushes audio at real-time pace
Difference from Push Source 1
Push Source 1 pushes already-compressed H.264 data. This use-case pushes uncompressed bitmap images — the ImageSource handles the encoding to H.264 internally.
Setting Up the Canvas
The sample uses SkiaSharp for bitmap rendering:
SKBitmap bmpBackground = null;
SKBitmap bmpImage = null;
SKCanvas canvas = null;
using (var ms = new System.IO.MemoryStream(Properties.Resources.logo))
{
using (SKImage img = SKImage.FromEncodedData(ms))
{
bmpBackground = SKBitmap.FromImage(img);
}
}
bmpImage = new SKBitmap(1280, 720, SKColorType.Bgra8888, SKAlphaType.Premul);
canvas = new SKCanvas(bmpImage);
SKTypeface tf = SKTypeface.FromFamilyName("Arial", SKFontStyle.Bold);
SKFont font = new SKFont { Size = 48, Typeface = tf };
SKPaint paintFont = new SKPaint { Color = SKColors.Purple, Style = SKPaintStyle.Fill };
SKPaint paintFill = new SKPaint { Color = SKColors.White, Style = SKPaintStyle.Fill };
canvas.DrawBitmap(bmpBackground, new SKRect(0, 0, 1280, 720));
Creating the Video Source
VAST.Image.ImageSource videoSource = new VAST.Image.ImageSource(true);
videoSource.SetImage(bmpImage);
videoSource.Open();
VAST.Common.MediaType videoMediaType = new VAST.Common.MediaType
{
ContentType = VAST.Common.ContentType.Video,
CodecId = VAST.Common.Codec.H264,
Bitrate = 1000000,
Width = 1280,
Height = 720,
Framerate = new VAST.Common.Rational(30),
};
videoSource.SetDesiredOutputType(0, videoMediaType);
The true parameter in the ImageSource constructor indicates that the source is used in push mode — the image will be updated by user code via SetImage() calls rather than generated automatically. The SetDesiredOutputType call configures the H.264 encoding parameters.
Creating the Audio Source
Audio is handled separately using a VirtualNetworkSource, the same approach as in Push Source 1:
VAST.Network.VirtualNetworkSource audioSource = new VAST.Network.VirtualNetworkSource();
VAST.Common.MediaType audioMediaType = new VAST.Common.MediaType
{
ContentType = VAST.Common.ContentType.Audio,
CodecId = VAST.Common.Codec.AAC,
SampleRate = 44100,
Channels = 2,
Bitrate = 128000,
};
VAST.Codecs.AAC.ConfigurationParser.GenerateCodecPrivateData(audioMediaType);
int audioStreamIndex = audioSource.AddStream(audioMediaType);
Combining Sources with AggregatedNetworkSource
Since video and audio come from different source objects, they are combined using AggregatedNetworkSource:
VAST.Network.AggregatedNetworkSource source = new VAST.Network.AggregatedNetworkSource();
source.AddSource(videoSource);
source.AddSource(audioSource);
this.server.CreatePublishingPoint("builtin", source);
AggregatedNetworkSource merges multiple IMediaSource instances into a single source, combining their streams. The server sees a unified source with both video and audio streams.
Push Loop
The loop updates the bitmap and pushes audio frames at real-time pace:
Updating the Image
if (videoFileTime <= playbackTime)
{
canvas.DrawRect(250, 550 - font.Spacing, 780, font.Spacing, paintFill);
canvas.DrawText(string.Format("{0:yyyy-MM-dd HH:mm:ss.fff}", DateTime.Now), 250, 550, font, paintFont);
canvas.Flush();
videoSource.SetImage(bmpImage);
videoFrameCount++;
videoFileTime = videoFrameCount * 10000000L * videoMediaType.Framerate.Den / videoMediaType.Framerate.Num;
}
On each video frame interval, the sample draws the current time over the background image and calls SetImage() to trigger encoding. The ImageSource encodes the updated bitmap into an H.264 frame and pushes it downstream automatically — no manual PushMedia call is needed for video.
Pushing Audio
if (audioFileTime <= playbackTime)
{
int frameSize = ((audioBuffer[pos + 3] & 0x03) << 11)
| (audioBuffer[pos + 4] << 3)
| ((audioBuffer[pos + 5] & 0xE0) >> 5);
audioSource.PushMedia(audioStreamIndex, audioBuffer, pos + 7, frameSize - 7,
audioFileTime + streamingStarted.Ticks, long.MinValue,
VAST.Common.VersatileBufferFlag.AbsoluteTimestamp);
audioBitstreamPosition += frameSize;
audioFrameCount++;
}
Audio is pushed the same way as in Push Source 1 — ADTS frames are extracted and pushed with absolute timestamps.
Pacing
Task.Delay(10).Wait();
Platform Support
This use-case requires the VAST.Image library (VAST_FEATURE_IMAGE compilation symbol) and is supported on Windows, macOS, and Linux.
See Also
- Sample Applications — overview of all demo projects
- .NET Server Demo — parent page with setup instructions, license key configuration, and access URL reference
- Multi-Protocol Server — overview and full publishing point table
- Initialization — server creation and protocol configuration
- Server Event Handlers — authorization, connections, and error handling
- User Push Source 1 — push pre-encoded H.264/AAC data
- VAST.Network Library — StreamingServer API reference