Publishing Point with User Push Source 1
This use-case demonstrates how to push pre-encoded media data (H.264 video, AAC audio, and text metadata) into the server using VirtualNetworkSource. This is the most flexible push approach — user code has full control over the compressed bitstream and timestamps.
Overview
The pushingRoutine1() method runs in a background task and performs the following:
- Creates a VirtualNetworkSource
- Defines video, audio, and metadata streams with their media types
- Creates a publishing point with the source
- Runs a loop that reads pre-encoded frames from embedded resources and pushes them at the correct pace
Creating the Source
VAST.Network.VirtualNetworkSource source = new VAST.Network.VirtualNetworkSource();
VirtualNetworkSource is a virtual media source that accepts pre-encoded media samples pushed by user code. It implements IMediaSource and can be used anywhere a media source is expected.
Defining Streams
Each stream must be registered with the source before creating the publishing point. The AddStream method returns the stream index used for pushing samples.
Video Stream
byte[] videoBuffer = new byte[Properties.Resources.h264_demo.Length];
Properties.Resources.h264_demo.CopyTo(videoBuffer, 0);
VAST.Codecs.H264.ConfigurationParser h264Parser = new Codecs.H264.ConfigurationParser();
h264Parser.Parse(videoBuffer);
VAST.Common.MediaType videoMediaType = new VAST.Common.MediaType
{
ContentType = VAST.Common.ContentType.Video,
CodecId = VAST.Common.Codec.H264,
Bitrate = 800000,
Width = h264Parser.Width,
Height = h264Parser.Height,
Framerate = h264Parser.FrameRate,
PixelAspectRatio = h264Parser.AspectRatio,
};
VAST.Codecs.H264.ConfigurationParser.GenerateCodecPrivateData(videoMediaType, videoBuffer);
int videoStreamIndex = source.AddStream(videoMediaType);
The ConfigurationParser extracts resolution, framerate, and aspect ratio from the H.264 bitstream. GenerateCodecPrivateData fills the codec-specific configuration (SPS/PPS) into the media type, which is required for proper stream initialization.
Audio Stream
VAST.Common.MediaType audioMediaType = new VAST.Common.MediaType
{
ContentType = VAST.Common.ContentType.Audio,
CodecId = VAST.Common.Codec.AAC,
SampleRate = 44100,
Channels = 2,
Bitrate = 128000,
};
VAST.Codecs.AAC.ConfigurationParser.GenerateCodecPrivateData(audioMediaType);
int audioStreamIndex = source.AddStream(audioMediaType);
GenerateCodecPrivateData creates the AAC AudioSpecificConfig from the media type parameters.
Metadata Stream
VAST.Common.MediaType metadataMediaType = new VAST.Common.MediaType
{
ContentType = VAST.Common.ContentType.Text,
CodecId = VAST.Common.Codec.Text,
Bitrate = 23 * 10 * 8 // 23 bytes per packet, 10 packets per second
};
int metadataStreamIndex = source.AddStream(metadataMediaType);
The metadata stream carries arbitrary text data. In this sample, it sends the current timestamp as a UTF-8 string 10 times per second.
Creating the Publishing Point
this.server.CreatePublishingPoint("builtin", source);
The server takes ownership of the source. Connected clients can access it at rtsp://server/builtin, rtmp://server/live/builtin, etc.
Pushing Media Samples
The push loop runs on a background task and uses real-time pacing to deliver samples at the correct rate.
Timestamp Calculation
DateTime streamingStarted = DateTime.Now;
long timestampBase = DateTime.UtcNow.Ticks;
long videoFrameCount = 0;
long audioFrameCount = 0;
All timestamps are in 100-nanosecond units (the .NET Ticks unit). The timestampBase provides an absolute reference point, and per-frame timestamps are calculated relative to it.
// video timestamp in 100ns units
long videoFileTime = videoFrameCount * 10000000L * videoMediaType.Framerate.Den / videoMediaType.Framerate.Num;
// audio timestamp in 100ns units (AAC frame = 1024 samples)
long audioFileTime = audioFrameCount * 10000000L * 1024 / audioMediaType.SampleRate;
Pushing Video
source.PushMedia(videoStreamIndex, videoBuffer, h264BitstreamPosition, frameSize,
timestampBase + videoFileTime, timestampBase + videoFileTime,
VAST.Common.VersatileBufferFlag.AbsoluteTimestamp);
The sample walks through the H.264 bitstream, finding frame boundaries by locating NAL unit access unit delimiters. Each frame is pushed with absolute DTS and PTS timestamps.
Pushing Audio
// parse ADTS frame size
int frameSize = ((audioBuffer[pos + 3] & 0x03) << 11)
| (audioBuffer[pos + 4] << 3)
| ((audioBuffer[pos + 5] & 0xE0) >> 5);
source.PushMedia(audioStreamIndex, audioBuffer, pos + 7, frameSize - 7,
timestampBase + audioFileTime, timestampBase + audioFileTime,
VAST.Common.VersatileBufferFlag.AbsoluteTimestamp);
AAC frames are extracted from the ADTS bitstream. The 7-byte ADTS header is stripped — only the raw AAC frame data is pushed.
Pushing Metadata
byte[] data = System.Text.Encoding.UTF8.GetBytes(DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss.fff"));
source.PushMedia(metadataStreamIndex, data, 0, data.Length,
timestampBase + metadataTime, timestampBase + metadataTime,
VAST.Common.VersatileBufferFlag.AbsoluteTimestamp);
Pacing
Task.Delay(10).Wait();
The loop sleeps 10 ms between iterations. On each iteration, it pushes all frames whose timestamps have elapsed, ensuring real-time delivery.
PushMedia Method
The VirtualNetworkSource provides two PushMedia overloads:
| Overload | Description |
|---|---|
PushMedia(streamIndex, buffer, offset, size, dts, pts, flags) |
Push from a byte array with explicit timestamps |
PushMedia(sample) |
Push a pre-built VersatileBuffer |
Parameters
| Parameter | Type | Description |
|---|---|---|
streamIndex |
int |
Stream index returned by AddStream |
buffer |
byte[] |
Buffer containing the compressed media data |
offset |
int |
Starting position in the buffer |
size |
int |
Number of bytes to push |
dts |
long |
Decode timestamp in 100ns units |
pts |
long |
Presentation timestamp in 100ns units |
flags |
VersatileBufferFlag |
AbsoluteTimestamp for absolute timestamps, None for relative |
VersatileBufferFlag.AbsoluteTimestamp
When AbsoluteTimestamp is set, the timestamps represent absolute time (based on DateTime.UtcNow.Ticks). The server uses absolute timestamps to synchronize streams and calculate correct segment boundaries for HLS/DASH. If this flag is not set, timestamps are treated as relative to the source start.
Looping
When the bitstream position reaches the end of the buffer, it resets to zero to loop the content:
if (h264BitstreamPosition >= videoBuffer.Length)
{
h264BitstreamPosition = 0;
}
The frame counter continues incrementing, so timestamps increase monotonically even across loop boundaries.
See Also
- Sample Applications — overview of all demo projects
- .NET Server Demo — parent page with setup instructions, license key configuration, and access URL reference
- Multi-Protocol Server — overview and full publishing point table
- Initialization — server creation and protocol configuration
- Server Event Handlers — authorization, connections, and error handling
- VAST.Network Library — StreamingServer API reference