CLIENT/SERVER
STREAMING SERVER
CAPTURE
PLAYBACK
PROTOCOLS
RTMP
RTP/RTSP
HLS
MPEG-DASH
WEBRTC
SRT
MORE
FILES
MP4
TS
TOOLS
PROCESSING
VIDEO
AUDIO
CODEC
Server demo shows how to use the object StreamingServer in user applications, how to use stand-alone single protocol servers and demonstrates how to implement receiving and transmitting data without visualization. Server demo contains 3 projects to run on Windows, Linux or macOS. They are already configured with platform specific extensions for user convenience.
Server demo requires administrator privilege because it uses Microsoft HTTP.sys server which can be started only by administrator.
There are several lines in the method Program.Main you should pay attention to. Firstly, VASTreaming log must be initialized:
VAST.Common.Log.LogFileName = System.IO.Path.Combine(logFolderName, logFileName); VAST.Common.Log.LogLevel = VAST.Common.Log.Level.Debug;
Demo already contains log initialization lines but they should be copied in order to integrate VASTreaming library into user project.
Next important line is buffers initialization:
// initialize internal buffers VAST.Media.MediaGlobal.Initialize(100);
Also, user can copy this line to a project. VAST.Media.MediaGlobal.Initialize parameter specifies expected application load. This number can be imagined as a number of expected concurrent connections to a server. If a server is expected to run in a low performance environment, then the parameter should be set to a small value. If a server is expected to run as a high-performance server then you can increase the parameter.
User should set license key (applicable for binary licenses only, source code license does not need this line):
// enter the license VAST.Common.License.Key = "YOUR_LICENSE_KEY";
This line must be copied to user project as well if user license type is a binary license (either annual subscription or lifetime license). Source code license does not need this line.
Finally, user should choose the sample to run (MultiProtoServer is the default one):
// TODO: choose which server you want to run MultiProtoServer sample = new MultiProtoServer(); //SimpleRtmpServer sample = new SimpleRtmpServer(); //SimpleRtspServer sample = new SimpleRtspServer(); //SendSample1 sample = new SendSample1(); //SendSample2 sample = new SendSample2(); //ReceiveSample sample = new ReceiveSample("rtp://127.0.0.1:10000"); //SimpleWebRtcServer sample = new SimpleWebRtcServer();
Below guide will explain server demo samples line by line.
MultiProtoServer demo shows how to use multi-protocol streaming server. This demo includes all server components provided by VASTreaming libraries i.e., RTMP, RTSP, HLS, MPEG-DASH etc.
In constructor, MultiProtoServer creates an instance of StreamingServer object:
server = new VAST.Network.StreamingServer(100);
Note that StreamingServer parameter specifies expected number of connections to optimize some internal lists and structures and does not actually limit connections.
Further down, MultiProtoServer performs initialization of all servers included into multi-protocol server and demonstrates usage of server configuration parameters:
// RTMP server.RtmpPort = 1935; // uncomment RTMPS port assignment if you want to run secure protocol //server.RtmpsPort = 1936; server.RtmpServerParameters = new VAST.RTMP.RtmpServerParameters { RtmpApplication = "live" }; // RTSP server.RtspPort = 554; // uncomment RTSPS port assignment if you want to run secure protocol //server.RtspsPort = 555; server.RtspServerParameters = new VAST.RTSP.RtspServerParameters { AllowedRtpTransports = VAST.RTP.RtpTransportType.Any }; // common HTTP server for WebRTC, MPEG-DASH and HLS server.HttpPort = 8888; // uncomment HTTPS port assignment if you want to run secure protocol //server.HttpsPort = 8889; if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows)) { // WebRTC is supported only on Windows at the moment server.WebRtcPath = "/rtc"; server.WebRtcServerParameters = new VAST.WebRTC.WebRtcServerParameters { IceServers = "stun:stun.l.google.com:19302", MediaTransport = VAST.WebRTC.WebRtcTransport.Auto, }; } // MPEG-DASH server.MpegDashPath = "/dash"; // HLS server.HlsPath = "/hls"; server.HlsServerParameters = new VAST.HLS.HlsServerParameters { EnableLowLatency = false }; // TS over HTTP // uncomment if you want to run this server //server.TsHttpPath = "/ts"; // MJPEG over HTTP // uncomment if you want to run this server //server.MjpegPath = "/mjpeg";
User should comment out port/path initializations to disable unnecessary protocols or uncomment secure ports assignments to enable secure protocols.
The following lines initialize API server.
// API server server.EnableJsonApi = true; server.ApiServerParameters.Users.Add("admin", "admin");
See API usage samples further down in this section.
In case user is going to test secure protocols, one has to specify a thumbprint of TLS certificate which should be used by the server:
// proper certificate thumbprint must be assigned if you want to run secure protocols server.CertificateThumbprint = "YOUR_CERTIFICATE_THUMBPRINT";
However, it is not enough for HTTPS protocol. User has to run the following command in an elevated command prompt:
netsh http add sslcert ipport=0.0.0.0:<https-port> certhash=<thumbprint> appid=<application-id>
Next lines from MultiProtoServer constructor subscribe for StreamingServer events and start the server:
server.PublishingPointRequested += Server_PublishingPointRequested; server.Authorize += Server_Authorize; server.PublisherConnected += Server_PublisherConnected; server.ClientConnected += Server_ClientConnected; server.Error += Server_Error; server.Disconnected += Server_Disconnected; server.Start();
Further down, there are multiple common use-cases of the server. Note, the below lines can be called from any part of a user code, not necessarily from the server constructor. By default, all samples are commented out, a user has to uncomment the desired ones.
Publishing point sample 1. Create publishing point with a pull source allows creating publishing point which pulls a stream from a remote server:
/////////////////////////////////////////////////////////////////// // 1. Create publishing point with a pull source /////////////////////////////////////////////////////////////////// this.pullSourceId = this.server.CreatePublishingPoint("publishing-path", "source-uri");
Publishing point sample 2. Create VOD publishing point for a single file shows how to create a publishing point for serving a single VOD file:
/////////////////////////////////////////////////////////////////// // 2. Create VOD publishing point for a single file // NOTE: accessible via HLS http://127.0.0.1:8888/hls/VOD_PUBLISHING_POINT_NAME // or MPEG-DASH http://127.0.0.1:8888/dash/VOD_PUBLISHING_POINT_NAME /////////////////////////////////////////////////////////////////// Uri uri = new Uri(@"MP4_FILE_PATH"); this.server.CreatePublishingPoint("publishing-path", uri.AbsoluteUri, VAST.Common.StreamingMode.Vod);
Publishing point sample 3. Create VOD publishing point for a single file from user stream allows creating a VOD publishing point which is being fed by user stream instead of reading from a file i.e., user can virtualize a source which can be absent on a physical media:
/////////////////////////////////////////////////////////////////// // 3. Create VOD publishing point for a single file from user stream // NOTE: accessible via HLS http://127.0.0.1:8888/hls/VOD_PUBLISHING_POINT_NAME /////////////////////////////////////////////////////////////////// VAST.File.ISO.IsoSource source = new VAST.File.ISO.IsoSource(); System.IO.Stream file = System.IO.File.Open(@"MP4_FILE_PATH", System.IO.FileMode.Open, System.IO.FileAccess.Read, System.IO.FileShare.ReadWrite); byte[] buffer = new byte[file.Length]; file.Read(buffer); source.Stream = new System.IO.MemoryStream(buffer); // if using user stream then publishing point can support only one output protocol this.server.CreatePublishingPoint("publishing-path", source, VAST.Common.StreamingMode.Vod, new VAST.Network.PublishingPointParameters { IsTemporary = true, AllowedProtocols = VAST.Common.StreamingProtocol.HLS });
Publishing point sample 4. Create VOD publishing point for a directory shows how to create a VOD publishing point which serves all files in an indicated directory, i.e., any file in this directory can be played as a VOD file:
//////////////////////////////////////////////////////////////////// // 4. Create VOD publishing point for a directory // NOTE: accessible via HLS http://127.0.0.1:8888/hls/publishing-path/filename.mp4 // or MPEG-DASH http://127.0.0.1:8888/dash/publishing-path/filename.mp4 // where filename.mp4 is a file path relative to PATH_TO_FOLDER_WITH_FILES /////////////////////////////////////////////////////////////////// Uri uri = new Uri(@"PATH_TO_FOLDER_WITH_FILES"); // parameter ?recursive=1 and can be used to access files in subdirectories string wildcardUri = uri.AbsoluteUri + "/*.mp4?recursive=1"; this.server.CreatePublishingPoint("publishing-path", wildcardUri, VAST.Common.StreamingMode.Vod);
Publishing point sample 5. Create live publishing point with a file source which will be played in an endless loop:
/////////////////////////////////////////////////////////////////// // 5. Create publishing point with a file source (endless loop of the same file) /////////////////////////////////////////////////////////////////// VAST.File.FileSource source = new VAST.File.FileSource(); source.Uri = @"VIDEO_FILE_PATH"; source.Loop = true; this.pullSourceId = this.server.CreatePublishingPoint("publishing-path", source);
Publishing point sample 6. Create publishing point with an image source. This sample shows how to create a publishing point, input of which receives a single static image or a sequence of images from user, and output of which forms a video stream:
/////////////////////////////////////////////////////////////////// // 6. Create publishing point with an image source /////////////////////////////////////////////////////////////////// VAST.File.ImageSource source = new VAST.File.ImageSource(); // source image can be set via file path, stream or Bitmap source.SetImage(@"IMAGE_FILE_PATH"); source.Open(); // set video encoding parameters VAST.Common.MediaType mt = new VAST.Common.MediaType { ContentType = VAST.Common.ContentType.Video, CodecId = VAST.Common.Codec.H264, Bitrate = 1000000, Width = 1280, Height = 720, Framerate = new VAST.Common.Rational(30), }; source.SetDesiredOutputType(0, mt); // create publishing point this.pullSourceId = this.server.CreatePublishingPoint("publishing-path", source);
Publishing point sample 7. Create publishing point with a capture source. This sample shows how to use capture source e.g., a web camera, as a source for streaming server:
/////////////////////////////////////////////////////////////////// // 7. Create publishing point with a capture source /////////////////////////////////////////////////////////////////// if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows)) { Task.Run(() => createCaptureSource()); }
Function createCaptureSource() enumerates available audio & video capture devices, chooses the first ones from the list, creates capture sources, then adds them to VAST.Network.AggregatedNetworkSource to form a single source from two streams and, finally, creates a publishing point with an aggregated source.
Publishing point sample 8. Create publishing point which uses pre-encoded media data of user:
/////////////////////////////////////////////////////////////////// // 8. Create publishing point which uses pre-encoded media data of user /////////////////////////////////////////////////////////////////// this.pushingTask = Task.Run(() => pushingRoutine1());
Function pushingRoutine1() creates VAST.Network.VirtualNetworkSource, adds one video and one audio stream to it, creates a publishing point with this source and publishing path builtin. After that it runs an endless loop and keeps pushing pre-encoded media data to VAST.Network.VirtualNetworkSource. The demo takes media data from resources, but a user can obtain them by any means in real-time.
Built-in publishing point can be watched via:
Publishing point sample 9. Create publishing point which uses ImageSource for encoding video stream:
/////////////////////////////////////////////////////////////////// // 9. Create publishing point which uses ImageSource for encoding video stream /////////////////////////////////////////////////////////////////// this.pushingTask = Task.Run(() => pushingRoutine2());
Function pushingRoutine2() creates VAST.Image.ImageSource to generate H.264 video stream from bitmaps provided by user, VAST.Network.VirtualNetworkSource for audio, then adds them to VAST.Network.AggregatedNetworkSource to form a single source from two streams, then creates a publishing point with this source and publishing path builtin. After that it runs an endless loop and keeps pushing raw bitmaps to VAST.Image.ImageSource and pre-encoded audio data to VAST.Network.VirtualNetworkSource.
Publishing point sample 10. Create publishing point which uses uncompressed video data received from user, encodes it with IEncoder and pushes it into a publishing point:
/////////////////////////////////////////////////////////////////// // 10. Encode uncompressed user video on the fly and push to the server /////////////////////////////////////////////////////////////////// this.pushingTask = Task.Run(() => pushingRoutine3());
Function pushingRoutine3() creates VAST.Media.IEncoder to encode raw pixel buffers to H.264 video, then creates VAST.Network.VirtualNetworkSource and a publishing point with this source and publishing path builtin. After that it runs an endless loop and keeps encoding video samples and pushing them to VAST.Network.VirtualNetworkSource.
The server demo also shows how to use JSON API. Initialization of JSON API is carried out in the constructor of class MultiProtoServer. Different ways of JSON API usage can be found in the API/Sample.txt file in Visual Studio project as well.
API sample 1. Authorize.
POST http://127.0.0.1:8888/api/v1/auth Sample body: { "name": "admin", "password": "admin" }
API sample 2. Create mixing publishing point.
POST http://127.0.0.1:8888/api/v1/publishing-point?token=TODO_replace_with_ authentication_token Sample body: { "path": "mix", "title": "Mixing Publishing Point Test", "streaming-mode": "live", "allow-video-processing": true, "sources": [ { "uri": "vast://publishing-point/host1" }, { "uri": "vast://publishing-point/host2" }, { "uri": "vast://publishing-point/host3" }, { "uri": "vast://publishing-point/host4" }, { "uri": "file:///api-background.jpg" }, { "uri": "file:///api-watermark.png" }, { "uri": "file:///api-audio-only.png" } ], "processing": { "video": { "discard": false, "transcoding": true, "tracks": [ { "index": 0, "width": 1280, "height": 720, "framerate": "30/1", "keyframe-interval": 30, "bitrate": 3000000, "codec": "H264", "profile": 66, "mixing": { "type": "all", "layers": [ { "sources": [ 4 ], "layout": "manual", "stretch": "fill" }, { "sources": [ 0, 1, 2, 3 ], "layout": "auto", "stretch": "zoom" }, { "sources": [ 5 ], "layout": "manual", "stretch": "fill", "location": [ 50, 50, 295, 109 ], "opacity": 0.7 } ], "audio-source-image": 6 } } ] }, "audio": { "discard": false, "transcoding": true, "tracks": [ { "index": 1, "sample-rate": 44100, "channels": 1, "bitrate": 128000, "codec": "AAC", "mixing": { "type": "all" } } ] } } }
Source URI vast://publishing-point/<publishing-path> means that mixing publishing point is using publishing point with a path <publishing-path> as a source. So, for the above sample when somebody starts publishing to rtmp://127.0.0.1/live/host1 then mixing publishing point picks up the stream and shows it on layer 2. Image file path file:///api-background.jpg is a shortcut to a folder with server binary. Absolute file URI can be used e.g., file://C:/Images/image.png. It's also possible to use http link as an image source.
Layers are declared with incrementing Z order, i.e., layer N+1 overlaps layers 0..N. Last declared layer overlaps all other layers.
Layer with manual layout must link only one source. Layer with auto layout can link any number of sources.
Location is an array of 4 integer coordinates: left, top, right, bottom.
Refer to server OpenAPI specification for more details.
API sample 3a. Update mixing settings.
PUT http://127.0.0.1:8888/api/v1/publishing-point/mix?token=TODO_replace_with_ authentication_token Sample body: { "processing": { "video": { "tracks": [ { "index": 0, "mixing": { "type": "all", "layers": [ { "sources": [ 4 ], "layout": "manual", "stretch": "fill" }, { "sources": [ 0 ], "layout": "manual", "stretch": "zoom" }, { "sources": [ 5 ], "layout": "manual", "stretch": "fill", "location": [ 50, 50, 295, 109 ], "opacity": 0.7 } ], "audio-source-image": 6 } } ] } } }
Provided sample changes layer 2 from showing all connected sources to showing of single source 0 full screen.
API sample 3b. Update sources.
PUT http://127.0.0.1:8888/api/v1/publishing-point/mix?token=TODO_replace_with_ authentication_token Sample body: { "sources": [ { "uri": "vast://publishing-point/host1" }, { "uri": "vast://publishing-point/host2" }, { "uri": "vast://publishing-point/host3" }, { "uri": "vast://publishing-point/host4" }, { "uri": "vast://publishing-point/host5" }, { "uri": "vast://publishing-point/host6" }, { "uri": "file:///api-background.jpg" }, { "uri": "file:///api-watermark.png" }, { "uri": "file:///api-audio-only.png" } ], "processing": { "video": { "tracks": [ { "index": 0, "mixing": { "type": "all", "layers": [ { "sources": [ 6 ], "layout": "manual", "stretch": "fill" }, { "sources": [ 0, 1, 2, 3, 4, 5 ], "layout": "auto", "stretch": "zoom" }, { "sources": [ 7 ], "layout": "manual", "stretch": "fill", "location": [ 50, 50, 295, 109 ], "opacity": 0.7 } ], "audio-source-image": 8 } } ] } } }
If you want to update source list then the complete new list must be specified.
If new source list affects currently active sources in the mixing settings then new mixing settings must be provided as well.
API sample 4. Delete mixing publishing point.
DELETE http://127.0.0.1:8888/api/v1/publishing-point/mix?token=TODO_replace_with_ authentication_token
User should also pay attention to sample event handlers for all events of StreamingServer. See the chapter above for more details about server events.
PublishingPointRequested event handler:
private void Server_PublishingPointRequested( Guid connectionId, VAST.Network.ConnectionInfo connectionInfo, VAST.Network.CreatedPublishingPointParameters createdPublishingPointParameters) { // newly connected client has requested publishing point which doesn't exist yet // user code in this event handler can create on-demand publishing point here if (connectionInfo.PublishingPath == "YOUR_ON_DEMAND_PUBLISHING_PATH") { this.server.CreatePublishingPoint("YOUR_ON_DEMAND_PUBLISHING_PATH", "source-uri"); } }
Authorize event handler:
private void Server_Authorize(Guid connectionId, VAST.Network.ConnectionInfo connectionInfo) { // connection type has been detected as well as all other parameters // user code has to choose whether connection is valid and set IsValid property accordingly connectionInfo.IsValid = true; }
PublisherConnected event handler:
private void Server_PublisherConnected(Guid connectionId, VAST.Network.PublishingPoint publishingPoint) { // publisher (or pull source created by user) has been connected and authorized // new publishing point has been created and is ready for forwarding, recording etc // below is a sample code to start forwarding of your source to another server // sample is using pull source but you can apply this logic to any source if (this.pullSourceId == connectionId) { //publishingPoint.StartForwarding("YOUR_FORWARDING_URI"); } // below is a sample code to start recording of your source to a local file // sample is using pull source but you can apply this logic to any source if (this.pullSourceId == connectionId) { //publishingPoint.StartRecording("YOUR_RECORDING_FILE_PATH"); } }
ClientConnected event handler:
private void Server_ClientConnected(Guid connectionId, VAST.Network.ConnectedClient client) { // new client has been connected // VAST.Network.ConnectedClient contains additional information about a client // including a publishing point this client belongs to }
Error event handler:
private void Server_Error(Guid connectionId, string errorDescription) { // session has encountered unrecoverable error and will be closed shortly }
Disconnected event handler:
private void Server_Disconnected(Guid connectionId, VAST.Network.ExtendedSocketError socketError) { // connection has been closed // socketError may contain additional information about the disconnection reason // if Error event has been raised for this connectionId // then it means that server forcibly closed the connection }
ReceiveSample shows how to use IMediaSource directly, by user, in order to receive data from a remote peer, and optionally decode it for further processing i.e., it demonstrates how to receive encoded data and decode it on the fly. The demo is being executed without visualization, offscreen.
First, you need to create source object, subscribe to events and start opening:
this.source = VAST.Media.SourceFactory.Create(uri); if (this.source == null) { throw new Exception("Unsupported protocol"); } this.source.Uri = uri; this.source.NewStream += source_NewStream; this.source.NewSample += source_NewSample; this.source.Error += source_Error; this.source.StateChanged += source_StateChanged; // start opening this.source.Open();
As soon as a source detects input streams, user receives NewStream event. Provided sample stores media type of each stream and creates decoders. Note that decoding is absolutely optional and provided just for demonstration of an approach:
private void source_NewStream(object sender, Media.NewStreamEventArgs e) { while (this.streams.Count < e.StreamCount) { this.streams.Add(null); this.decoders.Add(null); } this.streams[e.StreamIndex] = e.MediaType; try { VAST.Media.DecoderParameters decoderParameters = new VAST.Media.DecoderParameters { // set VAST.Common.MediaFramework.FFmpeg if you want to utilize FFmpeg for encoding PreferredMediaFramework = VAST.Common.MediaFramework.Builtin, // set to false if you want to force CPU decoding AllowHardwareAcceleration = true, }; switch (e.MediaType.ContentType) { case VAST.Common.ContentType.Video: { VAST.Common.MediaType desiredVideoMediaType = new VAST.Common.MediaType { ContentType = VAST.Common.ContentType.Video, CodecId = VAST.Common.Codec.Uncompressed, PixelFormat = VAST.Common.PixelFormat.BGRA, Width = e.MediaType.Width, Height = e.MediaType.Height }; this.decoders[e.StreamIndex] = VAST.Media.DecoderFactory.Create( e.MediaType, desiredVideoMediaType, decoderParameters); break; } case VAST.Common.ContentType.Audio: { VAST.Common.MediaType desiredAudioMediaType = new VAST.Common.MediaType { ContentType = VAST.Common.ContentType.Audio, CodecId = VAST.Common.Codec.PCM, SampleFormat = VAST.Common.SampleFormat.S16, SampleRate = e.MediaType.SampleRate, Channels = e.MediaType.Channels }; this.decoders[e.StreamIndex] = VAST.Media.DecoderFactory.Create( e.MediaType, desiredAudioMediaType, decoderParameters); break; } default: // no decoding of everything else break; } } catch (Exception ex) { VAST.Common.Log.WarnFormat( "Failed to create decoder for stream {0}, type {1}: {2}", e.StreamIndex, e.MediaType, ex.ToString()); } }
When object is opened and user receives VAST.Media.MediaState.Opened state, it is needed to start the source:
private void source_StateChanged(object sender, Media.MediaState e) { lock (this) { switch (e) { case VAST.Media.MediaState.Opened: // source successfully opened, start it this.source.Start(); break; case VAST.Media.MediaState.Started: // source has been started, samples will come soon break; case VAST.Media.MediaState.Closed: // source has been disconnected break; } } }
In case of an error, developer may want to show an error to a user or perform other necessary actions:
private void source_Error(object sender, Media.ErrorEventArgs e) { // source error occurred VAST.Common.Log.ErrorFormat("Error occurred: {0}", e.ErrorDescription); }
Finally, after a source gets VAST.Media.MediaState.Started state, you will start receiving media samples:
private void source_NewSample(object sender, Media.NewSampleEventArgs e) { if (e.Sample.StreamIndex < 0 || e.Sample.StreamIndex >= this.streams.Count) return; // unexpected stream index if (this.decoders[e.Sample.StreamIndex] == null || !this.decoders[e.Sample.StreamIndex].IsUsable) return; // unsupported decoder VAST.Media.IDecoder decoder = this.decoders[e.Sample.StreamIndex]; decoder.Write(e.Sample); while (true) { VAST.Common.VersatileBuffer decodedSample = decoder.Read(); if (decodedSample == null) { break; } // TODO: process decodedSample decodedSample.Release(); } }
SendSample1 shows how to use IMediaSink directly, by user, in order to send data to a remote server. SendSample1 uses pre-encoded media data. While sending, the demo operates only one audio stream.
First, it is needed to create sink object, subscribe to events, add source stream(s) and start opening:
this.sink = new VAST.RTSP.RtspPublisherSink(); this.sink.Uri = "rtp://127.0.0.1:10000"; this.sink.Error += sink_Error; this.sink.StateChanged += sink_StateChanged; audioBuffer = new byte[Properties.Resources.g711u.Length]; Properties.Resources.g711u.CopyTo(audioBuffer, 0); audioMediaType = new VAST.Common.MediaType { ContentType = Common.ContentType.Audio, CodecId = Common.Codec.G711U, SampleRate = 8000, Channels = 1, BitsPerSample = 8, Bitrate = 64000, FrameSize = 1280 }; // add audio media type this.sink.AddStream(0, audioMediaType); this.sink.Open();
When object is opened and user receives VAST.Media.MediaState.Opened state, a pushing thread can be started:
private void sink_StateChanged(object sender, Media.MediaState e) { lock (this) { switch (e) { case VAST.Media.MediaState.Opened: // sink successfully opened, start it this.sink.Start(); this.pushingTask = Task.Run(() => pushingRoutine()); break; case VAST.Media.MediaState.Started: // sink has been started break; case VAST.Media.MediaState.Closed: // sink has been disconnected break; } } }
private void sink_Error(object sender, Media.ErrorEventArgs e) { // sink error occurred VAST.Common.Log.ErrorFormat("Error occurred: {0}", e.ErrorDescription); }
In pushing thread, user has to keep pushing pre-encoded media data to a sink with a real-time pace. The demo takes media data from resource, but a user can obtain it by any other means as well.
private void pushingRoutine() { DateTime streamingStarted = DateTime.Now; long audioFrameCount = 0; int audioBitreamPosition = 0; while (this.running) { // playback time in 100ns units long playbackTime = (long)(DateTime.Now - streamingStarted).Ticks; // audio timestamp converted to 100ns units long audioFileTime = audioFrameCount * 10000000L * 1024 / audioMediaType.SampleRate; do { if (audioBitreamPosition >= audioBuffer.Length) { // EOS audioBitreamPosition = 0; } if (audioFileTime <= playbackTime) { // prepare buffer VAST.Common.PacketBuffer packet = VAST.Media.MediaGlobal.LockBuffer(audioMediaType.FrameSize); packet.Append(audioBuffer, audioBitreamPosition, audioMediaType.FrameSize); packet.Pts = packet.Dts = audioFileTime; packet.KeyFrame = packet.CleanPoint = true; packet.StreamIndex = 0; // push sample to clients if any this.sink.PushMedia(0, packet); packet.Release(); audioBitreamPosition += audioMediaType.FrameSize; audioFrameCount++; audioFileTime = audioFrameCount * 10000000L * audioMediaType.FrameSize * 8 / audioMediaType.Bitrate; } } while (audioFileTime <= playbackTime); Task.Delay(10).Wait(); } }
SendSample2 also shows how to use IMediaSink directly, but it demonstrates how to send audio and video streams simultaneously.
The logic behind this sample is very similar to a previous one, SendSample1. The only difference is SendSample2 pushes two streams: one video and one audio. Please, refer to the previous section for detailed explanation of the sample logic.
SimpleRtmpServer shows how to use RTMP server in a stand-alone mode, without multi-protocol server. User should be aware that RtmpServer is very simple and media flow between connected peers must be implemented in user code. Provided sample shows how to do it. In case user does not want to handle such low-level stuff, it is needed to consider using of StreamingServer which handles media flow automatically.
First, it is necessary to create server object, subscribe to events and start it:
// initialize RTMP VAST.RTMP.RtmpGlobal.RtmpDefaultAckWindowSize = 2500000; VAST.RTMP.RtmpGlobal.RtmpDefaultOurChunkSize = 1024; VAST.RTMP.RtmpGlobal.RtmpSessionInactivityTimeoutMs = 30000; VAST.RTMP.RtmpGlobal.RtmpApplication = "live"; // create RTMP server this.server = new VAST.RTMP.RtmpServer(); this.server.Connected += Server_Connected; this.server.Disconnected += Server_Disconnected; this.server.PublisherConnected += Server_PublisherConnected; this.server.ClientConnected += Server_ClientConnected; // TODO: add non-zero port for secure RTMPS as a second parameter and certificate thumbprint as a third parameter if you have properly installed certificate this.server.Start(1935, 0, "YOUR_CERTIFICATE_THUMBPRINT");
Once publisher is connected, it is needed to keep reference to it and subscribe to events:
private void Server_PublisherConnected(object sender, VAST.Network.INetworkSource publisher) { lock (this) { if (this.publishedStreams.ContainsKey(publisher.PublishingPoint)) { VAST.Common.Log.ErrorFormat( "Stream {0} already published", publisher.PublishingPoint); publisher.Accept = true; return; } // accept any stream name publisher.Accept = true; publisher.NewStream += Publisher_NewStream; publisher.NewSample += Publisher_NewSample; PublishedStream publishedStream = new PublishedStream(publisher.PublishingPoint); this.publishedStreams.Add(publisher.PublishingPoint, publishedStream); this.connectedPublishers.Add(publisher.EndPoint, publishedStream); } }
Then it is necessary to store media types of all publisher streams:
private void Publisher_NewStream(object sender, Media.NewStreamEventArgs e) { lock (this) { VAST.RTMP.RtmpPublisherSource publisher = sender as VAST.RTMP.RtmpPublisherSource; if (!this.publishedStreams.ContainsKey(publisher.PublishingPoint)) { VAST.Common.Log.ErrorFormat( "Stream {0} is not published", publisher.PublishingPoint); return; } PublishedStream publishedStream = this.publishedStreams[publisher.PublishingPoint]; // save MediaType of the media stream, it'll be necessary to send to connected client while (publishedStream.MediaStreams.Count < e.StreamCount) { publishedStream.MediaStreams.Add(null); } // save MediaType of the media stream, it'll be necessary to send to connected client publishedStream.MediaStreams[e.StreamIndex] = e.MediaType.Clone(); VAST.Common.Log.DebugFormat( "Publisher {0} new media type {1}: {2}", publisher.PublishingPoint, e.StreamIndex, e.MediaType.ToString()); bool descriptorReady = true; for (int i = 0; i < e.StreamCount; ++i) { if (publishedStream.MediaStreams[i] == null) { descriptorReady = false; break; } } if (descriptorReady) { // TODO: start file recording if necessary //this.startRecording(publishedStream, @"RECORDING_FILE_PATH"); } } }
Sample also demonstrates how to create optional file recorder by just uncommenting the line calling startRecording() method.
Once client is connected, it is needed to keep reference to it and add media streams received from a publisher:
private void Server_ClientConnected(object sender, VAST.Network.INetworkSink client) { lock (this) { if (!this.publishedStreams.ContainsKey(client.PublishingPoint)) { VAST.Common.Log.ErrorFormat("Stream {0} is not published", client.PublishingPoint); return; } PublishedStream publishedStream = this.publishedStreams[client.PublishingPoint]; // accept any client client.Accept = true; int index = 0; foreach (VAST.Common.MediaType mt in publishedStream.MediaStreams) { client.AddStream(index++, mt); } publishedStream.ConnectedClients.Add(client.EndPoint, client); this.connectedClients.Add(client.EndPoint, publishedStream); } }
When we have connected clients then media samples from publisher must be pushed to each client:
private void Publisher_NewSample(object sender, Media.NewSampleEventArgs e) { lock (this) { VAST.RTMP.RtmpPublisherSource publisher = sender as VAST.RTMP.RtmpPublisherSource; if (!this.publishedStreams.ContainsKey(publisher.PublishingPoint)) { VAST.Common.Log.ErrorFormat( "Stream {0} is not published", publisher.PublishingPoint); return; } PublishedStream publishedStream = this.publishedStreams[publisher.PublishingPoint]; foreach (VAST.Media.IMediaSink client in publishedStream.ConnectedClients.Values) { client.PushMedia(e.Sample.StreamIndex, e.Sample); } } }
When either publisher or client disconnects, it is needed to remove a reference from our structures. In case of publisher, it is also needed to disconnect all clients associated with this publishing point:
private void Server_Disconnected(object sender, VAST.Transport.TransportArgs e) { lock (this) { if (this.connectedPublishers.ContainsKey(e.EndPoint)) { PublishedStream publishedStream = this.connectedPublishers[e.EndPoint]; VAST.Common.Log.DebugFormat( "Disconnected publisher {0} publishing point {1}", e.EndPoint, publishedStream.PublishName); foreach (KeyValuePair entry in publishedStream.ConnectedClients) { this.connectedClients.Remove(entry.Key); entry.Value.Stop(); } this.publishedStreams.Remove(publishedStream.PublishName); this.connectedPublishers.Remove(e.EndPoint); } else if (this.connectedClients.ContainsKey(e.EndPoint)) { PublishedStream publishedStream = this.connectedClients[e.EndPoint]; if (publishedStream.ConnectedClients.ContainsKey(e.EndPoint)) { VAST.Common.Log.DebugFormat( "Disconnected client {0} publishing point {1}", e.EndPoint, publishedStream.PublishName); publishedStream.ConnectedClients.Remove(e.EndPoint); } else { VAST.Common.Log.ErrorFormat( "Disconnected client {0} not found in publishing point {1}", e.EndPoint, publishedStream.PublishName); } this.connectedClients.Remove(e.EndPoint); } else { VAST.Common.Log.ErrorFormat("Disconnected peer {0} not found", e.EndPoint); } } }
Optional file writer has to be explicitly created by user:
private void startRecording(PublishedStream publishedStream, string path) { VAST.Media.IMediaSink fileWriter = VAST.Media.SinkFactory.Create(path); if (fileWriter == null) { throw new Exception("File type is not supported"); } if (fileWriter is VAST.File.ISO.IsoSink) { // additional parameters for MP4 writer ((VAST.File.ISO.IsoSink)fileWriter).Parameters = new VAST.File.ISO.ParsingParameters { WriteMediaDataLast = true }; } fileWriter.Error += FileWriter_Error; fileWriter.Uri = path; fileWriter.Open(); int streamIndex = 0; foreach (VAST.Common.MediaType mt in publishedStream.MediaStreams) { fileWriter.AddStream(streamIndex++, mt); } fileWriter.Start(); publishedStream.ConnectedClients.Add(new IPEndPoint(0, 0), fileWriter); this.activeWriters.Add(fileWriter.UniqueId, publishedStream);
SimpleRtspServer shows how to use RTSP server in a stand-alone mode, without multi-protocol server. User should be aware that RtspServer is very simple and media flow between connected peers must be implemented in user code. Provided sample shows how to do it. In case a developer does not want to handle such low-level stuff, one needs to consider using of StreamingServer which handles media flow automatically.
SimpleRtspServer logic is pretty much the same as SimpleRtmpServer described above. Please refer to the previous section for detailed explanation.
SimpleWebRtcServer shows how to use stand-alone WebRTC signaling server without multi-protocol server. Stand-alone WebRTC signaling server is intended to create and control rooms with 2 or more users. Media data is exchanged directly among connected peers.
Since it is a pure signaling server, not receiving or sending any media data, its usage is very simple. First, it is necessary to create and start it:
this.server = new VAST.WebRTC.WebRtcStandaloneSignalingServer(8888, 0, "/"); this.server.Authorize += Server_Authorize; this.server.Start();
And when new peer is connected, it is necessary to accept or reject it:
private void Server_Authorize(object sender, VAST.WebRTC.WebRtcStandaloneSignalingServer.AuthorizeEventArgs e) { // connection type has been detected as well as all other parameters // user code has to choose whether connection is valid and set Accept property accordingly e.Accept = true; }