Video Data Workflow
Internal Server Data Sources and Data Consumers
When a camera is connected, a data source is created. Several data consumers then subscribe to the data source, including storage recorder, motion detector, desktop client, analytics plugin, etc.
Every data consumer works in a separate thread in async mode that minimizes potential delays. The latency introduced by the server is typically a few milliseconds (less than 12ms). All threads receive a memory pointer, so no data copying takes place.
The server supports RTSP RTCP and RTSP ONVIF extension to receive absolute timestamp value from the camera
By default, the server doesn’t trust the camera’s timestamp:
- Once the first packet is received from the camera, the server calculates the difference between server and camera time. This difference also contains network delay.
- This difference is then applied to all packets coming from the camera.
- The difference is calculated once for both video streams to keep streams synchronized.
To disable this function:
- Open Nx Meta Desktop and go to Camera Settings
- Find the Expert tab, and check Trust camera timestamp
This option will make the server trust timestamps coming from the camera, as long as the time difference between the server and camera is less than 10 seconds. In this mode, network delay doesn’t affect the timestamp.
The server streams video with timestamps in accordance with RFC 2326. To receive absolute timestamp values, the client application should specify the header "Require: onvif-replay". Then the server will include an absolute timestamp of every frame in RTP header extension in accordance with ONVIF format.
As an alternative to setting a header, the client application could use the query-parameter onvif_replay. This would allow clients to report an accurate and stable timestamp for each video frame played back.
Example link using onvif-replay
The server supports streaming RTSP over TCP or UDP. Transport is specified by RTSP client according to the standard: RFC 2326.
TCP is the recommended transport protocol as the overhead is only a few milliseconds (less than 12ms)
There are two ways to use multicast streams in the mediaserver:
General Multicast Plugin
Functions as a plugin and can accept any multicast video/audio data in MPEG-TS format. To use it, add a stream like: "udp://18.104.22.168:5000" using Nx Meta Desktop. Using this plugin, the server wouldn't control the provided multicast stream.
RTSP Multicast Stream
The server can configure a multicast stream from the camera automatically. Nx Meta Desktop allows users to select the transport type: visit Camera Settings -> Expert tab -> Media Streaming section -> RTP Transport; in which the server will change the RTP transport to multicast when establishing an RTSP session. The camera will then start a multicast stream and the server will join this group.
When the stream is no longer needed, the server will leave the multicast group and the camera will then close the stream if other clients don't use it.
Video and metadata are stored separately on the hard drives. Metadata is stored in the SQLite database file, while the video is a set of ~1-minute files. During playback, the mediaserver retrieves metadata from the database and creates a custom proprietary RTSP track. The client parses RTSP stream, extracts video and metadata tracks, and performs synchronization while playing the video.