Video Data Workflow
Internal Server Data Sources and Data Consumers
When a camera is connected, a data source is then created. Several data consumers then subscribe to the data source, including storage recorder, motion detector, desktop client, analytics plugin, etc.
Every data consumer works in a separate thread in async mode, which minimizes potential delays. The latency introduced by the server is typically a few milliseconds (less than 12ms). All threads receive a memory pointer, so no data copying takes place.
The server supports RTSP RTCP and RTSP ONVIF extension to receive absolute timestamp value from the camera
- RTCP: https://tools.ietf.org/html/rfc3550
- ONVIF extension: https://www.onvif.org/specs/stream/ONVIF-Streaming-Spec-v210.pdf
By default, the server doesn’t trust the camera’s timestamp:
- Once the first packet is received from the camera, the server calculates the difference between server and camera time. This difference also contains network delay.
- This difference is then applied to all packets coming from the media server.
- The difference is calculated once for both video streams to keep streams synchronized.
To disable this function:
- Open Nx Desktop Client and go to Camera Settings
- Find the Expert tab, and check Trust camera timestamp
This option will make the server trust timestamps coming from the camera, as long as the time difference between the server and camera is less than 10 seconds. In this mode, network delay doesn’t affect the timestamp.
The server streams video with timestamps in accordance with RFC2326. To receive absolute timestamp values, the client application should specify the header "Require: onvif-replay". Then the server will include an absolute timestamp of every frame in RTP header extension in accordance to ONVIF format: ONVIF Streaming Specification
The server supports streaming RTSP over TCP or UDP. Multicast-streaming is not supported. Transport is specified by RTSP client according to the standard: RFC 2326,
TCP is the recommended transport protocol as the overhead is only a few milliseconds (less than 12ms)
Video and metadata are stored separately on the hard drives. Metadata is stored in the SQLite database, while the video is a set of ~1-minute files. During playback, the mediaserver retrieves metadata from the database and creates a custom, proprietary RTSP track. The client parses RTSP stream, extracts video and metadata tracks, and performs synchronization while playing the video.