Let's take a deep dive into the inner workings of the Server from a plugin's perspective. The diagram above can serve as a reference.
During the initialization of a video device, the Server creates a Stream Reader (one per device) that reads encoded video frames from the device.
The Server always decodes video frames if motion detection is configured on a device. The Server writes the encoded video stream (without transcoding it) to the video archive if the recording is turned on for that device.
If a Plugin is enabled on a device, the Server creates a DeviceAgent
class instance per device (see Plugin control flow) and a dedicated Muxer instance per DeviceAgent
.
If a Plugin requests video frames (which is not always necessary), a special option in the capabilities section of the Engine
manifest has to be updated.
For example:
"capabilities": "needUncompressedVideoFrames_yuv420"
The Server supplies DeviceAgent
with frames by calling pushUncompressedVideoFrame()
or pushCompressedVideoFrame()
.
After processing a frame, DeviceAgent
can generate object and event metadata and pass it to the Server as pointers to the ObjectMetadataPacket
and EventMetadataPacket
class instances respectively.
Metadata is saved to the analytics DB, and pointers are passed to a Muxer, which combines video frames with the metadata and sends them to the Desktop client. In order for the metadata to be displayed in the live stream, the delay must be at most 2 seconds.
On accepting the metadata packet (i.e. a pointer to IMetadataPacket
, of which ObjectMetadataPacket
and EventMetadataPacket
are descendants), the Muxer puts it into the buffer with the limit of 5 seconds for reordering, then writes video to the archive along with video frames.
NOTE: The combined video frames and metadata are sent to the client and recorded into the video archive simultaneously.
The metadata is written to the .mkv file to the separate track. The limit of 5 seconds is introduced to reduce memory consumption. In order for the metadata to be displayed during video playback from the archive, the delay must not exceed the limit.
At the same time, the Server writes some index data related to the timestamp to the metadata DB, for searching. If the metadata packet is delayed for more than 5 seconds, it is written to another .mkv chunk, which may cause the video content to not necessarily line up with the metadata.
The metadata track will still store a valid timestamp but will not be displayed during playback in GUI. In this case, the metadata will still be searchable, thanks to indexes in the DB.
Comments
0 comments
Article is closed for comments.