HTTP Generic Event timestamp synchronization

Answered

Comments

11 comments

  • Avatar
    Nx Support

    Hi Eduardo,

    Preview generated for Generic events are taken from the closest key-frame, they are not using exact timestamps. This is done to optimize resources on the server.

    The best way to solve the problem is to integrate through the SDK, in this case, the previews will be precise. 

    1
    Comment actions Permalink
  • Avatar
    Eduardo Cuesta

    Yes, we will use the plugin.

    But anyway, if we read the video streams from the nx server, how can I get the timestamp in this streams? because the streams coming from the nx server have timestamp, right?

    2
    Comment actions Permalink
  • Avatar
    Nx Support

    When you request RTSP stream from the server, you could add ?onvif_replay=1 to the URL and server will be sending absolute timestamps in the RTSP.

    When you receive frames using SDK, every frame goes with timestamp by default.

    2
    Comment actions Permalink
  • Avatar
    Eduardo Cuesta

    Great, so we have modified our pipeline to do the following:

    1. Open the stream using the URL:

    rtsp://user:password@192.168.2.105:7001/e3e9a385-7fe0-3ba5-5482-a86cde7faf48?onvif_replay=1

    2. Grab video frames in our third party application using live555. This library provides synchronised presentation times using the RTCP sender reports [1][2]. We use this timestamp as the absolute timestamps.

    3. We send objects with the corresponding timestamp information to a plugin running on nx metaVMS server, that expose a REST API.

    4. This plugin creates the corresponding nx object using the absolute timestamp we sent before.

    The behaviour is better but we still find an important delay between what we see in our third party application (using live555) and the thumbnails in the nx metaVMS.

    [1] http://lists.live555.com/pipermail/live-devel/2008-July/009224.html

    [2] https://live-devel.live.narkive.com/WgLSYFbC/synchronize-the-rtcp-time-using-rtsp-range

    1
    Comment actions Permalink
  • Avatar
    Andrey Terentyev

    Hello Eduardo,

    Though, your described pipeline is possible, the recommended one is presented hereafter in order to minimize delays.

    1. Create analytics plugin by means of Metadata SDK. The plugin runs on the same machine the Server does, in the same process the Server does. Once a plugin is enabled on a camera, the Server passes all the video frames from this camera to the plugin. This happens in the same process, not network is involved. This reduces delays a lot.

    Your plugin sends either all the frames or just several of them to "third party application" your remote machine. It could significantly reduce the delays, too:

    a) You can detect objects not in every frame, but just in selected. If you don't want to perform complete frame processing, let's call this step pre-processing.

    b) once objects are pre-processed, this particular frames only could be sent for further processing to "third party application" over the network.

    2. "Third party application" processes the frames and sends back to the plugin object metadata.

    3. On receive, the plugin passes object metadata to the Server.

    Here is the scheme.

    Nx Witness Server -1. passes frames-> analytics plugin ---2. sends frames over the network---> "third party application"

    "third party application" --3. sends metadata over the network--> analytics plugin -4. passes metadata-> Nx Witness Server

    Here is the similar case.

    https://support.networkoptix.com/hc/en-us/community/posts/360038771573-Integrate-video-source-SDK-plugin-in-Nx-Witness-4-0-Trail-Version

    SCENARIO

    0
    Comment actions Permalink
  • Avatar
    Andrey Terentyev

    The improvement for recommended scenario.

    You could run your "third party application" in the separate process (or container, or VM) on the same machine the Server runs. In this case, physical network delays are eliminated.

     

    0
    Comment actions Permalink
  • Avatar
    Eduardo Cuesta

    Thank you very much for your prompt response. 

    I am not concerned about the delay, but rather about the synchronisation. I can afford a small delay in the visualisation, but unsynchronization of object detections and frames does not look good. Therefore, I want to further develop the scenario I proposed. Moreover, I want to avoid (as far as possible) the scenario where I send the frames from the plugin through a socket (or anything like that) to my 3rd party application, since it seems to me like reinventing the wheel... we are already using RTSP, it should be possible to synchronise some metadata on top of the stream I think.

    So, I wonder... how do you grab the timestamp within the server for the following scenarios?:

    1. Trust camera timestamps disabled:  I guess that you get it from the system clock.

    2. Trust camera timestamps enabled: here we are using the method setTimestampUs within the plugin, using the timestamp we grabbed from the stream in our 3rd party application with live555.  Is it compatible with your implementation?

     

     

    1
    Comment actions Permalink
  • Avatar
    Nx Support

    1. Trust camera timestamps disabled: Server ignores timestamps from the camera and assigns server time to every frame.

    2. Trust camera timestamps enabled: Server keeps timestamps from the camera as long as the difference between camera time an server time is less than 10 seconds. 

     

    Regarding the synchronization issue: Can you please create a short video demonstrating the problem? (if you are running Nx Client on Windows, you can use the screen recording feature to do it)

    To debug the problem further we need to understand, which specific component introduces discrepancy. 

    Can you try and track a single frame in the pipeline from beginning to the end saving timestamp at every step?

    Also, you can request a precise image from the server for any given timestamp, compare it to saved frame a and see if server returns the image you saved in the pipeline.

    0
    Comment actions Permalink
  • Avatar
    Eduardo Cuesta

    I have performed two experiments in order to debug further the issue. 

    Experiment 1

    For this experiment, I stream with your application testcamera a low framerate video, where I overlaid the frame number at the bottom. I blurred this video for privacy reasons.

    I have tracked some frames, comparing the thumbnails of nx metaVMS at the left, and the images of my pipeline at the right. You can see a difference of 2-4 frames between the server and my pipeline images, which means around 600 ms of delay. 

    Experiment 2

    For this experiment, I use a IP RTSP camera. I have recorded a physical chronometer and I have compared images of the server and my pipeline, measuring a delay of 600-700 ms between both. In this experiment, the images presented at the nx metaVMS are late (compared to thus of my pipeline).

    Moreover, I have noticed no difference using or not the option "Trust camera timestamp". I think it makes sense, since I capture the stream generated by nx metaVMS from my pipeline.

    5
    Comment actions Permalink
  • Avatar
    Andrey Terentyev

    Hello Eduardo,

    I'm going to create the ticket for further investigation of this issue. Please, check your email box for message from ticket system.

    Upon the resolution the result will be published in the forum.

    0
    Comment actions Permalink
  • Avatar
    Andrey Terentyev

    Solution from a support ticket.

    For the live stream the Nx Desktop displays object metadata and thumbnails on the right-hand panel as it receives those, i.e. without regarding or syncing to what frames are displayed on the scene.
    So, there might be a situation when you see object thumbnail and metadata on the right-hand panel but don't see a corresponding bounding box on the live picture on the scene, because the metadata is too late to be displayed on the live frame, since this frame is already outdated and can't be displayed live anymore.
    There is an intentionally introduced delay of max 2 seconds before the frame is displayed, exactly in order to address this latency between the metadata and the frame.
    This delay is dynamically calculated depending on the actual metadata arrival statistics.
    So, you might see a delay of max 2 seconds between physical picture and it's representation in the live stream on the scene in Nx Desktop.

    LIVEDELAY

    0
    Comment actions Permalink

Please sign in to leave a comment.