testcamera: synchronize multiple streams

Answered

Comments

6 comments

  • Avatar
    Anton Babinov

    Hi Mat,


    generally, if you'll open 10 streams of the same camera in a desktop client, you'll that streams won't be perfectly synced. Also, the server starts reading the stream's frame not with the first frame of the video file, as there is some delay between testcamera starts streaming and the server opens the stream. It is unlikely that the first frame received by the server from camera 1 would correspond with the first frame received by the server from camera 2, check my example below:
    T0 -> T1->T2->T3
    1. At T0 you start testcamera tool or any other camera streams become available.
    2. At T1 server opens stream from cam 1 and starts receiving frames.
    3. At T2 server opens stream from cam 2 and starts receiving frames. However, at this time server already received some frames from cam1.

    Do you measure delay between frames in your plugin somehow or checks by viewing streams in desktop UI ? Could you please describe how do you sync frames in your code?

    1
    Comment actions Permalink
  • Avatar
    Mat L

    Hi,

    The plugin I'm developing sends all uncompressed frames and their timestamps ( videoFrame->timestampUs() ) to another "synchronizer" program I'm also developing.

    This so-called "synchronizer" is responsible for grouping frames from different cameras that have approximatively the same timestamps (i.e. they all belong to the same 100 ms interval, which is precise enough for the tracking task). Grouped frames are then processed by the multi-camera object tracking system.

    As you can guess from this description: during tests, I want to replicate the behaviour of multiple cameras sending (approximatively) synchronized frames.

    By the way, is it correct to assume that videoFrame->timestampUs() gives a timestamp suitable for the task I'm trying to achieve? I'm aware that the network latency might impact this timestamp, but it should be well below 100 ms. Camera sends frame => local network => NX Witness receives the frame and passes it to the plugin: in the end, is timestampUs() approximatively the time at which the camera sent the frame?

    0
    Comment actions Permalink
  • Avatar
    Evgeny Balashov

    Mat, with testcamera you can try this hack:

    1. You need the server to open all the streams at the same time. To do that - add cameras to the server, disable recording and analytics, close all cameras, and then open all of them at the same time on the layout. The server will try to request all streams simultaneously
    2. Then you need a server to maintain the connection continuously. To do that - enable continuous recording on all cameras (while they are still open on the layout). This should force the server to maintain a connection after you close the layout.

    This should help you, the streams will be as close as they can be with testcamera, but if some connection randomly closes - they will get out of sync again. Restarting the testcamera or the server might also help get more or less predictable results.

    If those things do not produce good enough results, then TestCamera will not help. In that case, you can try third-party streaming software (for example, Wowza)

    1
    Comment actions Permalink
  • Avatar
    Mat L

    Thanks for the idea, Evgeny. I'll try this "hack".

    Do you think that opening the streams with the Video API (rtsp://localhost:7001/[camera id]...) would have the same effect? This way, it might be easier to open the streams within a short time.

    I hope you'll have some time to answer my question about videoFrame->timestampUs() (last paragraph of my previous post).

    0
    Comment actions Permalink
  • Avatar
    Evgeny Balashov

    Yes, using Video API should have the same effect.

    Sorry for the delay.

    1
    Comment actions Permalink
  • Avatar
    Andrey Terentyev

    Hello Mat,

    By the way, is it correct to assume that videoFrame->timestampUs() gives a timestamp suitable for the task I'm trying to achieve? I'm aware that the network latency might impact this timestamp, but it should be well below 100 ms. Camera sends frame => local network => NX Witness receives the frame and passes it to the plugin: in the end, is timestampUs() approximatively the time at which the camera sent the frame?

    If you set "Trust camera timestamp" in the camera settings "Expert" tab in Nx Desktop, timestampUs() will return the value set by camera, otherwise the method will return the value set by the Server when a frame is received.

     

    1
    Comment actions Permalink

Please sign in to leave a comment.