We read a rtsp video stream* in our third party application. Then, we detect objects in the stream frame by frame. Finally, we create and HTTP generic event using a timestamp in milliseconds. However, we are unable to synchronize the event created in Nx MetaVMS and the frames we have in our third party application. There is a delay of some seconds (1-4 sec) between the frame we use for creating the event and the frame displayed in the Nx MetaVMS events screen.
We have tried several strategies for setting this timestamp in our application:
1. To use the local timestamp we have when we read the frame.
2. To use the presentation timestamp information provided by the rtsp stream.
3. To use the decoding timestamp information provided by the rtsp stream.
None of them have worked and we have also tried the option "Trust camera timestamp". So how do you get the frame when you received an HTTP event? which timestamp do you use?
*This stream is generated by your utility testcamera, using always the same test video in order to ensure the reproducibility of the experiment.
Please sign in to leave a comment.