Integrating with nvidia deepstream

Answered

Comments

10 comments

  • Avatar
    Nx Support

    Hi Sebastian, it is possible to integrate Nx with DeepStream. 

    Here is a basic idea:

    DeepStream pipeline consists of GStreamer plugins: the pipeline receives video, sends frames into the decoder, does several processing step, and retrieves metadata after tracking step to do visualization or other tasks

    To integrate DeepStream with Nx Server:

    Implement Metadata SDK plugin with modified DeepStream pipeline: 

    • Replace RTSP component and sending frames from Nx Server directly into the decoder 
    • Remove Visualization component and send metadata back to Nx Server

    0
    Comment actions Permalink
  • Avatar
    Gourav Vemula

    How to batch the frame from multiple cameras and send it to Nvidia Deepstream for analytics. I have successfully integrated NX witness with Nvidia Deepstream but it creates separate instance for each pipeline. How can I do batch inference on multiple streams with single Deepstream plugin

    0
    Comment actions Permalink
  • Avatar
    Floris De Smedt

    Hi,

    is there an example available how to do this?

    Currently I'm connecting with rtspsrc to an rtsp stream, but I'm not able to retrieve the timestamp from that stream (it always shows up as 0 in the ntp_timestamp field). 

    In your figure, it seems you skip the rtsp source. How would this pipeline be defined then?

    Best regards,
    Floris,
    senior data scientist smart city applications
    Robovision

    0
    Comment actions Permalink
  • Avatar
    Tagir Gadelshin

    Hi, Floris De Smedt
    Good to hear that Robovision is creating an integration with Nx!

    What is your full scenario? Do you develop a plugin? The picture above implies that the plugin receives frames from Nx Server so that there is no need to retrieve timestamps from RTSP, it's handled on the Nx server side, plugin receives frames.

    But in case you need a timestamp from RTSP, you should have a look at RTCP packets called Sender Report:
    https://datatracker.ietf.org/doc/html/rfc3550#section-6.4.1https://datatracker.ietf.org/doc/html/rfc3550#section-4

    Also useful links and discussions:
    https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_NTP_Timestamp.html
    https://stackoverflow.com/questions/20094998/retrieving-timestamp-in-rtp-rtsp
    https://stackoverflow.com/questions/6149983/h-264-rtsp-absolute-timestamp

     

    0
    Comment actions Permalink
  • Avatar
    Floris De Smedt

    Hi,

    yes, we are working on integration(s) with Network Optix.

    The "default" way of processing a frame completely in the plugin itself has some limitations for us:

    • We make use of (python) libraries for task parallelism in complex processing pipelines
    • Most of our code is python based 
    • The Nx server and the machine(s) that perform the processing are not always the same system

    Therefor, for the demonstrator I worked on in the past, I use a different architecture. We read the rtsp stream (which has the timestamp embedded), did the processing in our processing (which can run on any machine) and post the metadata on a message bus. The plugin we wrote subscribes on the same message bus, reads the meta data, and stores it according to the SDK.

    Although the C/C++ based DeepStream would allow to put all processing into the plugin, we have a few projects where information from different cameras needs to be combined to form the metadata. So the message bus architecture would still give us some additional flexibility :) .

    Currently I'm looking into using appsrc for the deepstream pipeline, to start from an RTSP stream. I did try the DeepStream manner of reading timestamps, but they still end-up to be 0 all the time.

    0
    Comment actions Permalink
  • Avatar
    Tagir Gadelshin

    Floris De Smedt
    I've seen in another thread that you are using ?onvif_replay=1 attribute when requesting RTSP stream, but can you confirm that you are doing it in this scenario as well?
    In general, I see that you've stumbled on synchronization issues at that time, does that mean that you've managed to take timestamps from RTSP?

    https://support.networkoptix.com/hc/en-us/community/posts/360050059114-Issue-with-timestamp-in-rtsp-stream-in-combination-with-plugin-


    0
    Comment actions Permalink
  • Avatar
    Floris De Smedt

    Hi,

    I tried with both the onvif option and without, and confirmed the timestamp is part of the data using a sniffer.

    At the time I was indeed working on this synchronisation issue. In the meantime we succeeded in solving that, which allows to run python processing on the rtsp stream, and store the meta information back into Nx. For the integration with DeepStream I will do something similar I think.

    0
    Comment actions Permalink
  • Avatar
    Tagir Gadelshin

    Floris De Smedt

    In the meantime we succeeded in solving that

    What was the issue? We have several similar requests, maybe this will help other developers.

    And do you need any additional support from us regarding your initial questions?

    0
    Comment actions Permalink
  • Avatar
    Floris De Smedt

    Sorry for the late reply

    What was the issue? We have several similar requests, maybe this will help other developers.

    Using the this code (https://github.com/ramoncaldeira/PyAV/tree/rtcp-ntp-time) I succeeded in getting the timestamps + frame data. It is however important to read sufficiently fast to avoid the frames and the timestamps getting out of sync. That was the issue I had. By reading and processing the stream in different processes/threads, this problem seemed to be resolved.

    This implementations works for live video feeds, but when I tested it in the past weeks, it turns out that when reading from a virtual camera (created from video files), the frame data gets ghosting (not the whole frame is correctly updated, causing movements to smear out). We are working on a new implementation that does not have that issue.

    And do you need any additional support from us regarding your initial questions?

    For now I'm following the track of using appsrc to get the frame data + timestamps. I did not get to an end-to-end integration (showing the detections as an overlay in the Nx client), so I don't know for sure, but I think this will work. 

    0
    Comment actions Permalink
  • Avatar
    Tagir Gadelshin

    Floris De Smedt
    Thanks! This is very informative!

    I'll try to outline your full solution:

    1. You take RTSP streams from the Nx Server
    2. Using PyAV (https://github.com/ramoncaldeira/PyAV/tree/rtcp-ntp-time) you are getting the timestamps + frame data. You do stream reading and processing in separate processes/threads to avoid delays.
    3. Using appsrc (https://gstreamer.freedesktop.org/documentation/app/appsrc.html?gi-language=c) you insert this data in the DeepStream pipeline (probably, this article describes the process: https://gstreamer.freedesktop.org/documentation/tutorials/basic/short-cutting-the-pipeline.html?gi-language=c), also some examples I've found located in the DeepStream SDK:
       /opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-appsrc-test
    4. Probably, using appsink you get the metadata & timestamps and send it to the Nx Server Plugin using zmq message-bus.
    5. The plugin code listens on the same message-bus and calls "pushMetaDataPacket"

    I hope this outlines your solution and maybe we will refer to it when creating some example plugins or developing new features in SDK.

     

    2
    Comment actions Permalink

Please sign in to leave a comment.