objectMetadata not appearing in Client

In Progress

Comments

11 comments

  • Avatar
    Andrey Terentyev

    Hello,

    Could you please provide OS version, VMS build number, Metadata SDK version?

    0
    Comment actions Permalink
  • Avatar
    Andrey Terentyev

    Graham Parry,

    Could you, please, do the following?

    Take the sample_analytics_plugin of Metadata SDK. Port your example code there. Check if the issue still persists. If it does, share the plugin code with us.

    0
    Comment actions Permalink
  • Avatar
    Graham Parry

    Hi Andrey
    FYI I'm using nxwitness-metadata_sdk-5.0.0.35745-universal, a 5.0.0.35744 Server/Client, on Windows 10.
    I'm using the stub object streamer example with all the file stuff knocked out and with just two functions to handle the conversion of incoming Json via HTTP.

    I've left it a day or two as I was persisting with the code and eventually began to see the objects appear.  Just occasionally I'll see nothing arrive but for the most part I can load the metadata OK.
    As this is an early days for me experiment I need to start adding multiple frames per data object, similar to the stub object streamer example, as single shot examples won't do anything.  In the absence of realtime stream analysis I'm creating a looped app to send in say 300 frames for the same Json objects. 

    I am though finding it tricky to add a correct  timestamp to fit the timestampUs criteria.  As I'm bypassing pushCompressedVideoFrame I can't grab the videoPacket timestamp, so I have to create one in code for each objectMetadataPacket I'm creating.  Down the line there may well be timestamp matching issues anyway, but for now if you have a formula for a uint64_t timestamp  that would be useful.  


    0
    Comment actions Permalink
  • Avatar
    Graham Parry

    Update to my last post.

    Much better response now that I have multiple frames being sent in.  I think I have the timestamp nailed, at least a version good enough for test data, so I have answered my own question there.  

    However, the only thing missing is the boundingBox appearing on the video.  I'm taking values from those generated by the generate.py application, so they shouldn't be beyond what the server is expecting.  

    0
    Comment actions Permalink
  • Avatar
    Graham Parry

    Would like the bounding boxes to appear on playback but not achieving it.  I did read from another answer that the whole process of video analysis to plugin must be less than 2 seconds, but I am timestamping each frame in real time and it is sent to the plugin in an instant so I can't see that is  an issue.  Is there something in settings or manifests that may be missing?  While the frames are being generated and sent to the plugin I don't see the boxes either.
    On the other hand, with your stub streamer example the boxes appear while the objects are being created from the file, but afterwards, when I select any of them in the client for playback, they don't show the boxes.  Could you help with the criteria required for a boundingBox to appear both as the object is created (passed to the plugin) and also on playback.

    0
    Comment actions Permalink
  • Avatar
    Andrey Terentyev

    Graham,

    I'd like to help you, but you're asking questions to which I can't give answers without having a source code example.

    I need a piece of code reproducing the issue in order we would be on the same page.

    Could you, please, do the following?

    Take the sample_analytics_plugin of Metadata SDK. Port your example code there. No need to port the HTTP server. Check if the issue still persists. If it does, share the plugin code with us and describe repoduction scenario, i.e. steps to reproduce, expected result and actual result.

    0
    Comment actions Permalink
  • Avatar
    Graham Parry

    Thanks Andrey

    Have made a start on implementing our code on the sample plugin.  Will advise how it goes and supply the code block if still not appearing.

    0
    Comment actions Permalink
  • Avatar
    Graham Parry

    Tried it out using the sample_analytics_plugin as the most basic framework and still no visible bounding boxes. 

    This is the json person object I'm creating. 

    {
        "timestampUs": 0,
        "durationUs": 0,
        "objects": [
            {
                "typeId": "nx.base.Person",
                "trackId": "95b57074-641f-42a3-94ba-473846dda36",
                "attributes": {
                "Age": "Adult", 
                "Gender": "Woman",
                "Top Clothing Color": "Pink", 
                "Bottom Clothing Color": "Grey"            
            }, 
                "boundingBox": {
                "x": 0.2418023131836082, 
                "y": 0.2579128121515938, 
                "width": 0.33715904075347155, 
                "height": 0.3583852790585783
            }
            }
        ]
    }

    My loop code generates any number of test frames, creates the timestamp each time and adjusts the x y co-ordinates very slightly for each frame.  I put recording on, run my frame generator, and send in n frames of json metadata.   The person object is then visible in the client and will play back, but with no boundingBox visible.

    The plugin code creating the objectMetadata for the server is shown below.  Hope this helps.  

    std::tuple<int, std::string> DeviceAgent::transferIncomingData(const crow::request& request)
    {

        const auto objectMetadataPacket = makePtr<ObjectMetadataPacket>();

        std::string body = request.body;

        rapidjson::Document d;
        d.Parse<0>(body.c_str());

        if (d.HasParseError())
            return std::make_tuple(400, "Json data did not parse");

        //  Set the packet timestamp and duration
        uint64_t ts = static_cast <uint64_t>(d["timestampUs"].GetInt64());
        uint64_t dr = static_cast <uint64_t>(d["durationUs"].GetInt64());
        objectMetadataPacket->setTimestampUs(ts);
        objectMetadataPacket->setDurationUs(dr);

        //  Now find the objects
        const rapidjson::Value& obj = d["objects"];

        //  Iterate through the object(s)
        for (rapidjson::Value::ConstValueIterator ito = obj.Begin(); ito != obj.End(); ++ito) {

            auto objectMetadata = makePtr<ObjectMetadata>();
            objectMetadata->setTypeId((*ito)["typeId"].GetString());
            objectMetadata->setTrackId(nx::sdk::UuidHelper::fromStdString((*ito)["trackId"].GetString()));

            //  Create the bounding box for the object in this frame
            float x = 0;
            float y = 0;
            float width = 0;
            float height = 0;

            const rapidjson::Value& box = (*ito)["boundingBox"];
            for (rapidjson::Value::ConstMemberIterator itb = box.MemberBegin(); itb != box.MemberEnd(); ++itb)
            {

                if (itb->name.GetString() == "x")
                    x = itb->value.GetFloat();
                else if (itb->name.GetString() == "y")
                    y = itb->value.GetFloat();
                else if (itb->name.GetString() == "width")
                    width = itb->value.GetFloat();
                else if (itb->name.GetString() == "height")
                    height = itb->value.GetFloat();
            }

            objectMetadata->setBoundingBox(Rect(x, y, width, height));

            //  Add attributes after the bounding box
            const rapidjson::Value& attributes = (*ito)["attributes"];
            for (rapidjson::Value::ConstMemberIterator itr = attributes.MemberBegin(); itr != attributes.MemberEnd(); ++itr)
            {
                objectMetadata->addAttribute(makePtr<Attribute>(itr->name.GetString(), itr->value.GetString()));
            }

            //  Bounding box color
            objectMetadata->addAttribute(makePtr<Attribute>("nx.sys.color", "Red"));

            //  Add this object to the packet
            objectMetadataPacket->addItem(objectMetadata.get());

        }   //  Object Iteration

    //  Send the metadata packet to the NX Server
        objectMetadataPacket->addRef();
        pushMetadataPacket(objectMetadataPacket.get());

        //  Return a successful message in the HTTP response
        return std::make_tuple(200, "Json data successfully converted.");
    }

     

     

    0
    Comment actions Permalink
  • Avatar
    Graham Parry

    Hi Andrey
    Any further thoughts on boundingBox being visible on playback?   Since my last message I've run the plugin on a live system with active cameras ( rather than experimenting with TestCamera ) and when I fire json in it successfully saves the objects ( in this case nx.base.person ) but again no bounding boxes either when the data is arriving or, more importantly, when I go to Advanced Search and playback a detected object.

    0
    Comment actions Permalink
  • Avatar
    Andrey Terentyev

    Graham,

    Any further thoughts on boundingBox being visible on playback?  

    You could use analytics logging and along the analytics data flow for debugging this issue. Here is the article which would help.

    https://support.networkoptix.com/hc/en-us/articles/360058902133-Analytics-Logs-Monitoring-Metadata-Flow

    In your case, I'd suggest checking logs on Nx Witness Desktop part of the flow, i.e.

    • rtp_parser
      Metadata extracted from the stream coming from the Server.
      • widget_analytics_controller
        Metadata to be displayed on the video.

    log files.

    0
    Comment actions Permalink
  • Avatar
    Graham Parry

    Thanks Andrey.  Looks like those logs will prove the best way or working out what's happening to the metadata at its various stages.

    0
    Comment actions Permalink

Please sign in to leave a comment.