The Nx Witness Server will often need to react to certain events detected by a plugin. For example, when an object of a specific type is detected, the Nx Witness Server may need to send an HTTP request to an external system to notify it about the event. For this purpose, the Nx Witness Server is equipped with the Rules Engine, which can be configured in the “Camera Rules” dialog of the desktop client to process Analytics Events on a camera and to perform specific actions.
In this section, we will learn how to organize the code of a plugin to send Analytics Events to the Nx Witness Server. All the framework-specific details can be found in the source code of the current example in the step5/src/sample_company/vms_server_plugins/opencv_object_detection/ folder.
The first step is to familiarize ourselves with the documentation. In the src/nx/sdk/analytics/manifests.md file of the Metadata SDK, read the “Engine Manifest” and “DeviceAgent Manifest” sections regarding the “eventTypes” property, as well as the “Event types” section.
Declaring event types
Following the documentation, define the Event type ids and several additional variables in device_agent.h:
const std::string kDetectionEventType = "sample.opencv_object_detection.detection";
const std::string kDetectionEventCaptionSuffix = " detected";
const std::string kDetectionEventDescriptionSuffix = " detected";
const std::string kProlongedDetectionEventType =
"sample.opencv_object_detection.prolongedDetection";
Declare the event types in DeviceAgent::manifestString()
in device_agent.cpp:
"eventTypes": [
{
"id": ")json" + kDetectionEventType + R"json(",
"name": "Object detected"
},
{
"id": ")json" + kProlongedDetectionEventType + R"json(",
"name": "Object detected (prolonged)",
"flags": "stateDependent"
}
]
Structures for Event data conversion
Similarly to Structures for object detection we must define several entities in event.h to pass Event information across different formats to and from the OpenCV framework:
enum class EventType
{
detection_started,
detection_finished,
object_detected
};
struct Event
{
const EventType eventType;
const int64_t timestampUs;
const std::string classLabel;
};
Event generation
The information in a detected event is passed to the Nx Witness Server in the form of a metadata packet of the EventMetadataPacket
type — which is different from the ObjectMetadataPacket
type we used in Step 4. There are two different metadata packet types that may need to be passed to the Nx Witness Server at the same time: ObjectMetadataPacket
and EventMetadataPacket
. Therefore, we must consider generating several packets of different types.
For storing metadata packets, continue to use MetadataPacketList
(a vector of IMetadataPacket):
using MetadataPacketList = std::vector<nx::sdk::Ptr<nx::sdk::analytics::IMetadataPacket>>;
The events will be generated by the ObjectTracker::run()
method, and eventually by the OpenCV framework. We need to change the type of the returned result in order to contain both objects and events by defining a structure in object_tracker.h:
struct Result
{
DetectionList detections;
EventList events;
};
Refactor both the declaration and the implementation of ObjectTracker::run()
and ObjectTracker::runImpl()
to return the Result
structure.
We will stop using the DeviceAgent::generateEventMetadataPacket()
method and define a new one in the DeviceAgent
class:
MetadataPacketList eventsToEventMetadataPacketList(
const EventList& events,
int64_t timestampUs);
This method converts information stored in the Event structures received from ObjectTracker
to MetadataPacketList
, in order to be passed to the Nx Witness Server. In fact, this method is the interface between the Plugin and ObjectTracker
, utilizing the OpenCV framework.
Finally, we will refactor DeviceAgent::processFrame()
for processing the events. The try section needs to be modified:
try
{
DetectionList detections = m_objectDetector->run(frame);
ObjectTracker::Result objectTrackerResult = m_objectTracker->run(frame, detections);
const auto& objectMetadataPacket = detectionsToObjectMetadataPacket(
objectTrackerResult.detections,
frame.timestampUs);
const auto& eventMetadataPacketList = eventsToEventMetadataPacketList(
objectTrackerResult.events,
frame.timestampUs);
MetadataPacketList result;
if (objectMetadataPacket)
result.push_back(objectMetadataPacket);
result.insert(
result.end(),
std::make_move_iterator(eventMetadataPacketList.begin()),
std::make_move_iterator(eventMetadataPacketList.end()));
return result;
}
That’s it. Let’s build our plugin and see how Analytics Events work.
After activating the plugin on a camera, open Event Rules in System Administration and click Add.
In the Event section, select Analytics Event in the drop-down list.
In the At, choose the camera the plugin is enabled on.
In the Event Type drop-down list, you should see the analytics events defined in the current section of the manual.
Comments
0 comments
Article is closed for comments.