Skip to content

Reference

Analysis

IRIS+ Professional provides a wide range of analysis capabilities, including video indexing, object detection, and classification. The analysis is performed on the video stream from cameras and videos added to the system.

Analysis Parameters

Parameters

The analysis parameters can be configured for each camera or video added to the system. The parameters are divided into several sections, each with its own set of options.

Note

Note that currently, it is not possible to edit indexing parameters after the camera has been added. If you need to change them, you will have to delete the camera and add it again with the new parameters.

Here you can set the parameters for indexing the video.

  • Detector FPS (4 by default): The number of frames the detector will analyse per second.

Warning

The higher the FPS, the more frames the detector is able to analyse per second. Note, however, that this will also increase GPU usage. The default value of 4 FPS is a good compromise between accuracy and processing needs. Consult the hardware requirements for more information.

Enables the extraction of attributes from the video stream, for all object types (on by default).

  • Number of feature vectors (2 by default): The number of feature vectors depends on the number of objects in the video. In case of a scene with low / rare activity, leave it as default. As the number of objects increases, you may consider increasing the number of feature vectors so that no objects are missed.

Enables the extraction of face attributes from the video stream (off by default).

  • Number of feature vectors (2 by default): The desired number of feature vectors depends on the number of faces in the video: In case of a scene with low activity, leave it as default. As the number of faces increases, you may consider increasing the number of feature vectors so that no faces are missed.

This feature is currently unavailable for editing. It will be supported in a future release.

Enables the extraction of attributes from the the environment around classifiable objects (on by default). It is used for detecting changes in the background, such as movement or changes in lighting or environmental conditions.

  • Max background vector calculations per frame (1 by default): The optimal number of analysed vectors depends on how likely it is for the background to change; If the background is expected to remain mostly static, leave it as default. If the background changes frequently or drastically (Such as in case of a drone footage, or PTZ camera, where the enviroment continously changes due to camera movement), set it to 2 or more, as more frequent calculations will be necessary.
Feature vectors

Feature vectors are quantifiable attributes extracted from video streams. They are used to identify objects in the video and can be used for various purposes, such as object tracking, classification, and recognition.

List of Object Types

IRIS+ Professional supports a wide range of object types that can be detected and classified in video streams. These object types can be used in queries to filter detections and to create custom use cases.

  • airplane
  • apple
  • backpack
  • banana
  • baseball bat
  • baseball glove
  • bear
  • bed
  • bench
  • bg
  • bicycle
  • bird
  • boat
  • book
  • bottle
  • bowl
  • broccoli
  • bus
  • cake
  • car
  • carrot
  • cat
  • cell phone
  • chair
  • clock
  • couch
  • cow
  • cup
  • dining table
  • dog
  • donut
  • elephant
  • fire hydrant
  • fork
  • frisbee
  • giraffe
  • hair drier
  • handbag
  • horse
  • hot dog
  • keyboard
  • kite
  • knife
  • laptop
  • microwave
  • motorcycle
  • mouse
  • orange
  • oven
  • parking meter
  • person
  • pizza
  • potted plant
  • refrigerator
  • remote
  • sandwich
  • scissors
  • sheep
  • sink
  • skateboard
  • skis
  • snowboard
  • spoon
  • sports ball
  • stop sign
  • suitcase
  • surfboard
  • teddy bear
  • tennis racket
  • tie
  • toaster
  • toilet
  • toothbrush
  • traffic light
  • train
  • truck
  • tv
  • umbrella
  • vase
  • wine glass
  • zebra

List of Classifiers

Classifiers, short for Few-Shot Learning (FSL) classifiers, can be utilized in queries to filter detections. They are lightweight, requiring only a few dozen examples for training.

Custom Classifiers

If your needs aren't met by the existing classifiers, feel free to contact us by opening a ticket. Custom classifiers can be developed, in binary format, that can integrate seamlessly with other systems, eliminating the need for a new software release.

Classifier Available* Applicable Object Type Classes
Gender yes Person
  • Male
  • Female
Age planned Person 10-year age bands (0-80)
Person on the phone yes Person
  • Talking on the phone
  • Looking at phone
  • None of the above
Hands up yes Person
  • A person with raised hands.
  • A person with non-raised hands.
Emergency vehicle yes Car, Truck, Bus
  • Emergency vehicle
  • Non-emergency vehicle
PPE helmet yes Person
  • Wearing
  • Not wearing
PPE vest yes Person
  • Wearing
  • Not wearing
Person wearing a face mask yes Person
  • Wearing
  • Unsure if wears or not.
Person with a gun planned Person
  • With a gun
  • Without a gun
Person is smoking planned Person
  • Smoking
  • Not smoking
Face of person is hidden planned Person
  • Visible face
  • Hidden face
Gate/door open/closed planned Background
  • Open
  • Closed
Simple pose planned Person
  • Standing
  • Sitting
  • Lying down
Fire and smoke planned Background
  • Fire/smoke visible
  • Normal
Construction vehicles planned Car, Truck 12 construction vehicle classes

*If "yes", the classifier is available in the latest release. Otherwise, it is planned for a future release.

Query Types

IRIS+ Professional provides a wide range of use cases across multiple domains. For advanced needs, generic use cases can be leveraged to create custom use cases tailored to their specific needs (via the Generic Object Search, for example). Specialized use cases can be quickly configured by simply filling in a few parameters.

All use cases listed below can be executed in two modes:

  • Forensic Mode: Processes historical data by specifying a start and end time.
  • Live Mode: Runs continuously to generate real-time data or event streams.

All use cases that produce events can also be configured to aggregate events and to produce statistics.

Both event and statistical data can be visualized in the IRIS+ Professional application, and can be forwarded to 3rd party applications such as a VMS.

Many use cases use classifiers, see above, to detect object attributes or actions.

Custom use cases

If you have unique requirements that aren't addressed by existing use cases, feel free to contact us by opening a ticket. Custom use cases can be developed, as binaries, that can integrate seamlessly with your system, eliminating the need to wait for a new software release.

Name Description Main Configuration Parameters Domain
Near Miss Detection Detects if a moving vehicle is close to a person.
  • Vehicle Classes
  • Vehicle Attributes (list of classifiers with desired classes)
  • Distance
Safety
PPE Detection in Area Detects if a person is wearing a hard helmet and/or vest in a given area.
  • Area Polygon
Safety
PPE Detection at Entrance Detects if a person is wearing a hard helmet and/or vest when entering an area.
  • Crossline
Safety
Fall Detection Detects persons in prone position. Safety
Intrusion Detection Detects if a person appears in a protected area.
  • Area Polygon
  • Sensitivity
Security
Loitering Detection Detects if a person stays in a given area for a configurable amount of time.
  • Area Polygon
  • Max Time Threshold
Security
Tailgating Detects tailgating for persons or vehicles.
  • Object Class
  • Min Following Time Threshold
Security
Measure Vehicle Speed* Measures vehicle speed on one camera based on multiple crossing lines.
  • Crosslines
  • Real Distance of Crosslines
Smart City
Measure Vehicle Following Distance* Measures the following distance of vehicles.
  • Crosslines
Smart City
Wrong Vehicle* Detects if configured vehicle type is in a given area. Examples include:
  • Only branded vehicles allowed
  • Wrong vehicle in lane (e.g., car in bus lane)
  • Area Polygon
  • Vehicle Attributes (list of classifiers with alert classes)
Smart City, Security
Single Crossline Traffic Counting Counts line-crossing objects (persons, vehicles, etc.). Counting can be grouped by attributes, direction, and time window.
  • Crossline
  • Attributes (list of classifiers with classes of interest)
  • Aggregation Time Window
Smart City, Retail
Multiple Crossline Traffic Counting Counts line-crossing objects (persons, vehicles, etc.). Counting can be grouped by attributes and time window. Two crosslines can be added to filter directional traffic.
  • Crosslines and Directions
  • Attributes (list of classifiers with classes of interest)
  • Aggregation Time Window
Smart City, Retail
Red Light Running Detects if a vehicle runs a red light.
  • Crossline
  • Traffic Light ROI
Smart City
Measure Queue Length* Measures the number of persons in a queue.
  • Area Polygon
  • Aggregation Window Size
Retail

* Part of an upcoming release