Use Cases for Search Analytics

Invariant Search allows the user to search for objects that are size, orientation, and place invariant. It will find objects that are occluded or only partially visible. In this example we can find the school bus at different angles, in different locations, when it is near and far away, and when part of the school bus is hidden from view.

Patterns-of-Life allows the user to tell who was together and when in a video. The Boolean operators “and/or/not” can be used in the search. In this example three vehicles were found together several times, using the “and” operator. Then the “or” operator was used and the results show when any of the three vehicles were found.

Mapping Videos to the Search Space allows the user to visually see where each frame is in a 3-D image of multiple videos. Each point represents the signature of one of the video frames projected into the search space. The different colors represent the different videos. In this example you can see a number of clusters forming from the videos.